Hello Ross, I have one question regarding the copying data to output buffer when I am implementing own FramedSource. There is an fTo buffer that has a maximum size fMaxSize, this fMaxSize is taken from OutPacketBuffer::maxSize value.
So, the question is - can I change this buffer size dynamically? E.g. if I receive a frame that is bigger then OutPacketBuffer::maxSize and I will increase this value, what should be done to update fTo buffer and it’s fMaxSize for next doGetNextFrame call? Best regards, ----------------------------------------- Victor Vitkovskiy Senior software developer mailto: victor.vitkovs...@mirasys.com www.mirasys.com -----Original Message----- From: live-devel <live-devel-boun...@us.live555.com> On Behalf Of Victor Vitkovskiy Sent: Thursday, 20 January 2022 11:26 To: LIVE555 Streaming Media - development & use <live-de...@us.live555.com> Subject: Re: [Live-devel] [Mirasys] Live555 RTSP server questions EXTERNAL Hello Ross, I have found a mistake, sorry to disturb you Best regards, ----------------------------------------- Victor Vitkovskiy Senior software developer mailto: victor.vitkovs...@mirasys.com www.mirasys.com -----Original Message----- From: live-devel <live-devel-boun...@us.live555.com> On Behalf Of Victor Vitkovskiy Sent: Thursday, 20 January 2022 10:57 To: LIVE555 Streaming Media - development & use <live-de...@us.live555.com> Subject: Re: [Live-devel] [Mirasys] Live555 RTSP server questions EXTERNAL Hello Ross, I have implemented MetadataServerMediaSubsession as in this example: http://lists.live555.com/pipermail/live-devel/2021-August/021953.html This works, I can see that this session is present in DESCRIBE SDP response: v=0 o=- 1642668001587955 1 IN IP4 172.24.128.1 s=Session streamed by "testOnDemandRTSPServer" i=h264ESVideoTest t=0 0 a=tool:LIVE555 Streaming Media v2021.11.23 a=type:broadcast a=control:* a=range:npt=now- a=x-qt-text-nam:Session streamed by "testOnDemandRTSPServer" a=x-qt-text-inf:h264ESVideoTest m=application 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 vnd.onvif.metadata/90000 a=control:track1 But then I tried to use RTSP client example from testProgs\testRTSPClient.cpp and tried to save incoming data in this function: void DummySink::afterGettingFrame(void* clientData, unsigned frameSize, unsigned numTruncatedBytes, struct timeval presentationTime, unsigned durationInMicroseconds) { FILE* file = nullptr; if (fopen_s(&file, "d:\\out.xml", "w+b") == 0) { fwrite(clientData, 1, frameSize, file); fflush(file); fclose(file); } DummySink* sink = (DummySink*)clientData; sink->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime, durationInMicroseconds); } But in result file I see some garbage. FrameSize is correct, but clientData contains some strange data. I can confirm that packets are sent correctly, I can see them in Wireshark, but client parse data incorrectly. Please find attached: packets.pcapng in.xml - input xml file that is sent from RTSP server out.xml - result file that is saved on RTSP client side RtspClientTest.cpp - my RTSP client code that I use to save data Could you please tell me what is wrong with RTSP client? Best regards, ----------------------------------------- Victor Vitkovskiy Senior software developer mailto: victor.vitkovs...@mirasys.com www.mirasys.com -----Original Message----- From: live-devel <live-devel-boun...@us.live555.com> On Behalf Of Ross Finlayson Sent: Friday, 14 January 2022 17:12 To: LIVE555 Streaming Media - development & use <live-de...@us.live555.com> Subject: Re: [Live-devel] [Mirasys] Live555 RTSP server questions EXTERNAL > On Jan 15, 2022, at 2:42 AM, Victor Vitkovskiy > <victor.vitkovs...@mirasys.com> wrote: > > Hello Ross, > > I was able to create H.264 / 265 streams from my sources, it works fine now > even for several URI's and clients. > Could you please advise me how to stream xml metadata? > > What RTPSink and Framer I should use for this? Is this ‘ONVIF’-type metadata? If so, then you would use a “SimpleRTPSink”, as described here: http://lists.live555.com/pipermail/live-devel/2021-August/021953.html If your metadata is being streamed along with a H.264 (or H.265) video stream, then your “ServerMediaSession” would contain two “ServerMediaSubsession” objects - one for the video track, one for the metadata track. It’s important that the “fPresentationTime” values set by each of your “FramedSource” subclass objects (for the video track, and for the metadata track) be accurate, and aligned with ‘wall clock’ time (the times that you’d get by calling “gettimeoffday()”). Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel