Hello Ross, 

Thank you for your answers. 

Still I have some opened questions:
>> You don’t need to be concerned at all with the internals of the LIVE555 code 
>> to do what you want here.
This doesn't give me any information how to do this :).
If I don't need to subclass from RTSPServer then how I can detect new client 
connected / disconnected and what server session is used by this client (as we 
can have several sessions in RTSPServer)?

>> Yes you can.  This FAQ entry 
>> http://live555.com/liveMedia/faq.html#liveInput-unicast
>> tells you exactly what you need to do.  Unless your source data is 
>> accessible as a file (in your OS’s file system), then you will need to write 
>> a subclass of “FramedSource” that delivers, on demand, a frame of encoded 
>> data each time “doGetNextFrame()” is called.  See the “DeviceSource.cpp” 
>> code for a model of how to do this.
It is clear for me how to create my H264FramedSource, but it is not clear how 
to use it in higher levels. 
testOnDemandRTSPServer example use this code: 
        ServerMediaSession* sms = ServerMediaSession::createNew(*env, 
streamName, streamName, descriptionString);
        sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, 
inputFileName, reuseFirstSource));
        rtspServer->addServerMediaSession(sms);
So I need to subclass from H264VideoFileServerMediaSubsession and override 
those two virtual functions: createNewStreamSource and createNewRTPSink, is 
this correct? 
Or I need to subclass from OnDemandServerMediaSubsession and do the same thing 
(like reading SDP information from H.264 stream)?

Best regards,
-----------------------------------------
Victor Vitkovskiy
Senior software developer
mailto: victor.vitkovs...@mirasys.com
www.mirasys.com


-----Original Message-----
From: live-devel <live-devel-boun...@us.live555.com> On Behalf Of Ross Finlayson
Sent: Wednesday, 12 January 2022 11:02
To: LIVE555 Streaming Media - development & use <live-de...@us.live555.com>
Subject: Re: [Live-devel] [Mirasys] Live555 RTSP server questions

EXTERNAL


> On Jan 12, 2022, at 9:41 PM, Victor Vitkovskiy 
> <victor.vitkovs...@mirasys.com> wrote:
>
> Dear Support,
>
> My name is Victor, I am investigating possibility to use Live555 in Mirasys 
> VMS system and I have several questions, could you please help me?
>
>       • We need to create RTSP server that will stream data from our system 
> if client connects to it.
> Unfortunately, I have not found any callbacks / events related to client 
> connection in RTSPServer class.

You don’t need to be concerned at all with the internals of the LIVE555 code to 
do what you want here.


>       • There are a lot of examples how to stream files from PC, but I 
> haven’t found any example how I can stream continuously data from memory.
> E.g. I get some video stream from IP camera, process it and then I want to 
> stream this over RTSP.
> I have found this description how to do this:
>                 But what about the "testOnDemandRTSPServer" test program (for 
> streaming via unicast)? How can I modify it so that it takes input from a 
> live source instead of from a file?
>                 But still I can’t cope with this

Yes you can.  This FAQ entry
        http://live555.com/liveMedia/faq.html#liveInput-unicast
tells you exactly what you need to do.  Unless your source data is accessible 
as a file (in your OS’s file system), then you will need to write a subclass of 
“FramedSource” that delivers, on demand, a frame of encoded data each time 
“doGetNextFrame()” is called.  See the “DeviceSource.cpp” code for a model of 
how to do this.

But basically, the “testOnDemandRTSPServer” code (and an appropriate 
“ServerMediaSubsession” subclass) would be the appropriate model for you to use 
here.

Again, you don’t need to concern yourselves with the internals of the LIVE555 
code.  That’s my job, not yours :-)


>       • Is there any example how to stream some xml metadata? What media 
> source I should use to stream it over RTSP?

I can answer this question, but not until you’ve demonstrated that you’ve 
gotten more common media streaming (e.g., H.264 video) to work.  You need to 
show that you can walk before you try to run :-)


>       • If I understood correctly, it should be possible to add several media 
> sources to RTSP session, e.g. if IP camera has 2 video streams (one is with 
> full resolution and one is with cropped one) can I add those streams to one 
> RTSP session as different tracks?

You would do this by adding two different “ServerMediaSession” objects - each 
with a different name - to your RTSP server.  One “ServerMediaSession” would be 
for the high-resolution stream; the other would be for the low-resolution 
stream.  A RTSP client would use the stream name (i.e., in the “rtsp://“ URL) 
to select which stream it wanted.

(This is basically what the “testOnDemandRTSPServer” code does - except that it 
uses different “ServerMediaSession” objects for streaming different codec types 
- not different resolutions of the same codec type.)


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to