Hi Mojtaba,
Writing a tutorial takes some time, which for the moment I don't really have,
sorry. But in short, what I did to create my streaming server was to:
1) Create a custom DeviceSource class derived from FramedSource ( see FAQ,
based on liveMedia/DeviceSource.cpp and liveMedia/DeviceSource.hh ). Basically,
in the doGetNextFrame() and the deliverFrame() function there is code similar
to the code in your x264VideoStreamFramer.cpp file ( the else{} part of your
doGetNextFrame() function is in deliverFrame() ). I also created a simple FIFO
class to store encoded video frames based on an STL queue.
2) Create a custom RTPSink class derived from VideoRTPSink. This is essentially
a modified copy of liveMedia/MPEG4ESVideoRTPSink.cpp needed for handling my
type of frames.
3) Create a custom MediaSubsesion class, overridding 3 functions
createNewStreamSource(), createNewRTPSink() and getAuxSDPLine(). Function 1 and
2 create instances of your custom devicesource and your custom rtpsink. The
getAuxSDPLine() starts the rtpsink and waits till my systemheader is created,
so that it can be put in the config=... part of the reply to the DESCRIBE.
4) Create a custom RTSPserver class based on mediaServer/DynamicRTSPserver.hh
and mediaServer/DynamicRTSPserver.cpp changing the createNewSMS() function to
handle my media type
That's it! I also set the max packet len (OutPacketBuffer::maxSize) in
MediaSink.cpp to 120 KB.
The whole thing is started up by ( see mediaServer/live555MediaServer.cpp )
- Creating a basictaskscheduler
- Creating a basicuserenvironment
- Create the RTSP server
- Entering the doeventloop
By the way, thanks for your tutorial, it helped...
About the packet loss, for my application it will be necessary to create some
container format which will hold the video and audio frames. To simply fix the
packet loss detection problem I could of course put a frame number in the
container format, this would avoid any changes to liveMedia, but it would be
nice if some function or mechanism would be added to report to the higher level
that a frame was lost.
Hello, I agree. I was also able to create an RTP streaming application for my
H264 encoder within a week. More documentation would surelybe appreciated by
new users. Towards this, I documented my work and put it here: (it has sample
code + UML diagrams)http://www.white.ca/patrick/tutorial.tar.gzDo you think you
would be able to do the same, Luc?I'm also interested in your question about
packet loss. I have not yet had time to look at that part of RTP but I will
have to, very soon. Are we to assume that presentation times will be regular
(like every 33 ms) and if there is a gap between them on the receiving end, we
know a frame was lost? I may be wrong but that doesn't seem like an elegant
solution...Mojtaba Hosseini
_________________________________________________________________
Ontdek Windows Live Hotmail, het ultieme online mailprogramma!
http://get.live.com/mail/overview
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel