> And I use ByteStreamFileSource.cpp and ADTSAudioFileSource.cpp to get the 
> frame data.
> 
> For h264/aac sync, I use testProgs/testOnDemandRTSPServer.cpp to do:
> 
> ServerMediaSession* sms 
> = ServerMediaSession::createNew(*env, streamName, streamName, 
> descriptionString);
> sms->addSubsession(H264VideoFileServerMediaSubsession 
> ::createNew(*env, inputFileName, reuseFirstSource));
> sms->addSubsession(ADTSAudioFileServerMediaSubsession 
> ::createNew(*env, inputFileName3, reuseFirstSource));

Using a byte stream as input works well when you are streaming just a single 
medium (audio or video).  However, if you are streaming both audio and video, 
and want them properly synchronized, then you *cannot* use byte streams as 
input (because, as you discovered, you don't get precise presentation times for 
each frame).

Instead - if you are streaming both audio and video - then each input source 
must deliver *discrete* frames (i.e., one frame at a time), with each frame 
being given an presentation time ("fPresentationTime") when it is encoded.

Specifically: You will need to define new subclass(es) of "FramedSource" for 
your audio and video inputs.  You will also need to define new subclasses of 
"OnDemandServerMediaSubsession" for your audio and video streams.  In 
particular:
- For audio, your subclass will redefine the "createNewStreamSource()" virtual 
function to create an instance of your new audio source class (that delivers 
one AAC frame at a time).
- For video, your subclass will redefine the "createNewStreamSource()" virtual 
function to create an instance of your new video source class (that delivers 
one H.264 NAL unit at a time - with each H.264 NAL unit *not* having an initial 
0x00 0x00 0x00 0x01 'start code).  It should then feed this into a 
"H264VideoStreamDiscreteFramer" (*not* a "H264VideoStreamFramer").  Your 
implementation of the "createNewRTPSink()" virtual function may be the same as 
in "H264VideoFileServerMediaSubsession", but you may prefer instead to use one 
of the alternative forms of "H264VideoRTPSink::createNew()" that takes SPS and 
PPS NAL units as parameters.  (If you do that, then you won't need to insert 
SPS and PPS NAL units into your input stream.)

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to