Hello, I'm designing a playback application in Linux that plays back H.264 encoded video and uncompressed PCM audio to a screen in real-time as a RTP Client - I'm noticing that the presentation times I receive as a client often jump erratically until synchronization kicks in - since its imperative that these timestamps be accurate as that's the only way to tie the two streams together (I'm playing back audio and video), I'm wondering if its possible to access this synchronized state variable somehow in my MediaSink subclassed class? I know the variable fCurPacketHasBeenSynchronizedUsingRTCP in MultiFramedRTPSource.cpp has this information present in it, but how do I access it from the sink classes (i.e FileSink, etc)
One option I saw in QuickTimeFileSink.cpp, is to get the current MediaSession, iterate through the sub-sessions and do .rtpSource()->hasBeenSynchronizedUsingRTCP(); How do I get the current MediaSession from a MediaSink derived class? Is there a simpler way to get at this RTPSource class, fCurPacketHasBeenSynchronizedUsingRTCP variable? I know I can access it from H264VideoRTPSource, (its RTPSource derived), allowing me to get at that variable, but I cannot do anything with information since that class doesn't allow me to store the information somewhere in memory :-( Any help is appreciated, Thank you, Jerry
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel