I am experiencing an issue when streaming from a live camera source to a client over a proxy server. My current setup is the following:
[Live555 camera server] <-> [Live555 Proxy server] <-> [Client (vlc)] I register a media session in the proxy. Later, the client will connect to that session via the proxy. When the client connects to the server for the first time everything goes smoothly. However when I disconnect the client and re-connect a few times the playback will halt for an arbitrary period of time (until the client gives up and reconnects). I have tracked this problem to the proxy server starting to send RTP packets to the client with an old RTP timestamp after an RTCP SR is sent from the server to the proxy. Without going into many details I think the core problem is that the FramedSource I implemented for the camera is using an H264VideoStreamFramer for parsing and framing the incoming H264 data from the camera: FramedSource *H264CameraMediaSubsession::createNewStreamSource( unsigned /*clientSessionId*/, unsigned &estBitrate ) { estBitrate = 500; // kbps, estimate FramedSource *transportStreamSource = CameraDeviceSource::createNew( envir(), fFrameReader ); return H264VideoStreamFramer::createNew(envir(), transportStreamSource); } I think that the problem is with the H264VideoStreamFramer passing consecutive presentation timestamps (using the wall clock as base) to the RTPSink regardless of the PTS I set in my implementation of CameraDeviceSource::doGetNextFrame(). I added some debug prints for figuring out what's going on the server side: [Streaming is ongoing] Frame PTS is 381398.975357 [This is the PTS I pass in CameraDeviceSource::doGetNextFrame()] fTimestampBase: 0x51879d3e, tv: 1582406442.526492 [This is the PTS that is passed to the RTPSink] => RTP timestamp: 1127101430 (0x432e33f6) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1127101430 (frame PTS 1582406442.526492 Frame PTS is 381399.055351 fTimestampBase: 0x51879d3e, tv: 1582406442.566492 => RTP timestamp: 1127105030 (0x432e4206) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1127105030 (frame PTS 1582406442.566492 Frame PTS is 381399.135529 fTimestampBase: 0x51879d3e, tv: 1582406442.606492 => RTP timestamp: 1127108630 (0x432e5016) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1127108630 (frame PTS 1582406442.606492 fTimestampBase: 0x51879d3e, tv: 1582406492.200539 => RTP timestamp: 1131572095 (0x43726b7f) SR: rtp timestamp 1131572095 [The client disconnects from proxy, a PAUSE is sent from proxy to server. Only RTCP SR packets are sent.] fTimestampBase: 0x51879d3e, tv: 1582406497.240983 => RTP timestamp: 1132025734 (0x43795786) SR: rtp timestamp 1132025734 fTimestampBase: 0x51879d3e, tv: 1582406501.872705 => RTP timestamp: 1132442589 (0x437fb3dd) SR: rtp timestamp 1132442589 fTimestampBase: 0x51879d3e, tv: 1582406507.372838 => RTP timestamp: 1132937601 (0x43874181) SR: rtp timestamp 1132937601 fTimestampBase: 0x51879d3e, tv: 1582406512.151478 => RTP timestamp: 1133367679 (0x438dd17f) SR: rtp timestamp 1133367679 fTimestampBase: 0x51879d3e, tv: 1582406513.312512 => RTP timestamp: 1133472172 (0x438f69ac) SR: rtp timestamp 1133472172 [Client connects again to proxy. A PLAY is sent.] RTPSink::presetNextTimestamp() 1582406513.313628 [The RTPSink timestamp is preset using the wall clock time] fTimestampBase: 0x51879d3e, tv: 1582406513.313628 => RTP timestamp: 1133472273 (0x438f6a11) RTPSink::presetNextTimestamp() is adjusting timestamp Frame PTS is 381421.734938 [The PTS passed by CameraDeviceSource::doGetNextFrame() have advanced in time] Frame PTS is 381421.734938 fTimestampBase: 0x51e8a929, tv: 1582406442.646492 [But they're ignored and the PTS used for the packet are as before the PAUSE] => RTP timestamp: 1133472273 (0x438f6a11) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133472273 (frame PTS 1582406442.646492 [The RTP timestamps of the packet have advanced significantly from 1131572095 -> 1133472273. The proxy side will simply do the increment on the frames it sends to the client so there's a discontinuity in what the client sees but it doesn't care since it's a fast forward. Packets will continue to be relayed.] Frame PTS is 381421.734938 fTimestampBase: 0x51e8a929, tv: 1582406442.646492 => RTP timestamp: 1133472273 (0x438f6a11) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133472273 (frame PTS 1582406442.646492 Frame PTS is 381421.734938 fTimestampBase: 0x51e8a929, tv: 1582406442.646492 => RTP timestamp: 1133472273 (0x438f6a11) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133472273 (frame PTS 1582406442.646492 ... [Things go bad on the next RTCP SR.] fTimestampBase: 0x51e8a929, tv: 1582406517.681249 => RTP timestamp: 1140225401 (0x43f67579) SR: rtp timestamp 1140225401 [The SR is computed from wall clock time. The proxy updates its fSyncTime and fSyncTimestamp with those values.] Frame PTS is 381426.054815 fTimestampBase: 0x51e8a929, tv: 1582406445.206480 => RTP timestamp: 1133702672 (0x4392ee10) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133702672 (frame PTS 1582406445.206480 Frame PTS is 381426.135018 fTimestampBase: 0x51e8a929, tv: 1582406445.246480 => RTP timestamp: 1133706272 (0x4392fc20) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133706272 (frame PTS 1582406445.246480 Frame PTS is 381426.174876 fTimestampBase: 0x51e8a929, tv: 1582406445.286480 => RTP timestamp: 1133709872 (0x43930a30) MultiFramedRTPSink::setTimestamp(): RTP Packet timestamp = 1133709872 (frame PTS 1582406445.286480 [Now the RTP packets received by the proxy are considered to be 72 seconds in the past. The normalizer employed by the proxy will account for this change relaying older packets to the client as well. The client will refuse accepting those as they're considered to be late. You may notice the time diff between CameraDeviceSource::doGetNextFrame() PTS and the one derived by the H264VideoStreamFramer is actually less than 72s (~22s) - I believe this is because the sample I pasted here is actually after running the client a few times and the delay is aggregated on each PAUSE sent to the server.] I may have been misusing the H264or5VideoStreamFramer for streaming from a live source. If that is the case I would appreciate some guidance in how things should be done differently. If H264or5VideoStreamFramer is the way to go for live H264 sources as well then it may be missing an API that should be invoked when streaming is paused/resumed in order to make it update its own time base. I would appreciate some guidance on this issue. Thanks, Micha -- Micha Kalfon Software Engineer Toka - Cyber Builders (CyberToka Ltd.) Mail: mi...@tokagroup.com Mobile: +972-52-6086486
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel