Re: [Live-devel] RTP stream retransmit

2013-02-01 Thread Ross Finlayson
> I think I figured the cause - many of the incoming RTP frames "get" the same > presentation time (via 'receptionStatsDB().noteIncomingPacket(...)'. No, this is a 'wild goose chase'. The presentation times that get set for the outgoing Transport Stream packets are determined *entirely* by the

Re: [Live-devel] Two client sessions to different servermedia sessions

2013-02-01 Thread Ross Finlayson
> I am using OnDemandServerMediaSession and added two servermedia sessions > with the name “live0” and “live1”. Both serverMediaSessions will stream MJPG > video data. > > I have set the reuseFirstSource flag, so that more than one client sessions > to the same serverMediaSession(for e.g “li

[Live-devel] Two client sessions to different servermedia sessions

2013-02-01 Thread saravanan
Hi, I am using OnDemandServerMediaSession and added two servermedia sessions with the name "live0" and "live1". Both serverMediaSessions will stream MJPG video data. I have set the reuseFirstSource flag, so that more than one client sessions to the same serverMediaSession(for e.g "live0"

Re: [Live-devel] RTP stream retransmit

2013-02-01 Thread Zvika Meiseles
That's strange. The "MPEG2TransportStreamFramer" class should be scanning the "PTS" (timestamps) in the incoming MPEG Transport Stream packets, and using this to compute an estimated 'duration' for each. So I don't know why this is not working for you. I think I figured the cause - many of the i

Re: [Live-devel] Am I accidentally H.264 encoding twice???

2013-02-01 Thread Ross Finlayson
> Regarding your statement [your 'framer' object should then be fed into a > "H264VideoRTPSink" object, for streaming], please help me understand this. > Currently, the 'framer' object is sent to videoSink->startPlaying(), where > videoSink is a much more fundamental RTPSink. This is exactly a

Re: [Live-devel] Am I accidentally H.264 encoding twice???

2013-02-01 Thread temp2...@forren.org
(((Jacob, please see Ross'es quoted email response further below. ABORT the removal of double H.264...))) Ross, Thanks very much for this info. Will do. Regarding your statement [your 'framer' object should then be fed into a "H264VideoRTPSink" object, for streaming], please help me understand

[Live-devel] Observing dataloss in linux user space with testRTSPClient

2013-02-01 Thread Marathe, Yogesh
Hi, I added simple logic and few parameters in DummySink to calculate received bit rate in afteGettingFrame() of testRTSPClient and printed the same at 30secs of interval. This was showing birate per stream received. When I opened 4 connections from IP cameras (streaming at 8Mbps CBR) I saw 30-