Hi,

I've with some effort managed to get a solution that uses Live555 to handle
the streaming of video from a USB webcam for a simple application. It all works
quite well but I see a lot of latency and need to start tracking it down. My 
pipeline
currently is: camera -> OpenCV -> x264 -> Live555 -> network (localhost) -> VLC.
For this I see a latency of about 1.5s, which is quite a lot. So far I've 
managed to
time the grabbing and encoding part and it seems to be ~50ms per frame. The
network part should be really minimal as it's all running on one machine right
now. VLC here is an unknown beast, as it's really hard to know what it's doing
and how much it actually buffers and adds to the latency that way.

What I'm now interested in is if there is some way to debug latency within
Live555 to make sure the latency is not introduced there. What I've done so far
is trying to compare the presentation times of the encoded frames with what
a slightly modified testRTSPClient shows. My grabbing and encoding is done
in a FramedSource subclass in doGetNextFrame(). There the NAL:s for the
captured frame are all given the same presentation time based on:

        gettimeofday( &m_currentTime, NULL ); 

I print this time along with some other data:

H264FramedSource::doGetNextFrame: frame done in 30 ms, queue size: 9, time: 
1397053387.439
H264FramedSource::doGetNextFrame: frame done in 0 ms, queue size: 8, time: 
1397053387.439
H264FramedSource::doGetNextFrame: frame done in 0 ms, queue size: 7, time: 
1397053387.439
...

The first one did the grabbing and encoding, creating 9 NAL units into a queue, 
the following doGetNextFrame()
simply feed the already created NAL:s and are thus much faster. The time is the 
presentation time
as retrieved above. 

Then I use a slightly modified testRTSPClient to receive the stream, and the 
first frames that arrive looks like:

Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 5 bytes.       
Presentation time: 1397053387.439622  now: 1397053387.439772
Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 604 bytes.     
Presentation time: 1397053387.439622  now: 1397053387.439794
Stream "rtsp://192.168.1.12:8554/camera0/"; video/H264: Received 8224 bytes.    
Presentation time: 1397053387.439622  now: 1397053387.439924
...

The presentation time is the same as I set, the "now" is simply an added 
gettimeofday() call for
when the DummySink::afterGettingFrame() got called. Seems testRTSPClient gets 
the frames
pretty fast (less than 1ms) after they've been passed to Live555 in the server 
end. I was not
expecting it to be that fast, and that makes me doubt my logic. Is it this 
simple, do the frame arrive
this fast at the client? If so then the latency from the camera to a 
testRTSPClient is not too
big, far from the 1.5s I see with VLC (Mplayer on OSX just breaks apart but it 
seems that the 
broken image is just as late there). The testRTSPClient of course does not 
include any decoding
and similar that needs to be done, but that can't be too many milliseconds per 
frame.

Am I thus right in assuming almost all of my latency comes from the VLC side? 
My next test
would be to extend testRTSPClient to provide a real decoding sink in place of 
the DummySink,
but I'd like to avoid doing that if possible.

Any ideas? I'm more than happy to be lectured on my bad ways of measuring time 
and latency. :)


Best regards,
   Jan Ekholm



_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to