Hi,

I am developing a multi-threaded application using live libraries that receives 
audio and video from several Axis IP cameras concurrently as RTSP clients, 
decodes into raw, processes the data, creates a single outbound audio/video 
stream, re-encodes and streams it out to a remote receiver after setting up a 
RTSP server. I am using VLC player to play back the stream on the receiving end.

I am having an issue on the streaming-out side. The audio and video encoders 
read raw data from shared buffers using two derived FramedSource classes 
modeled after DeviceSource.cpp. The deliverFrame() function in these derived 
classes read raw audio and video from respective shared buffers, encodes them 
using ffmpeg libraries, fills up the buffers and sets other parameters 
appropriately, before returning. Occasionally, when a shared buffer is accessed 
for read, there is'nt enough data avaialable to read, possibly due to jitter in 
processing time on the write side of the shared buffers. What is the right 
action in that case? If my buffer reader waited a few milli-secs until there is 
enough data available to read (by using Events or otherwise), the receiver side 
VLC player freezes. If I return with fFramesize = 0, the application crashes. 
The only thing that seems to work is if I re-encoded the previous frame (for 
video) and encoded all-zero (for audio), and filled up the buff!
 ers and other parameters the normal way. Even in this case, the receiving VLC 
player freezes every few min or so.

What am I doing wrong?

Thanks,
Debargha

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to