But at the same time on transmiting a RTP packet,I need to capture a frame into the buffers from the live souce(e.g,such like a webcamera).Does it need mutithread programming?
No. You can do this, within a single event loop, with a chain of three objects:
FrameCapture => Encoder => RTP_Sink where:- "FrameCapture" is an instance of a new class (that you would write) that subclasses "FramedSource". You should look at the "DeviceSource" code as a model for how you might do this. - "Encoder" is an instance of a new class (that you would write) that subclasses "FramedFilter". It would compress each frame using your chosen codec. - "RTP_Sink" is an instance of the appropriate subclass of "RTPSink" (depending on the codec).
You would begin the streaming by calling RTP_Sink->startPlaying(Encoder, ...); and then env->taskScheduler().doEventLoop(); to start the event loop. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel