Hi,
I have created a subclass of MediaSink which process the received data. I am
using this instead of FileSink.
I am able to see MultiFramedRTPSource::doGetNextFrame1() getting called
before I start playing streams. It also calls my afterGettingFrame()
and continuePlaying() function. All this wo
I have tried second approach. I ran openRTSP with -b 4 -r -v
rtsp:/media but it doesnt seems to be working.
No, you should use the "-v" option, but *not* the "-r" option -
because you *want* "openRTSP" to receive the RTP stream, but then
write the resulting video data to stdout. You shoul
I have tried second approach. I ran openRTSP with -b 4 -r -v
rtsp:/media but it doesnt seems to be working. Anyway I will keep this
as a fall back mechanishm. I would like to know more about creating a
subclass of MediaSink, could you please elaborate more on this.
On Tue, Dec 1, 2009 at 12:1
I believe it is possible to have the streams in buffer but not clear
how? I went through RTPSource class which has a function
pointer setAuxilliaryReadHandler(). As I understand this is for
registering a handler for processing RTP read.
"setAuxilliaryReadHandler()" is a hack that should not be
Hi,
I am working on a project where I need to record video from an IP camera and
process it. My job is to capture the stream and pass it on to next module.
After long discussion and research we had decided to make use of live555
stack as it is widely used (and tested). I have gone through the
ope