Am I doing the right thing when doing stop and start the RTP stream?
I think so. But what are you using for your server? I *think* our
RTP/RTCP server implementation will handle this correctly (because
RTCP packets should continue to get sent even after the stream is
stopped). But other im
Hi,
In my project I need to stop and restart receving RTP stream. I'm using the
following code to do these,
start and restart is the same,
mySink->startPlaying(theRtpSource, afterPlayFun, NULL);
watchVariable=0;
pScheduler->doEventLoop(&watchVariable);
For stop
mySink->stopPlaying();
watchV
Hi All,
I am working on ffplay ,right now i'd like to modify the ffplay code to
support liveMedia library can any one please help me how to proceed . I
suspect to change in rtsp.c (which is there in ffplay/libavformat) is it
right?? please give some idea to proceed
Thanks in Advance.
--
You said you don't support H264 + MPEG-TS, but you support H264.
"MPEG2TransportStream" is included in many class name, so can I not
stream in MPEG-TS ? Maybe I can do this but not with H264.
That's right - we support multiplexing MPEG-1 or 2 video (along with
MPEG-1 or 2 audio) into a Transpo
Ross Finlayson a écrit :
What can I do to stream H264 in MPEG-TS format with LiveMedia ?
We don't currently support this. Sorry.
Instead, you should stream the H.264 video directly in RTP (using the
"H264VideoStreamFramer" and "H264VideoRTPSink" classes, as described
earlier)
You said you
What can I do to stream H264 in MPEG-TS format with LiveMedia ?
We don't currently support this. Sorry.
Instead, you should stream the H.264 video directly in RTP (using the
"H264VideoStreamFramer" and "H264VideoRTPSink" classes, as described
earlier)
--
Ross Finlayson
Live Networks, Inc.
Guillaume Grimaldi a écrit :
Ross Finlayson a écrit :
My camera generate raw format, but I plan to use H264.
OK, so you first need a (hardware or software) H.264 video encoder.
Then, you will need to write your own subclass of
"H264VideoStreamFramer", and feed your H.264 NAL units into this