When you mention ffmpeg, do you mean the application, or the underlyinglibrary (libav) ?I had the same problematic few years ago, and I found it can be handlednicely with a reduced amount of code around avcodec_decode_video.I suggest you to have a look on AVCodecParserContext for this matter.If yo
> We have implemented a sender and a receiver for MPEG2TS using Livemedia.
>
Do 'we' not have our own domain name? :-)
> I found some old posts from 2004 talking about Receiver reports.
>
You realize, I hope, that 2004 was 8 years ago :-) We've had full support for
RTCP "RR" and "SR" packets f
Hi Ross
We have implemented a sender and a receiver for MPEG2TS using Livemedia.
I found some old posts from 2004 talking about Receiver reports.
"Note, however, that data from RTCP "RR" (Receiver Report) packets (i.e.,
coming from receivers back to the sender) are currently not processed at
> I'm having a problem when trying to trick play from mkv files produced by
> ffmpeg in that I cannot seek in them. I don't seem to be able to play back
> webm files at all (at least, VLC doesn't like them).
OK, so your first task should be to find out (perhaps using a VLC mailing list)
why VL
I just implemented an custom H264 source and decoder with FFMPEG. Here's how I
did it (in concept):
1.)Get the subsession->fmtp_configuration() string
2.)Do something like this with H264VideoRTPSource.hh included - "auto
records = parseSPropParameters(fmtp_configuration, recordCount);
> I’m working on my Live555/FFmpeg video player, and I ran into an interesting
> problem that has kept me stumped for several days. I am taking the buffer
> that is delivered to my MediaSink (like the example in testRTSPClient), and I
> am passing the buffer and the size to FFmpeg to decode. It
I have been working with the avcode_decode_video2for a bit.
If [7] is SPS, [8] is PPS, [5] is a keyframe slice, and [1]'a are the
difference frames.
I have found it needs a stream like this
00 00 01 [7]00 00 01[8]00 00 01[5]00 00 01[1]00 00 01[1]
If you are calling it in a loop it will
Hi there:
I'm trying to use live media server to stream out video, captured live
from an HD camera, but streamed on demand starting at an arbitrary
point. My plan is to use dvgrab (in linux) to grab the firewire HDV
stream, pipe it into ffmpeg and transcode to a file that can be streamed
by
It sounds like your not setting up FFmpeg properly. The SDP will contain the
SPS and PPS in the sprop-parameter-sets. You need to decode these (Base64
encoded) and place them in the extradata field of the AVCodecContext.
Additionally every NALU passed to FFmpeg must be in Annex B format. That is
I'm working on my Live555/FFmpeg video player, and I ran into an interesting
problem that has kept me stumped for several days. I am taking the buffer that
is delivered to my MediaSink (like the example in testRTSPClient), and I am
passing the buffer and the size to FFmpeg to decode. It says tha
10 matches
Mail list logo