I'm pleased to report that almost 99% of you think that the "LIVE555 Streaming
Media" software is perfect, and that no new features should be added. How do I
know this? Because, so far, only 20 of the 1881 people on this mailing list
have responded to the new survey:
How would you impr
I decode H.264 from a network camera.
My decoder wants SPS/PPS once before the first I-frame in band delivered.
So I wait until I get the first I-frame and then insert
0x001/SPS/0x001/PPS before the I-frame.
Interestingly - one of the cameras I test with sends SPS/PPS inside the RTP
stre
One more thing. Because your input device (H.264 encoder) is (I presume)
delivering discrete NAL units (one at a time), rather than an unstructured byte
stream, be sure to use a "H264VideoStreamDiscreteFramer", not a
"H264VideoStreamFramer".
Ross Finlayson
Live Networks, Inc.
http://www.live55
After you parse the nal records from parseSPropParameterSets, store them.
For each sample that comes through, construct the buffer you send to your
upstream decoder like this:
Where start_code is 4 bytes 0x00, 0x00, 0x00, 0x01
[start_code][NAL_record_1][start_code][NAL_record_2][start_code][sam
>> Using the library, what is the most efficient way to get access to the H.264
>> portion of the stream
>
> >>Because this is a Frequently Asked Question, I have now added an entry for
> >>it to the FAQ. See:
> >>http://www.live555.com/liveMedia/faq.html#testRTSPClient-how-to-decode-data
>
> As a side note: I'm having trouble understanding what happens when
> DeviceSource triggers it's event trigger..
>
> The trigger calls deliverFrame(), The first line of deliverFrame() is:
>
> if (!isCurrentlyAwaitingData()) return;
>
> and unless I'm mistaken, this is always going to return un
From: Ross Finlayson
Sent: Friday, April 05, 2013 7:10 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] testRTSPClient / H.264 Network Camera Stream
I am experimenting with the testRTSPClient app to develop a DirectShow
source filter to connect to a network c
Thanks Ross,
I have made some changes to my architecture and it is working ok now,
although I spent quite a while trying to figure out why the RTPSink
wasn't starting which turned out to be because i forgot to wrap my
source in a Framer when I made the changes; should I have expected to
see e
> When I create a new ByteSteamFileSource/H264VideoStreamFramer inside the
> 'createNewStreamSource' method ala H264FileServerMediaSubsession it works as
> expected.
You should continue to do this.
> From this I take it that each call to 'createNewStreamSource' should return a
> pointer to a
I am implementing an RTSP server to stream live H264 video via either
multicast or unicast RTP.
My multicast solution is working fine but the unicast stream is giving
me trouble.
I've read the FAQ and based on this I created a test program like so:
1) Created an OnDemandServerMediaSubsessio
10 matches
Mail list logo