On Thu, Dec 22, 2011 at 6:12 AM, Jeff Shanab wrote:
> Above the live 555 libs I have my own frame class. It is a simple RAII
> data class with payload, a bit of byte alignment, and some metadata like
> size and type. I use a reference counted pointer to this. This allows my
> multiple subscribers
: Re: [Live-devel] Live555 EventLoop crash
2. One of the biggest performance hits in my profiling is memcpy (I use an
embedded platform, so memcpy gets pricy fast), much of it due to copying media
buffers. Would you ever consider adding (or consider accepting ;) code that
allows live555 to work in
> 2. One of the biggest performance hits in my profiling is memcpy (I use an
> embedded platform, so memcpy gets pricy fast), much of it due to copying
> media buffers. Would you ever consider adding (or consider accepting ;) code
> that allows live555 to work in the calling library's buffers in
for me.
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of David J Myers
Sent: Wednesday, December 21, 2011 9:22 AM
To: live-de...@ns.live555.com
Subject: Re: [Live-devel] Live555 EventLoop Crash
Hi Ross,
>> On further examination of my encoded
Hi Ross,
>> On further examination of my encoded frame data, it looks like an I-frame
consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17
bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of
the frame size. Each P-frame is just one NALU.
>> My code
On Dec 21, 2011, at 6:49 AM, David J Myers wrote:
> On further examination of my encoded frame data, it looks like an I-frame
> consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17
> bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of
> the frame
On further examination of my encoded frame data, it looks like an I-frame
consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17
bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of
the frame size. Each P-frame is just one NALU.
My code is now using H26
> There can’t be much wrong with my encoded data because it displays fine in
> VLC (apart from some frames being truncated) when I use H264VideoStreamFramer
> instead of H264VideoStreamDiscreteFramer.
Well, a "nal_unit_type" of 0 is definitely wrong. But if only a few of the
encoded NAL units
>> So I've tried using H264VideoStreamDiscreteFramer and removing the first
4 bytes (which is always 00 00 00 01) from the encoded frame data, but this
fails with the output:-
>> Warning: Invalid 'nal_unit_type': 0.
>OK, that means that your encoder is producing invalid H.264 output.
>> I'm g
> So I’ve tried using H264VideoStreamDiscreteFramer and removing the first 4
> bytes (which is always 00 00 00 01) from the encoded frame data, but this
> fails with the output:-
> Warning: Invalid 'nal_unit_type': 0.
OK, that means that your encoder is producing invalid H.264 output.
> I’m gu
Hi Ross,
>That's your problem. Because your "StreamSource" object is delivering
discrete NAL units (in this case, discrete frames, where each frame is a
single NAL unit) - i.e., delivering one NAL unit at a time - then you should
be using "H264VideoStreamDiscreteFramer".
>Just make sure that you
> Yes, I am trying to stream H.264 compressed live images from my camera sensor
> which are 2144x1944 pixels, so just under 5MegaPixels.
I'm not sure what that translates into in terms of "bytes", but (as noted
several times already on this mailing list) you really shouldn't be streaming a
very
Hi Ross
You wrote:
>But in your first message, you talked about "5MP" images. Did you mean "5
MByte images"; "5 Megapixel images"? In any case, what codec is this?
I.e., what kind of data are you trying to stream?
Yes, I am trying to stream H.264 compressed live images from my camera
sensor
> >No, the problem has nothing to do with 'speed'. There's a bug in your code
> >somewhere. Sorry.
>
> Would you please take another look at this?
Well, I'm not sure what the "this" is that I can take a look at, because the
problem is with your custom code, which I know nothing about.
But i
Hi Ross,
>> I?ve tried increasing OutPacketBuffer::maxSize from 100 to 500 (5
million) but this has no effect. Am I just trying to stream too much data,
too quickly?
>No, the problem has nothing to do with 'speed'. There's a bug in your code
somewhere. Sorry.
Would you please tak
> No I don’t see this error message, however I am seeing continuous
> truncations, almost every frame is truncated. My own debug output looks like
> this, at a rate of around 4 frames per second:-
> deliverFrame(): newFrameSize:216054, fNumTruncatedBytes:66055
> deliverFrame(): newFrameSize:10899
Hi Ross,
>No, I meant the 'sink' object that those feed into. Is this a "RTPSink"
(subclass)? If so, then you should be seeing
>an error message like
> "MultiFramedRTPSink::afterGettingFrame1(): The input frame data was
too large for our buffer size"
>This error message will also tell you
On Wed, Dec 14, 2011 at 1:27 PM, Ross Finlayson wrote:
> No, you must send only one frame at a time, because downstream objects
> expect 'frames', not 'portions of frames'. "fMaxSize" is the size of the
> buffer that the downstream object specified when it called "getNextFrame()"
> on your input
> >(Unfortunately, you didn't say what your downstream object is, so until you
> >do, I can't really tell you how to increase the buffer size.)
> Ok, In H.264 mode, I’m directly using H264VideoStreamFramer, and in MPEG4
> mode, I’m using MPEG4VideoStreamDiscreteFramer. I hope this is what you mea
Hi Ross,
>(Unfortunately, you didn't say what your downstream object is, so until you
do, I can't really tell you how to increase the buffer size.)
Ok, In H.264 mode, I'm directly using H264VideoStreamFramer, and in MPEG4
mode, I'm using MPEG4VideoStreamDiscreteFramer. I hope this is what you mea
> >That error message indicates that your input source object did not set
> >"fFrameSize" properly. In particular, it set it to a value greater than
> >"fMaxSize" - bad!
>
> >A data source object *must* check "fMaxSize" before delivering data and
> >setting "fFrameSize".
>
> Ok, this is sur
>That error message indicates that your input source object did not set
"fFrameSize" properly. In particular, it set it to a value greater than
"fMaxSize" - bad!
>A data source object *must* check "fMaxSize" before delivering data and
setting "fFrameSize".
Ok, this is surely what I'm doing
> Now, after streaming a number of frames to the client, I get the following
> warning (the actual byte counts vary)
> StreamParser::afterGettingBytes() warning: read 9828 bytes; expected no more
> than 4142
That error message indicates that your input source object did not set
"fFrameSize" pro
Hi Ross,
I've modified my Live source Server application in the following way:-
Where I was using Linux pipes to get data from the main thread to the
Live555 event thread, I now cycle through a series of shared buffers. Linux
pipes were too small (64k only) for my big 5MP images, and they also ha
24 matches
Mail list logo