>In your "OnDemandServerMediaSubsession" subclass constructor, are you
setting the "reuseFirstSource" parameter (in the parent class constructor)
to True? This is important if - as in your case - you're streaming from a
live input source. It prevents a new input source object from being created
e
Hi, I don't think I'm handling multiple client connections properly in my
camera's live rtsp server.
My camera has multiple streams and I derive a class from FramedSource for
each of my substreams, the constructor looks like this:-
StreamSource::StreamSource(UsageEnvironment &env,streamSubStream_
Hi Ross,
>> Where can I put some debug to catch the RTCP reports coming in?
>Add
>#define DEBUG 1
>to the start of "liveMedia/RTCP.cpp". You will see reports of RTCP "SR"
packets being sent by the server, and - if >your client is working correctly
(e.g., "openRTSP") - RTCP "RR" packets arri
Hi Ross,
> Huh? If your RTSP server is "based on testOnDemandRTSPServer", then it
most certainly *is* 'starting' both RTP and RTCP. That's what a unicast
RTSP server does. You seem very confused here.
>But the issue here is your *client*. It is apparently not sending back
periodic RTCP "RR"
>The problem is your client. It is apparently not sending any periodic RTCP
"RR" packets - which it is supposed to do as per the RTP/RTCP standard.
But my RTSP server (based on TestOnDemandRTSPServer) doesn't start RTCP (or
RTP). So, if the Client doesn't send GET_PARAMETER messages like VCL, I
Hi,
I have a strange problem with my live embedded rtsp H.264 server when
connected to a certain 3rd party client software package.
Every 60 seconds my FramedSource derived class(StreamSource) is getting
deleted. If I connect via VLC or openRTSP, this does not happen, but I'm
fairly sure it is n
You wrote:
>I suggest using "openRTSP" as your client (give it the "-d "
flag, to record a specific length of time).
>You should end up with a file named something like "VIDEO-H264-1".
>Rename this file to "test.h264". See whether or not you can play it using
VLC.
>If you can't, then ema
>> I-Frames from my encoder look like this
>> 00 00 00 01 27 42 00 32 8b 68
>> 02 18 0f 33 02 48
>> 04 00 00 00 01 28 ce 05
>> 0a c8 00 00 00 01 25 b8
>> So, in the above example I
>> send:- The first part, the SPS, 13 bytes 27 42 00 32 8b 68 02 18 0f 33
>> 02 48 04 T
Hi,
I'm having problems getting a valid stream out of my live video server now I
have switched to using the H264VideoStreamDiscreteFramer. I was using
H264VideoStreamFramer but I couldn't avoid frame truncation problems.
I-Frames from my encoder look like this
00 00 00 01 27 42 00 32
8b 68 0
Hi Ross,
>> On further examination of my encoded frame data, it looks like an I-frame
consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17
bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of
the frame size. Each P-frame is just one NALU.
>> My code
On further examination of my encoded frame data, it looks like an I-frame
consists of 3 NAL units, each preceded by 00 00 00 01, the first NALU is 17
bytes long, the second NALU is 9 bytes long, and the 3rd NALU is the rest of
the frame size. Each P-frame is just one NALU.
My code is now using H26
>> So I've tried using H264VideoStreamDiscreteFramer and removing the first
4 bytes (which is always 00 00 00 01) from the encoded frame data, but this
fails with the output:-
>> Warning: Invalid 'nal_unit_type': 0.
>OK, that means that your encoder is producing invalid H.264 output.
>> I'm g
Hi Ross,
>That's your problem. Because your "StreamSource" object is delivering
discrete NAL units (in this case, discrete frames, where each frame is a
single NAL unit) - i.e., delivering one NAL unit at a time - then you should
be using "H264VideoStreamDiscreteFramer".
>Just make sure that you
Hi Ross
You wrote:
>But in your first message, you talked about "5MP" images. Did you mean "5
MByte images"; "5 Megapixel images"? In any case, what codec is this?
I.e., what kind of data are you trying to stream?
Yes, I am trying to stream H.264 compressed live images from my camera
sensor
Hi Ross,
>> I?ve tried increasing OutPacketBuffer::maxSize from 100 to 500 (5
million) but this has no effect. Am I just trying to stream too much data,
too quickly?
>No, the problem has nothing to do with 'speed'. There's a bug in your code
somewhere. Sorry.
Would you please tak
Hi Ross,
>No, I meant the 'sink' object that those feed into. Is this a "RTPSink"
(subclass)? If so, then you should be seeing
>an error message like
> "MultiFramedRTPSink::afterGettingFrame1(): The input frame data was
too large for our buffer size"
>This error message will also tell you
Hi Ross,
>(Unfortunately, you didn't say what your downstream object is, so until you
do, I can't really tell you how to increase the buffer size.)
Ok, In H.264 mode, I'm directly using H264VideoStreamFramer, and in MPEG4
mode, I'm using MPEG4VideoStreamDiscreteFramer. I hope this is what you mea
>That error message indicates that your input source object did not set
"fFrameSize" properly. In particular, it set it to a value greater than
"fMaxSize" - bad!
>A data source object *must* check "fMaxSize" before delivering data and
setting "fFrameSize".
Ok, this is surely what I'm doing
Hi Ross,
I've modified my Live source Server application in the following way:-
Where I was using Linux pipes to get data from the main thread to the
Live555 event thread, I now cycle through a series of shared buffers. Linux
pipes were too small (64k only) for my big 5MP images, and they also ha
Hi Ross,
>> FYI, I've now installed a new version (2011.12.02) of the "LIVE555
Streaming Media" code that avoids this problem (for H.264 parsing). You
should no longer need to modify the "BANK_SIZE" constant.
I guess I should download the patch and look at the new code but could you
just e
Hi Ross,
>> Surely the answer is to test the frame-size from the encoder before
>> passing it into Live555
>Unfortunately not, because we don't know the frame size in advance; we can
figure it out only by parsing/copying the >data, looking for the next 'start
code'. And it's this that's ove
Hi Ross,
Further to your answer on this, we have an image size of 2144x1944 pixels
and have already increased the BANK_SIZE beyond 30 to 45. However
there is no guarantee that any number is sufficient. Surely the answer is to
test the frame-size from the encoder before passing it into Live
HI Ross,
Thanks again for your sterling advice.
I finally got some video out to a VLC client. You were right, I had to
throttle back the encoder to 500kb/s. This is not enough bandwidth for a
decent image (640x480 @ 12fps) though. The funny thing is, once the stream
is going, I can dynamically b
Hi,
I am trying to use Live555 to implement an RTSP server in our IP camera.
Our H.264 encoder produces NAL units preceded by a 00 00 00 01 start code.
The encoding main thread then writes the NAL units to a linux pipe where
they can be read by the RTSP Server thread for streaming out.
Hi,
I'm not sure if this sort of request is allowed on the list but I'm looking
for someone to help us modify and debug a Live555 embedded implementation.
Specifically we have RTSP working/streaming with MPEG4 but when we switch to
H.264 compression, it just won't hack it.
We are based in the
25 matches
Mail list logo