>Hello everyone,
>
>I would like to ask a question about how to let the RTSP Client can
>receive the ANNOUNCE message from RTSP Server.
I stand corrected. I looked at the RTSP 1.0 specification once
again, and see that it does, in fact, allow for "ANNOUNCE" to be sent
from the server to the cl
>Hello everyone,
>
>I would like to ask a question about how to let the RTSP Client can
>receive the ANNOUNCE message from RTSP Server.
It can't - because the RTSP "ANNOUNCE" command is sent from the
client to the server, rather than vice versa.
--
Ross Finlayson
Live Networks, Inc.
http://www
>Thanks Ross that is very insightful and realized what I really wanted
>to process is a Program Stream. I could use the testOnDemandRTSPServer
>or testMPEG1or2AudioVideoStreamer as my simulated stream source
>instead of MPEG1or2VideoStreamer. However I could not seem to find a
>counterpart example
Hello everyone,
I would like to ask a question about how to let the RTSP Client can receive
the ANNOUNCE message from RTSP Server.
The Server will send out an end-of-stream information by ANNOUNCE to inform
the client this session will close soon.
Thanks & Regards,
Techuan
___
Thanks Ross that is very insightful and realized what I really wanted
to process is a Program Stream. I could use the testOnDemandRTSPServer
or testMPEG1or2AudioVideoStreamer as my simulated stream source
instead of MPEG1or2VideoStreamer. However I could not seem to find a
counterpart example the r
The said NAL unit's format indicates a NAL unit stream or byte stream?
Or both is ok?
If our NAL unit is byte stream format, should I extract NAL unit from the byte
stream?
Thank you very much.___
live-devel mailing list
live-devel@lists.live555.com
h
The remaining piece of the puzzle for me deals with a web client
that I have written that provides interaction with the live stream.
I am trying to get a timestamp back from the media server to the
client, and ultimately available to a Javascript function on the
page. I can formulate a RTSP re
Hello,
I've had great success working with the information and feedback
provided on this devel post. I've created a live555 media server which
extends the DeviceSource class with one of my own (reads MPEG-4 frames
off DVR). Thank you all!
The remaining piece of the puzzle for me deals wit
>MPEG1or2VideoStreamer
>--->MPEG1or2VideoReceiver|MPEG1or2AudioVideoStreamerToDarwin>DarwinServer
The problem here is that "testMPEG1or2AudioVideoToDarwin" takes a
MPEG *Program Stream* as input, but the data output by
"testMPEG1or2VideoReceiver" is a MPEG Video *Elementary Stream".
If
Do you know the maximum size of a NAL unit?
Sorry, no.
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Hi Ross,
In the provided MPEG1or2AudioVideoStreamerToDarwin example, is it
possible to directly take an inbound stream source object instead of
ByteStreamFileSource as input to MPEG1or2Demux?
I am experimenting on taking the source stream in the
MPEG1or2VideoReceiver and fed it to MPEG1or2Demux,
Hi, Ross,
Thank you for answering.
Do you know the maximum size of a NAL unit?
In doGetNextFrame(), is it possible that size of a NAL unit exceed and to be
fragmented?
Thanks!___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live5
12 matches
Mail list logo