Jesus,
The "ProxyServerMediaSession" class was intended to be a self-contained,
fully-featured class that could be used 'as is' to build RTSP proxy servers.
For now at least, it is not intended to be customizable. (This may change in
the future, depending on how the code ends up being used, b
> It seems that RTPTransmissionStats::fOldValid is not initialized, it is set
> in the second call of RTPTransmissionStats::noteIncomingRR (when
> fFirstPacket becomes false).
>
> Do you think it is possible to initialize it in the constructor ?
Yes, this was an oversight. It will be fixed i
> We are streaming (over Wifi ) H264 encoded frames from a live source using
> the reference of testH264VideoStreamer class. Camera frame rate is 30 fps.
> Receiver is an iOS application referenced from testRTSPClient class.
>
> We are able to stream with Initial propagation delay around 200ms b
> I have RTSP ServerMediaSession with one audio subsession and one video
> subsession.
> The sources for audio and video are live and coming as RTP streams whose
> timestamps are derived from same source.
> ( I subclassed OnDemandServerMediaSubsession to take live RTP input)
>
> The RTSP clien
*sir *
*In one of mine application i am using testMPEG2TransportStreamer.cpp for
streaming the video captured from live source and in one of mine header
file i am using "Mpeg4videostreamdiscreteframer" can this file be used to
calculate the frame rate???*
Thanks
On Fri, Jun 1, 2012 at 4:55 AM, W
> I am using RTSP for transmitting video from server to client. At some
> points during the transmission I need the server to "send" metadata to the
> client. Could you please share how can I do this?
Your first step is to decide what RTP payload format you want to use for this
'metadat
> I'm currently working on a modified version of wis-streamer which uses a
> Davinci D365IPNC Camera to stream H264 video.
> I have trouble understanding the flow of data from my camera which provides
> live video input, till the final RTPsink where it is being sent through the
> network to be r
Hi,
I'm currently working on a modified version of wis-streamer which uses a
Davinci D365IPNC Camera to stream H264 video.
I have trouble understanding the flow of data from my camera which provides
live video input, till the final RTPsink where it is being sent through the
network to be received
> I used the LIVE555 to accepted a TS stream Source over RTP and then broadcast
> the TS/RTP Source to the RTP Sink, and now the vlc player can play this live
> source.
> However, I also need to store/save this live TS stream source on the live555
> server into a stream file, how can I reach thi