HI,
In the videoframesource’s getnextframe, if the buffer is nalu, not
completely frame. So the fPresentationTime and fDurationInMicroseconds
should only be set when the buffer is the last nalu in current frame.
Is it right?
___
live-dev
It seems that live555 support RTSP 'trick play' operations for some media
types.
Recently when I attempt to stream a mkv file through live555, i found VLC
player's functionality of trick play doesn't work well. When i seeked to
time 1:00, it got data correctly(this means live555 has seeked to the
> I tried the unicast server based on onDemand sample but my function
> "getAuxSDPLine()" never returns. When i debugged i found that function
> "checkForAuxSDPLine1()" calls the function "fDummyRTPSink->auxSDPLine()"
> which is the function implemented in MPEG4RTPSink which in turn tried to get
I tried the unicast server based on onDemand sample but my function
"getAuxSDPLine()" never returns. When i debugged i found that function
"checkForAuxSDPLine1()" calls the function "fDummyRTPSink->auxSDPLine()"
which is the function implemented in MPEG4RTPSink which in turn tried to
get the po
> Thanks for the input and i rellay appreciate your help. If i subclass
> ServerMediaSubsession than I need to implement other functions too. Is there
> a another class which will do most for me. I noticed that
> OnDemandMediaSubsession is being used for some subsession implementation but
> is
Thanks for the input and i rellay appreciate your help. If i subclass
ServerMediaSubsession than I need to implement other functions too. Is
there a another class which will do most for me. I noticed that
OnDemandMediaSubsession is being used for some subsession implementation
but is it only fo
> I tried sending this extra data as first packet and also tried sending it
> with each packet but no use. I debuggedit and found that when i send this
> packet my MPEG4discreteFrame found all the information it needed (debugged
> function analyzeVOLHeader()) but still my tetsRTSPClient does not
I tried sending this extra data as first packet and also tried sending
it with each packet but no use. I debuggedit and found that when i send
this packet my MPEG4discreteFrame found all the information it needed
(debugged function analyzeVOLHeader()) but still my tetsRTSPClient does
not show a
> Thank you for the clarification. I will try to send this extra data as first
> packet. I have another confusion. if i send this packet as first packet when
> there was no client attached to it and later on client attached to it than
> how will that client get this packet
As long as you delive
Thank you for the clarification. I will try to send this extra data as
first packet. I have another confusion. if i send this packet as first
packet when there was no client attached to it and later on client
attached to it than how will that client get this packet as it will be
lost. Right now
Your "deliverFrame()" function looks OK - provided that it gets called whenever
a new MPEG-4 frame is available. Note that each call to
"MyStreamingDeviceSource::doGetNextFrame()" must be followed (eventually) by a
call to "FramedSource::afterGetting(this)". Therefore, if you return from
"del
HmmI have a better idea :-)
On Fri, Oct 18, 2013 at 2:42 PM, Ross Finlayson wrote:
> Embedding code on the Axis camera to call home is an excellent idea !
>
>
> Feel free to suggest this to Axis; I'd be happy to work with them to help
> make this happen.
>
>
> Ross Finlayson
> Live Networks,
Hi,
I spend almost last 3 days banging my head with RFCs and other
documents. Here is what I want to achieve. I have raw BGRA frames and I
want to stream them using live555. This is what I am doing
I created a subclass of FramedSource which is responsible to encode the
raw BGRA frame into MP
> Embedding code on the Axis camera to call home is an excellent idea !
Feel free to suggest this to Axis; I'd be happy to work with them to help make
this happen.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
___
live-devel mailing lis
Understanding the author's original intent and design philosophy is key,
especially when it comes to using their software :-)
In regards to the registering RTSP-over-HTTP back-end servers...I do not
anticipate a need for this, however, your clarification on it's
implementation within Proxy Server
> I googled and I found the open source site from that camera (1), I found a
> sheet containig live streaming information (2). There I found Live555.
[...]
> (1): http://gopro.com/support/open-source
> (2):
> http://wpcdn.gopro.com.s3.amazonaws.com/wp-content/uploads/2013/01/live.2012.02.04.tar.g
>> - Proxy Server -T option does not allow for specifying a unique port per
>> stream. In NAT cases streams would have different ports. Would it be
>> possible to have this option for streams that will be using RTSP over HTTP.
>
> FYI, right now back-end RTP/RTCP-over-RTSP-over-HTTP tunneling wo
Hello!
I am a GoPro camera user, and I am a bit dissapointed because of the lag
produced in the WiFi Live Streaming.
The GoPro streams to a server (10.5.5.9:8080) and the preview is here:
http://10.5.5.9:8080/live/amba.m3u8
I googled and I found the open source site from that camera (1), I found
> - Proxy Server -T option does not allow for specifying a unique port per
> stream. In NAT cases streams would have different ports. Would it be possible
> to have this option for streams that will be using RTSP over HTTP.
FYI, right now back-end RTP/RTCP-over-RTSP-over-HTTP tunneling works onl
Ross,
Very nice...I look forward to seeing how this evolves.
I noticed that you implemented the ability to specify stream suffix! I will
be setting up a new test environment with the latest code base to work
through these API changes. I'll provide feedback, should I encounter
anything of interest
FYI, I have just submitted to the IETF a new Internet-Draft document that
describes our new custom "REGISTER" RTSP command (that we use in our proxy
server implementation). You can find a copy online at
http://tools.ietf.org/html/draft-finlayson-rtsp-register-command-00
I have also rele
21 matches
Mail list logo