It's true that MultiFramedRTPSink class already provide this feature
of adding delay. But what we wanted was the optimization over that.
With MultiFramedRTPSink::sendPacketIfNecessary function delay was
added before sending packet over the network.
As we are working on live(real time) transmis
It's true that MultiFramedRTPSink class already provide this feature of
adding delay. But what we wanted was the optimization over that.
With MultiFramedRTPSink::sendPacketIfNecessary function delay was added
before sending packet over the network.
As we are working on live(real time) transmission
Seemed to be some sort of circular linking dependency. Relinking all of
the libraries a second time fixed the issue. Thanks for the help! I was
surprised at how easy live555 was to cross compile for arm, everything
just worked.
-Thanks
Alex Wright
___
I did some modification in code for live streaming. Now I want the
getNextFrame to be scheduled with a delay (of frame duration).
So in function, MultiFramedRTPSink::packFrame()
What modifications I should do in code in order to call
fSource->getNextFrame function()
using
envir().taskScheduler().
Hi,
I did some modification in code for live streaming. Now I want the
getNextFrame to be scheduled with a delay (of frame duration).
So in function, *MultiFramedRTPSink::packFrame()*
What modifications I should do in code in order to call
*fSource->getNextFrame function()*
using
*envir().taskSche
I am using live555 to stream *.acc and *.264 files to android
devices, VLC mac can play those streams normally, but android media
player can't play those streams. I can't figure out why...
Is there anyone could help me here?
Are you sure that 'Android Media Player' even supports the R
Hi, everyone:
I am using live555 to stream *.acc and *.264 files to android devices,
VLC mac can play those streams normally, but android media player can't play
those streams. I can't figure out why...
Is there anyone could help me here?
___
li
I try to get .avi from VIS-streamer-device with openRTSP-Client.
My problem is that I get one image and than I see in wireshark that
the client send TCP ZeroWindow.
This is simply the receiver's operating system's way of telling the
sender - using the TCP protocol - that it can't handle incomi