Hi!
I'm writing some code to stream live audio in ulaw format over RTP
multicast. I derived a class from framedSource class to read in the
data from a memory buffer. An audio recorder thread feeds ulaw audio
data to this buffer. In the derived class, I specified frame size @
128 bytes, duration @
Hi All,
I've built a TS over RTSP streaming solution based on
"MPEG2TransportFileServerMediaSubsession". The a/v codec types in my
TS are MP4a and MP4v respectively. It plays back nicely in VLC player.
However, my client (an Android 2.2 device with OpenCore based media
player) doesn't support the
I had built the Live555 libs with mingw tools in UBuntu successfully
following instructions:
1). genMakefiles config.mingw
2). make
The reason for cross compiling above is I have other libraries built
this way (namely, FFMpeg). I've built an app with those FFMpeg libs in
Visual C++ 2008. The app wo
I've built a solution per your suggestion. It works well. Thanks.
On Thu, Nov 25, 2010 at 5:11 AM, Ross Finlayson wrote:
>> I'm new to the Live555 which I want to use in my project. My
>> requirement is to stream video (H.264/AAC in TS) over the RTSP
>> unicast. The video continuously comes in (l
I'm new to the Live555 which I want to use in my project. My
requirement is to stream video (H.264/AAC in TS) over the RTSP
unicast. The video continuously comes in (like live) and is captured
into circular buffer (either in memory or on hard disk). I check out
the "testOnDemandRTSPServer" sample.