> The issue we have now is that in retransmitting the audio stream we don’t
> understand if Live555 adds an extra header to the AAC frames.
As I said before - it doesn't. The frames that come from the
"MPEG4GenericRTPSource" are simply AAC frames, with no extra header added.
> We have the jpe
Hi All,
Thanks for the expedient response. What I failed to mention in my original
email was that we are developing a transcoding server that will encode mjpeg
stream to H264 stream in real time and send out both streams through RTSP to
multiple clients. We were able to build the pipeline and
> Can this be fixed in next release?
Yes, I have included it in a new version - 2013.01.25 - released just now.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com
If you are using the HTTP Live Streaming that depends on the MPEG-2 Transport
Stream, I implemented this on live cameras by gorilla subclassing (Cut-n-paste
and modify) the MPEG2TransportStreamFromESSource to a
MPEG2TransportStreamFromESSource4iOS class. I changed the inserting of the PAT
and
Dear Ross Finlayson,
Thanks, Now it works fine with your input to set the flat to True.
Regards,
Saravanan S
From: Ross Finlayson [mailto:finlay...@live555.com]
Sent: Friday, January 25, 2013 7:18 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Questions
On 01/22/2013 06:32 PM, Ross Finlayson wrote:
>> We think this change was introduced in the sequence of:
>> 2012.09.11:
>> - Fixed a bug in "MPEG2TransportStreamFromESSource": Its destructor wasn't
>> stopping the delivery from upstream objects.
>
> That's correct. That line
> doStopGettingFrames(
> I am using the Live555 server to stream MJPEG data captured from the device.
> We derived a class JPEGDeviceSource from JPEGVideSource, and reading the data
> from sharedbuffer(sent by device) in doGetNextFrame() of JPEGDeviceSource.
>
> Everything works fine for a single RTSP client session,
Hi,
I am using the Live555 server to stream MJPEG data captured from the device.
We derived a class JPEGDeviceSource from JPEGVideSource, and reading the
data from sharedbuffer(sent by device) in doGetNextFrame() of
JPEGDeviceSource.
Everything works fine for a single RTSP client session,
> at first, is there a possibility to put "*.ts" streams in a directory other
> than the directory of "live555MediaServer" application?
Yes, as noted in the online documentation - http://www.live555.com/mediaServer/
- the files to be streamed can must be either in the same directory as the
"