On Tue, May 6, 2008 at 6:36 PM, Ross Finlayson <[EMAIL PROTECTED]> wrote:
> Because you are streaming from a live source, then you may instead be able
> to use a simple "FramedSource" subclass (that you would write) that just
> delivers one AAC frame at a time (and sets "fPresentationTime"
> approp
Hi Ross, I've subclassed the FramedSource, with a class that takes the
FramedSource of the subsession of the session between my program and
the camera and save it as fInputSource.
I print the presentation time in MultiFramedRTPSource::doGetNextFrame1()
In each doGetNextFrame my class uses fInputS
IL PROTECTED]>
A
"LIVE555 Streaming Media - development & use" <[EMAIL PROTECTED]>
cc
Objet
Re: [Live-devel] Live stream of Axis camera
On Tue, May 6, 2008 at 6:36 PM, Ross Finlayson <[EMAIL PROTECTED]>
wrote:
> Using "ADTSAudioFileSource" might be wrong,
On Tue, May 6, 2008 at 6:36 PM, Ross Finlayson <[EMAIL PROTECTED]> wrote:
> Using "ADTSAudioFileSource" might be wrong, if the input data is not in
> ADTS format.
>
> Because you are streaming from a live source, then you may instead be able
> to use a simple "FramedSource" subclass (that you wou
I'm trying to stream in real time from a Axis 207 camera, based on
examples of the library. I successfully streamed video, but I have
troubles to add aac audio streaming, using ADTSAudioFileSource and
MPEG4GenericRTPSink, as I've seen in the list.
Using "ADTSAudioFileSource" might be wrong,
I forget to add that when I try to convert the file that must contain
the audio with ffmpeg I get error messages like:
[mpeg4 @ 0xb7e269a8]header damaged
[mpeg4 @ 0xb7e269a8]hmm, seems the headers are not complete, trying to
guess time_increment_bits
[mpeg4 @ 0xb7e269a8]my guess is 1 bits ;)
[mpeg