Hi-

I'm looking for some advice on streaming AAC via RTSP on iOS from a webcam. 
Hopefully someone has done this before and can point out my mistake. I have 
been successfully using the Live555 libraries with uLaw encoded audio for some 
time and am now trying to do the same for AAC on a newer camera. 

I am attempting to do this via the iOS Audio File Stream APIs 
(AudioFileStreamOpen/AudioFileStreamParseBytes). First, can anyone tell me if 
this is the correct way to do this? I assume once I receive the AAC frames in 
my custom RTSP stream sink class that the data still needs to be parsed to AAC 
packets and that the audio file stream API is the way to do that.

Unfortunately, AudioFileStreamParseBytes is not correctly finding AAC packets 
and/or properties in the data so I never get actual AAC packets to play. This 
is fairly frustrating as AudioFileStreamParseBytes does not generate any errors 
to let me know what I might be doing wrong.

I am assuming that the data from the Live555 utility routine 
parseGeneralConfigStr is the so-called Magic Cookie data that AAC decoders 
need. As this is apparently stripped out of the stream early by the Live555 
libraries and iOS Audio File Stream APIs expect to find it in the stream how do 
I hand this off to iOS Audio File Stream APIs? 

Are there any byte order issues I need to address for the ARM platform when 
handing off the data?

Thanks-
Frank Vernon


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to