[Live-devel] MP3FileSource

2009-07-15 Thread Michael Russell
My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3) audio file and feeds it to an input of MPEG2TransportStreamFromESSource like this: ByteStreamFileSource* audioFileSource = ByteStreamFileSource::createNew(*env, filename); FramedSource* audioES = audioFileSource; MPEG1or2Au

Re: [Live-devel] .m4v / .mp3 Synchronization

2009-07-09 Thread Michael Russell
I wrote: I have two independent ByteStreamFileSource objects - One feeds MPEG-4 video elementary stream (.m4v) data to an MPEG4VideoStreamFramer. One feeds MPEG-1, Layer 3 (.mp3) audio data to an MPEG1or2AudioStreamFramer. Those framers then each feed a MPEG2TransportStreamFromESSource obje

Re: [Live-devel] .m4v / .mp3 Synchronization

2009-07-08 Thread Michael Russell
Ross Finlayson wrote: That's correct. The timestamps (specifically, the "fPresentationTime" variable) should be set by each Framer object. These are used to set the SCR timestamps in the resulting Transport Stream. So I'm not sure why this isn't working for you; you're going to have to track

Re: [Live-devel] .m4v / .mp3 Synchronization

2009-07-03 Thread Michael Russell
Ross Finlayson wrote: Your synchronization problem occurred when you *created* (multiplexed) your Transport Stream from your audio and video inputs. If you used our software to create your Transport Stream (it wasn't really clear from your message whether or not you did), then you should make

[Live-devel] .m4v / .mp3 Synchronization

2009-07-02 Thread Michael Russell
Hi Ross - I am prototyping a streaming application using an MPEG-1, Layer 3 (.mp3) audio file and an MPEG-4 video elementary stream (.m4v) file as inputs. I am doing this to simulate our actual encoder outputs since they are not yet available. I recorded these files from two different physic