Hi, we're developing an audio streamer server from a live source. The source audio is encoded by FFMpeg and its type is AV_SAMPLE_FMT_S16P (planar audio 16 bits signed) and the codec is MP3.
We based our streamer server using testOnDemandRTSPServer. We subclassed FramedSource (AudioFramedSource) and OnDemandServerMediaSubsession (AudioServerMediaSubsession) In this AudioServerMediaSubsession class we return MPEG1or2AudioStreamFramer in createNewStreamSource method and MPEG1or2AudioRTPSink in createNewRTPSink method In order to test this implementation we used FFPlay and VLC as the client, we connected to the audio stream and the result is similar to what it is described in http://www.live555.com/rtp-mp3/ Is there some example of how to use MP3ADU from memory live source? Or maybe it's another problem in the source code Thanks, Ignacio Barreto _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel