>When you stream audio over live-555's RTSP plugin on mplayer, what >decides the >audio data to be big endian vs little endian?
16-bit PCM audio data - when streamed using RTP using an IETF-standard RTP payload format - is *always* in Big Endian order (called "Network Byte Order" in the RFCs). That's the standard[*]. >I am developing a server and I am >specifying the format as 97 (dynamic format) with rtpmap being : 96 L16/8000 >for single channel, 16 bit/sample, 8 khz sample rate audio. But >mplayer somehow >thinks audio data is in big endian format and attaches a BE -> LE >conversion filter It is correct; your server is not. If your server's PCM data is originally in Little Endian order, then you need to insert a LE->BE filter in your server. If you are using our libraries to build your server, then this involves just inserting a "EndianSwap16" filter object in front of your "SimpleRTPSink". (If you are streaming from WAV audio files, then you could just use the "WAVAudioFileServerMediaSubsession" class, which does this for you. Note also our "LIVE555 Media Server", which can stream ".wav" audio files.) [*] The reason for this is that IETF protocol standards began in an era when most computers on the Internet were Big Endian computers like Sun workstations (which originally used the Motorola 68xxx architecture). Back then, computers that used the (Little Endian) Intel 8086 architecture were (generally speaking) too underpowered to be used as Internet nodes. If we had known back then that the x86 architecture would come to dominate the industry, then perhaps things would have been done differently.... -- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel