From: Ross Finlayson <[EMAIL PROTECTED]> To: LIVE555 Streaming Media - development & use <[EMAIL PROTECTED]> Date: Wed, 19 Mar 2008 23:18:49 -0700 Subject: Re: [Live-devel] live 555 rtsp library for mplayer >>When you stream audio over live-555's RTSP plugin on mplayer, what >>decides the >>audio data to be big endian vs little endian?
>16-bit PCM audio data - when streamed using RTP using an >IETF-standard RTP payload format - is *always* in Big Endian order >(called "Network Byte Order" in the RFCs). That's the standard[*]. I am developing the server for an embedded device, I was hoping I could avoid that but alas! It sucks that mplayer code actually support a whole slew of combinations (any of the types specified in file ad_pcm.c, function static int init(sh_audio_t *sh_audio). The code checks sh_audio->format to match a data format, in order for data to be 16 bit, little endian it has to be one of these formats: 'sowt' (0x74776F73) , 0x0, 0x1, 0xfffe . If I could soecify a string for that would falll into one of these cases, I wouldve been in good shape. >>I am developing a server and I am >>specifying the format as 97 (dynamic format) with rtpmap being : 96 L16/8000 >>for single channel, 16 bit/sample, 8 khz sample rate audio. But >>mplayer somehow >>thinks audio data is in big endian format and attaches a BE -> LE >>conversion filter >It is correct; your server is not. If your server's PCM data is >originally in Little Endian order, then you need to insert a LE->BE >filter in your server. If you are using our libraries to build your >server, then this involves just inserting a "EndianSwap16" filter >object in front of your "SimpleRTPSink". (If you are streaming from >WAV audio files, then you could just use the >"WAVAudioFileServerMediaSubsession" class, which does this for you. >Note also our "LIVE555 Media Server", which can stream ".wav" audio >files.) I am not using live555 libraries because of the light weight requirement of this project, so I need to come up with an optimized version of the byte swaper. I wonder if there would be a way to read the data backwards from the A/D that we use for audio. >[*] The reason for this is that IETF protocol standards began in an >era when most computers on the Internet were Big Endian computers >like Sun workstations (which originally used the Motorola 68xxx >architecture). Back then, computers that used the (Little Endian) >Intel 8086 architecture were (generally speaking) too underpowered to >be used as Internet nodes. If we had known back then that the x86 >architecture would come to dominate the industry, then perhaps things >would have been done differently.... The standards definitely lack today's computing and multimedia needs. I was looking at the rfc for media specification - the fourcc and character based stream identification all sleems like a joke to me ..they need new revisions like many other rfc's. Ratin Ross Finlayson Live Networks, Inc. http://www.live555.com/
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel