> If I understand your use-case correctly, you have written > a software that uses libavcodec internally to decode wma. > Your software works fine with older versions of libavcodec > (FFmpeg 0.11) but not with newer versions of libavcodec > (although ffmpeg - the command line application - still > works fine, its console outputs looks slightly different > though). > >
Exactly! > > The reason is that the wma decoder has changed. wma > internally uses floats, in the past (until FFmpeg 0.11) > these floats were converted to s16 (16 bit integer) inside > the decoder. For clarity reasons, it was decided that it > makes no sense to convert these floats into int's inside > the decoder. > The decoder now outputs (planar) floats (the native wma format), > assuming you need int's (s16) in your application, you now have > to use the aconvert filter (or libswresample directly) to > convert from float to int16. > > Ok, I will try to use it > > Please understand that it is not a solution for you to use > 0.11 forever because new features are not backported to > old versions of FFmpeg. > > Of course, thats why I'm writing here =) > > I hope that clears it up, Carl Eugen > > Now everything is clear. Thank you very much Carl!
_______________________________________________ Libav-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/libav-user
