From an old “terminal dog”:

dscott% grep -R 10868 /System/Library/Frameworks/*Audio*
/System/Library/Frameworks/AudioUnit.framework/Headers/AUComponent.h:   
kAudioUnitErr_FormatNotSupported                = -10868,


etc.


> On Apr 14, 2015, at 11:26 AM, Steven Clark <[email protected]> wrote:
> 
> -10868 is kAudioUnitErr_FormatNotSupported
>  
> Found this in my notes, I don’t know if it came from documentation or 
> perusing Apple’s .h files.  The macerror command doesn’t seem to know about 
> it.
>  
> Steven J. Clark
> VGo Communications
>  
> From: [email protected] 
> [mailto:[email protected]] On 
> Behalf Of Waverly Edwards
> Sent: Monday, April 13, 2015 5:26 PM
> To: [email protected]
> Subject: avfoundation “error -10868”
>  
> I am in the process of converting an old existing project to use AVFoundation 
> APIs.  I have two goals
> 1)  Read the data into a Int16, interleaved format buffer
> 2)  Manipulate the data in the buffer directly (not realtime)
>  
> In testing, I found that I am getting “error -10868”, however I haven’t been 
> able to figure out what is wrong.
> The buffer is created, there is no error but I am unable to play the audio.  
> How do I overcome this issue?
>  
> My second question is how do I access the buffer directly or more 
> specifically the stereo channels.  When I was using CoreAudio APIs 
> (pre-Swift), I understood but now I’m lost.  In coreaudio, I previously used 
> an interleaved buffer.  Would someone be kind enough to show me how I could 
> access the channels in both interleaved and non-interleaved.
>  
>  
> My test input file is an Apple Lossless Compression file if that matters.
>  
> Thank you,
>  
>  
> W.
>  
>  
> /*
> Test result
>  
> bufferExists = true
> readErr = nil
> audioFileBuffer.format = <AVAudioFormat 0x6080008968a0:  2 ch,  44100 Hz, 
> Int16, inter>
> File URL:               
> Optional("file:///Users/wedwards/Desktop/tempAudioOutput.m4a 
> <file://///Users/wedwards/Desktop/tempAudioOutput.m4a>")
> File format:            <AVAudioFormat 0x608000892ed0:  2 ch,  44100 Hz, 
> 'alac' (0x00000001) from 16-bit source, 4096 frames/packet>
> File format descr:      <AVAudioFormat 0x608000892ed0:  2 ch,  44100 Hz, 
> 'alac' (0x00000001) from 16-bit source, 4096 frames/packet>
> Processing format:      <AVAudioFormat 0x60800089dce0:  2 ch,  44100 Hz, 
> Int16, inter>
> Length:                 184320 frames, 4.17959183673469 seconds
> 2015-04-13 15:47:36.911 SWIFT - Manipulate Audio Data[2993:250205] 
> 15:47:36.911 ERROR:     AVAudioNode.mm:521: AUSetFormat: error -10868
> 2015-04-13 15:47:36.912 SWIFT - Manipulate Audio Data[2993:250205] error 
> -10868
> */​
>  
>  
> import Foundation
> import AVFoundation
> import Cocoa
>  
> var audioEngine    : AVAudioEngine     = AVAudioEngine()
> var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
>  
>  
> func displayAudioFormatInfo( fileURL: NSURL, audioFile: AVAudioFile)
> {
>     var fileLength      = AVAudioFramePosition(audioFile.length)
>     var lengthInSeconds = Float64(fileLength) / 
> audioFile.fileFormat.sampleRate
>     
>     println("File URL:               \(fileURL.absoluteString)")
>     println("File format:            \(audioFile.fileFormat)")
>     println("File format descr:      \(audioFile.fileFormat.description)")
>     println("Processing format:      
> \(audioFile.processingFormat.description)")
>     println("Length:                 \(fileLength) frames, \(lengthInSeconds) 
> seconds")
> }
>  
>  
>  
> func readAudioDataTEST( fileURL: NSURL ) 
> {
>         
>         var readErr         : NSError? = nil
>         var isInterleaved   = true
>         var format          = AVAudioCommonFormat.PCMFormatInt16 // the 
> desired working format
>         let audioFile       = AVAudioFile(forReading: fileURL, commonFormat: 
> format, interleaved: isInterleaved, error: nil)
>         let audioFormat     = AVAudioFormat(commonFormat: 
> AVAudioCommonFormat.PCMFormatInt16, sampleRate: 44100.0, 
> channels:AVAudioChannelCount(2), interleaved: true)
>         let audioFrameCount = AVAudioFrameCount(audioFile.length)
>         let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, 
> frameCapacity: audioFrameCount)
>         var bufferExists    = audioFile.readIntoBuffer(audioFileBuffer, 
> error: &readErr)
>         
>         println("bufferExists = \(bufferExists)")
>         println("readErr = \(readErr)")
>         println("audioFileBuffer.format = \(audioFileBuffer.format)")
>         displayAudioFormatInfo( fileURL, audioFile )
>         
>         var mainMixer = audioEngine.mainMixerNode
>         audioEngine.attachNode(audioFilePlayer)
>         audioEngine.connect(audioFilePlayer, to: mainMixer, format: 
> audioFileBuffer.format)
>         
>         audioEngine.startAndReturnError(nil)
>         
>         audioFilePlayer.play()
>         audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options: 
> nil, completionHandler: nil)
> }
>  
>  
>  
> func getUrlFromNavDialog() -> NSURL?{
>     var openPanel = NSOpenPanel()
>     var url: NSURL?
>     openPanel.allowsMultipleSelection = false
>     openPanel.canChooseDirectories = false
>     openPanel.canCreateDirectories = false
>     openPanel.canChooseFiles = true
>     openPanel.runModal()
>     return openPanel.URL?
> }
>  
>  
> var fileURL = getUrlFromNavDialog()
> if (fileURL != nil ) { readAudioData(fileURL!) }
>  
>  
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list      ([email protected])
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/douglas_scott%40apple.com
> 
> This email sent to [email protected]

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to