Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

Unstable Playlist.Entry.id causes crashes when removing duplicates
When multiple identical songs are added to a playlist, Playlist.Entry.id uses a suffix-based identifier (e.g. songID_0, songID_1, etc.). Removing one entry causes others to shift, changing their .id values. This leads to diffing errors and collection view crashes in SwiftUI or UIKit when entries are updated. Steps to Reproduce: Add the same song to a playlist multiple times. Observe .id.rawValue of entries (e.g. i.SONGID_0, i.SONGID_1). Remove one entry. Fetch playlist again — note the other IDs have shifted. FB18879062
0
0
538
Jul ’25
Garageband displaying error 100001 when loading up some AU plugins
I recently got some plugins from Universal Audio, and have licensed them properly through both UA and iLok manager. Whenever I try to load up the plugins (specifically from UA) in GarageBand, it first says that "NSCreateObjectFileImageFromMemory-p47UEwps” because the developper can not be verified. After clicking either 'show in finder' or 'okay', it opens the plugin in a form without its GUI and showing that it is not licensed (even though it is). It also displays error code 100001. I have tried only some basic stuff to troubleshoot like restarting the DAW/my computer and reinstalling/relicensing the softwares. I don't know if the macOS version has anything to do with it but for some reason I just can't get it to work.
1
0
385
Jan ’25
Microphone Recording interrupts when phone ringing
I'm developing an iOS app that requires continuous audio recording. Currently, when a phone call comes in, the AVAudioSession is interrupted and recording stops completely during the ringing phase. While I understand recording should stop if the call is answered, my app needs to continue recording while the phone is merely ringing. I've observed that Apple's Voice Memos app maintains recording during incoming call rings. This indicates the hardware and iOS are capable of supporting this functionality. Request Please advise on any available AVAudioSession configurations or APIs that would allow my app to: Continue recording during an incoming call ring Only stop recording if/when the call is actually answered Impact This interruption significantly impacts the user experience and core functionality of my app. Workarounds like asking users to enable airplane mode are impractical and create a poor user experience. Questions Is there an approved way to maintain microphone access during call rings? If not currently possible, could this capability be considered for addition to a future iOS SDK? Are there any interim solutions or best practices Apple recommends for this use case? Thank you for your help. SUPPORT INFORMATION Did someone from Apple ask you to submit a code-level support request? No Do you have a focused test project that demonstrates your issue? Yes, I have a focused test project to submit with my request What code level support issue are you having? Problems with an Apple framework API in my app
2
0
114
Jun ’25
Notification interruptions
My app Balletrax is a music player for people to use while they teach ballet. Used to be you could silence notifications during use, but now the customer seems to have to know how to use Focus mode, remember to turn it on and off, and have to check the notifications one does and doesn't want to use. Is there no way to silence all notifications when the app is in use?
0
0
62
Apr ’25
Problems recording audio on Tahoe 26.0 (Intel only)
I have some tried-and-tested code that records and plays back audio via AUHAL which breaks on Tahoe on Intel. The same code works fine on Sequioa and also works on Tahoe on Apple Silicon. To start with something simple, the following code to request access to the Microphone doesn't work as it should: bool RequestMicrophoneAccess () { __block AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType: AVMediaTypeAudio]; if (status == AVAuthorizationStatusAuthorized) return true; __block bool done = false; [AVCaptureDevice requestAccessForMediaType: AVMediaTypeAudio completionHandler: ^ (BOOL granted) { status = (granted) ? AVAuthorizationStatusAuthorized : AVAuthorizationStatusDenied; done = true; }]; while (!done) CFRunLoopRunInMode (kCFRunLoopDefaultMode, 2.0, true); return status == AVAuthorizationStatusAuthorized; } On Tahoe on Intel, the code runs to completion but granted is always returned as NO. Tellingly, the popup to ask the user to grant microphone access is never displayed, even though the app is not present in the Privacy pane and never appears there. On Apple Silicon, everything works fine. There are some other problems, but I'm hoping they have a common underlying cause and that the Apple guys can figure out what's wrong from the information in this post. I'd be happy to test any potential fix. Thanks.
2
0
436
Oct ’25
Making sense of AVAudioSession interruption notifications
I have an app under development - demo here - https://youtu.be/VbAfUk_eYl0?si=s6EDBx-4G6P_QbZO - which is sort of an audio player for airdropped files - something useful to musicians who dump work in progress to their phone, make notes, revise and update. I've been testing my handling of audio session interruption notifications, but seems to be a lot of inconsistency in how, when and why iOS delivers them, and I'm wondering if there is some rhyme or reason to it that I'm just not detecting. For example, I am playing a song in my app. Switch to Apple Music and start playing a song there. My app gets an interruption began notification - this is consistent. Switch back to my app, and about half the time, I will get an interruption ended notification (coupled often with a blast of the tail of whatever audio buffer was partially played when the interruption started, even though the engine was stopped - and followed by call to my AVAudioPlayerNodeCompletionCallback - is there some way to avoid this?). Half the time I don't get an interruption ended notification; my app can (as expected) end the interruption by activating the AVAudioSession and playing something. I have not been able to determine any pattern to this behavior, other than that if my app started playing using AVAudioPlayerNode.scheduleSegment rather than scheduleFile I think the notification will be consistently delivered on app activation rather than when I activate the session programmatically. I would like my app to behave deterministically, and would appreciate any help in deciphering what causes the inconsistent behavior in notifications from iOS.
2
0
339
Mar ’25
Number of songs in the Apple Music Feed
Hello, I'm evaluating the Apple Music Feed dataset and I noticed that the total number of songs available in the feed is too small. As of today, the number of objects returned in each feed is: 51,198,712 albums 23,093,698 artists 173,235,315 songs This gives an average of 3.38 songs per album which is quite low. Also, iterating on the data I see that there are albums referencing songs that don't exist in the songs feed. I would like to know: Is the feed data incomplete? If so, in what situations an object may be missing from the feed? Thank you in advance!
0
0
252
Aug ’25
Memory Leak in AVAudioPlayer in Simulator only
I have a memory leak, when using AVAudioPlayer. I managed to narrow down the issue into a very simple app, which code I paste in at the end. The memory leak start immediately when I start playing sound, but only in the emylator. On the real iPhone there is no memory leak. The memory leak on the Simulator looks like this: import SwiftUI import AVFoundation struct ContentView_Audio: View { var sound: AVAudioPlayer? init() { guard let path = Bundle.main.path(forResource: "cd201", ofType: "mp3") else { return } let url = URL(fileURLWithPath: path) do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers]) } catch { return } do { try AVAudioSession.sharedInstance().setActive(true) } catch { return } do { sound = try AVAudioPlayer(contentsOf: url) } catch { return } } var body: some View { HStack { Button { playSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "play.fill") .resizable() .frame(width: 20, height: 20) } } .padding() Button { stopSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "stop.fill") .resizable() .frame(width: 20, height: 20) } } .padding() } } private func playSound() { guard sound != nil else { return } sound?.volume = 1 // sound?.numberOfLoops = -1 sound?.play() } func stopSound() { sound?.stop() } }
2
0
99
Apr ’25
MusicKit - Not showing as a capability in Xcode
A bit of a novice to app development here but I have a paid developer account, I have registered the identifier for MusicKit on the developer website (using the bundle identifier I've selected in Xcode) but the option to add MusicKit as a capability is not available in Xcode? I've manually updated the certificates, closed the app and reopened it, started a new project and tried with a different demo project? Apologies if I am missing something obvious but could someone help me get this capability added?
0
0
117
Aug ’25
changePlaybackRateCommand does not work on iOS.
Hi. I work on an audio app for iOS which is successfully using the MPRemoteCommandCenter for commands like next, back, skip forward, skip backward etc. I am trying to implement playback rate controls in my app (so that users can change the playback speed of audio to 0.5x or 2x for example). While the above commands work, the changePlaybackRateCommand does not seem to. I have enabled the command, given it a target/handler and set supported rates. With the other commands, this caused the UI to change on lock screen, in command center etc, by adding the control for the command (a next button for the next command for example). However, it does not seem to do anything for the playback rate command. I can implement my own "rate button" UI and rate change handling, but I'm wondering if this is a known bug within Apple? Looking online, it seems other people face the same issue and haven't been able to get this command to work. Why is this API provided if it doesn't seem to do anything? Is there something I'm missing? Kind regards.
0
0
340
Jan ’25
AVMIDIPlayer.play() function crashes on iOS 18
It's only occurs on iOS 18+. Backtrace attached below. Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: NoteKeys [24384] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1a2d4c7cc __exceptionPreprocess + 164 (NSException.m:249) 1 libobjc.A.dylib 0x1a001f2e4 objc_exception_throw + 88 (objc-exception.mm:356) 2 CoreFoundation 0x1a2e47748 +[NSException raise:format:] + 128 (NSException.m:0) 3 AVFAudio 0x1bd41f4c8 -[AVMIDIPlayer play:] + 300 (AVMIDIPlayer.mm:145) 4 NoteKeys 0x1023c0670 SoundGenerator.playData() + 20 (SoundGenerator.swift:170) 5 NoteKeys 0x1023c0670 EditViewController.playBtnTapped(startIndex:) + 940 (EditViewController.swift:2034) 6 NoteKeys 0x1024497fc specialized Keyboard.playBtnTapped(sender:) + 1904 (Keyboard.swift:1249) 7 NoteKeys 0x10244631c Keyboard.playBtnTapped(sender:) + 4 (<compiler-generated>:0) 8 NoteKeys 0x10244631c @objc Keyboard.playBtnTapped(sender:) + 48 9 UIKitCore 0x1a58739cc -[UIApplication sendAction:to:from:forEvent:] + 100 (UIApplication.m:5816) 10 UIKitCore 0x1a58738a4 -[UIControl sendAction:to:forEvent:] + 112 (UIControl.m:942) 11 UIKitCore 0x1a58736f4 -[UIControl _sendActionsForEvents:withEvent:] + 324 (UIControl.m:1013) 12 UIKitCore 0x1a5fe8d8c -[UIButton _sendActionsForEvents:withEvent:] + 124 (UIButton.m:4198) 13 UIKitCore 0x1a5fea5a0 -[UIControl touchesEnded:withEvent:] + 400 (UIControl.m:692) 14 UIKitCore 0x1a57bb9ac -[UIWindow _sendTouchesForEvent:] + 852 (UIWindow.m:3318) 15 UIKitCore 0x1a57bb3d8 -[UIWindow sendEvent:] + 2964 (UIWindow.m:3641) 16 UIKitCore 0x1a564fb70 -[UIApplication sendEvent:] + 376 (UIApplication.m:12972) 17 UIKitCore 0x1a565009c __dispatchPreprocessedEventFromEventQueue + 1048 (UIEventDispatcher.m:2686) 18 UIKitCore 0x1a5659f3c __processEventQueue + 5696 (UIEventDispatcher.m:3044) 19 UIKitCore 0x1a5552c60 updateCycleEntry + 160 (UIEventDispatcher.m:133) 20 UIKitCore 0x1a55509d8 _UIUpdateSequenceRun + 84 (_UIUpdateSequence.mm:136) 21 UIKitCore 0x1a5550628 schedulerStepScheduledMainSection + 172 (_UIUpdateScheduler.m:1171) 22 UIKitCore 0x1a555159c runloopSourceCallback + 92 (_UIUpdateScheduler.m:1334) 23 CoreFoundation 0x1a2d20328 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 (CFRunLoop.c:1970) 24 CoreFoundation 0x1a2d202bc __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2014) 25 CoreFoundation 0x1a2d1ddc0 __CFRunLoopDoSources0 + 244 (CFRunLoop.c:2051) 26 CoreFoundation 0x1a2d1cfbc __CFRunLoopRun + 840 (CFRunLoop.c:2969) 27 CoreFoundation 0x1a2d1c830 CFRunLoopRunSpecific + 588 (CFRunLoop.c:3434) 28 GraphicsServices 0x1eecfc1c4 GSEventRunModal + 164 (GSEvent.c:2196) 29 UIKitCore 0x1a5882eb0 -[UIApplication _run] + 816 (UIApplication.m:3844) 30 UIKitCore 0x1a59315b4 UIApplicationMain + 340 (UIApplication.m:5496) 31 NoteKeys 0x10254bc10 main + 68 (AppDelegate.swift:15) 32 dyld 0x1c870aec8 start + 2724 (dyldMain.cpp:1334) Thanks very much for any help: )
1
0
295
Feb ’25
How to synchronize the clock sources of two audio devices
I created a virtual audio device to capture system audio with a sample rate of 44.1 kHz. After capturing the audio, I forward it to the hardware sound card using AVAudioEngine, also with a sample rate of 44.1 kHz. However, due to the clock sources being unsynchronized, problems occur after a period of playback. How can I retrieve the clock source of the hardware device and set it for the virtual device?
2
0
183
May ’25
AVPlayerView with .inline controlsStyle macOS 26
My audio app shows a control bar at the bottom of the window. The controls show nicely, but there is a black "slab" appearing behind the inline controls, the same size as the playerView. Setting the player view background color does nothing: playerView.wantsLayer = true playerView.layer?.backgroundColor = NSColor.clear.cgColor How can I clear the background? If I use .floating controlsStyle, I don't get the background "slab".
0
0
146
Oct ’25
Issue using Siphon Tap on input AudioQueue
Hi all, I've developed an audio DSP application in C++ using AudioToolbox and CoreAudio on MacOS 14.4.1 with Xcode 15. I use an AudioQueue for input and another for output. This works great. I'm now adding real-time audio analysis eg spectral analysis. I want this to run independently of my audio processing so it can not interfere with audio playback. Taps on AudioQueues seem to be a good way of doing this... Since the analytics won't modify the audio data, I am using a Siphon Tap by setting the AudioQueueProcessingTapFlags to kAudioQueueProcessingTap_PreEffects | kAudioQueueProcessingTap_Siphon; This works fine on my output queue. However, on my input queue the Tap callback is called once and then a EXC_BAD_ACCESS occurs - screen shot below. NB: I believe that a callback should only call AudioQueueProcessingTapGetSourceAudio when not using a Siphon, so I don't call it. Relevant code: AudioQueueProcessingTapCallback tap_callback) { // Makes an audio tap for a queue void * tap_data_ptr = NULL; AudioQueueProcessingTapFlags tap_flags = kAudioQueueProcessingTap_PostEffects | kAudioQueueProcessingTap_Siphon; uint32_t max_frames = 0; AudioStreamBasicDescription asbd; AudioQueueProcessingTapRef tap_ref; OSStatus status = AudioQueueProcessingTapNew(queue_ref, tap_callback, tap_data_ptr, tap_flags, &max_frames, &asbd, &tap_ref); if (status != noErr) printf("Error while making Tap\n"); else printf("Successfully made tap\n"); } void tapper(void * tap_data, AudioQueueProcessingTapRef tap_ref, uint32_t number_of_frames_in, AudioTimeStamp * ts_ptr, AudioQueueProcessingTapFlags * tap_flags_ptr, uint32_t * number_of_frames_out_ptr, AudioBufferList * buf_list) { // Callback function for audio queue tap printf("Tap callback"); }``` Image of exception stack provided by Xcode: ![]("https://developer.apple.com/forums/content/attachment/27479e8d-a118-459b-aa2d-7e30528910e3" "title=Screenshot 2025-06-14 at 1.29.14 PM.png;width=932;height=562") What have I missed? Appreciate any help you learned folks may be able to provide. Best, Geoff.
1
0
91
Jun ’25
Audio Volume increases by itself
I am using an AVAudioPlayer to play a "tick" sound once per second in a SwiftUI app. When running the app on an iPhone 16 (18.2.1) the tick sounds increase in volume after a few seconds. This does not happen in the simulator nor on an iPhone SE 2020 (18.1.1).
2
0
277
Jan ’25
access transport in Logic Pro
hi, i need to read wether the transport is playing or stopped but my current method that works for vst does not work for au. is there a lpx resource available for developers anywhere? if (auto* playHead = processor->getPlayHead()) { juce::AudioPlayHead::CurrentPositionInfo posInfo; if (playHead->getCurrentPosition(posInfo)) { bool isCurrentlyPlaying = posInfo.isPlaying; if (isCurrentlyPlaying != wasTransportPlaying) { if (isCurrentlyPlaying) { wasTransportPlaying = isCurrentlyPlaying; startAllTimers(); } else { wasTransportPlaying = isCurrentlyPlaying; stopAllTimers(); } } } } thanks :)
0
0
259
Mar ’25
How to safely switch between mic configurations on iOS?
I have an iPadOS M-processor application with two different running configurations. In config1, the shared AVAudioSession is configured for .videoChat mode using the built-in microphone. The input/output nodes of the AVAudioEngine are configured with voice processing enabled. The built-in mic is formatted for 1 channel at 48KHz. In config2, the shared AVAudioSession is configured for .measurement mode using an external USB microphone. The input/output nodes of the AVAudioEngine are configured with voice processing disabled. The external mic is formatted for 2 channels at 44.1KHz I've written a configuration manager designed to safely switch between these two configurations. It works by stopping AVAudioEngine and detaching all but the input and output nodes, updating the shared audio session for the desired mic and sample-rates, and setting the appropriate state for voice processing to either true or false as required by the configuration. Finally the new audio graph is constructed by attaching appropriate nodes, connecting them, and re-starting AVAudioEngine I'm experiencing what I believe is a race-condition between switching voice processing on or off and then trying to re-build and start the new audio graph. Even though notifications, which are dumped to the console indicate that my requested input and sample-rate settings are in place, I crash when trying to start the audio engine because the sample-rate is wrong. Investigating further it looks like the switch from remote I/O to voice-processing I/O or vice-versa has not yet actually completed. I introduced a 100ms second delay and that seems to help but is obviously not a reliable way to build software that must work consistently. How can I make sure that what are apparently asynchronous configuration changes to the shared audio session and the input/output nodes have completed before I go on? I tried using route change notifications from the shared AVAudioSession but these lie. They say my preferred mic input and sample-rate setting is in place but when I dump the AVAudioEngine graph to the debugger console, I still see the wrong sample rate assigned to the input/output nodes. Also these are the wrong AU nodes. That is, VPIO is still in place when RIO should be, or vice-versa. How can I make the switch reliable without arbitrary time delays? Is my configuration manager approach appropriate (question for Apple engineers)?
1
0
157
3w
MusicKit: Best way to check if all tracks of albums are added to library.
I prefer to use the album fetched from the library instead of the catalog since this is faster. If doing so, how can I check if all tracks of an album are added to the library. In this case I'd like to fetch the catalog version or throw an error (for example when offline). Using .with(.tracks) on the library album fetches the tracks added to the library. The trackCount property is referring to the tracks that can be fetched from the library. The isComplete property is always nil when fetching from the library. One possible way is checking the trackNumber and discCount properties. However this only detects that not all tracks of an album are added to the library if there is a song not added ahead of one that is. I'd like to be able to handle this edge case as well. Is there currently a way to do this? I'd prefer to not rely on the apple music catalog for this since this is supposed to work offline as well. Fetching and storing all trackIDs when connected and later comparing against these would work, but this would potentially mean storing tens of thousands of track ids. Thank you
0
0
75
Mar ’25