Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

AudioHardwareError: No Access to Int32 error constants
I am unable to access the Int32 error from the errors that CoreAudio throws in Swift type AudioHardwareError. This is critical. There is no way to access the errors or even create an AudioHardwareError to test for errors. do { _ = try AudioHardwareDevice(id: 0).streams // will throw } catch { if let error = error as? AudioHardwareError { // cast to AudioHardwareError print(error) // prints error code but not the errorDescription } } How can get reliably get the error.Int32? Or create a AudioHardwareError with an error constant? There is no way for me to handle these error with code or run tests without knowing what the error is. On top of that, by default the error localizedDescription does not contain the errorDescription unless I extend AudioHardwareError with CustomStringConvertible. extension AudioHardwareError: @retroactive CustomStringConvertible { public var description: String { return self.localizedDescription } }
2
1
594
Dec ’24
How not to block during recording
The problem I have at the moment is that if a phone call comes in during my recording, even if I don't answer, my recording will be interrupted The phenomenon of recording interruption is that the picture is stuck, and the recording can be resumed automatically after the call is over. But it will cause the recorded video sound and painting out of sync Through the AVCaptureSessionWasInterrupted listening, I can get to record the types of alerts and interrupt As far as I can tell, a ringing or vibrating phone can block the audio channel. I found the same scenario in other apps, you can turn off the ring tone or vibration, but I don't know how to do it, I tried a lot of ways, but it doesn't work BlackmagicCam or ProMovie App, when a call comes in during recording, there will only be a notification menu, and there will be no ringtone or vibration, which solves the problem of recording interruption I don't know if this requires some configuration or application, please let me know if it does
1
0
541
Dec ’24
Sound randomly
Hello all! I've been having this issue for a while, on my iPhone 12 Pro. The volume when listening to music, watching YouTube, TikTok, etc. It will randomly lower, but the actual audio slider won't it will still be at max volume but get very quiet. I've followed other instructions such as turn off audio awareness, and other settings but nothing seems to be working. And phone calls too Has anyone else had this issue and managed to fix it?
1
0
422
Dec ’24
Music Kit initialisation, Uncaught TypeError: Cannot read properties of undefined (reading 'node')
I'm trying to load Music Kit on the server with solid js. I can confirm that my implementation has been sufficient to return authentication tokens and for MusicKit.isAuthorized to return true. My issue is that if I reload the page, it only succeeds intermittently (perhaps 25% of the time?). My question is - what is wrong with my implementation? Removing the async keyword ensures it loads every time but playing and queuing music no longer works. I'm currently assuming this is an SSR issue but the docs haven't explicitly specified this isn't possible. I have the following boilerplate: export default createHandler( () => ( <StartServer document={({ assets, children, scripts }) => { return ( <html lang="en"> <head> <meta name="apple-music-developer-token" content={authResult.token} /> <meta name="apple-music-app-name" content="app name" /> <meta name="apple-music-app-build" content="1978.4.1" /> {assets} <script src="https://js-cdn.music.apple.com/musickit/v3/musickit.js" async /> </head> <body> <div id="app">{children}</div> {scripts} </body> </html> ) }} /> )) When I first load my app, I'll encounter: musickit.js:13 Uncaught TypeError: Cannot read properties of undefined (reading 'node') at musickit.js:13:10194 at musickit.js:13:140 at musickit.js:13:209 The intermittence signals an issue relating to the async keyword. An expansion on this issue can be found here.
0
0
553
Dec ’24
AudioComponentInstanceNew takes up to five seconds to complete
We are using a VoiceProcessingIO audio unit in our VoIP application on Mac. In certain scenarios, the AudioComponentInstanceNew call blocks for up to five seconds (at least two). We are using the following code to initialize the audio unit: OSStatus status; AudioComponentDescription desc; AudioComponent inputComponent; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO; desc.componentFlags = 0; desc.componentFlagsMask = 0; desc.componentManufacturer = kAudioUnitManufacturer_Apple; inputComponent = AudioComponentFindNext(NULL, &desc); status = AudioComponentInstanceNew(inputComponent, &unit); We are having the issue with current MacOS versions on a host of different Macs (x86 and x64 alike). It takes two to three seconds until AudioComponentInstanceNew returns. We also see the following errors in the log multiple times: AUVPAggregate.cpp:2560 AggInpStreamsChanged wait failed and those right after (which I don't know if they matter to this issue): KeystrokeSuppressorCore.cpp:44 ERROR: KeystrokeSuppressor initialization was unsuccessful. Invalid or no plist was provided. AU will be bypassed. vpStrategyManager.mm:486 Error code 2003332927 reported at GetPropertyInfo
3
1
1.2k
Dec ’24
Apple Music Won't Play using the latest version of Xcode/MacOS
I have tried everything. The songs load unto the playlists and on searches, but when prompted to play, they just won't play. I have a wrapper since my main player (which carries the buttons for play/rewind/forward/etc.), is in Objc. // // ApplePlayerWrapper.swift // UniversallyMac // // Created by Dorian Mattar on 11/10/24. // import Foundation import MusicKit import MediaPlayer @objc public class MusicKitWrapper: NSObject { @objc public static let shared = MusicKitWrapper() private let player = ApplicationMusicPlayer.shared // Play the current track @objc public func play() { guard !player.queue.entries.isEmpty else { print("Queue is empty. Cannot start playback.") return } logPlayerState(message: "Before play") Task { do { try await player.prepareToPlay() try await player.play() print("Playback started successfully.") } catch { if let nsError = error as NSError? { print("NSError Code: \(nsError.code), Domain: \(nsError.domain)") } } logPlayerState(message: "After play") } } // Log the current player state @objc public func logPlayerState(message: String = "") { print("Player State - \(message):") print("Playback Status: \(player.state.playbackStatus)") print("Queue Count: \(player.queue.entries.count)") // Only log current track details if the player is playing if player.state.playbackStatus == .playing { if let currentEntry = player.queue.currentEntry { print("Current Track: \(currentEntry.title)") print("Current Position: \(player.playbackTime) seconds") print("Track Length: \(currentEntry.endTime ?? 0.0) seconds") } else { print("No current track.") } } else { print("No track is playing.") } print("----------") } // Debug the queue @objc public func debugQueue() { print("Debugging Queue:") for (index, entry) in player.queue.entries.enumerated() { print("\(index): \(entry.title)") } } // Ensure track availability in the queue public func queueTracks(_ tracks: [Track]) { Task { do { for track in tracks { // Validate Play Parameters guard let playParameters = track.playParameters else { print("Track \(track.title) has no Play Parameters.") continue } // Log the Play Parameters print("Track Title: \(track.title)") print("Play Parameters: \(playParameters)") print("Raw Values: \(track.id.rawValue)") // Ensure the ID is valid if track.id.rawValue.isEmpty { print("Track \(track.title) has an invalid or empty ID in Play Parameters.") continue } // Queue the track try await player.queue.insert(track, position: .afterCurrentEntry) print("Queued track: \(track.title)") } print("Tracks successfully added to the queue.") } catch { print("Error queuing tracks: \(error)") } debugQueue() } } // Clear the current queue @objc public func resetMusicPlayer() { Task { player.stop() player.queue.entries.removeAll() print("Queue cleared.") print("Apple Music player reset successfully.") } } } I opened an Apple Dev. ticket, but I'm trying here as well. Thanks!
1
0
462
Jan ’25
Why is AVAudioEngine input giving all zero samples?
I am trying to get access to raw audio samples from mic. I've written a simple example application that writes the values to a text file. Below is my sample application. All the input samples from the buffers connected to the input tap is zero. What am I doing wrong? I did add the Privacy - Microphone Usage Description key to my application target properties and I am allowing microphone access when the application launches. I do find it strange that I have to provide permission every time even though in Settings > Privacy, my application is listed as one of the applications allowed to access the microphone. class AudioRecorder { private let audioEngine = AVAudioEngine() private var fileHandle: FileHandle? func startRecording() { let inputNode = audioEngine.inputNode let audioFormat: AVAudioFormat #if os(iOS) let hardwareSampleRate = AVAudioSession.sharedInstance().sampleRate audioFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareSampleRate, channels: 1)! #elseif os(macOS) audioFormat = inputNode.inputFormat(forBus: 0) // Use input node's current format #endif setupTextFile() inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { [weak self] buffer, _ in self!.processAudioBuffer(buffer: buffer) } do { try audioEngine.start() print("Recording started with format: \(audioFormat)") } catch { print("Failed to start audio engine: \(error.localizedDescription)") } } func stopRecording() { audioEngine.stop() audioEngine.inputNode.removeTap(onBus: 0) print("Recording stopped.") } private func setupTextFile() { let tempDir = FileManager.default.temporaryDirectory let textFileURL = tempDir.appendingPathComponent("audioData.txt") FileManager.default.createFile(atPath: textFileURL.path, contents: nil, attributes: nil) fileHandle = try? FileHandle(forWritingTo: textFileURL) } private func processAudioBuffer(buffer: AVAudioPCMBuffer) { guard let channelData = buffer.floatChannelData else { return } let channelSamples = channelData[0] let frameLength = Int(buffer.frameLength) var textData = "" var allZero = true for i in 0..<frameLength { let sample = channelSamples[i] if sample != 0 { allZero = false } textData += "\(sample)\n" } if allZero { print("Got \(frameLength) worth of audio data on \(buffer.stride) channels. All data is zero.") } else { print("Got \(frameLength) worth of audio data on \(buffer.stride) channels.") } // Write to file if let data = textData.data(using: .utf8) { fileHandle!.write(data) } } }
4
0
928
Jan ’25
App recedes to background,audioEngine.start()
private var audioEngine = AVAudioEngine() private var inputNode: AVAudioInputNode! func startAnalyzing() { inputNode = audioEngine.inputNode let recordingFormat = inputNode.outputFormat(forBus: 0) let hardwareSampleRate = recordingSession.sampleRate inputNode.removeTap(onBus: 0) if recordingFormat.sampleRate != hardwareSampleRate { print("。") let newFormat = AVAudioFormat(commonFormat: recordingFormat.commonFormat, sampleRate: hardwareSampleRate, channels: recordingFormat.channelCount, interleaved: recordingFormat.isInterleaved) inputNode.installTap(onBus: 0, bufferSize: 1024, format: newFormat) { buffer, time in self.processAudioBuffer(buffer, time: time) } } else { inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { buffer, time in self.processAudioBuffer(buffer, time: time) } } do { audioEngine.prepare() try audioEngine.start() } catch { print(": \(error)") } } I back the app to the background and then call startAnalyzing(), which reports an error and the background recording permissions are configured。 error: [10429:570139] [aurioc] AURemoteIO.cpp:1668 AUIOClient_StartIO failed (561145187) [10429:570139] [avae] AVAEInternal.h:109 [AVAudioEngineGraph.mm:1545:Start: (err = PerformCommand(*ioNode, kAUStartIO, NULL, 0)): error 561145187 Audio engine couldn't start. Is background boot not allowed?
1
0
515
Jan ’25
Rear View Camera Installed – Now CarPlay Audio Stops After 15 Seconds via Bluetooth
I recently installed a rear-view camera in my car, and ever since, I've been experiencing a frustrating issue with my CarPlay. After about 15 seconds of playing audio via Bluetooth, the sound stops coming out of the speakers, even though the song continues to run in the background. For context, my stereo system is an aftermarket unit that I installed to enable CarPlay functionality. Everything worked perfectly before adding the rear-view camera. Unfortunately, my unit does not have a port for a wired connection, so I can't test the audio using a cable. Has anyone experienced a similar issue? Could the camera installation be interfering with the Bluetooth or audio system somehow? Any advice or troubleshooting tips would be greatly appreciated!
1
0
314
Jan ’25
How to set volume with MusicKit Web?
I've got a web app built with MusicKit that displays a list of songs. I have player controls for play, pause, skip next, skip, previous, toggle shuffle and set repeat mode. All of these work by using music. The play button, when nothing is playing and nothing is in the queue, will enqueue all the tracks and start playing with the below, for example: await music.setQueue({ songs, startPlaying: true }); I've implemented a progress slider based on feedback from the "playbackProgressDidChange" listener. Now, how in the world can I set the volume? This seems like it should be simple, but I am at a complete loss here. The docs say: "The volume of audio playback, which is set directly on the HTMLMediaElement as the HTMLMediaElement.volume property. This value ranges between 0, which would be muting the audio, and 1, which would be the loudest possible." Given that all my controls work off the music instance, I don't understand how I can do that. In this video from WWDC 2022, music web components are touched on briefly. These are also documented very sparsely. The volume docs are here. For the life of me, I can't even get the volume web component to display in the UI. It appears that MusicKit Web is hobbled compared to the native implementation, but surely adjusting volume shouldn't be that hard right? I'd appreciate any insight on how to do this, including how to get web components to work (in a Next JS app). Thanks.
2
0
561
Jan ’25
Inquiry about Potential Core Audio Improvements
Hi everyone, I wanted to bring up a question about Core Audio and its potential for future updates or improvements, specifically regarding latency optimization. As someone who relies on Core Audio for real-time audio processing, any enhancements in this area would be incredibly beneficial for professionals in the industry. Does anyone know if Apple has shared any plans or updates regarding Core Audio’s performance, particularly for low-latency applications? I’d appreciate any insights or advice from the community! Thanks so much! Best, Michael
1
0
501
Jan ’25
How to find `AudioHardwareControl` direction?
I'm working with modern Core Audio API introduced in macOS Sequoia. I have an AudioHadwareDevice which has several controls of type AudioHardwareControl. I figured out to filter only volume controls I can use classID == kAudioVolumeControlClassID condition. Some devices have volume controls for both input and output. How I can determine the direction of the control? Streams, i.e. AudioHardwareStream object have direction, but I didn't found a way to map controls to streams. There are kAudioObjectPropertyScopeInput and kAudioObjectPropertyScopeOutput property scopes, but no matter what I tried controls always return false to any control.hasProperty(address: whatever). Any other ideas?
1
0
503
Jan ’25
[VisionOS Audio] AVAudioPlayerNode occasionally produces loud popping/distortion when playing PCM data
I'm experiencing audio issues while developing for visionOS when playing PCM data through AVAudioPlayerNode. Issue Description: Occasionally, the speaker produces loud popping sounds or distorted noise This occurs during PCM audio playback using AVAudioPlayerNode The issue is intermittent and doesn't happen every time Technical Details: Platform: visionOS Device: vision pro / simulator Audio Framework: AVFoundation Audio Node: AVAudioPlayerNode Audio Format: PCM I would appreciate any insights on: Common causes of audio distortion with AVAudioPlayerNode Recommended best practices for handling PCM playback in visionOS Potential configuration issues that might cause this behavior Has anyone encountered similar issues or found solutions? Any guidance would be greatly helpful. Thank you in advance!
2
1
642
Jan ’25
Audio Volume increases by itself
I am using an AVAudioPlayer to play a "tick" sound once per second in a SwiftUI app. When running the app on an iPhone 16 (18.2.1) the tick sounds increase in volume after a few seconds. This does not happen in the simulator nor on an iPhone SE 2020 (18.1.1).
2
0
277
Jan ’25
Help Needed: How to Make iOS Timer More Stable?
I’m currently developing an iOS metronome app using DispatchSourceTimer as the timer. The interval is set very small, around 50 milliseconds, and I’m using CFAbsoluteTimeGetCurrent to calculate the elapsed time to ensure the beat is played within a ±0.003-second margin. The problem is that once the app goes to the background, the timing becomes unstable—it slows down noticeably, then recovers after 1–2 seconds. When coming back to the foreground, it suddenly speeds up, and again, it takes 1–2 seconds to return to normal. It feels like the app is randomly “powering off” and then “overclocking.” It’s super frustrating. I’ve noticed that some metronome apps in the App Store have similar issues, but there’s one called “Professional Metronome” that’s rock solid with no such problems. What kind of magic are they using? Any experts out there who can help? Thanks in advance! P.S. I’ve already enabled background audio permissions. The professional metronome that has no issues: https://link.zhihu.com/?target=https%3A//apps.apple.com/cn/app/pro-metronome-%25E4%25B8%2593%25E4%25B8%259A%25E8%258A%2582%25E6%258B%258D%25E5%2599%25A8/id477960671
2
0
555
Jan ’25
AVSpeechUtterance Mandarin voice output replaced by SIRI language setting after upgraded the IOS to 18
Hi, Apple's engineer. Hoping that you can reply to this one. We're developing a Text-to-Speak app. Everything went well until the IOS got upgraded to 18. AVSpeechSynthesisVoice(language: "zh-CN") is running well under IOS 16 AND IOS 17. It speaks Mandarin correctly. In IOS 18, we noticed that Siri's Language setting interrupted the performance of AVSpeechSynthesisVoice. It plays Cantonese instead of Mandarin. Buggy language setting in Siri that affects the AVSpeechSynthesisVoice : Chinese (Cantonese - China mainland) Chinese (Cantonese -Hong Kong)
3
3
825
Jan ’25
Can Logic Pro load an Audio Unit v3 in-process?
After investing more than a week into getting a bunch of audio unit projects converted into app + appex + framework, they all are now correctly loaded in-process in the demo host app that is part of Xcode's template. However, Logic Pro adamantly refuses to load them in-process. Does Logic Pro simply not do that ever, or is there some hint or configuration my plugins need to provide to enable that? If it is unsupported, will it be supported in some future version of Logic? The entire point of investing that week was performance, which is moot if it is impossible to test the impact of loading in-process in a real-world usage scenario.
1
0
662
Jan ’25
AUv3 recent "Failed to find component with type..." frequent issues
I've been generating new Audio Unit Extension apps with Xcode 16 (and newer), and although they generally work initially, it is easy (although I'm not sure how to do it reliably) to cause the app to no longer be able to instantiate the audiounit. Generally the call to AVAudioUnit.findComponent fails and SimplePlayEngine hits the fatalError("Failed to find component with type...") In the most recent project, merely adding files to the extension (without making any use of them) caused it to go off the rails. If I "Archive" the app+plugin, there is no audio unit extension in the bundle. If I switch to the audiounit extension and build it it's fine. If I look at the build folder in Library/Developer/Xcode/project_folder the extension_name.appex is there. Any ideas? If I can coax an unmodified audio unit extension project to exhibit this behavior I'll attach it here. Right now what I have has code I don't want to share.
4
1
703
Jan ’25