Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

How correctly setup AVSampleBufferDisplayLayer
How can I setup correctly AVSampleBufferDisplayLayer for video display when I have input picture format kCVPixelFormatType_32BGRA? Currently video i visible in simulator, but not iPhone, miss I something? Render code: var pixelBuffer: CVPixelBuffer? let attrs: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA, kCVPixelBufferWidthKey as String: width, kCVPixelBufferHeightKey as String: height, kCVPixelBufferBytesPerRowAlignmentKey as String: width * 4, kCVPixelBufferIOSurfacePropertiesKey as String: [:] ] let status = CVPixelBufferCreateWithBytes( nil, width, height, kCVPixelFormatType_32BGRA, img, width * 4, nil, nil, attrs as CFDictionary, &pixelBuffer ) guard status == kCVReturnSuccess, let pb = pixelBuffer else { return } var formatDesc: CMVideoFormatDescription? CMVideoFormatDescriptionCreateForImageBuffer( allocator: nil, imageBuffer: pb, formatDescriptionOut: &formatDesc ) guard let format = formatDesc else { return } var timingInfo = CMSampleTimingInfo( duration: .invalid, presentationTimeStamp: currentTime, decodeTimeStamp: .invalid ) var sampleBuffer: CMSampleBuffer? CMSampleBufferCreateForImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pb, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: format, sampleTiming: &timingInfo, sampleBufferOut: &sampleBuffer ) if let sb = sampleBuffer { if CMSampleBufferGetPresentationTimeStamp(sb) == .invalid { print("Invalid video timestamp") } if (displayLayer.status == .failed) { displayLayer.flush() } DispatchQueue.main.async { [weak self] in guard let self = self else { print("Lost reference to self drawing") return } displayLayer.enqueue(sb) } frameIndex += 1 }
0
0
179
Apr ’25
Destroy MIDIUMPMutableEndpoint again?
Is there a way to destroy MIDIUMPMutableEndpoint again? In my app, the user has a setting to enable and disable MIDI 2.0. If MIDI 2.0 should not be supported (or if iOS version < 18), it creates a virtual destination and a virtual source. And if MIDI 2.0 should be enabled, it instead creates a MIDIUMPMutableEndpoint, which itself creates the virtual destination and source automatically. So here is my problem: I didn't find any way to destroy the MIDIUMPMutableEndpoint again. There is a method to disable it (setEnabled:NO), but that doesn't destroy or hide the virtual destination and source. So when the user turns MIDI 2.0 support off, I will have two virtual destinations and sources, and cannot get rid of the 2.0 ones. What is the correct way to get rid of the MIDIUMPMutableEndpoint once it is created?
0
0
123
Sep ’25
AVPlayer playing protected HLS, when session expires and open new session created the player is stall for a small time
I am playing the protected HLS streams and the authorization token expires in 3 minutes. I am trying to achieve this with 'AVAssetResourceLoaderDelegate'. I can refresh the token and play it, but the problem is in between the session, the player stalls for a small time, LIKE 1 SECOND. Here's my code : class APLCustomAVARLDelegate: NSObject, AVAssetResourceLoaderDelegate { static let httpsScheme = "https" static let redirectErrorCode = 302 static let badRequestErrorCode = 400 private var token: String? private var retryDictionary = [String: Int]() private let maxRetries = 3 private func schemeSupported(_ scheme: String) -> Bool { let supported = ishttpSchemeValid(scheme) print("Scheme '\(scheme)' supported: \(supported)") return supported } private func reportError(loadingRequest: AVAssetResourceLoadingRequest, error: Int) { let nsError = NSError(domain: NSURLErrorDomain, code: error, userInfo: nil) print("Reporting error: \(nsError)") loadingRequest.finishLoading(with: nsError) } // Handle token renewal requests to prevent playback stalls func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForRenewalOfRequestedResource renewalRequest: AVAssetResourceRenewalRequest) -> Bool { print("Resource renewal requested for URL: \(renewalRequest.request.url?.absoluteString ?? "unknown URL")") // Handle renewal the same way we handle initial requests guard let scheme = renewalRequest.request.url?.scheme else { print("No scheme found in the renewal URL.") return false } if isHttpsSchemeValid(scheme) { return handleHttpsRequest(renewalRequest) } print("Scheme not supported for renewal.") return false } private func isHttpsSchemeValid(_ scheme: String) -> Bool { let isValid = scheme == APLCustomAVARLDelegate.httpsScheme print("httpsScheme scheme '\(scheme)' valid: \(isValid)") return isValid } private func generateHttpsURL(sourceURL: URL) -> URL? { // If you need to modify the URL, do it here // Currently this just returns the same URL let urlString = sourceURL.absoluteString print("Generated HTTPS URL: \(urlString)") return URL(string: urlString) } private func handleHttpsRequest(_ loadingRequest: AVAssetResourceLoadingRequest) -> Bool { print("Handling HTTPS request.") guard let sourceURL = loadingRequest.request.url, var redirectURL = generateHttpsURL(sourceURL: sourceURL) else { print("Failed to generate HTTPS URL.") reportError(loadingRequest: loadingRequest, error: APLCustomAVARLDelegate.badRequestErrorCode) return true } // Track retry attempts with a dictionary keyed by request URL let urlString = sourceURL.absoluteString let currentRetries = retryDictionary[urlString] ?? 0 if currentRetries < maxRetries { retryDictionary[urlString] = currentRetries + 1 } else { // Too many retries, report a more specific error reportError(loadingRequest: loadingRequest, error: NSURLErrorTimedOut) retryDictionary.removeValue(forKey: urlString) return true } if var urlComponents = URLComponents(url: redirectURL, resolvingAgainstBaseURL: false) { var queryItems = urlComponents.queryItems ?? [] // Generate a fresh token each time let freshToken = AESTimeBaseEncription.secureEncryptSecretText() // Check if the token already exists if let existingTokenIndex = queryItems.firstIndex(where: { $0.name == "token" }) { // Update the existing token queryItems[existingTokenIndex].value = freshToken } else { // Add the token if it doesn't exist queryItems.append(URLQueryItem(name: "token", value: freshToken)) } urlComponents.queryItems = queryItems redirectURL = urlComponents.url! } let redirectRequest = URLRequest(url: redirectURL) let response = HTTPURLResponse(url: redirectURL, statusCode: APLCustomAVARLDelegate.redirectErrorCode, httpVersion: nil, headerFields: nil) print("Redirecting HTTPS to URL: \(redirectURL)") loadingRequest.redirect = redirectRequest loadingRequest.response = response loadingRequest.finishLoading() // If successful, reset the retry counter if retryDictionary[urlString] == maxRetries { retryDictionary.removeValue(forKey: urlString) } return true } }
0
0
500
Mar ’25
Windows Apple Music: how to enumerate the local library or export it? Is Library.musicdb documented / API available?
Environment Windows 11 [edition/build]: [e.g., 23H2, 22631.x] Apple Music for Windows version: [e.g., 1.x.x from Microsoft Store] Library folder: C:\Users<user>\Music\Apple Music\Apple Music Library.musiclibrary Summary I need a supported way to programmatically enumerate the local Apple Music library on Windows (track file paths, playlists, etc.) for reconciliation with the on-disk Media folder. On macOS this used to be straightforward via scripting/export; on Windows I can’t find an equivalent. What I’m seeing in the library bundle Library.musicdb → not SQLite. First 4 bytes: 68 66 6D 61 ("hfma"). Library Preferences.musicdb → also starts with "hfma". artwork.sqlite → SQLite but appears to be artwork cache only (no track file paths). Extras.itdb → has SQLite format 3 header but (from a quick scan) not seeing track locations. Genius.itdb → not a SQLite database on this machine. What I’ve tried Attempted to open Library.musicdb with SQLite providers → error: “file is not a database.” Binary/string scans (ASCII, UTF-16LE/BE, null-stripped) of Library.musicdb → did not reveal file paths or obvious plist/XML/JSON blobs. The Windows Apple Music UI doesn’t appear to expose “Export Library / Export Playlist” like legacy iTunes did, and I can’t find a public API for local library enumeration on Windows. What I’m trying to accomplish Read local track entries (absolute or relative paths), detect broken links, and reconcile against the Media folder. A read-only solution is fine; I do not need to modify the library. Questions for Apple Is the Library.musicdb file format documented anywhere, or is there a supported SDK/API to enumerate the local library on Windows? Is there a supported export mechanism (CLI, UI, or API) on Windows Apple Music to dump the local library and/or playlists (XML/CSV/JSON)? Is there a Windows-specific equivalent to the old iTunes COM automation or any MusicKit surface that can return local library items (not streaming catalog) and their file locations? If none of the above exist today, is there a recommended workaround from Apple for library reconciliation on Windows (e.g., documented support for importing M3U/M3U8 to rebuild the local library from disk)? Are there any plans/timeline for adding Windows feature parity with iTunes/Music on macOS for exporting or scripting the local library? Why this matters For large personal libraries, users occasionally end up with orphaned files on disk or broken links in the app. Without an export or API, it’s difficult to audit and fix at scale on Windows. Reference details (in case it helps triage) Library.musicdb header bytes: 68-66-6D-61-A0-00-00-00-10-26-34-00-15-00-01-00 (ASCII shows hfma…). artwork.sqlite is readable but doesn’t contain track file paths (appears limited to artwork). I can supply a minimal repro tool and logs if that’s helpful. Feature request (if no current API) Add an official Export Library/Playlists action on Windows Apple Music, or Provide a read-only Windows API (or schema doc) that surfaces track file locations and playlist membership from the local library. Thanks in advance for any guidance or pointers to docs I might have missed.
0
0
235
Sep ’25
Getting CoreMediaErrorDomain -15628 playback failure in iOS 26 (AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error: Domain [CoreMediaErrorDomain] Code [-15628] Description [The operation couldn’t be completed.] Underlying Error Domain [(null)] Code [0] Description [(null)] Environment: iOS version: iOS 26 Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.
0
4
409
Sep ’25
Best approach for auto-play when using AVPlayer
Hello! I am trying to determine the best approach with AVPlayer for implementing auto-play, that is, playback that automatically starts without user initiation. Ideally this would work for both local and streaming audio. My current approach is using KVO and the status on an AVPlayerItem equal to readyToPlay to do this, but I was wondering if there was a better property or state to use, or, alternatively, whether this use case may already be handled when automaticallyWaitsToMinimizeStalling is true, so that I could simply write: player.replaceCurrentItem(with: AVPlayerItem(url: streamingUrl)) player.rate = 1 or let playerItem = AVPlayerItem(url: streamingUrl) player = AVPlayer(playerItem: playerItem) player.rate = 1 and expect the item to be auto-played when ready. In the context of user-initiated playback, I've typically seen code that makes a button's enabled state contingent on player.currentItem.duration, e.g. in AVFoundationSimplePlayer-iOS. On the other hand, AVAutoWait, which utilizes automaticallyWaitsToMinimizeStalling, does not seem to do this. As a side note, I am not using an AVQueuePlayer.
0
0
389
Mar ’25
Changing instrument with AVMIDIControlChangeEvent bankSelect
I've been trying to use AVMIDIControlChangeEvent with a bankSelect message type to change the instrument the sequencer uses on a AVMusicTrack with no luck. I started with the Apple AVAEMixerSample, converting the initial setup/loading and portions dealing with the sequencer to Swift. I got that working and playing the "bluesyRiff" and then modified it to play individual notes. So my createAndSetupSequencer looked like func createAndSetupSequencer() { sequencer = AVAudioSequencer(audioEngine: engine) // guard let midiFileURL = Bundle.main.url(forResource: "bluesyRiff", withExtension: "mid") else { // print (" failed guard trying to get URL for bluesyRiff") // return // } let track = sequencer.createAndAppendTrack() var currTime = 1.0 for i: UInt32 in 0...8 { let newNoteEvent = AVMIDINoteEvent(channel: 0, key: 60+i, velocity: 64, duration: 2.0) track.addEvent(newNoteEvent, at: AVMusicTimeStamp(currTime)) currTime += 2.0 } The notes played, so then I also replaced the gs_instruments sound bank with GeneralUser GS MuseScore v1.442 first by trying guard let soundBankURL = Bundle.main.url(forResource: "GeneralUser GS MuseScore v1.442", withExtension: "sf2") else { return} do { try sampler.loadSoundBankInstrument(at: soundBankURL, program: 0x001C, bankMSB: 0x79, bankLSB: 0x08) } catch{.... } This appears to work, the instrument (8 which is "Funk Guitar") plays. If I change to bankLSB: 0x00 I get the "Palm Muted guitar". So I know that the soundfont has these instruments Stuff goes off the rails when I try to change the instruments in createAndSetupSequencer. Putting let programChange = AVMIDIProgramChangeEvent(channel: 0, programNumber: 0x001C) let bankChange = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.bankSelect, value: 0x00) track.addEvent(programChange, at: AVMusicTimeStamp(1.0)) track.addEvent(bankChange, at: AVMusicTimeStamp(1.0)) just before my add note loop doesn't produce any change. Loading bankLSB 8 (Funk) in sampler.loadSoundBankInstrument and trying to change with bankSelect 0 (Palm muted) in createAndSetupSequencer results in instrument 8 (Funk) playing not Palm Muted. Loading bankLSB 0 (Palm muted) and trying to change with bankSelect 8 (Funk) doesn't work, 0 (Palm muted) plays I also tried sampler.loadInstrument(at: soundBankURL) and then I always get the first instrument in the sound font file (piano)no matter what values I put in my programChange/bankChange I've also changed the time in the track.addEvent to be 0, 1.0, 3.0 etc to no success The sampler.loadSoundBankInstrument specifies two UInt8 parameters, bankMSB and BankLSB while the AVMIDIControlChangeEvent bankSelect value is UInt32 suggesting it might be some combination of bankMSB and BankLSB. But the documentation makes no mention of what this should look like. I tried various combinations of 0x7908, 0X0879 etc to no avail I will also point out that I am able to successfully execute other control change events For example adding if i == 1 { let portamentoOnEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamento, value: 0xFF) track.addEvent(portamentoOnEvent, at: AVMusicTimeStamp(currTime)) let portamentoRateEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamentoTime, value: 64) track.addEvent(portamentoRateEvent, at: AVMusicTimeStamp(currTime)) } does produce a change in the sound. (As an aside, a definition of what portamento time is, other than "the rate of portamento" would be welcome. is it notes/seconds? freq/minute? beats/hour?) I was able to get the instrument to change in a different program using MusicPlayer and a series of MusicTrackNewMIDIChannelEvent on a track but these operate on a MusicTrack not the AVMusicTrack which the sequencer uses. Has anyone been successful in switching instruments through an AVMIDIControlChangeEvent or have any feedback on how to do this?
0
0
366
Mar ’25
Disabling Hardware OIS via AVFoundation — Clarification on AVCaptureVideoStabilizationMode
Hello everyone, I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions. My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off. The documentation for the .off case states: A mode that doesn’t stabilize video capture. This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction. So, I'd like to ask the community: Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior? Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture? What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline? Any insights or shared experiences on this topic would be greatly appreciated. Thank you!
0
1
287
Sep ’25
MPNowPlayingInfoCenter playbackState fails to update after losing audio focus on macOS
My Environment: Device: Mac (Apple Silicon, arm64) OS: macOS 15.6.1 Description: I'm developing a music app and have encountered an issue where I cannot update the playbackState in MPNowPlayingInfoCenter after my app loses audio focus to another app. Even though my app correctly calls [MPNowPlayingInfoCenter defaultCenter].playbackState = .paused, the system's Now Playing UI (Control Center, Lock Screen, AirPods controls) does not reflect this change. The UI remains stuck until the app that currently holds audio focus also changes its playback state. I've observed this same behavior in other third-party music apps from the App Store, which suggests it might be a system-level issue. Steps to Reproduce: Use two most popular music apps in Chinese app Store (NeteaseCloud music and QQ music) (let's call them App A and App B): Start playback in App A. Start playback in App B. (App B now has audio focus, and App A is still playing). Attempt to pause App A via the system's Control Center or its own UI. Observed Behavior: App A's audio stream stops, but in the system's Now Playing controls, App A still appears to be playing. The progress bar continues to advance, and the pause button becomes unresponsive. If you then pause App B, the Now Playing UI for App A immediately corrects itself and displays the proper "paused" state. My Questions: Is there a specific procedure required to update MPNowPlayingInfoCenter when an app is not the current "Now Playing" application? Is this a known issue or expected behavior in macOS? Are there any official workarounds or solutions to ensure the UI updates correctly?
0
0
169
Sep ’25
iPhone 17 smart framing api not working
I tried to modify the AVCam sample code by copying the code here https://developer.apple.com/documentation/avfoundation/adopting-smart-framing-in-your-camera-app#Configure-the-smart-framing-monitor smart framing monitors I can ensure the activeformat supports smart framing, but the supported frames in monitor is always nil. In my another project it has supported value, but the observation has never been triggered, then I tried to keep printing the recommended frame, it's always nil. Could the engineer embed the code into AVCam rather than posting a few code pieces?
0
0
152
Sep ’25
AVAudioMixerNode outputVolume range?
According to the header file the outputVolume properties supported range is 0.0-1.0: /*! @property outputVolume @abstract The mixer's output volume. @discussion This accesses the mixer's output volume (0.0-1.0, inclusive). @property (nonatomic) float outputVolume; However when setting the volume to 2.0 the audio does indeed play louder. Is the header file out of date and if so, what is the supported range for outputVolume? Thanks
0
0
115
Apr ’25
401 Unauthorized when attempting to access Apple Music Feed API
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir . running.... INFO:__main__:Sending requests to https://api.media.apple.com INFO:__main__:Getting the latest export for feed artist Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file? Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements
0
0
414
Sep ’25
MusicKit: Best way to check if all tracks of albums are added to library.
I prefer to use the album fetched from the library instead of the catalog since this is faster. If doing so, how can I check if all tracks of an album are added to the library. In this case I'd like to fetch the catalog version or throw an error (for example when offline). Using .with(.tracks) on the library album fetches the tracks added to the library. The trackCount property is referring to the tracks that can be fetched from the library. The isComplete property is always nil when fetching from the library. One possible way is checking the trackNumber and discCount properties. However this only detects that not all tracks of an album are added to the library if there is a song not added ahead of one that is. I'd like to be able to handle this edge case as well. Is there currently a way to do this? I'd prefer to not rely on the apple music catalog for this since this is supposed to work offline as well. Fetching and storing all trackIDs when connected and later comparing against these would work, but this would potentially mean storing tens of thousands of track ids. Thank you
0
0
94
Mar ’25
Failure on attempt to import track as spatial audio
I'm working on a project to support spatial audio editing, using this sample project as a reference: https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix This sample works well on an unedited capture, but does not work for a capture that has already been edited. The failure is occurring at "let audioInfo = try await CNAssetSpatialAudioInfo(asset: myAsset)", which is throwing "no eligible audio tracks in asset". I also find that for already edited captures, if i use CNAssetSpatialAudioInfo.assetContainsSpatialAudio, it returns false. What i mean by "already edited" is that if I take a spatial capture with my iPhone 16, and then edit that capture in the Photos app using the Cinematic effect, and then save the edited output (e.g. edited_capture.mov), I can't import that edited_capture.mov into my project as a spatial audio asset. Is this intentional behavior or a bug? If it's intentional, can you describe why?
0
1
163
Sep ’25
Option to Set Default Interpolation for Keyframes to “Linear” in Final Cut Pro
In Final Cut Pro, keyframes for transform parameters (such as Position, Scale, and Rotation) are automatically set to “Smooth” interpolation. This often results in undesired easing between keyframes, especially when linear motion is required. Currently, we have to manually adjust each keyframe to "Linear" using the Video Animation Editor, which can be time-consuming when working with many keyframes. Would it be possible to add an option to set the default keyframe interpolation to "Linear"—either globally in Preferences or per parameter in the Inspector? This would greatly streamline the animation workflow for many editors. Thank you for considering this request!
0
0
117
Jun ’25
Can individual Apple Developer accounts stream full tracks with MusicKit?
I have implemented fetching Apple Music preview songs using a Swift framework integrated into a Unity app. My requirement is to fetch full tracks from a user’s Apple Music library and play them inside Unity. To do this, I understand that I need to handle authentication, generate a Developer Token, and then obtain a Music User Token to access the user’s Apple Music content. Currently, I have an Individual Apple Developer account (not Organization). Based on my research, it seems that: With an Individual account, I can implement this functionality and even upload builds to TestFlight for internal testing. However, when releasing the app publicly on the App Store, full-track playback may be restricted for Individual accounts and allowed only for Organization accounts. 👉 Can you confirm if this understanding is correct? 👉 Specifically, is it possible for an Individual account to fetch and play full-length tracks from a subscribed Apple Music user’s library (at least for internal/TestFlight testing)?
0
0
180
Sep ’25
Potential Documentation Error in kAudioAggregateDevicePropertyTapList Sample Code
Hi, I believe I've found a potential error in the sample code on the documentation page for creating and using a process tap with an aggregate device. The issue is in the section explaining how to add a tap to the aggregate device. I have already filed a Feedback Assistant ticket on this (ID: FB17411663) but haven't heard back for months. Capturing system audio with Core Audio taps The sample code for modifying the kAudioAggregateDevicePropertyTapList incorrectly uses the tapID as the target AudioObjectID when calling AudioObjectSetPropertyData. // (Code to get the list and potentially modify listAsArray) if var listAsArray = list as? [CFString] { // ... (modification logic) ... // Set the list back on the aggregate device. <--- The comment is correct list = listAsArray as CFArray _ = withUnsafeMutablePointer(to: &list) { list in // INCORRECT: This call uses tapID as the target object. AudioObjectSetPropertyData(tapID, &propertyAddress, 0, nil, propertySize, list) } } The kAudioAggregateDevicePropertyTapList is a property that belongs to the aggregate device, not the individual tap. Therefore, to set this property, the AudioObjectSetPropertyData function must target the AudioObjectID of the aggregate device itself. Using tapID as the first argument is logically incorrect for this operation and will not update the aggregate device as intended. Furthermore, the preceding AudioObjectGetPropertyData call to fetch the list also appears to use the incorrect tapID as its target in the sample. The AudioObjectID for both getting and setting this property should be the ID of the aggregate device. _ = AudioObjectGetPropertyData(aggregateDeviceID, &propertyAddress, 0, nil, &propertySize, &list) _ = AudioObjectSetPropertyData(aggregateDeviceID, &propertyAddress, 0, nil, propertySize, newList) Thank you!
0
0
356
Sep ’25