AVFoundation

RSS for tag

Work with audiovisual assets, control device cameras, process audio, and configure system audio interactions using AVFoundation.

Posts under AVFoundation tag

200 Posts

Post

Replies

Boosts

Views

Activity

macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
7
0
1k
1h
Camera launched via Camera Control is terminated with “AVCaptureEventInteraction not installed” when viewing/editing photos
I’m seeing a reproducible system-level Camera crash/termination on iPhone Air running iOS 26.4.2. Steps to reproduce: Press Camera Control to launch the Camera app. Tap the lower-left thumbnail to enter the recent photo view. Browse photos, or tap Edit and start cropping a photo. The Camera/Photos flow unexpectedly exits and returns to the Home Screen or widget view. Additional detail: The issue can happen whether or not a new photo is taken after launching Camera with Camera Control. In other words, using Camera Control as a shortcut into Camera, then tapping the lower-left thumbnail to browse photos, can trigger the issue. Sometimes it happens while only browsing photos, without entering Edit. Expected result: The photo viewer/editor should stay open and allow normal browsing or cropping. Actual result: The flow exits unexpectedly. Mac Console evidence: Around 2026-05-12 21:53:59-21:54:00, Console showed SpringBoard/RunningBoard terminating com.apple.camera. Relevant log excerpt: Capture Application Requirements Unmet: "AVCaptureEventInteraction not installed" reportType: CrashLog ReportCrash Parsing corpse data for pid 94087 com.apple.camera: Foreground: false Storage is sufficient. Restart/reset-style support steps have already been tried and did not resolve the issue. This appears specific to the Camera Control launch path, not normal Photos app browsing. Has anyone else seen this on iOS 26.x, or is this a known Camera Control / AVCaptureEventInteraction regression? Already Filed as FB22766094.
0
0
52
8h
AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?
AVCaptureSession.startRunning() triggers AVCaptureSessionRuntimeErrorNotification with AVError.unknown (-11800), underlying OSStatus 2003329396 → fourCC 'what', every cold launch, but only when an audio AVCaptureDeviceInput is attached. Removing only the audio input makes the error disappear. Same code in a fresh project records audio fine — bug only appears in this app's binary. AVAudioApplication.shared.recordPermission == .granted. Info.plist has NSMicrophoneUsageDescription. No interruption notifications fire. Test device: iPhone 16 Pro, iOS 26.4.2. iOS deployment target 17.1. Minimal reproducer import AVFoundation let session = AVCaptureSession() session.beginConfiguration() let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! session.addInput(try AVCaptureDeviceInput(device: camera)) // Removing ONLY this line makes the error disappear: let mic = AVCaptureDevice.default(for: .audio)! session.addInput(try AVCaptureDeviceInput(device: mic)) session.addOutput(AVCaptureMovieFileOutput()) session.addOutput(AVCapturePhotoOutput()) session.commitConfiguration() NotificationCenter.default.addObserver( forName: .AVCaptureSessionRuntimeError, object: session, queue: nil ) { print($0.userInfo ?? [:]) } session.startRunning() // -11800 / 'what' fires within ~2 sec Observed state at error time AVError.unknown (-11800) underlyingError = NSError(NSOSStatusErrorDomain, 2003329396) userInfo[AVErrorFourCharCode] = 'what' captureSession.isRunning = false ← never came up captureSession.isInterrupted = false captureSession.preset = .high captureSession.inputs = [Back Triple Camera, iPhone Microphone] AVAudioSession.sharedInstance(): category = .playAndRecord mode = .videoRecording sampleRate = 48000.0 isInputAvailable = true isOtherAudioPlaying = false availableInputs = [MicrophoneBuiltIn] (no BT/Continuity/AirPods) currentRoute.inputs = [] ← EMPTY currentRoute.outputs = [Speaker|Speaker] 2003329396 = 0x77686174 = 'what'. From a few SO threads this maps to AURemoteIO::StartIO returning a HAL-bring-up failure. The smoking gun: currentRoute.inputs is empty even though availableInputs contains the built-in mic, isInputAvailable is true, the category is .playAndRecord, and isOtherAudioPlaying is false. The HAL never routes the mic into the session, then 'what' follows. Nothing observable from AVAudioSession indicates a competing client. Environment / SDKs linked Firebase (SPM: Crashlytics, Performance, Messaging, Analytics, AppCheck, RemoteConfig, DynamicLinks), FBSDK, Kingfisher, MetalPetal. Multiple Google ad mediation pods present, but their audio session takeover is already disabled (audioVideoManager.isAudioSessionApplicationManaged = true, IMSdk.shouldAutoManageAVAudioSession(false)). What I've ruled out (all still produce 'what') Audio session config: .playAndRecord/.videoRecording, .playAndRecord/.default, .record/.measurement, .record/.default. With/without .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .mixWithOthers. setActive(true) before vs. after attaching audio input. setPreferredInput(builtInMic) (verified accepted). 200ms Thread.sleep between setActive(true) and startRunning(). Setting usesApplicationAudioSession = false swaps the fourCC to '!rec' but produces the same outcome. Topology: sessionPreset = .high / .hd1920x1080 / .hd1280x720 / .medium. Camera = .builtInTripleCamera / .builtInDualWideCamera / .builtInWideAngleCamera. AVCam-style always-attached graph. Setting sessionPreset before vs. after adding inputs. Threading: All session mutations on a single dedicated DispatchQueue (vs. Swift actor). 1× and 2× full stopRunning()+startRunning() recovery cycles ("do it twice" pattern) — both re-fail with 'what'. SDK takeover prevention: GoogleMobileAdsMediation pods (Vungle, Mintegral, Pangle, Unity, InMobi), Google-Mobile-Ads-SDK, MediaPipeTasksVision removed via full pod uninstall + clean build — 'what' persists. Notifications during the failure window: 3 × AVAudioSession.routeChangeNotification reason categoryChange before the error fires, even though category stays .playAndRecord/.videoRecording. Disabling automaticallyConfiguresApplicationAudioSession drops this to 1, but the runtime error still fires. No AVAudioSession.interruptionNotification. No AVCaptureSessionWasInterruptedNotification. Symbol audit otool -L and nm of the bundle confirm none of the linked frameworks reference AVAudioRecorder, AudioComponentInstanceNew, AURemoteIO, or AudioUnitInitialize in their symbol tables. Only the app's own files reference any audio API. Yet adding AVCaptureDeviceInput(.audio) reproduces 100% in this binary and 0% in a fresh project. My questions Who is most likely holding the audio HAL in a process where no linked framework references the AudioUnit / HAL APIs directly? Are there framework load-time audio initializations that don't show up in symbol tables (e.g., dynamic dlopen, CFBundleLoadExecutable) that could grab the HAL? Is there an os_log subsystem / category that surfaces the underlying AURemoteIO::StartIO failure reason at runtime? com.apple.coreaudio shows 'what' but not the originating cause. currentRoute.inputs is empty at error time even though availableInputs = [MicrophoneBuiltIn], isInputAvailable = true, and the category is .playAndRecord. What does an empty input route under those conditions imply, and what other system-level holders could be preventing the HAL from routing the mic in? Has anyone seen 'what' resolve with a device reboot, an iOS update, or by removing a specific framework? Happy to share a sysdiagnose. Thanks!
1
0
255
22h
RotationCoordinator returns angles 90 degrees lower on iPhone 17 Pro front camera — clarification on contract with AVSampleBufferDisplayLayer
Hi AVFoundation team, I'm seeing a uniform 90° offset in AVCaptureDevice.RotationCoordinator's reported angles between iPhone 17 Pro and iPhone 14 Pro using the front-facing .builtInWideAngleCamera (Center Stage on 17 Pro), and I'd like to confirm whether this is by design and what the recommended consumption pattern is when the rendering surface is an AVSampleBufferDisplayLayer rather than an AVCaptureVideoPreviewLayer. Here is the github repo of sample project. Setup Devices: iPhone 14 Pro (iOS 26.5) and iPhone 17 Pro (iOS 26.4.2) Camera: front, AVCaptureDeviceTypeBuiltInWideAngleCamera Active format: 1920×1080 Three RotationCoordinator instances are created on the same AVCaptureDevice, varying only the previewLayer: argument: - previewLayer: nil - previewLayer: AVSampleBufferDisplayLayer (the surface receiving frames from AVCaptureVideoDataOutput) - previewLayer: AVCaptureVideoPreviewLayer (with .session = captureSession, not displayed) Each instance is KVO-observed for videoRotationAngleForHorizonLevelPreview and videoRotationAngleForHorizonLevelCapture. Observed angles Device / Orientation: 14 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° ──────────────────────────────────────── Device / Orientation: 14 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 180° RC[AVSampleBufferLayer] prev / cap: 180° / 180° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 180° ──────────────────────────────────────── Device / Orientation: 17 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 0° RC[AVSampleBufferLayer] prev / cap: 0° / 0° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 0° ──────────────────────────────────────── Device / Orientation: 17 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° The −90° offset on 17 Pro is uniform: it appears in every RC variant, in both the preview-angle and the capture-angle properties, at every orientation tested. It is not specific to the previewLayer: argument.
1
0
243
1d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: 1.On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? 2.Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? 3.For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
0
0
138
3d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
0
0
144
3d
AudioHardwareCreateProcessTap delivers all-zero buffers while system audio is audible
Summary Using AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice for system audio capture. During long sessions, the AudioDeviceIOProc callback continues firing normally but every PCM sample is exactly 0.0f — while the system is producing audible output. Environment Field Value macOS 26.5 Beta Hardware MacBook Air (M2) API AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice Tap CATapDescription, processes = [], .unmuted, private Format 48,000 Hz, Float32, interleaved stereo Aggregate anchor kAudioAggregateDeviceMainSubDeviceKey = current default output UID Observed behavior After running normally for several minutes, the stream transitions into an all-zero state: AudioDeviceIOProc continues to fire at expected cadence Frame count, timestamps (mHostTime, mSampleTime), and mDataByteSize all look normal AudioBufferList pointers are valid Every sample in every buffer is exactly 0.0f Other apps are still producing audible output through the same output device The condition may self-recover or persist until the session is stopped Confirmed via RMS logging both inside the IOProc and after the ring buffer consumer — data is zero on delivery, not introduced downstream. Example: 51-minute session on MacBook Air M2 Segment 1 (~7 min): Three all-zero periods: 60 s, 53 s, 141 s. Real PCM briefly returned between them. Segment 2 (~44 min): Two all-zero periods: 16 min 3 s, 3 min 8 s. IOProc cadence, timestamp deltas, default output UID, and kAudioDevicePropertyDeviceIsRunningSomewhere all remained normal throughout. What I have ruled out Actual silence: User was in an active video call and could hear participants through the output device. Default output device change: Monitored kAudioHardwarePropertyDefaultOutputDevice — no change during affected periods. IOProc stall: Heartbeat and kAudioDevicePropertyDeviceIsRunningSomewhere remained normal. Aggregate device destroyed: AudioObjectGetPropertyData on the aggregate UID continued returning the expected device. Tap descriptor misconfiguration: The same tap produces valid PCM earlier in the same session, so this is not a startup-time issue. Why detection is hard All-zero buffers from a broken tap are indistinguishable from legitimate silence (muted participant, waiting room, paused media). kAudioProcessPropertyIsRunningOutput reports whether a process has active output IO, not whether it is contributing non-zero samples — a muted Zoom call still reports true. Possible correlations Sample-rate renegotiation on the output device (44.1 kHz ↔ 48 kHz) when another app changes output Bluetooth device state changes (AirPods sleep/wake) where UID stays the same MacBook Air more frequently affected than MacBook Pro Always occurs after extended uptime — first few minutes are consistently clean Current workaround Full teardown and rebuild restores real PCM. Restarting the IOProc alone or recreating only the aggregate device is not reliable — both the Process Tap and Aggregate Device must be destroyed and recreated. 1. AudioDeviceStop 2. AudioDeviceDestroyIOProcID 3. AudioHardwareDestroyAggregateDevice 4. AudioHardwareDestroyProcessTap 5. AudioHardwareCreateProcessTap 6. AudioHardwareCreateAggregateDevice 7. Create + start new IOProc Applying this automatically is risky because it cannot be reliably distinguished from legitimate silence. Questions Expected failure mode? Can a Process Tap continue delivering zero-filled buffers while the system output is audible? Is this expected under certain device or routing conditions? Detection signal? Is there any HAL property, notification, or diagnostic counter that distinguishes "sources are genuinely silent" from "the tap data path has stopped receiving the real mix"? Targeted recovery? Is there a supported way to re-anchor or reset the tap data path without destroying and recreating both objects? Full rebuild as intended workaround? If so, it would help to confirm this so developers can converge on a consistent approach. Mixer activity signal? kAudioProcessPropertyIsRunningOutput reflects IO registration, not sample contribution. Is there any AudioProcess property that indicates a process is currently delivering non-zero audio to the system mixer?
0
0
222
5d
CarPlay HID transport buttons remap to call-control during continuous mic capture (no opt-out API)
Hello, I am developing Uniq Intercom, a voice-only group communication app for motorcyclists (always-on intercom over WebRTC, used continuously for multi-hour rides). I am seeking guidance on an iOS audio session and CarPlay HID interaction I have not been able to resolve through documented APIs. Problem: As soon as my app activates the microphone (yellow recording indicator visible), iOS appears to classify the app as a real-time communication participant and CarPlay re-routes the steering-wheel / handlebar HID transport buttons (left / right / ok) from the media-control role to the call-control role (answer/decline). Because I do not register a CallKit / LiveCommunicationKit call (the session is a continuous group voice channel, not a discrete telephony call), there is no call object for those buttons to act upon — they effectively become inert. Why this matters: Motorcyclists rely on the intercom for 4–6 hour rides. CarPlay is now built into a growing number of modern motorcycles and with aftermarket display units virtually any bike, and any rider who uses any voice communication platform alongside it — Uniq Intercom, WhatsApp Call currently runs into this same handlebar button remap. With the buttons inert, the rider's only remaining option is to reach for the motorcycle's touchscreen to skip a track or change navigation — this is unsafe. The exact same remap behavior occurs during a real WhatsApp or Phone call — but for those the remap is correct (answer/decline maps to a real call). For continuous voice apps without a CallKit-style discrete call, no equivalent path exists today. As both an iOS developer and a motorcyclist, I would very much like to see this resolved — solving it would meaningfully improve safety for every rider using an iPhone with CarPlay. Configurations I have tested (all reproduce the symptom on iOS 18+ / 26 with wireless CarPlay): AVAudioSession.Category.playAndRecord + .voiceChat mode + various option combinations (duckOthers, mixWithOthers, allowBluetoothHFP, allowBluetoothA2DP, defaultToSpeaker) Same category with .videoChat mode (which @livekit/react-native defaults to) Same category with .default mode (re-applied after setAudioModeAsync to defeat any framework override) — confirmed Mode = Default for ~2 s window in audiomxd log before WebRTC's setActive cycle returned mode to .voiceChat. Buttons remained remapped during the .default window. Disabling MPRemoteCommandCenter and clearing MPNowPlayingInfoCenter.default().nowPlayingInfo JS-side override of WebRTC's global RTCAudioSessionConfiguration via @livekit/react-native's AudioSession.setAppleAudioConfiguration({audioMode: 'default'}) bridge, applied both before connect and after setAudioModeAsync to defeat library overrides In every case the audiomxd system log confirms our session goes active (Mode = VoiceChat or Default, Recording = YES), and CarPlay HID buttons are immediately remapped to call-control. The middle "OK" button remains functional because it is not part of the call-control mapping — confirming the buttons are not blocked, only re-purposed. The remap occurs the instant the iOS recording indicator appears, regardless of audio session mode. This led me to conclude the trigger is not audio session mode but the combination of microphone permission + active session + (likely) the AUVoiceIO unit instantiated by WebRTC. I cannot find a public API path to suppress this classification while maintaining the always-on continuous voice channel. My questions: Is there an entitlement or API that allows an app with active microphone capture to declare itself as a non-call media participant, keeping CarPlay HID transport buttons in the media role? Is AVAudioSession.setPrefersEchoCancelledInput(_:) (iOS 18+) the intended path for retaining platform AEC under .default mode without the focus-engine "communication priority" marking? Documentation is sparse on its CarPlay arbitration implications. Does the PushToTalk framework affect HID arbitration differently from playAndRecord + voiceChat? Our continuous-channel UX does not fit the PTT transmit-on-press model, but understanding the contrast would help. If no current API exists, is this something the iOS Audio team would consider for future SDKs? Solving this would meaningfully improve safety for motorcycle and adventure-sport users on iOS. Thank you for your time and any guidance you can offer. — Emre Erkaya / Uniq Intercom
1
0
123
5d
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
1
9
212
1w
Why doesn’t AVPlayer / AVFoundation support MPEG-DASH (MPD)? Any public rationale?
Hi, I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple. I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.). Questions: Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation? Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)? If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support? Thanks!
2
1
1k
1w
Working with kCVPixelFormatType_96VersatileBayerPacked12
Whilst AVCaptureSession is setup to capture ProRes RAW video, is it possible to get video pixel data which can read and processed, such as using CIImage(cvPixelBuffer: ) AVCaptureVideoDataOutput outputs ProRes RAW in kCVPixelFormatType_96VersatileBayerPacked12 pixel format. Is there a provided way to debayer this pixel format into something more usable?
0
0
100
1w
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
3
0
687
2w
PHPhotosErrorDomain Code: 3302 started affecting my users recently.
Recently I received multiple user email supports because the app cannot save video to photos, they are all on iOS 26.x The code in my app around recording video and saving to Photos hasn't changed in years. I'm not able to reproduce it locally, I tried on all my available devices. In recently published build I added additional logs and it appears that all of cases that fail with 3302 have the photos access set to "Limited Access". It never happens to users with "Full Access". In that build I also added a fallback, when saving to photos fails, the app saves to Documents and it seems it works (two of my users affected users confirmed it), but it's very unfortunate. I think it kind of proves that videos aren't broken given that users are able to play them just fine. On of users says that for him saving to Photos works for 2-3 times after he reinstalls the app and then it stops working. Did anything recently changed in how we should save videos to photos? I'm using the following code. I can see in git blame that I haven't changed in since 2020 and never encountered those errors in development or heard about those issues until around 1 month ago. Thank you. PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in
2
0
165
2w
visionOS: AVFoundation cannot deliver simultaneous video from two external (UVC) cameras; no public USB fallback exists
Area: visionOS 26.4 · AVFoundation · AVCapture · External/UVC video Classification: Suggestion / API Enhancement Request (also: Incorrect/Missing Documentation) Device / OS: Apple Vision Pro, visionOS 26.x. Xcode 26.4.1, XROS26.4.sdk. Summary On visionOS, a third-party app cannot display two UVC USB cameras (connected through a powered USB-C hub) at the same time. Every AVFoundation path that would enable this on iPadOS is either unavailable or fails at runtime on visionOS, and there is no public non-AVFoundation fallback (no IOUSBHost, no DriverKit, no usable CoreMediaIO, no MFi path for generic UVC devices). This is a real capability gap relative to iPadOS and macOS, and Camo Studio on iPadOS (App Store ID 6450313385) demonstrates the two-camera USB-hub use case is legitimate and valuable for spatial-video/hybrid-capture workflows on Vision Pro. Steps to reproduce Connect a powered USB-C hub to Apple Vision Pro with two UVC webcams attached. Build a visionOS app that uses AVCaptureDevice.DiscoverySession(deviceTypes: [.external], …). Observe: both cameras are discovered and enumerate as distinct AVCaptureDevices. Attempt A — two independent sessions: Create two independent AVCaptureSessions, each with one AVCaptureDeviceInput and one AVCaptureVideoDataOutput, start both. Result: only one session delivers sample buffers. The other stalls silently with no error and no interruption notification. Attempt B — AVCaptureMultiCamSession with manual connections (the pattern that works on iPadOS 18+): Result: code does not compile. In XROS26.4.sdk: AVCaptureInputPort is API_UNAVAILABLE(visionos) (AVCaptureInput.h) AVCaptureInput.ports is API_UNAVAILABLE(visionos) AVCaptureDeviceInput.portsWithMediaType:sourceDeviceType:sourceDevicePosition: is API_UNAVAILABLE(macos, visionos) Therefore AVCaptureConnection(inputPorts:output:) cannot be constructed. AVCaptureMultiCamSession itself is declared API_AVAILABLE(… visionos(2.1)), which is misleading because without input-port access the manual-connection path the class requires is unreachable. Expected behavior Either of the following would resolve this, in order of preference: Expose the missing API surface on visionOS. Make AVCaptureInputPort, AVCaptureInput.ports, and AVCaptureDeviceInput.portsWithMediaType:sourceDeviceType:sourceDevicePosition: available on visionOS so the documented iPadOS multi-cam pattern compiles and runs. AVCaptureMultiCamSession is already declared available — the supporting API surface should match. Allow two concurrent plain AVCaptureSessions to each own a distinct external AVCaptureDevice. Each session binds a different hardware device, and the current serialization appears to be a software policy rather than a hardware constraint (a powered hub has bandwidth for both). Document the limit explicitly and surface a clear error or interruption reason on the stalled session so apps can fail loudly instead of appearing to work. Actual behavior AVCaptureMultiCamSession advertises visionos(2.1) availability but the APIs required to wire its connections are marked unavailable on visionOS. Two concurrent AVCaptureSessions silently deliver frames to only one session; no error is reported on the other. There is no public alternative framework on visionOS for raw UVC access to work around this: IOUSBHost.framework — not present in XROS26.4.sdk DriverKit — not present in XROS26.4.sdk IOKit — ships a stub (IOKit.tbd); no public USB device interfaces CoreMediaIO — headers are an apinotes stub on visionOS ExternalAccessory — MFi-only; generic UVC devices don't enumerate This means there is no public path, AVFoundation or otherwise, for a third-party visionOS app to display two UVC cameras at once. Impact / use cases Apple Vision Pro is uniquely suited to multi-camera monitoring and capture workflows — spatial creators, broadcast/AV producers, multi-angle reference during immersive authoring, clinical and field-recording use cases, and apps that combine a primary UVC cinema camera with a secondary UVC reference/overview angle. iPadOS already supports this via AVCaptureMultiCamSession (demonstrated shipping by Camo Studio). The current visionOS limitation pushes these workflows back to iPad or macOS and undermines Vision Pro's positioning as a pro capture/monitor environment. References iPadOS reference implementation: Apple sample Displaying Video From Connected Devices + AVCaptureMultiCamSession with manual AVCaptureConnection wiring — works on iPadOS 18+ with two UVC cameras via a powered hub. Shipping precedent: Camo Studio — two simultaneous UVC cameras via USB hub on iPad — https://apps.apple.com/us/app/camo-studio-stream-record/id6450313385 visionOS 26.4 SDK headers cited above (AVCaptureInput.h, AVCaptureSession.h).
1
0
1.2k
2w
Clarification on WWDC25 Session 300: Do iPhone 11 and SE (2nd gen) fully support Frame Interpolation & Super Resolution without issues?
Hello everyone, I have a question regarding the Ultra-Low Latency Frame Interpolation and Super Resolution features introduced in WWDC 2025 Session 300 (https://developer.apple.com/videos/play/wwdc2025/300/). In the video, it was mentioned that these features run on any device as long as it has iOS 26.0 or later and an Apple Silicon chipset. Based on the official support guide (https://support.apple.com/ko-kr/guide/iphone/iphe3fa5df43/ios), the iPhone 11 and iPhone SE (2nd generation) are listed as supported devices. I just want to double-check and confirm: since they meet the criteria mentioned in the video, do these features actually run without any performance issues or limitations on the iPhone 11 and iPhone SE (2nd gen)? I want to make sure I understand the exact hardware capabilities before proceeding with development. Thanks for your help!
1
0
436
3w
AVContentKeySession: Cannot re-fetch content key once obtained — expected behavior?
We are developing a video streaming app that uses AVContentKeySession with FairPlay Streaming. Our implementation supports both online playback (non-persistable keys) and offline playback (persistable keys). We have observed the following behavior: Once a content key has been obtained for a given Content Key ID, AVContentKeySession does not trigger contentKeySession(_:didProvide:) again for that same Key ID We also attempted to explicitly call processContentKeyRequest(withIdentifier:initializationData:options:) on the session to force a new key request for the same identifier, but this did not result in the delegate callback being fired again. The session appears to consider the key already resolved and silently ignores the request. This means that if a user first plays content online (receiving a non-persistable key), and later wants to download the same content for offline use (requiring a persistable key), the delegate callback is not fired again, and we have no opportunity to request a persistable key. Questions Is this the expected behavior? Specifically, is it by design that AVContentKeySession caches the key for a given Key ID and does not re-request it — even when processContentKeyRequest(withIdentifier:) is explicitly called? Should we use distinct Content Key IDs for persistable vs. non-persistable keys? For example, if the same piece of content can be played both online and offline, is the recommended approach to have the server provide different EXT-X-KEY URIs (and thus different key identifiers) for the streaming and download variants? Is there a supported way to force a fresh key request for a Key ID that has already been resolved — for example, to upgrade from a non-persistable to a persistable key? Environment iOS 18+ AVContentKeySession(keySystem: .fairPlayStreaming) Any guidance on the recommended approach for supporting both streaming and offline playback for the same content would be greatly appreciated.
1
0
387
4w
`LockedCameraCaptureManager` practically unusable since iOS 26
Somewhere since iOS 26, the LockedCameraCapture framework gets in an unpredictable state after opening the main app from the LockedCamera extension using LockedCameraCaptureSession.openApplication(for userActivity:). (Feedback with sample code to reproduce: FB21966835) Opening the extension from the lock screen again doesn’t open the extension but puts the lock screen in a state as if it has. Content updated from LockedCameraCaptureManager.shared.sessionContentUpdates comes in inconsistently, usually needs the app to be opened again or the extension to be opened. This makes using this extension impossible for me as I use it to record video files that manually need to be imported when the app is launched (so not through PhotoKit). Does anybody have a suggestion to circumvent this issue or how to get this fixed?
0
0
265
4w
Setting up video and image capture pipeline creates internal errors in AVFoundation.
I have created code for iOS that allows me to start and stop video acquisition from a proprietary USB camera using AVFoundation's AVCaptureSession and AVCaptureDevice APIs. There is a start and stop method. The start method takes an argument to specify one of two formats that I use for my custom camera application. I can start the session and switch between formats all day without any errors. However, if I start and then stop the camera three times in a row, on the third invocation of start, I get errors in the console output and the CMSampleBuffers stop flowing to my callback. Additionally, once I get AVFoundation into this state, stoping the camera doesn't help. I have to kill the app and start over. Here are the errors. And below these, the code. I'm hoping someone who has experience with these errors or an engineer from Apple who knows the AVFoundation image capture pipeline code, can respond and tell me what I'm doing wrong. Thanks. <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:558) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:269) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:511) - (err=-16453) Capture session error: The operation could not be completed Capture session error: The operation could not be completed func start(for deviceFormat: String) async throws -> AnyPublisher<CMSampleBuffer, Swift.Error> { func configureCaptureDevice(with deviceFormat: String) throws { guard let format = formatDict[deviceFormat] else { throw Error.captureFormatNotFound } captureSession.beginConfiguration() defer { captureSession.commitConfiguration() } try captureDevice.lockForConfiguration() captureDeviceFormat = deviceFormat captureDevice.activeFormat = format captureDevice.unlockForConfiguration() } return try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Start capture session for \(deviceFormat): \(String(describing: captureSession))") // If we were already steaming camera images from a different mode, terminate that stream. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" do { // Re-configure with the new format; should be harmless if called with the currently configured format. try configureCaptureDevice(with: deviceFormat) // Return a new stream publisher for this invocation. bufferPublisher = PassthroughSubject<CMSampleBuffer, Swift.Error>() // If we are not currently running, start the image capture pipeline. if captureSession.isRunning == false { captureSession.startRunning() } continuation.resume(returning: bufferPublisher!.eraseToAnyPublisher()) } catch { logger.fault("Failed to start camera: \(error.localizedDescription)") continuation.resume(throwing: error) } } } } func stop() async throws { try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Stop capture session: \(String(describing: captureSession))") // The following invocation is synchronous and takes time to execute; // looks like a stall but you can ignore it as the MainActor is not blocked. captureSession.stopRunning() // Terminate the stream and reset our state. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" // Signal the caller that we are done here. continuation.resume() } } }
0
0
232
4w
iOS 26.4 regression: The `.pauses` audiovisual background playback policy does not pause video playback anymore when backgrounding the app
Starting with iOS 26.4 and the iOS 26.4 SDK, the .pauses audiovisual background playback policy is not correctly applied anymore to an AVPlayer having an attached video layer displayed on screen. This means that, when backgrounding a video-playing app (without Picture in Picture support) or locking the device, playback is not paused automatically by the system anymore. This issue affects the Apple TV application as well. We have filed FB22488151 with more information.
0
0
230
Apr ’26
iPad Pro M4 giving wrong value for layerPointConverted for ultra wide angle
I am using iPad Pro M4 device to apply exposure point to the camera. While converting layerPointConverted from 0 -1 range to device size point it is giving wrong value. But if same code is used for other iPad like Gen2, it gives proper value. In both cases video gravity used is resizeAspectFill. I tried using true depth camera for M4 device but it does not work.
0
0
200
Apr ’26
macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
Replies
7
Boosts
0
Views
1k
Activity
1h
Camera launched via Camera Control is terminated with “AVCaptureEventInteraction not installed” when viewing/editing photos
I’m seeing a reproducible system-level Camera crash/termination on iPhone Air running iOS 26.4.2. Steps to reproduce: Press Camera Control to launch the Camera app. Tap the lower-left thumbnail to enter the recent photo view. Browse photos, or tap Edit and start cropping a photo. The Camera/Photos flow unexpectedly exits and returns to the Home Screen or widget view. Additional detail: The issue can happen whether or not a new photo is taken after launching Camera with Camera Control. In other words, using Camera Control as a shortcut into Camera, then tapping the lower-left thumbnail to browse photos, can trigger the issue. Sometimes it happens while only browsing photos, without entering Edit. Expected result: The photo viewer/editor should stay open and allow normal browsing or cropping. Actual result: The flow exits unexpectedly. Mac Console evidence: Around 2026-05-12 21:53:59-21:54:00, Console showed SpringBoard/RunningBoard terminating com.apple.camera. Relevant log excerpt: Capture Application Requirements Unmet: "AVCaptureEventInteraction not installed" reportType: CrashLog ReportCrash Parsing corpse data for pid 94087 com.apple.camera: Foreground: false Storage is sufficient. Restart/reset-style support steps have already been tried and did not resolve the issue. This appears specific to the Camera Control launch path, not normal Photos app browsing. Has anyone else seen this on iOS 26.x, or is this a known Camera Control / AVCaptureEventInteraction regression? Already Filed as FB22766094.
Replies
0
Boosts
0
Views
52
Activity
8h
AVCaptureSession runtime error -11800 / 'what' on startRunning() with audio input — what's holding the HAL?
AVCaptureSession.startRunning() triggers AVCaptureSessionRuntimeErrorNotification with AVError.unknown (-11800), underlying OSStatus 2003329396 → fourCC 'what', every cold launch, but only when an audio AVCaptureDeviceInput is attached. Removing only the audio input makes the error disappear. Same code in a fresh project records audio fine — bug only appears in this app's binary. AVAudioApplication.shared.recordPermission == .granted. Info.plist has NSMicrophoneUsageDescription. No interruption notifications fire. Test device: iPhone 16 Pro, iOS 26.4.2. iOS deployment target 17.1. Minimal reproducer import AVFoundation let session = AVCaptureSession() session.beginConfiguration() let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! session.addInput(try AVCaptureDeviceInput(device: camera)) // Removing ONLY this line makes the error disappear: let mic = AVCaptureDevice.default(for: .audio)! session.addInput(try AVCaptureDeviceInput(device: mic)) session.addOutput(AVCaptureMovieFileOutput()) session.addOutput(AVCapturePhotoOutput()) session.commitConfiguration() NotificationCenter.default.addObserver( forName: .AVCaptureSessionRuntimeError, object: session, queue: nil ) { print($0.userInfo ?? [:]) } session.startRunning() // -11800 / 'what' fires within ~2 sec Observed state at error time AVError.unknown (-11800) underlyingError = NSError(NSOSStatusErrorDomain, 2003329396) userInfo[AVErrorFourCharCode] = 'what' captureSession.isRunning = false ← never came up captureSession.isInterrupted = false captureSession.preset = .high captureSession.inputs = [Back Triple Camera, iPhone Microphone] AVAudioSession.sharedInstance(): category = .playAndRecord mode = .videoRecording sampleRate = 48000.0 isInputAvailable = true isOtherAudioPlaying = false availableInputs = [MicrophoneBuiltIn] (no BT/Continuity/AirPods) currentRoute.inputs = [] ← EMPTY currentRoute.outputs = [Speaker|Speaker] 2003329396 = 0x77686174 = 'what'. From a few SO threads this maps to AURemoteIO::StartIO returning a HAL-bring-up failure. The smoking gun: currentRoute.inputs is empty even though availableInputs contains the built-in mic, isInputAvailable is true, the category is .playAndRecord, and isOtherAudioPlaying is false. The HAL never routes the mic into the session, then 'what' follows. Nothing observable from AVAudioSession indicates a competing client. Environment / SDKs linked Firebase (SPM: Crashlytics, Performance, Messaging, Analytics, AppCheck, RemoteConfig, DynamicLinks), FBSDK, Kingfisher, MetalPetal. Multiple Google ad mediation pods present, but their audio session takeover is already disabled (audioVideoManager.isAudioSessionApplicationManaged = true, IMSdk.shouldAutoManageAVAudioSession(false)). What I've ruled out (all still produce 'what') Audio session config: .playAndRecord/.videoRecording, .playAndRecord/.default, .record/.measurement, .record/.default. With/without .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .mixWithOthers. setActive(true) before vs. after attaching audio input. setPreferredInput(builtInMic) (verified accepted). 200ms Thread.sleep between setActive(true) and startRunning(). Setting usesApplicationAudioSession = false swaps the fourCC to '!rec' but produces the same outcome. Topology: sessionPreset = .high / .hd1920x1080 / .hd1280x720 / .medium. Camera = .builtInTripleCamera / .builtInDualWideCamera / .builtInWideAngleCamera. AVCam-style always-attached graph. Setting sessionPreset before vs. after adding inputs. Threading: All session mutations on a single dedicated DispatchQueue (vs. Swift actor). 1× and 2× full stopRunning()+startRunning() recovery cycles ("do it twice" pattern) — both re-fail with 'what'. SDK takeover prevention: GoogleMobileAdsMediation pods (Vungle, Mintegral, Pangle, Unity, InMobi), Google-Mobile-Ads-SDK, MediaPipeTasksVision removed via full pod uninstall + clean build — 'what' persists. Notifications during the failure window: 3 × AVAudioSession.routeChangeNotification reason categoryChange before the error fires, even though category stays .playAndRecord/.videoRecording. Disabling automaticallyConfiguresApplicationAudioSession drops this to 1, but the runtime error still fires. No AVAudioSession.interruptionNotification. No AVCaptureSessionWasInterruptedNotification. Symbol audit otool -L and nm of the bundle confirm none of the linked frameworks reference AVAudioRecorder, AudioComponentInstanceNew, AURemoteIO, or AudioUnitInitialize in their symbol tables. Only the app's own files reference any audio API. Yet adding AVCaptureDeviceInput(.audio) reproduces 100% in this binary and 0% in a fresh project. My questions Who is most likely holding the audio HAL in a process where no linked framework references the AudioUnit / HAL APIs directly? Are there framework load-time audio initializations that don't show up in symbol tables (e.g., dynamic dlopen, CFBundleLoadExecutable) that could grab the HAL? Is there an os_log subsystem / category that surfaces the underlying AURemoteIO::StartIO failure reason at runtime? com.apple.coreaudio shows 'what' but not the originating cause. currentRoute.inputs is empty at error time even though availableInputs = [MicrophoneBuiltIn], isInputAvailable = true, and the category is .playAndRecord. What does an empty input route under those conditions imply, and what other system-level holders could be preventing the HAL from routing the mic in? Has anyone seen 'what' resolve with a device reboot, an iOS update, or by removing a specific framework? Happy to share a sysdiagnose. Thanks!
Replies
1
Boosts
0
Views
255
Activity
22h
RotationCoordinator returns angles 90 degrees lower on iPhone 17 Pro front camera — clarification on contract with AVSampleBufferDisplayLayer
Hi AVFoundation team, I'm seeing a uniform 90° offset in AVCaptureDevice.RotationCoordinator's reported angles between iPhone 17 Pro and iPhone 14 Pro using the front-facing .builtInWideAngleCamera (Center Stage on 17 Pro), and I'd like to confirm whether this is by design and what the recommended consumption pattern is when the rendering surface is an AVSampleBufferDisplayLayer rather than an AVCaptureVideoPreviewLayer. Here is the github repo of sample project. Setup Devices: iPhone 14 Pro (iOS 26.5) and iPhone 17 Pro (iOS 26.4.2) Camera: front, AVCaptureDeviceTypeBuiltInWideAngleCamera Active format: 1920×1080 Three RotationCoordinator instances are created on the same AVCaptureDevice, varying only the previewLayer: argument: - previewLayer: nil - previewLayer: AVSampleBufferDisplayLayer (the surface receiving frames from AVCaptureVideoDataOutput) - previewLayer: AVCaptureVideoPreviewLayer (with .session = captureSession, not displayed) Each instance is KVO-observed for videoRotationAngleForHorizonLevelPreview and videoRotationAngleForHorizonLevelCapture. Observed angles Device / Orientation: 14 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° ──────────────────────────────────────── Device / Orientation: 14 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 180° RC[AVSampleBufferLayer] prev / cap: 180° / 180° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 180° ──────────────────────────────────────── Device / Orientation: 17 Pro · Portrait (interface=1) RC[nil] prev / cap: 0° / 0° RC[AVSampleBufferLayer] prev / cap: 0° / 0° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 0° ──────────────────────────────────────── Device / Orientation: 17 Pro · LandscapeRight (interface=3) RC[nil] prev / cap: 0° / 90° RC[AVSampleBufferLayer] prev / cap: 90° / 90° RC[AVCaptureVideoPreviewLayer] prev / cap: 0° / 90° The −90° offset on 17 Pro is uniform: it appears in every RC variant, in both the preview-angle and the capture-angle properties, at every orientation tested. It is not specific to the previewLayer: argument.
Replies
1
Boosts
0
Views
243
Activity
1d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: 1.On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? 2.Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? 3.For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
Replies
0
Boosts
0
Views
138
Activity
3d
Radiometric interpretation of Apple ProRAW and Bayer RAW access via AVFoundation
I am working on a computational photography research project involving multi-exposure HDR reconstruction using Bayer RAW and Apple ProRAW captures. I would like to clarify the radiometric interpretation of Apple ProRAW and the availability of Bayer RAW capture through AVFoundation. My questions are: On current iPhone Pro devices, is it possible for third-party apps to capture and export true Bayer-pattern RAW DNG files through AVFoundation, rather than Apple ProRAW linear DNG files? If so, which availableRawPhotoPixelFormatTypes correspond to Bayer RAW, and what device or format restrictions apply? Apple ProRAW appears to be demosaiced and computationally processed, and may include multi-frame fusion. Is the decoded ProRAW image intended to be radiometrically linear and scene-referred? For a bracketed ProRAW sequence captured with fixed ISO, white balance, lens, and focus, but different exposure times, can one assume that the decoded linear pixel values Y_i(p) satisfy an exposure-proportional model in non-saturated regions, such as Y_i(p) ≈ t_i R(p), across brackets? This question is about radiometric consistency for algorithmic use, not about visual editing or tone mapping. Thank you for your help.
Replies
0
Boosts
0
Views
144
Activity
3d
AudioHardwareCreateProcessTap delivers all-zero buffers while system audio is audible
Summary Using AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice for system audio capture. During long sessions, the AudioDeviceIOProc callback continues firing normally but every PCM sample is exactly 0.0f — while the system is producing audible output. Environment Field Value macOS 26.5 Beta Hardware MacBook Air (M2) API AudioHardwareCreateProcessTap + AudioHardwareCreateAggregateDevice Tap CATapDescription, processes = [], .unmuted, private Format 48,000 Hz, Float32, interleaved stereo Aggregate anchor kAudioAggregateDeviceMainSubDeviceKey = current default output UID Observed behavior After running normally for several minutes, the stream transitions into an all-zero state: AudioDeviceIOProc continues to fire at expected cadence Frame count, timestamps (mHostTime, mSampleTime), and mDataByteSize all look normal AudioBufferList pointers are valid Every sample in every buffer is exactly 0.0f Other apps are still producing audible output through the same output device The condition may self-recover or persist until the session is stopped Confirmed via RMS logging both inside the IOProc and after the ring buffer consumer — data is zero on delivery, not introduced downstream. Example: 51-minute session on MacBook Air M2 Segment 1 (~7 min): Three all-zero periods: 60 s, 53 s, 141 s. Real PCM briefly returned between them. Segment 2 (~44 min): Two all-zero periods: 16 min 3 s, 3 min 8 s. IOProc cadence, timestamp deltas, default output UID, and kAudioDevicePropertyDeviceIsRunningSomewhere all remained normal throughout. What I have ruled out Actual silence: User was in an active video call and could hear participants through the output device. Default output device change: Monitored kAudioHardwarePropertyDefaultOutputDevice — no change during affected periods. IOProc stall: Heartbeat and kAudioDevicePropertyDeviceIsRunningSomewhere remained normal. Aggregate device destroyed: AudioObjectGetPropertyData on the aggregate UID continued returning the expected device. Tap descriptor misconfiguration: The same tap produces valid PCM earlier in the same session, so this is not a startup-time issue. Why detection is hard All-zero buffers from a broken tap are indistinguishable from legitimate silence (muted participant, waiting room, paused media). kAudioProcessPropertyIsRunningOutput reports whether a process has active output IO, not whether it is contributing non-zero samples — a muted Zoom call still reports true. Possible correlations Sample-rate renegotiation on the output device (44.1 kHz ↔ 48 kHz) when another app changes output Bluetooth device state changes (AirPods sleep/wake) where UID stays the same MacBook Air more frequently affected than MacBook Pro Always occurs after extended uptime — first few minutes are consistently clean Current workaround Full teardown and rebuild restores real PCM. Restarting the IOProc alone or recreating only the aggregate device is not reliable — both the Process Tap and Aggregate Device must be destroyed and recreated. 1. AudioDeviceStop 2. AudioDeviceDestroyIOProcID 3. AudioHardwareDestroyAggregateDevice 4. AudioHardwareDestroyProcessTap 5. AudioHardwareCreateProcessTap 6. AudioHardwareCreateAggregateDevice 7. Create + start new IOProc Applying this automatically is risky because it cannot be reliably distinguished from legitimate silence. Questions Expected failure mode? Can a Process Tap continue delivering zero-filled buffers while the system output is audible? Is this expected under certain device or routing conditions? Detection signal? Is there any HAL property, notification, or diagnostic counter that distinguishes "sources are genuinely silent" from "the tap data path has stopped receiving the real mix"? Targeted recovery? Is there a supported way to re-anchor or reset the tap data path without destroying and recreating both objects? Full rebuild as intended workaround? If so, it would help to confirm this so developers can converge on a consistent approach. Mixer activity signal? kAudioProcessPropertyIsRunningOutput reflects IO registration, not sample contribution. Is there any AudioProcess property that indicates a process is currently delivering non-zero audio to the system mixer?
Replies
0
Boosts
0
Views
222
Activity
5d
CarPlay HID transport buttons remap to call-control during continuous mic capture (no opt-out API)
Hello, I am developing Uniq Intercom, a voice-only group communication app for motorcyclists (always-on intercom over WebRTC, used continuously for multi-hour rides). I am seeking guidance on an iOS audio session and CarPlay HID interaction I have not been able to resolve through documented APIs. Problem: As soon as my app activates the microphone (yellow recording indicator visible), iOS appears to classify the app as a real-time communication participant and CarPlay re-routes the steering-wheel / handlebar HID transport buttons (left / right / ok) from the media-control role to the call-control role (answer/decline). Because I do not register a CallKit / LiveCommunicationKit call (the session is a continuous group voice channel, not a discrete telephony call), there is no call object for those buttons to act upon — they effectively become inert. Why this matters: Motorcyclists rely on the intercom for 4–6 hour rides. CarPlay is now built into a growing number of modern motorcycles and with aftermarket display units virtually any bike, and any rider who uses any voice communication platform alongside it — Uniq Intercom, WhatsApp Call currently runs into this same handlebar button remap. With the buttons inert, the rider's only remaining option is to reach for the motorcycle's touchscreen to skip a track or change navigation — this is unsafe. The exact same remap behavior occurs during a real WhatsApp or Phone call — but for those the remap is correct (answer/decline maps to a real call). For continuous voice apps without a CallKit-style discrete call, no equivalent path exists today. As both an iOS developer and a motorcyclist, I would very much like to see this resolved — solving it would meaningfully improve safety for every rider using an iPhone with CarPlay. Configurations I have tested (all reproduce the symptom on iOS 18+ / 26 with wireless CarPlay): AVAudioSession.Category.playAndRecord + .voiceChat mode + various option combinations (duckOthers, mixWithOthers, allowBluetoothHFP, allowBluetoothA2DP, defaultToSpeaker) Same category with .videoChat mode (which @livekit/react-native defaults to) Same category with .default mode (re-applied after setAudioModeAsync to defeat any framework override) — confirmed Mode = Default for ~2 s window in audiomxd log before WebRTC's setActive cycle returned mode to .voiceChat. Buttons remained remapped during the .default window. Disabling MPRemoteCommandCenter and clearing MPNowPlayingInfoCenter.default().nowPlayingInfo JS-side override of WebRTC's global RTCAudioSessionConfiguration via @livekit/react-native's AudioSession.setAppleAudioConfiguration({audioMode: 'default'}) bridge, applied both before connect and after setAudioModeAsync to defeat library overrides In every case the audiomxd system log confirms our session goes active (Mode = VoiceChat or Default, Recording = YES), and CarPlay HID buttons are immediately remapped to call-control. The middle "OK" button remains functional because it is not part of the call-control mapping — confirming the buttons are not blocked, only re-purposed. The remap occurs the instant the iOS recording indicator appears, regardless of audio session mode. This led me to conclude the trigger is not audio session mode but the combination of microphone permission + active session + (likely) the AUVoiceIO unit instantiated by WebRTC. I cannot find a public API path to suppress this classification while maintaining the always-on continuous voice channel. My questions: Is there an entitlement or API that allows an app with active microphone capture to declare itself as a non-call media participant, keeping CarPlay HID transport buttons in the media role? Is AVAudioSession.setPrefersEchoCancelledInput(_:) (iOS 18+) the intended path for retaining platform AEC under .default mode without the focus-engine "communication priority" marking? Documentation is sparse on its CarPlay arbitration implications. Does the PushToTalk framework affect HID arbitration differently from playAndRecord + voiceChat? Our continuous-channel UX does not fit the PTT transmit-on-press model, but understanding the contrast would help. If no current API exists, is this something the iOS Audio team would consider for future SDKs? Solving this would meaningfully improve safety for motorcycle and adventure-sport users on iOS. Thank you for your time and any guidance you can offer. — Emre Erkaya / Uniq Intercom
Replies
1
Boosts
0
Views
123
Activity
5d
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
Replies
1
Boosts
9
Views
212
Activity
1w
Why doesn’t AVPlayer / AVFoundation support MPEG-DASH (MPD)? Any public rationale?
Hi, I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple. I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.). Questions: Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation? Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)? If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support? Thanks!
Replies
2
Boosts
1
Views
1k
Activity
1w
Working with kCVPixelFormatType_96VersatileBayerPacked12
Whilst AVCaptureSession is setup to capture ProRes RAW video, is it possible to get video pixel data which can read and processed, such as using CIImage(cvPixelBuffer: ) AVCaptureVideoDataOutput outputs ProRes RAW in kCVPixelFormatType_96VersatileBayerPacked12 pixel format. Is there a provided way to debayer this pixel format into something more usable?
Replies
0
Boosts
0
Views
100
Activity
1w
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
Replies
3
Boosts
0
Views
687
Activity
2w
PHPhotosErrorDomain Code: 3302 started affecting my users recently.
Recently I received multiple user email supports because the app cannot save video to photos, they are all on iOS 26.x The code in my app around recording video and saving to Photos hasn't changed in years. I'm not able to reproduce it locally, I tried on all my available devices. In recently published build I added additional logs and it appears that all of cases that fail with 3302 have the photos access set to "Limited Access". It never happens to users with "Full Access". In that build I also added a fallback, when saving to photos fails, the app saves to Documents and it seems it works (two of my users affected users confirmed it), but it's very unfortunate. I think it kind of proves that videos aren't broken given that users are able to play them just fine. On of users says that for him saving to Photos works for 2-3 times after he reinstalls the app and then it stops working. Did anything recently changed in how we should save videos to photos? I'm using the following code. I can see in git blame that I haven't changed in since 2020 and never encountered those errors in development or heard about those issues until around 1 month ago. Thank you. PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in
Replies
2
Boosts
0
Views
165
Activity
2w
visionOS: AVFoundation cannot deliver simultaneous video from two external (UVC) cameras; no public USB fallback exists
Area: visionOS 26.4 · AVFoundation · AVCapture · External/UVC video Classification: Suggestion / API Enhancement Request (also: Incorrect/Missing Documentation) Device / OS: Apple Vision Pro, visionOS 26.x. Xcode 26.4.1, XROS26.4.sdk. Summary On visionOS, a third-party app cannot display two UVC USB cameras (connected through a powered USB-C hub) at the same time. Every AVFoundation path that would enable this on iPadOS is either unavailable or fails at runtime on visionOS, and there is no public non-AVFoundation fallback (no IOUSBHost, no DriverKit, no usable CoreMediaIO, no MFi path for generic UVC devices). This is a real capability gap relative to iPadOS and macOS, and Camo Studio on iPadOS (App Store ID 6450313385) demonstrates the two-camera USB-hub use case is legitimate and valuable for spatial-video/hybrid-capture workflows on Vision Pro. Steps to reproduce Connect a powered USB-C hub to Apple Vision Pro with two UVC webcams attached. Build a visionOS app that uses AVCaptureDevice.DiscoverySession(deviceTypes: [.external], …). Observe: both cameras are discovered and enumerate as distinct AVCaptureDevices. Attempt A — two independent sessions: Create two independent AVCaptureSessions, each with one AVCaptureDeviceInput and one AVCaptureVideoDataOutput, start both. Result: only one session delivers sample buffers. The other stalls silently with no error and no interruption notification. Attempt B — AVCaptureMultiCamSession with manual connections (the pattern that works on iPadOS 18+): Result: code does not compile. In XROS26.4.sdk: AVCaptureInputPort is API_UNAVAILABLE(visionos) (AVCaptureInput.h) AVCaptureInput.ports is API_UNAVAILABLE(visionos) AVCaptureDeviceInput.portsWithMediaType:sourceDeviceType:sourceDevicePosition: is API_UNAVAILABLE(macos, visionos) Therefore AVCaptureConnection(inputPorts:output:) cannot be constructed. AVCaptureMultiCamSession itself is declared API_AVAILABLE(… visionos(2.1)), which is misleading because without input-port access the manual-connection path the class requires is unreachable. Expected behavior Either of the following would resolve this, in order of preference: Expose the missing API surface on visionOS. Make AVCaptureInputPort, AVCaptureInput.ports, and AVCaptureDeviceInput.portsWithMediaType:sourceDeviceType:sourceDevicePosition: available on visionOS so the documented iPadOS multi-cam pattern compiles and runs. AVCaptureMultiCamSession is already declared available — the supporting API surface should match. Allow two concurrent plain AVCaptureSessions to each own a distinct external AVCaptureDevice. Each session binds a different hardware device, and the current serialization appears to be a software policy rather than a hardware constraint (a powered hub has bandwidth for both). Document the limit explicitly and surface a clear error or interruption reason on the stalled session so apps can fail loudly instead of appearing to work. Actual behavior AVCaptureMultiCamSession advertises visionos(2.1) availability but the APIs required to wire its connections are marked unavailable on visionOS. Two concurrent AVCaptureSessions silently deliver frames to only one session; no error is reported on the other. There is no public alternative framework on visionOS for raw UVC access to work around this: IOUSBHost.framework — not present in XROS26.4.sdk DriverKit — not present in XROS26.4.sdk IOKit — ships a stub (IOKit.tbd); no public USB device interfaces CoreMediaIO — headers are an apinotes stub on visionOS ExternalAccessory — MFi-only; generic UVC devices don't enumerate This means there is no public path, AVFoundation or otherwise, for a third-party visionOS app to display two UVC cameras at once. Impact / use cases Apple Vision Pro is uniquely suited to multi-camera monitoring and capture workflows — spatial creators, broadcast/AV producers, multi-angle reference during immersive authoring, clinical and field-recording use cases, and apps that combine a primary UVC cinema camera with a secondary UVC reference/overview angle. iPadOS already supports this via AVCaptureMultiCamSession (demonstrated shipping by Camo Studio). The current visionOS limitation pushes these workflows back to iPad or macOS and undermines Vision Pro's positioning as a pro capture/monitor environment. References iPadOS reference implementation: Apple sample Displaying Video From Connected Devices + AVCaptureMultiCamSession with manual AVCaptureConnection wiring — works on iPadOS 18+ with two UVC cameras via a powered hub. Shipping precedent: Camo Studio — two simultaneous UVC cameras via USB hub on iPad — https://apps.apple.com/us/app/camo-studio-stream-record/id6450313385 visionOS 26.4 SDK headers cited above (AVCaptureInput.h, AVCaptureSession.h).
Replies
1
Boosts
0
Views
1.2k
Activity
2w
Clarification on WWDC25 Session 300: Do iPhone 11 and SE (2nd gen) fully support Frame Interpolation & Super Resolution without issues?
Hello everyone, I have a question regarding the Ultra-Low Latency Frame Interpolation and Super Resolution features introduced in WWDC 2025 Session 300 (https://developer.apple.com/videos/play/wwdc2025/300/). In the video, it was mentioned that these features run on any device as long as it has iOS 26.0 or later and an Apple Silicon chipset. Based on the official support guide (https://support.apple.com/ko-kr/guide/iphone/iphe3fa5df43/ios), the iPhone 11 and iPhone SE (2nd generation) are listed as supported devices. I just want to double-check and confirm: since they meet the criteria mentioned in the video, do these features actually run without any performance issues or limitations on the iPhone 11 and iPhone SE (2nd gen)? I want to make sure I understand the exact hardware capabilities before proceeding with development. Thanks for your help!
Replies
1
Boosts
0
Views
436
Activity
3w
AVContentKeySession: Cannot re-fetch content key once obtained — expected behavior?
We are developing a video streaming app that uses AVContentKeySession with FairPlay Streaming. Our implementation supports both online playback (non-persistable keys) and offline playback (persistable keys). We have observed the following behavior: Once a content key has been obtained for a given Content Key ID, AVContentKeySession does not trigger contentKeySession(_:didProvide:) again for that same Key ID We also attempted to explicitly call processContentKeyRequest(withIdentifier:initializationData:options:) on the session to force a new key request for the same identifier, but this did not result in the delegate callback being fired again. The session appears to consider the key already resolved and silently ignores the request. This means that if a user first plays content online (receiving a non-persistable key), and later wants to download the same content for offline use (requiring a persistable key), the delegate callback is not fired again, and we have no opportunity to request a persistable key. Questions Is this the expected behavior? Specifically, is it by design that AVContentKeySession caches the key for a given Key ID and does not re-request it — even when processContentKeyRequest(withIdentifier:) is explicitly called? Should we use distinct Content Key IDs for persistable vs. non-persistable keys? For example, if the same piece of content can be played both online and offline, is the recommended approach to have the server provide different EXT-X-KEY URIs (and thus different key identifiers) for the streaming and download variants? Is there a supported way to force a fresh key request for a Key ID that has already been resolved — for example, to upgrade from a non-persistable to a persistable key? Environment iOS 18+ AVContentKeySession(keySystem: .fairPlayStreaming) Any guidance on the recommended approach for supporting both streaming and offline playback for the same content would be greatly appreciated.
Replies
1
Boosts
0
Views
387
Activity
4w
`LockedCameraCaptureManager` practically unusable since iOS 26
Somewhere since iOS 26, the LockedCameraCapture framework gets in an unpredictable state after opening the main app from the LockedCamera extension using LockedCameraCaptureSession.openApplication(for userActivity:). (Feedback with sample code to reproduce: FB21966835) Opening the extension from the lock screen again doesn’t open the extension but puts the lock screen in a state as if it has. Content updated from LockedCameraCaptureManager.shared.sessionContentUpdates comes in inconsistently, usually needs the app to be opened again or the extension to be opened. This makes using this extension impossible for me as I use it to record video files that manually need to be imported when the app is launched (so not through PhotoKit). Does anybody have a suggestion to circumvent this issue or how to get this fixed?
Replies
0
Boosts
0
Views
265
Activity
4w
Setting up video and image capture pipeline creates internal errors in AVFoundation.
I have created code for iOS that allows me to start and stop video acquisition from a proprietary USB camera using AVFoundation's AVCaptureSession and AVCaptureDevice APIs. There is a start and stop method. The start method takes an argument to specify one of two formats that I use for my custom camera application. I can start the session and switch between formats all day without any errors. However, if I start and then stop the camera three times in a row, on the third invocation of start, I get errors in the console output and the CMSampleBuffers stop flowing to my callback. Additionally, once I get AVFoundation into this state, stoping the camera doesn't help. I have to kill the app and start over. Here are the errors. And below these, the code. I'm hoping someone who has experience with these errors or an engineer from Apple who knows the AVFoundation image capture pipeline code, can respond and tell me what I'm doing wrong. Thanks. <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:558) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:269) - (err=-16453) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:511) - (err=-16453) Capture session error: The operation could not be completed Capture session error: The operation could not be completed func start(for deviceFormat: String) async throws -> AnyPublisher<CMSampleBuffer, Swift.Error> { func configureCaptureDevice(with deviceFormat: String) throws { guard let format = formatDict[deviceFormat] else { throw Error.captureFormatNotFound } captureSession.beginConfiguration() defer { captureSession.commitConfiguration() } try captureDevice.lockForConfiguration() captureDeviceFormat = deviceFormat captureDevice.activeFormat = format captureDevice.unlockForConfiguration() } return try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Start capture session for \(deviceFormat): \(String(describing: captureSession))") // If we were already steaming camera images from a different mode, terminate that stream. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" do { // Re-configure with the new format; should be harmless if called with the currently configured format. try configureCaptureDevice(with: deviceFormat) // Return a new stream publisher for this invocation. bufferPublisher = PassthroughSubject<CMSampleBuffer, Swift.Error>() // If we are not currently running, start the image capture pipeline. if captureSession.isRunning == false { captureSession.startRunning() } continuation.resume(returning: bufferPublisher!.eraseToAnyPublisher()) } catch { logger.fault("Failed to start camera: \(error.localizedDescription)") continuation.resume(throwing: error) } } } } func stop() async throws { try await withCheckedThrowingContinuation { continuation in sessionQueue.async { [unowned self] in logger.debug("Stop capture session: \(String(describing: captureSession))") // The following invocation is synchronous and takes time to execute; // looks like a stall but you can ignore it as the MainActor is not blocked. captureSession.stopRunning() // Terminate the stream and reset our state. bufferPublisher?.send(completion: .finished) bufferPublisher = nil captureDeviceFormat = "" // Signal the caller that we are done here. continuation.resume() } } }
Replies
0
Boosts
0
Views
232
Activity
4w
iOS 26.4 regression: The `.pauses` audiovisual background playback policy does not pause video playback anymore when backgrounding the app
Starting with iOS 26.4 and the iOS 26.4 SDK, the .pauses audiovisual background playback policy is not correctly applied anymore to an AVPlayer having an attached video layer displayed on screen. This means that, when backgrounding a video-playing app (without Picture in Picture support) or locking the device, playback is not paused automatically by the system anymore. This issue affects the Apple TV application as well. We have filed FB22488151 with more information.
Replies
0
Boosts
0
Views
230
Activity
Apr ’26
iPad Pro M4 giving wrong value for layerPointConverted for ultra wide angle
I am using iPad Pro M4 device to apply exposure point to the camera. While converting layerPointConverted from 0 -1 range to device size point it is giving wrong value. But if same code is used for other iPad like Gen2, it gives proper value. In both cases video gravity used is resizeAspectFill. I tried using true depth camera for M4 device but it does not work.
Replies
0
Boosts
0
Views
200
Activity
Apr ’26