Environment
Device: iPhone 15 Pro
iOS: iOS 18.0
Framework: AVFoundation
App type: Custom camera app using AVCaptureSession + AVCaptureVideoPreviewLayer
I’m seeing an intermittent but frequent issue where the camera preview layer briefly flashes empty after certain interruptions, even though the capture session reports itself as running and no errors are emitted.
This happens most often after:
Locking and unlocking the device
Switching cameras (back ↔ front)
The issue is not 100% reproducible, but occurs often enough to be noticeable in normal usage.
What happens
The preview layer briefly flashes as empty (sometimes just a “micro-frame”)
Duration: typically ~0.5–2 seconds before frames resume
session.isRunning == true throughout
No crash, no runtime error, no interruption end failure
Focus/exposure restore correctly once frames resume
Visually it looks like the preview layer loses frames temporarily, even though the session appears healthy.
Repro
Intermittent but frequent after:
Lock → unlock device
Switching camera (front/back)
Timing-dependent and non-deterministic
Happens multiple times per session, but not every time
Key observation
AVCaptureSession.isRunning == true does not guarantee that frames are actually flowing.
To verify this, I added an AVCaptureVideoDataOutput temporarily:
During the blank period, no sample buffers are delivered
Frames resume after ~1–2s without any explicit restart
Session state remains “running” the entire time
What I’ve tried (did NOT fix it)
Adding delays before/after startRunning() (0.1–0.5s)
Calling startRunning() on different queues
Restarting the session in AVCaptureSessionInterruptionEnded
Verifying session.connections (all show isActive == true)
Rebuilding inputs/outputs during interruption recovery
Ensuring startRunning() is never called between beginConfiguration() / commitConfiguration()
(Hit the expected runtime warning when attempted)
None of the above removed the brief blank preview.
Workaround (works visually but expensive)
This visually fixes the issue, but:
Energy impact jumps from Low → High in Xcode Energy Gauge
AVCaptureVideoDataOutput processes 30–60 FPS continuously
The gap only lasts ~1–2s, but toggling the delegate on/off cleanly is difficult
Overall CPU and energy cost is not acceptable for production
Additional notes
CPU usage is already relatively high even without the workaround (this app is camera-heavy by nature)
With the workaround enabled, energy impact becomes noticeably worse
The issue feels like a timing/state desync between session state and actual frame delivery, not a UI issue
Questions
Is this a known behavior where AVCaptureSession.isRunning == true but frames are temporarily unavailable after interruptions?
Is there a recommended way to detect actual frame flow resumption (not just session state)?
Should the AVCaptureVideoPreviewLayer.connection (isActive / isEnabled) be explicitly checked or reset after interruptions?
Is there a lightweight, energy-efficient way to bridge this short “no frames” gap without using AVCaptureVideoDataOutput?
Is rebuilding the entire session the only reliable solution here, or is there a better pattern Apple recommends?
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Dear Apple Technical Support Team,
Greetings! I am an iOS app developer, currently upgrading the functions of the photo app I developed Recently, I noticed the new Spatial Photos feature added in the iOS 26 system, which brings an immersive 3D photo experience to users. We hope to integrate similar capabilities into our own app to provide users with a richer photo viewing experience.
Through technical research, we found that on Apple Vision devices, the similar spatial photo display effect can be achieved through the ImagePresentationComponent.Spatial3DImage interface. However, our tests show that this interface only supports visionOS and cannot be called in the iOS system. At present, iOS 26 already natively supports the Spatial Photos feature, and we hope to know how to enable third-party photo apps to also have this capability.
Here, we sincerely request your team to provide relevant technical support, mainly to understand the following questions:
Are there any official APIs, SDKs, or development frameworks applicable to the iOS 26 system that can support third-party apps to implement core functions such as the generation and display of spatial photos?
If there are no public adaptive interfaces available at present, are there any other compliant technical solutions or alternative paths to achieve similar effects?
For third-party apps to integrate the spatial photo feature, are there any relevant development documents, technical specifications, or review requirements that need to be followed?
We have completed the basic function iteration of the app and have the technical capability to quickly adapt to new functions. We hope to receive guidance and support from your team to help us bring a better product experience to iOS users.
Attached are the relevant information of our app and the detailed report on interface compatibility during the test for your reference. If you need any further supplementary information, please feel free to inform us.
Thank you for reviewing this email in your busy schedule, and we look forward to your reply!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hi,
I’m trying to implement the new PhotoKit PHBackgroundResourceUploadExtension. I created the extension, enabled full photo library access in the host app, and registered the extension point using the string: com.apple.photos.background-upload.
However, when I attempted to enable the extension with:
try library.setUploadJobExtensionEnabled(true)
I received the following error:
Error Domain=PHPhotosErrorDomain Code=-1 "(null)"
This happens when running the app on Xcode 26.1 and 26.2 Beta, using the iPhone 17 Pro Max simulator (iOS 26.1 and 26.2).
My question is: Is this extension supported on the simulator?
I’m asking because at the moment it’s difficult for me to test this on a physical device.
Also, What's the meaning of the error?
Thanks.
Hello, I'm wondering how to capture 24MP photos.
I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality.
When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings.
What could be the issue?
// Objective-C
// setup
[self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality];
CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue];
[self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions];
// capturing
AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings];
photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions;
photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization;
[self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate];
...
I occasionally receive reports from users that photo import from the Photos library gets stuck and the progress appears to stop indefinitely.
I’m using the following APIs (code to be added):
func fetchAsset(_ asset: PHAsset) {
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.isSynchronous = false
options.isNetworkAccessAllowed = true
options.progressHandler = { (progress, error, stop, info) in
// 🚨 never called
}
let requestId = PHImageManager.default().requestImageDataAndOrientation(
for: asset,
options: options
) { data, _, _, info in
// 🚨 never called
}
}
Due to repeated reports, I added detailed logs inside the callback closures. Based on the logs, it looks like the request keeps waiting without any callbacks being invoked — neither the progressHandler nor the completion block of requestImageDataAndOrientation is called.
This happens not only with the PHImageManager approach, but also when using PHAsset with PHContentEditingInputRequestOptions — the completion callback is not invoked as well.
func fetchAssetByContentEditingInput(_ asset: PHAsset) {
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: nil) { contentEditingInput, info in
// 🚨 never called
}
}
I suspect this is related to iCloud Photos.
Here is what I confirmed from affected users:
1. Using the native picker, iCloud download proceeds normally and the photo can be attached. However, using the PHImageManager-based approach in my app, the same photo cannot be attached.
2. Even after verifying that the photo has been fully downloaded from iCloud (e.g., by trying “Export Unmodified Originals” in the Photos app as described here: https://support.apple.com/en-us/111762, and confirming the iCloud download progress completed), the callback is still not invoked for that asset.
Detailed flow for (1):
• I asked the user to attach the problematic photo (the one where callbacks never fire) using the native photo picker (UIImagePickerController).
• The UI showed “Downloading from iCloud” progress.
• The progress advanced and the photo was attached successfully.
• Then I asked the user to attach the same photo again using my custom photo picker (which uses the PHImageManager APIs mentioned above).
• The progress did not advance.
• No callbacks were invoked.
• The operation waited indefinitely and never completed.
Workaround / current behavior:
• If I ask users to force-quit and relaunch the app and try again, about 6 out of 10 users can attach successfully afterward.
• The remaining ~4 out of 10 users still cannot attach even after relaunching.
• For users who are not fixed immediately after relaunch, it seems to resolve naturally after some time.
I’ve seen similar reports elsewhere, so I’m wondering if Apple is already aware of an internal issue related to this. If there is any known information, guidance, or recommended workaround, I would appreciate it.
I also logged the properties of affected PHAssets (metadata) when the issue occurs, and I can share them below if that helps troubleshooting:
[size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true]
[size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true]
[size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
I occasionally receive reports from users that photo import from the Photos library gets stuck and the progress appears to stop indefinitely.
I’m using the following APIs:
func fetchAsset(_ asset: PHAsset) {
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.isSynchronous = false
options.isNetworkAccessAllowed = true
options.progressHandler = { (progress, error, stop, info) in
// 🚨 never called
}
let requestId = PHImageManager.default().requestImageDataAndOrientation(
for: asset,
options: options
) { data, _, _, info in
// 🚨 never called
}
}
Due to repeated reports, I added detailed logs inside the callback closures. Based on the logs, it looks like the request keeps waiting without any callbacks being invoked — neither the progressHandler nor the completion block of requestImageDataAndOrientation is called.
This happens not only with the PHImageManager approach, but also when using PHAsset with PHContentEditingInputRequestOptions — the completion callback is not invoked as well.
func fetchAssetByContentEditingInput(_ asset: PHAsset) {
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: nil) { contentEditingInput, info in
// 🚨 never called
}
}
I suspect this is related to iCloud Photos.
Here is what I confirmed from affected users:
Using the native picker (My app also provides the native picker as an alternative option for attaching photos), iCloud download proceeds normally and the photo can be attached. However, using the PHImageManager-based approach in my app, the same photo cannot be attached.
Even after verifying that the photo has been fully downloaded from iCloud (e.g., by trying “Export Unmodified Originals” in the Photos app as described here: https://support.apple.com/en-us/111762, and confirming the iCloud download progress completed), the callback is still not invoked for that asset.
Detailed flow for (1):
I asked the user to attach the problematic photo (the one where callbacks never fire) using the native photo picker (UIImagePickerController).
The UI showed “Downloading from iCloud” progress.
The progress advanced and the photo was attached successfully.
Then I asked the user to attach the same photo again using my custom photo picker (which uses the PHImageManager APIs mentioned above).
The progress did not advance (No callbacks were invoked).
The operation waited indefinitely and never completed.
Workaround / current behavior:
If I ask users to reboot the device and try again, about 6 out of 10 users can attach successfully afterward.
The remaining ~4 out of 10 users still cannot attach even after rebooting.
For users who are not fixed immediately after reboot, it seems to resolve naturally after some time.
I’ve seen similar reports elsewhere, so I’m wondering if Apple is already aware of an internal issue related to this. If there is any known information, guidance, or recommended workaround, I would appreciate it.
I also logged the properties of affected PHAssets (metadata) when the issue occurs, and I can share them below if that helps troubleshooting:
[size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true]
[size=3.91MB] [PHAssetMediaSubtype(rawValue: 528)+DepthEffect | userLibrary | (4284x5712) | adjusted=true]
[size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.72MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
[size=2.49MB] [PHAssetMediaSubtype(rawValue: 16)+DepthEffect | userLibrary | (3024x4032) | adjusted=true]
Hi Apple Developer Support Team,
We are developing an iOS application using a camera package within a hybrid (cross-platform) framework, and we would like to confirm whether it is possible to disable the camera shutter sound programmatically.
As per our understanding, the shutter sound on iOS is system-controlled and depends on the device’s silent/ring mode, and there is no App Store–approved API available to force-disable this sound. Kindly confirm whether this understanding is correct or if any supported alternative approach exists for hybrid or native implementations.
Thank you for your clarification.
Best regards,
ParkhyaSolutions
Looking to implement to UI to tell the user to clean their lens in our app.
Implemented the KVO for the cameraLensSmudgeDetectionStatus but I'm having issues reliably triggering it in, both in our app and the main camera app. Tried to get inventive by putting tupperware over the lens, but I think the model driving this or the LiDAR sensor might be smart enough to detect there is something close to the lens.
Is there any way to trigger this change in a similar way we can trigger thermal changes in debug?
Thanks.
Hi everyone,
I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline.
These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down.
<<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302
<<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281)
Even in this clean, minimal setup, the same logs appear on iOS 26.2
The exact same logic did not produce these logs on iOS 18.x.
To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch.
My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview.
Thanks in advance for any insight.
Example Code
At which point in the image processing pipeline does iOS apply the white balance gains which can be set via AVCaptureDevice.setWhiteBalanceModeLocked(with:completionHandler:)?
Are those gains applied in the analog part of the camera pipeline, before the pixel voltage gets converted via the ADC to digital values? Or does the camera first convert the pixel voltages to digital values and then the gains are applied to the digital values?
Is this consistent across devices or can the behavior vary from device to device?
On iPhone 16 Pro Max (not tested other devices) there's a noticeable jump in the framing of the preview video when you record in the iOS AVCam Sample App. The same jump in camera framing can be observed by switching to the front facing camera and then back to the rear one.
It looks roughly consistent with switching between the 0.5x and 1x camera (but not quite a match for the same viewable area in the Camera app) - and it's only when it's initially loaded, once recording is started it retains the 'closer' image no matter how many times it's stopped/started thereafter.
I'm relatively new to Swift and haven't done anything with the camera before, so odd 'buggy' behaviour in the sample code isn't helping me understand it! :-)
Is there any way to fix this?
Some of our app's users are repeatedly running into a crash on NeutrinoCore -[NUIdentifier initWithNamespace:name:version:] + 2352. It looks from the stack trace like multiple threads are performing PHFetchRequests, but that shouldn't be causing a crash. It's isolated to a small number of users, which makes me think that it's something related to their specific Photos databases (e.g., data corruption.)
Do you have any suggestions how I might be able to resolve this?
2
libsystem_c.dylib
abort + 124
3
NeutrinoCore
-[NUAssertionPolicyCrashReport notifyAssertion:] + 66
4
NeutrinoCore
-[NUAssertionPolicyComposite notifyAssertion:] + 160
5
NeutrinoCore
-[NUAssertionPolicyUnique notifyAssertion:] + 176
6
NeutrinoCore
-[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156
7
NeutrinoCore
_NUAssertFailHandler + 176
8
NeutrinoCore
-[NUIdentifier initWithNamespace:name:version:] + 2352
9
NeutrinoCore
-[NUIdentifier initWithName:version:] + 84
10
NeutrinoCore
-[NUIdentifier initWithName:] + 68
11
PhotoImaging
+[PISchema identifier] + 36
12
PhotoImaging
+[PISchema registeredPhotosSchemaIdentifier] + 32
13
PhotoImaging
+[PIPhotoEditHelper newComposition] + 28
14
PhotoImaging
+[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160
15
PhotoImaging
+[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224
16
PhotoLibraryServices
-[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1848
17
PhotoLibraryServices
+[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108
18
Photos
__167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260
19
Photos
-[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136
20
Photos
-[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88
21
Photos
-[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 356
22
Photos
-[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624
23
Photos
-[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88
24
Photos
-[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 92
25
Photos
-[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124
26
Photos
__39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584
27
PhotoLibraryServicesCore
__106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]block_invoke.84 + 880
28
CoreFoundation
invoking + 148
29
CoreFoundation
-[NSInvocation invoke] + 424
30
Foundation
<deduplicated_symbol> + 16
31
Foundation
-[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 528
32
Foundation
__88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188
33
libxpc.dylib
_xpc_connection_reply_callout + 124
42
libsystem_pthread.dylib
start_wqthread + 8
I am unable to find any clearcut documentation on configuring AVCaptureSession pipeline to capture video with proResRAW codec type, which is 16 bit format. Is it supported only with AVCaptureMovieFileOutput or one can have AVCaptureVideoDataOutput emitting 16-bit sample buffers that can be vended to AVAssetWriter?