I'm trying to benchmark a Core Image filter chains memory footprint and notice a weird quirk in instruments.
On a real device, even with a simple Core Image chain, the memory balloons each time I ran the filter. See attached screen shots.
Running on iPhone 17 Pro:
Running on simulator (M2 Macbook Pro)
As you can see there's a huge build up of 4MB "VM: IOSurface" memory on the real device, but the simulator seems to clean it up correctly.
Here's my basic code:
func processImage() {
guard let inputImage = ContentViewModel.loadImageFromBundle(name: "kitty.HEIC") else {
print("Failed to load sample_image from bundle")
return
}
var outputImage = inputImage
outputImage = outputImage.applyingFilter("CIBloom", parameters: [
kCIInputRadiusKey: 20,
kCIInputIntensityKey: 0.8
])
DispatchQueue.global(qos: .userInitiated).async {
let data = self.context.jpegRepresentation(of: outputImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!)
if let data = data, let uiImage = UIImage(data: data) {
DispatchQueue.main.async {
self.displayImage = Image(uiImage: uiImage)
}
}
}
}
Why is this happening? Seems like a bug to me or I need to release an object. At the very least makes it challenging to measure memory usage.
Any help is greatly appreciated.
Alex
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Device: iPhone 17 Pro
iOS Version: iOS 26.1
Camera: Ultra-wide (0.5x) using AVCaptureSession
Our camera app freezes on iPhone 17 when switching frame rates (30fps ↔ 60fps). This works fine on iPhone 16 Pro and earlier.
What We've Observed:
Freeze happens on frame rate change - particularly when stabilization was enabled
Thread.sleep is used - to allow camera hardware to settle before re-enabling stabilization
Works on older iPhones - only iPhone 17 exhibits this behavior
Console shows these errors before freeze:
17281
<<<< FigXPCUtilities >>>> signalled err=18446744073709534335 <<<< FigCaptureSourceRemote >>>> err=-17281
Is Thread.sleep on the main thread causing the freeze? Should all camera configuration be on a background queue?
Is there something specific about iPhone 17 ultra-wide camera that requires different handling?
Should we use session.beginConfiguration() / session.commitConfiguration() instead of direct device configuration?
Is calling setFrameRate from a property's didSet (which runs synchronously) problematic?
Are the FigCaptureSourceRemote errors (-17281) indicative of the problem, and what do they mean?
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback.
The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs.
The relevant capture code is:
func capturePhoto() {
let captureSettings = AVCapturePhotoSettings()
captureSettings.flashMode = .auto
photoOutput.maxPhotoQualityPrioritization = .quality
photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared)
print("**** CameraManager: capturePhoto")
}
and the delegate callbacks are:
class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate {
nonisolated(unsafe) static let shared = PhotoDelegate()
// MARK: - Delegate callbacks
func photoOutput(
_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: (any Error)?
) {
print("**** CameraManager: didFinishProcessingPhoto")
guard let pData = photo.fileDataRepresentation() else {
print("**** photoOutput is empty")
return
}
print("**** photoOutput data is \(pData.count) bytes")
}
func photoOutput(
_ output: AVCapturePhotoOutput,
willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings
) {
print("**** CameraManager: willBeginCaptureFor")
}
func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
print("**** CameraManager: willCaptureCapturePhotoFor")
}
}
The crash report significant parts are.....
Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [30850]
VM Region Info: 0 is not in any region. Bytes before following region: 4296495104
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
UNUSED SPACE AT START
--->
__TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera
Thread 0:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10
1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78
2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692
3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19
4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145
5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365
6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560
7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292
8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657
9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64
10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858
11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214
12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586
13 AppKit 0x7ff806c6b111 NSApplicationMain + 817
14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419
15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164
16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047
17 ContinuityCamera 0x10017b49e 0x100175000 + 25758
18 dyld 0x7ff8037d1418 start + 1896
Thread 1:
0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0
Thread 2:
0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0
Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext
0 ??? 0x0 ???
1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61
2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498
3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36
4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69
5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8
6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387
7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393
8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417
9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765
10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327
11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15
Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible.
Any thoughts on solving this situation would be appreciated.
Regards, Michaela
PHPhotoLibrary.authorizationStatus(for: .readWrite) == .authorized
Iinfo.plist Privacy - Photo Library Usage Description set
I check authorization before attempting to get the photoPickerItem.itemIdentifier, but every time the return value from itemIdentifier is nil. Seems I missing some permissions, but unsure why the system is still keeping _shouldExposeItemIdentifier set to false.
Topic:
Media Technologies
SubTopic:
Photos & Camera
I am new to Swift and iOS development, and I have a question about video capture performance.
Is it possible to capture video at a resolution of 4032×3024 while simultaneously running a vision/ML model on the video stream (e.g., using Vision or CoreML)?
I want to know:
whether iOS devices support capturing video at that resolution,
whether the frame rate drops significantly at that scale,
and whether it is practical to run a Vision/ML model in real-time while recording at such a high resolution.
If anyone has experience with high-resolution AVCaptureSession setups or combining them with real-time ML processing, I would really appreciate guidance or sample code.
Hi all,
I'm using Apple Sample Code below to create application using dockkit.
"Controlling a DockKit accessory using your camera app"
https://developer.apple.com/documentation/dockkit/controlling-a-dockkit-accessory-using-your-camera-app?changes=_8
I used vision hand recognition and put the observation data to dockAccessory.track, but Belkin or Insta360 devices never move on iPhone 16 Pro Max with iOS 18.3.
If I use other functions like face search (system tracking) in the app, those work ok.
I used Belkin and Insta360 Flow 2 Pro to reproduce the problem.
My friend is also saying that the custom tracking feature was working fine on the OS 18 beta, but on recent iOS 18.3 that feature does not work.
If I can get the iOS 18.0 beta then we can test that feature. But I cannot revert my iOS from 18.3 to the iOS 18.0 Beta.
Regards,
TO
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait.
Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation.
On iPhones 16 and below, the image is output in portrait orientation.
But on iPhones 17 and above, the image is output in landscape orientation.
Please help.
is forKey:fileSize considered accessing non-public API?
has your app been rejected at review stage due to this?
let resources = PHAssetResource.assetResources(for: asset)
if let resource = resources.first {
if let fileSize = resource.value(forKey: "fileSize") as? Int {
return fileSize
}
}
Hey, I'm building a camera app and I want to use the captured HDRGainMap along side the photo to do some processing with a CIFilter chain. How can this be done? I can't find any documentation any where on this, only on how to access the HDRGainMap from an existing HEIC file, which I have done successfully. For this I'm doing something like the following:
let gainmap = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeHDRGainMap)
let gainDict = NSDictionary(dictionary: gainmap)
let gainData = gainDict[kCGImageAuxiliaryDataInfoData] as? Data
let gainDescription = gainDict[kCGImageAuxiliaryDataInfoDataDescription]
let gainMeta = gainDict[kCGImageAuxiliaryDataInfoMetadata]
However I'm not sure what the approach is with a AVCapturePhoto output from a AVCaptureDevice.
Thanks!
I’ve never had a problem with any update before but as soon as I updated to 18.3 update my camera decided to start blurring when it’s in 1x & 2x, I use my camera daily for work and this is unacceptable. I’m wondering if anyone else is having this issue, it’s really frustrating..
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras.
Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator.
func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) {
guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return }
cancellables.removeAll()
rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer)
guard let coordinator = rotationCoordinator else { return }
coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview)
.receive(on: DispatchQueue.main)
.sink { degrees in
let radians = degrees * .pi / 180
MainActor.assumeIsolated {
callback(radians)
}
}
.store(in: &cancellables)
}
This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method.
Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace:
(
0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492
1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076
2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992
3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596
4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636
5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752
6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376
7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972
8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72
9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1
10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1
11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1
12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1
13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721
)
Hi, I’m working on a photo backup app.
I track the PHAsset localIdentifier to determine which photos have been backed up and which haven’t.
Recently, I’ve noticed that two users seem to have experienced the localIdentifier changes after transferring data to a new iPhone using Quick Start.
Additionally, others on StackOverflow have mentioned that the localIdentifier sometimes changes after updating the iOS version.
https://stackoverflow.com/questions/40094728/phobject-localidentifier-reliability
I’d like to confirm the reliability of the localIdentifier after an iOS version upgrade or device transfer.
Can I continue using these locally stored localIdentifiers?
Or is there another recommended approach, such as using PHCloudIdentifier?
I would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model.
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
3D Graphics
Swift Playground
Object Capture
Hi
This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]).
2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash
2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash
2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
Hello,
I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device:
dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture'
Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib'
Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file)
Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up.
Thank you!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello everyone,
I’m working on an iOS app that fetches videos from the "Recently Deleted" album using the Photos framework in Swift. However, I’m unable to fetch any videos, even though the "Recently Deleted" album contains 233 items (including videos), as seen in the Photos app.
Environment:
iOS Version: 18.3.1
Xcode Version: 16.2
Swift Version: Swift 5
Device: iPhone (simulator and physical device both tested)
Photo Library Permission: "All Photos" access granted
Recently Deleted Lock: Face ID/Passcode is disabled for "Recently Deleted"
If new photo is added to library and app is not running in foreground or was not opened after the new photo was added but the app is having full access to gallery, can it access, read the new photo - If the app is not specifically a cloud syncing app, can it have this attached function, suppose it is a game app or beauty camera app?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Privacy
PhotoKit
Background Tasks
Background Assets
Has Objective-C been deprecated?
Does the library exists in xCode 16.4?
"import WorldCaptureKit" gives error "No such module 'WorldCaptureKit'".
And I do not find any information about the library in the apple documentation.
But AI keeps suggesting me to use the library
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hi! I am making an app for Apple Vision pro (VisionOS 2.5) that is scanning the surroundings and recognises all the texts around you. I tried to use the AVCaptureSession library, but when I run the app from xcode on the real AVP device, the camera is not accessible. I enabled the camera access in my Info.plist: NSCameraUsageDescription Used for live text recognition and I checked camera settings in the AVP, there are no restrictions. However I have always a black square with a crossed camera icon displayed instead of the image from the camera.
I tried a couple of different apps from Github using the AVCaptureSession and they all display the black square instead of the picture.
What can be wrong with the camera?
Topic:
Media Technologies
SubTopic:
Photos & Camera