ARFrame.sceneDepth not correctly registered with ARFrame.capturedImage for iPad Pro (6th Gen) for high resolution capture.

Hi team,

I believe I’ve found a registration issue between ARFrame.sceneDepth and ARFrame.capturedImage when using high-resolution frame capture on a 2022 iPad Pro (6th gen).

When enabling high-resolution capture:


if let highResFormat = ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing {
    config.videoFormat = highResFormat
}
…
arView.session.captureHighResolutionFrame { ... }

the depth map provided by ARFrame.sceneDepth no longer aligns correctly with the corresponding high-resolution capturedImage. This misalignment results in consistently over-estimated distance measurements in my app (which relies on mapping depth to 2D pixel coordinates).

iPad Pro (6th gen): misalignment occurs only when capturing high-resolution frames.

iPhone 16 Pro: depth is correctly registered for both standard and high-resolution captures.

It appears the camera intrinsics, specifically the FOV, change between the “regular” resolution stream and the high-resolution capture on the iPad. My suspicion is that the depth data continues using the intrinsics of the lower resolution stream, resulting in an unregistered depth-to-RGB mapping.

Once I have the iPad in hand again, I will confirm whether camera.intrinsics or FOV differ between the low-res and high-res frames.

Is this a known issue with high-resolution frame capture on the 2022 iPad Pro? If not, I’m happy to provide some more thorough sample code.

Thanks for your time!

ARFrame.sceneDepth not correctly registered with ARFrame.capturedImage for iPad Pro (6th Gen) for high resolution capture.
 
 
Q