How to fix "Sample 0 missing LiDAR point cloud!" error?

I'm trying to run a PhotogrammetrySession based on photos taken in an AVCaptureSession and stored as .heic files.

When I load the files I'm always seeing the error "Sample 0 missing LiDAR point cloud!" showing up for each individual sample.

Debugging shows that sample.depthDataMap is populated, also the .heic contains depth data which can be extracted using e.g. heif-convert on my Mac.

Comparing the .heic I created to one of the ObjectCaptureSession which doesn't show the LiDAR warning, I noticed the only difference being the HEIC information here:

So my questions are:

  • Are these the missing information in my manual capture causing this warning?
  • Can I somehow add these information in an AVCaptureSession?
  • Do these information allow better photogrammetry results?

This answer is somewhat covered here: https://developer.apple.com/forums/thread/741948?answerId=811104022#811104022

The essence is that this warning refers to custom metadata saved in the HEIC images when using ObjectCaptureSession UI on iOS devices with LiDAR. There is no public way to provide this data besides using our ObjectCaptureSession capture UI which adds it automatically. If you are using another capture approach, you may safely ignore these warnings. The benefit of using the ObjectCaptureSession front-end is that we may additionally use these raw points to improve reconstruction of textureless surfaces (e.g. white walls) where depth maps may not be sufficient.

To be clear, this point cloud data is distinct from the depthDataMap in the sample. This depth map may be available when reading HEIC files. It may be automatically populated (even on non-LiDAR devices) from stereo disparity data generated by AVCaptureSession in a multi-camera session, or may be derived from LiDAR data. Regardless of provenance, this depth data will be used to estimate scale if available.

The other HEIC metadata you see in the front-end captured images (position, rotation, etc) is also used to improve reconstruction performance, but is not required. These metadata are part of the HEIC standard, and you may find some information about another use of them in the context of Spatial Photos: https://developer.apple.com/documentation/ImageIO/writing-spatial-photos Our front end encodes the ARKit poses and intrinsics into this metadata according to the HEIC standard. These data have various potential uses (e.g. rendering relative capture locations of images in a 3D viewer), but the reconstruction uses them if available to assume initial estimates for the captured image's camera. That said, our PhotogrammetrySession algorithm estimates poses and intrinsics itself as part of reconstruction, even on images without this data. These computed poses and intrinsics are additionally available in the outputs of the reconstruction session for potential use in your specific use case (see also: https://developer.apple.com/documentation/realitykit/photogrammetrysession/poses)

How to fix "Sample 0 missing LiDAR point cloud!" error?
 
 
Q