Discuss Spatial Computing on Apple Platforms.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

How to use defaultSize with visionOS window restoration?
One of the most common ways to provide a window size in visionOS is to use the defaultSize scene modifier. WindowGroup(id: "someID") { SomeView() } .defaultSize(CGSize(width: 600, height: 600)) Starting in visionOS 26, using this has a side effect. visionOS 26 will restore windows that have been locked in place or snapped to surfaces. If a user has manually adjusted the size of a locked/snapped window, the users size is only restore in some cases. Manual resize respected Leaving a room and returning later Taking the headset off and putting it back on later Manual resize NOT respected Device restart. In this case, the window is reopened where it was locked, but the size is set back to the values passed to defaultSize. The manual resizing adjustments the user has made are lost. This is counter to how all other windows and widgets work. I reported this last month (FB18429638), but haven't heard back if this is a bug or intended behavior. Questions What is the best way to provide a default window size that will only be used when opening new windows–and not used during scene restoration? Should we try to keep track of window size after users adjust them and save that somewhere? If this is intended behavior, can someone please update the docs accordingly?
1
0
453
Jul ’25
Automatically Enabling Spatial Personas in App
Is it possible to create a button in my app that will turn on the spatial personas for the user? Currently the only way I know of turning on spatial personas is by selecting the cube icon in the FaceTime window which is quite clunky for people unfamiliar with the Vision Pro's UI. Any help would be appreciated.
1
0
422
Feb ’25
VisionOS26 PresentationComponent not working
I am trying to get the new PresentationComponent working in VisionOS26 as seen in this WWDC video: https://developer.apple.com/videos/play/wwdc2025/274/?time=962 (18:29 minutes into video) Here is some other example code but it doesn't work either: https://stepinto.vision/devlogs/project-graveyard-devlog-002/ My simple Text view (that I am adding as a PresentationComponent) does not appear in my RealityView even though the entity is found. Here is a simple example built from an Xcode immersive view default project: struct ImmersiveView: View { @Environment(AppModel.self) var appModel var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) if let materializedImmersiveContentEntity = try? await Entity(named: "Test", in: realityKitContentBundle) { content.add(materializedImmersiveContentEntity) var presentation = PresentationComponent( configuration: .popover(arrowEdge: .bottom), content: Text("Hello, World!") .foregroundColor(.red) ) presentation.isPresented = true materializedImmersiveContentEntity.components.set(presentation) } } } } } Here is the Apple reference: https://developer.apple.com/documentation/realitykit/presentationcomponent
1
0
587
Jul ’25
how to achieve "concave in" glass view look?
I have been trying to implement this look where a component looks "pushed in" but I could not find any resources regarding this effect. The closest I got was a combination of a RoundedRectangle and .glassBackgroundEffect(), but this makes the view look pushed out, instead of pushed in. I was wondering if this is achievable in SwiftUI level, or even in UIKit level.
1
0
124
Apr ’25
AVPlayer stutters when using AVPlayerItemVideoOutput
We’re trying to build a custom player for Unity. For this, we’re using AVPlayer with AVPlayerItemVideoOutput to get textures. However, we noticed that playback is not smooth and the stream often freezes. For testing, we used this 8K video: https://deovr.com/nwfnq1 The video was played using the following code: @objc public func playVideo(urlString: String) { guard let url = URL(string: urlString) else { return } let pItem = AVPlayerItem(url: url) playerItem = pItem pItem.preferredForwardBufferDuration = 10.0 let pixelBufferAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, kCVPixelBufferMetalCompatibilityKey as String: true, ] let output = AVPlayerItemVideoOutput( pixelBufferAttributes: pixelBufferAttributes ) pItem.add(output) playerItemObserver = pItem.observe(\.status) { [weak self] pItem, _ in guard pItem.status == .readyToPlay else { return } self?.playerItemObserver = nil self?.player.play() } player = AVPlayer(playerItem: pItem) player.currentItem?.preferredPeakBitRate = 35_000_000 } When AVPlayerItemVideoOutput is attached, the video stutters and the log looks like this: 🟢 Playback likely to keep up 🟡 Buffer ahead: 4.08s | buffer: 4.08s 🟡 Buffer ahead: 4.08s | buffer: 4.08s 🟡 Buffer ahead: -0.07s | buffer: 0.00s 🟡 Buffer ahead: 2.94s | buffer: 3.49s 🟡 Buffer ahead: 2.50s | buffer: 4.06s 🟡 Buffer ahead: 1.74s | buffer: 4.30s 🟡 Buffer ahead: 0.74s | buffer: 4.30s 🟠 Playback may stall 🛑 Buffer empty 🟡 Buffer ahead: 0.09s | buffer: 4.30s 🟠 Playback may stall 🟠 Playback may stall 🛑 Buffer empty 🟠 Playback may stall 🟣 Buffer full 🟡 Buffer ahead: 1.41s | buffer: 1.43s 🟡 Buffer ahead: 1.41s | buffer: 1.43s 🟡 Buffer ahead: 1.07s | buffer: 1.43s 🟣 Buffer full 🟡 Buffer ahead: 0.47s | buffer: 1.65s 🟠 Playback may stall 🛑 Buffer empty 🟡 Buffer ahead: 0.10s | buffer: 1.65s 🟠 Playback may stall 🟡 Buffer ahead: 1.99s | buffer: 2.03s 🟡 Buffer ahead: 1.99s | buffer: 2.03s 🟣 Buffer full 🟣 Buffer full 🟡 Buffer ahead: 1.41s | buffer: 2.00s 🟡 Buffer ahead: 0.68s | buffer: 2.27s 🟡 Buffer ahead: 0.09s | buffer: 2.27s 🟠 Playback may stall 🛑 Buffer empty 🟠 Playback may stall When we remove AVPlayerItemVideoOutput from the player, the video plays smoothly, and the output looks like this: 🟢 Playback likely to keep up 🟡 Buffer ahead: 1.94s | buffer: 1.94s 🟡 Buffer ahead: 1.94s | buffer: 1.94s 🟡 Buffer ahead: 1.22s | buffer: 2.22s 🟡 Buffer ahead: 1.05s | buffer: 3.05s 🟡 Buffer ahead: 1.12s | buffer: 4.12s 🟡 Buffer ahead: 1.18s | buffer: 5.18s 🟡 Buffer ahead: 0.72s | buffer: 5.72s 🟡 Buffer ahead: 1.27s | buffer: 7.28s 🟡 Buffer ahead: 2.09s | buffer: 3.03s 🟡 Buffer ahead: 4.16s | buffer: 6.10s 🟡 Buffer ahead: 6.66s | buffer: 7.09s 🟡 Buffer ahead: 5.66s | buffer: 7.09s 🟡 Buffer ahead: 4.66s | buffer: 7.09s 🟡 Buffer ahead: 4.02s | buffer: 7.45s 🟡 Buffer ahead: 3.62s | buffer: 8.05s 🟡 Buffer ahead: 2.62s | buffer: 8.05s 🟡 Buffer ahead: 2.49s | buffer: 3.53s 🟡 Buffer ahead: 2.43s | buffer: 3.38s 🟡 Buffer ahead: 1.90s | buffer: 3.85s We’ve tried different attribute settings for AVPlayerItemVideoOutput. We also removed all logic related to reading frame data, but the choppy playback still remained. Can you advise whether this is a player issue or if we’re doing something wrong?
1
0
402
Oct ’25
Is `ParticleEmitterComponent` implemented for `RealityKit` on iOS?
Hi there, I was looking to add a particle emitter to my augmented reality app I'm developing using RealityKit. I'm targeting iOS. I noticed in the documentation for the ParticleEmitterComponent that it looks like iOS 18.0+ is supported, but when I try to use the ParticleEmitterComponent in my code in XCode, I get an error that it isn't found. Furthermore, this StackOverflow post seems to indicate that particle systems are not available for iOS. Would it be possible to get clarification on this?
1
0
144
May ’25
Summon gesture
Can you help to write a code able to pick an element a bit far from me, then bring it near to me, flick it a bit and then send it back to its original position when I release it? Thanks a lot, Christophe
1
0
68
Apr ’25
UICollectionViewDataSourcePrefetching does not work on SwiftUI wrapped VisionOS
prefetching logic for UICollectionView on VisionOS does not work. I have set up a Standalone test repo to demonstrate this issue. This repo is basically a visionOS version of Apple's guide project on implementation of prefetching logic. in repo you will see a simple ViewController that has UICollectionView, wrapped inside UIViewControllerRepresentable. on scroll, it should print 🕊️ prefetch start on console to demonstrate func collectionView(_ collectionView: UICollectionView, prefetchItemsAt indexPaths: [IndexPath]) is called. However it never happens on VisionOS devices. With the same code it behaves correctly on iOS devices
1
0
195
Jul ’25
Cursor display issue on attachment view in immersive space
While using Screen Mirroring in developer mode within my immersive space, I noticed an alignment issue with the computer cursor (transparent circle). When I move it toward an attachment view, the cursor remains horizontal instead of aligning with the surface of the attachment view. It shows correctly on a 2D window only wrong on attachment view. Is this behavior a bug, or could it be caused by a missing or incorrect configuration on the attachment view? Want help, thanks.
1
0
94
Apr ’25
How to Renew visionOS Enterprise API Entitlements?
How can I renew visionOS Enterprise API? I've spent so much time contacting Apple Developer Support. They said they don't know the renewal process either and are "checking with the internal operations team" - but it's been 2 months with no updates. The official documentation (https://developer.apple.com/documentation/visionOS/building-spatial-experiences-for-business-apps-with-enterprise-apis#Request-the-entitlements) says: "The license file comes with an expiration date, so you need to renew it before then to ensure your entitlements continue to function." But it doesn't explain HOW to renew. When I asked Apple Support about this, they told me: "After Apple approves your app for one or more entitlements, you receive a license file, along with additional instructions." But I never received any instructions when I was first approved, and I still don't know how to renew. There's also no direct way to contact the Enterprise API team. Now my visionOS Enterprise API license has been expired for 2 months. I submitted a renewal request, but I still haven't heard anything back. Is it normal to take more than 2 months for approval? Any advice or shared experiences would be really helpful. Thanks!
1
0
452
2w
Overlaying SwiftUI content with transparency in front of RealityView
Following up on my previous question here: https://developer.apple.com/forums/thread/774262 Having solved the clipping problem, I am now trying to overlay some content in front of the RealityView. However, it looks like any content with transparency does not render in front of the RealityView, while opaque views seem to work; placing content with transparency like glassBackgroundEffect() behind the RealityView in a ZStack causes the entire window to flicker. Additionally, my SwiftUI attachment placed in front of the stereoscopic image plane are invisible if the user look at it straight at 90 degrees. However, if the user look at it from increasing angles from the sides, the attachment gradually turns visible again. Are these behaviors expected? What is a recommended approach to overlay content in front of a RealityView? Thanks!
1
0
396
Feb ’25
Low-latency live streaming using APMP Wide FoV
Is it possible to achieve sub-second end-to-end latency when displaying live streaming video using APMP (Apple Projected Media Profile) with Wide FoV? APMP supports HLS playback, but my understanding is that standard HLS introduces several seconds of latency. I would like to know whether APMP (especially Wide FoV) supports Low-Latency HLS, or if there are inherent limitations that make sub-second latency impractical. If APMP is not suitable for this use case, are there any recommended alternatives within AVFoundation or related frameworks for rendering wide-FoV live video with very low latency? Thank you for any insights.
1
0
338
1w
Displaying spatial photos and videos on web pages in Safari
The purpose is to create a simple web-based gallery of spatial photos and videos using static html files. I have successfully displayed spatial photos using the img tag and IMG.heic files. I can tap and hold the image to bring up the contextual menu and from there select View Spatial Photo. Is there any way to add a control to the image, like a link or overlay on the image itself, that a user can simply tap to show the image in 3D? And how to host a video file on a web page without going through a CDN/streaming service? Sample html would be much appreciated.
1
0
748
4w
Can´t find a DLL in a VisionOS app with Unity
Dear all, I´m using Unity 6.2 beta and Xcode 16.2. I´m creating a simple framework to use the text to speech functionality in VisionOS from unity. The framework is created in Swift. I create an objective-c wrapper with the following declarations: ... void _initTTS(int); ... I create the framework, import it in Unity and call the functions in a c# wrapper class. The code is as follows: public static class TTSPluginManager { [DllImport("TTS_Vision"] private static extern void _initTTS(int val); ... public static void Initialize() { #if UNITY_VISIONOS _initTTS(0); #else Debug.LogWarning("NativeTTS.Initialize called on a non-iOS platform. Ignoring."); #endif } } I have managed to compile and run the program in the Apple Vision Pro, but I keep on getting the following error: DllNotFoundException: TTS_Vision assembly: type: member:(null) TTSPluginManager.Initialize () (at Assets/Plugins/TTSPluginManager.cs:33) LecturePortalManager.OnCreateStory (Ink.Runtime.Story story) (at Assets/AVRLecture/LecturePortalManager.cs:17) InkLoader.StartStory () (at Assets/AVRLecture/InkLoader.cs:24) InkLoader.Start () (at Assets/AVRLecture/InkLoader.cs:18) If I run the generated code from Xcode, I can see the app in the AVP, but I keep getting a loading error: DllNotFoundException: Unable to load DLL 'TTS_Vision'. Tried the load the following dynamic libraries: Unable to load dynamic library '/TTS_Vision' because of 'Failed to open the requested dynamic library (0x06000000) dlerror() = dlopen(/TTS_Vision, 0x0005): tried: '/TTS_Vision' (no such file) at TTSPluginManager.Initialize () [0x00000] in <00000000000000000000000000000000>:0 at LecturePortalManager.OnCreateStory (Ink.Runtime.Story story) [0x00000] in <00000000000000000000000000000000>:0 I can see in the generated code that the framework (TTS_Vision) is there, but the path seems wrong. I've tried to add more options to the searched paths, with no success... Any hints or suggestions are much more appreciated.
1
0
293
Sep ’25