Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Created

Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project. When building the my game project in Unity the following error is printed to the console: Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS. Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS. As far as I can tell the build did compile cleanly, but I might be missing something. If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated. Setup is the following: macOS Tahoe 26 Beta Xcode-beta Version 26.0 beta 3 (17A5276g) Unity Plug-in branch: 2025-beta1 Unity game project version: 2022.3.60f M1 Macbook Pro The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project. The Unity Plug-in project has been built using the build.py file with the following: python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all The output is available in the attached file. build-output.txt Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
6
1
1.1k
Jul ’25
How to Configure angularLimitInYZ for PhysicsSphericalJoint in RealityKit (Pendulum/Swing Behavior)
Hello RealityKit developers, I'm currently working on physics simulations in my visionOS app and am trying to adapt the concepts from the official sample Simulating physics joints in your RealityKit app. In the sample, a sphere is connected to the ceiling using a PhysicsRevoluteJoint to create a hinge-like simulation. I've successfully modified this setup to use a PhysicsSphericalJoint instead. The basic replacement works as expected: pin1 (attached to the sphere) rotates freely around pin0 (attached to the ceiling), much like a ball-and-socket joint should, removing all translational degrees of freedom. My challenge lies with the PhysicsSphericalJoint's angularLimitInYZ property. The documentation mentions that this property allows limiting the rotation around the Y and Z axes, defining an "elliptical cone shape around the x-axis of pin0." However, I'm struggling to understand how to specify these values to achieve a desired rotational limit. If I have a sphere that is currently capable of rotating 360 degrees around pin0 (like a free-spinning ball on a string), how would I use angularLimitInYZ to restrict its rotation to a certain height or angular range, preventing it from completing a full circle? Specifically, I'm trying to achieve a "swing" like behavior where the sphere oscillates back and forth but cannot rotate completely overhead or underfoot. What values or approach should I use for the angularLimitInYZ tuple to define such a restricted pendulum-like motion? Any insights, code examples, or explanations on how to properly configure angularLimitInYZ for this kind of behavior would be incredibly helpful! The following code is modified from the sample. extension MainView { func addPinsTo(ballEntity: Entity, attachmentEntity: Entity) throws { let hingeOrientation = simd_quatf(from: [1, 0, 0], to: [0, 0, 1]) let attachmentPin = attachmentEntity.pins.set( named: "attachment_hinge", position: .zero, orientation: hingeOrientation ) let relativeJointLocation = attachmentEntity.position( relativeTo: ballEntity ) let ballPin = ballEntity.pins.set( named: "ball_hinge", position: relativeJointLocation, orientation: hingeOrientation ) // Create a PhysicsSphericalJoint between the two pins. let revoluteJoint = PhysicsSphericalJoint(pin0: attachmentPin, pin1: ballPin) try revoluteJoint.addToSimulation() } } The following image is a screenshot of the operation when changing to PhysicsSphericalJoint. Thank you in advance for your assistance.
1
0
264
Jul ’25
Subject: Handling Z-Up Blender USDZ Models in RealityKit (visionOS) for Transform Updates
Hello everyone, I'm working on a visionOS application using RealityKit and am encountering a common coordinate system challenge when integrating 3D models created in Blender. My goal is to display and dynamically update the Transform (position, rotation, scale) of models created in Blender within RealityKit. The issue arises because Blender's default coordinate system is Z-up, and while exporting to USD/USDZ, I don't have a reliable "Y-up" export option that correctly reorients the model and its transform data for RealityKit's Y-up convention. This means I'm essentially exporting models with their "up" direction along the Z-axis. When I load these Z-up exported models into RealityKit, they are often oriented incorrectly. To then programmatically update their Transform (e.g., move them, rotate them based on game logic, or apply physics), I need to ensure that the Transform values I set align with RealityKit's Y-up system, even though the original model data was authored in a Z-up context. My questions are: What is the recommended transformation process (e.g., using simd_quatf or simd_float4x4) to convert a Transform that was conceptually defined in a Z-up coordinate system to RealityKit's Y-up coordinate system? Specifically, when I have a Transform (or its translation, rotation, scale components) from a Z-up context, how should I apply this to a RealityKit Entity so it appears and behaves correctly in a Y-up world? Are there any existing convenience APIs or helper functions within RealityKit, simd, or other Apple frameworks that simplify this Z-up to Y-up Transform conversion process? Or is a manual application of a transformation quaternion (e.g., simd_quatf(angle: -.pi / 2, axis: [1, 0, 0])) the standard approach? Any guidance, code examples, or best practices from those who have faced similar challenges would be greatly appreciated! Thank you.
1
1
360
Jul ’25
GestureComponent does not support DragGesture
The following code using the new GestureComponent demonstrates inconsistency. The tap gesture prints output, but the drag gesture does not. I already checked this post, which points to this seemingly outdated sample code I assume that example is deprecated in favour of the now built in version of GestureComponent. Nonetheless, there are no compiler warnings or errors, it just fails silently. TapGesture, LongPressGesture, MagnifyGesture, RotateGesture all work, so this feels like an oversight. RealityView { content in let testEntity = ModelEntity(mesh: .generateBox(size: .init(x: 1, y: 1, z: 1))) testEntity.position = SIMD3<Float>(0,0,-1) testEntity.components.set(InputTargetComponent()) testEntity.components.set(CollisionComponent( shapes: [.generateBox(size: .init(x: 1, y: 1, z: 1))] )) let testGesture = TapGesture() .onEnded { value in print("Tapped") } testEntity.components.set(GestureComponent(testGesture)) let dragGesture = DragGesture() .onEnded { value in print("Dragged") } testEntity.components.set(GestureComponent(dragGesture)) content.add(testEntity) }
3
1
388
Jul ’25
Struggles with attaching a ModelEntity to the skeleton joints of another ModelEntity
In SceneKit, when creating an .scn file from a rigged model, the framework created an SCNNode for each bone/joint, so you could add and remove child nodes directly to and from joints, and like any other SCNNode, you could access world position and world orientation for each joint. The analog would be for joints to be accessible as child entities of a ModelEntity in RealityKit. I am unable to proceed with migrating my project from SceneKit because of this, as there does not seem to be a way to even access the true world position of a joint with the current jointNames/jointTransforms paradigm. The translation information from the given transforms is insufficient to determine the location of a joint at any given time, and other approaches like creating a GeometricPin for the given joint name and attaching it to another entity do not seem to work. So conveniently being able to attach an item to the hand of a rigged model was trivial in SceneKit and now feels impossible in RealityKit. I am not the first person to notice this, and am feeling demoralized about proceeding with RealityKit with such a critical piece of functionality blocked https://stackoverflow.com/questions/76726241/how-do-i-attach-an-entity-to-a-skeletons-joint-in-realitykit Will this be addressed in some way?
5
2
765
Jul ’25
visionOS + Unity PolySpatial: Is 15,970 MeshFilters the True Upper Limit for Industrial Scenes?
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total) Project Context & Research Goals I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with: Production line equipment Many fragmented grid objects need to be merged.) Dynamic product racks (state-switchable assets) Animated worker avatars To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold: Key Finding When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests: PolySpatial’s mirroring mechanism effectively doubles GameObject overhead An apparent hard limit exists around ~8k active mesh objects in practice Objectives for This Discussion Verify if others have encountered similar limits with PolySpatial/RealityKit Understand whether this is a: Memory constraint (per-app allocation) Render pipeline limit (Metal draw calls) Unity-specific PolySpatial behavior Explore optimization strategies beyond brute-force object reduction Why This Matters Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team: Design safer content guidelines Prioritize GPU instancing/LOD investments Potentially contribute back to PolySpatial’s optimization I’d appreciate insights from engineers who’ve: Pushed similar large-scale scenes in visionOS Worked around PolySpatial’s cloning overhead Discovered alternative capacity limits (vertices/draw calls)
4
0
652
Jul ’25
RealityKit and Reality Composer Pro aren't forward or backward compatible with each other
TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873) So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally. Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back. Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file. Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed". As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
1
0
439
Jul ’25
Problem running Unreal 5.6
I am using Unreal Engine 5.6 on a MacBook Pro with an M3 chip and macOS 15.5. I’ve installed Xcode and accepted the license, but Unreal is not detecting the latest Metal Shader Standard (Metal v3.0). The maximum version Unreal sees is Metal v2.4, even though the hardware and OS should support Metal 3.0. I’ve also run sudo xcode-select -s /Applications/Xcode.app and accepted the license via Terminal. Is there anything in Xcode settings, SDK availability, or system permissions that could be preventing access to Metal 3.0 features?"
2
0
485
Jul ’25
CAMetalLayer nextDrawable crash
Hi , My application meet below crash backtrace at very low repro rate from the public users, i do not see it relate to a specific iOS version or iPhone model. The last code line from my application is calling CAMetalLayer nextDrawable API. I did some basic studying, suppose it may relate to the wrong CAMetaLayer configuration, like frame property w or h <= 0.0 bounds property w or h <= 0.0 drawableSize w or h <= 0.0 or w or h > max value (like 16384) Not sure my above thinking is right or not? Will the UIView which my CAMetaLayer attached will cause such nextDrawable crash or not ? Thanks a lot Main Thread - Crashed libsystem_kernel.dylib __pthread_kill libsystem_c.dylib abort libsystem_c.dylib __assert_rtn Metal MTLReportFailure.cold.1 Metal MTLReportFailure Metal _MTLMessageContextEnd Metal -[MTLTextureDescriptorInternal validateWithDevice:] AGXMetalA13 0x245b1a000 + 4522096 QuartzCore allocate_drawable_texture(id<MTLDevice>, __IOSurface*, unsigned int, unsigned int, MTLPixelFormat, unsigned long long, CAMetalLayerRotation, bool, NSString*, unsigned long) QuartzCore get_unused_drawable(_CAMetalLayerPrivate*, CAMetalLayerRotation, bool, bool) QuartzCore CAMetalLayerPrivateNextDrawableLocked(CAMetalLayer*, CAMetalDrawable**, unsigned long*) QuartzCore -[CAMetalLayer nextDrawable] SpaceApp -[MetalRender renderFrame:] MetalRenderer.mm:167 SpaceApp -[FrameBuffer acceptFrame:] VideoRender.mm:173 QuartzCore CA::Display::DisplayLinkItem::dispatch_(CA::SignPost::Interval<(CA::SignPost::CAEventCode)835322056>&) QuartzCore CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long) QuartzCore CA::Display::DisplayLink::dispatch_deferred_display_links(unsigned int) UIKitCore _UIUpdateSequenceRun UIKitCore schedulerStepScheduledMainSection UIKitCore runloopSourceCallback CoreFoundation __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ CoreFoundation __CFRunLoopDoSource0 CoreFoundation __CFRunLoopDoSources0 CoreFoundation __CFRunLoopRun CoreFoundation CFRunLoopRunSpecific GraphicsServices GSEventRunModal UIKitCore -[UIApplication _run] UIKitCore UIApplicationMain
3
0
358
Jul ’25
Combining render encoders
When I take a frame capture of my application in Xcode, it shows a warning that reads "Your application created separate command encoders which can be combined into a single encoder. By combining these encoders you may reduce your application's load/store bandwidth usage." In the minimal reproduction case I've identified for this warning, I have two render pipeline states: The first writes to the current drawable, the depth buffer, and a secondary color buffer. The second writes only to the current drawable. Because these are writing to a different set of outputs, I was initially creating two separate render command encoders to handle the draws under each of these states. My understanding is that Xcode is telling me I could only create one, however when I try to do that, I get runtime asserts when attempting to apply the second render pipeline state since it doesn't have a matching attachment configured for the second color buffer or for the depth buffer, so I can't just combine the encoders. Is the only solution here to detect and propagate forward the color/depth attachments from the first state into the creation of the second state? Is there any way to suppress this specific warning in Xcode?
1
0
305
Jul ’25
CustomMetalView sample uses deprecated functions - update?
The sample code here, has code like: // Create a display link capable of being used with all active displays cvReturn = CVDisplayLinkCreateWithActiveCGDisplays(&_displayLink); But that function's doc says it's deprecated and to use NSView/NSWindow/NSScreen displayLink instead. That returns CADisplayLink, not CVDisplayLink. Also the documentation for that displayLink method is completely empty. I'm not sure if I'm supposed to add it to run loop, or what, after I get it. It would be nice to get an updated version of this sample project and/or have some documentation in NSView.displayLink
2
0
362
Jul ’25
How to customize shader code for visionos ?
Hello experts, I'm trying to implement a material with custom shader code, but I saw that visionOS doesn't allow you to inject custom Metal functions or use CustomMaterial like iOS/macOS, nor can you directly write Metal Shading Language (.metal) and use it through ShaderGraphMaterial. So my question is, if i want to implement your own shader code, how should i do it?
1
0
464
Jul ’25
RealityKit not behaving as expected
This week, I developed a small multiplatform RealityKit project. I also created a demo scene in Reality Composer Pro. Afterward, I imported the local package into the project. Running the project on macOS works perfectly. However, when I tried to run it on my iPhone, I encountered a permission error indicating that it couldn’t read the package. This seems unusual to me because I assumed that the dependency is bundled into the binary file. In an attempt to resolve the issue, I pushed the RCP package to GitHub, hoping it would work. Fortunately, everything compiles successfully now, but the loading time is significantly long, and the animations don’t play on tap gestures. Could someone please help me identify the root cause of this problem?
1
0
332
Jul ’25
Unable to package in UE5.6
Im new in the Mac area but for sure not UE. Windows is a long process to packaging but it could be done. All the documentation for Epic and from the internet is basically non existent with exactly how to package a project within UE. I have Xcode installed which makes sense, agreed to terms and install for MacOS, I've been able to make a project for several weeks now and want to package for a test run for my friends to play on Windows. Now I just get this in the log: UATHelper: Packaging (Mac): ERROR: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Trace written to file /Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.uba with size 12.6kb UATHelper: Packaging (Mac): Total time in Unreal Build Accelerator local executor: 8.12 seconds UATHelper: Packaging (Mac): Result: Failed (OtherCompilationError) UATHelper: Packaging (Mac): Total execution time: 9.71 seconds PackagingResults: Error: Failed to finalize the .app with Xcode. Check the log for more information UATHelper: Packaging (Mac): Took 9.77s to run dotnet, ExitCode=6 UATHelper: Packaging (Mac): UnrealBuildTool failed. See log for more details. (/Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.txt) UATHelper: Packaging (Mac): AutomationTool executed for 0h 0m 10s UATHelper: Packaging (Mac): AutomationTool exiting with ExitCode=6 (6) UATHelper: Packaging (Mac): RunUAT ERROR: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: AutomationTool was unable to run successfully. Exited with code: 6 PackagingResults: Error: Unknown Error This absolutely makes no sense to me. Anyone have ideas?
2
0
274
Jul ’25
EXC_BREAKPOINT, QuartzCore , Crash CA::Render::Image::new_image
We are seeing crashes in Xcode organizer. So far we are not able to reproduce them locally. They affect multiple app releases (some older, built with Xcode 15.x and newer built with Xcode 16.0). They only affect iOS 18.5. Is there anything that changed in latest iOS? It's hard to tell what exactly is causing this crash because setting symbolic breakpoint on CA::Render::Image::new_image(unsigned int, unsigned int, unsigned int, unsigned int, CGColorSpace*, void const*, unsigned long const*, void (*)(void const*, void*), void*) triggers this breakpoint all the time, but not necessarily with exactly the previous stack frames matching the crash report. Is it a known issue? crash.crash Thank you.
0
5
294
Jul ’25
How to implement c for vision ?
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
1
0
81
Jul ’25
Why Large-Scale Model Scenes Cause Real Device Crashes
Is there any limitation in Vision Pro when loading scenes with large-scale models? ​Test Case: Asset: Composite USDA file containing ​10 individual models​ (total triangles count: ~4.2M) Simulator: Loads and renders correctly Real Device: Loads asset successfully but ​ failure during rendering phase: Environment abruptly dims System spontaneously reboots How can we resolve this issue? Below are excerpted logs preceding the crash: <<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606 Attempted to add ornament: <MRUIPlatterOrnament: 0x10a658f00; _isInternal: YES; _displaceWindowChrome: NO; _canCaptureUI: NO; _isBeingRemoved: NO; contentAnchorPoint3D: "{0.5, 0.5, 0}"; position: <MRUIPlatterOrnamentRelativePosition: 0x105b68e70; anchorPoint: {0.5, 0.5, 1}>; rotation: "{{0, 0, 0}, 0}"; opacity: 1.000000; canFollowUser: YES; effectiveOffset: "{0, 0, 0}"; presentingViewController: 0x0; billboardingBehavior: 0x0; scalingBehavior: 0x0; relativeToParent: NO; nonHeritableDepthDisplacement: 0.000000; order: 0.000000; _window._determinedSize: {0, 0}; _window: (null)> to nil or non-supporting UIScene: <UIWindowScene: 0x10a8a0000; role: UISceneSessionRoleImmersiveSpaceApplication; persistentIdentifier: test.test:SFBSystemService-BA3A21A3-D1AB-42E2-8AF0-AE0AB83BE528; activationState: UISceneActivationStateUnattached>. No action taken. Failed to set dependencies on asset 2823930584475958382 because NetworkAssetManager does not have an asset entity for that id. apply fence tx failed (client=0x98490e18) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xa86516e2) [0x10000003 (ipc/send) invalid destination port]
1
0
208
Jul ’25