I'm updating our app to support metal 4, but the metal 4 types don't seem to get recognized when targeting simulator. Is it known if metal 4 will be supported in the near future, or am I setting up the app wrong?
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have several games on the app store which are setup as "For Kids" which means these games cannot have any way to access the outside world including the App Store. No problem. These games have worked fine for years until iOS 26.
Now, all my updates are being rejected because the new version of Game Center running on iPadOS 26 has a way for people to exit the game and go to the App Store. I have no control over this since it's built into Game Center, and the app review folks want me to put a "parental gate" on it, but there's no way to do that because... well... it's in Game Center, not my code.
So, I'm unable to update my apps because of this. Presumably, the existing versions on the app store still do this exact same thing, so my update isn't going to make any difference. Does anyone know of a way to make that crap at the top go away so this isn't an issue?
Topic:
Graphics & Games
SubTopic:
GameKit
Hi,
I’m using the latest iPad Pro (13-inch) and I can see that Metal offers an rgb10a2unorm texture for rendering, but when I render a grey ramp and measure the actual luminance, I get a pattern that I would expect from an 8-bit texture (see below). Before I start ripping apart all my code, is there anything else I need to do to convince iOS to render my texture in 10-bit?
I already tried setting the PixelFormat in my CMetalLayer to rgb10a2unorm, but that didn’t change anything.
Hello,
Shaders in our application is written using HLSL and we rely on Metal Shader Converter to convert DXIL to Metal IR. We ran into an issue that causes metal pipeline state creation to fail when vertex stage-in function is used on AMD GPUs.
Here's the error reported by Metal in Xcode output:
Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED
XPC_ERROR_CONNECTION_INTERRUPTED
MTLCompiler: Compilation failed with XPC_ERROR_CONNECTION_INTERRUPTED on 4 try. This error suggests an unexpected interruption in the connection. Possible reasons: a crash in the compiler service, termination by the OS due to resource constraints (e.g., jetsam), a timeout in the service, or an issue with IPC. Verify system stability and check the logs for more details.
Compiler failed with XPC_ERROR_CONNECTION_INVALID
XPC_ERROR_CONNECTION_INVALID
MTLCompiler: Compiler encountered XPC_ERROR_CONNECTION_INVALID: failed to check-in, peer may have been unloaded: mach_error=10000003 (is the OS shutting down or process jetsammed?)
Compilation failed due to an interrupted connection: XPC_ERROR_CONNECTION_INTERRUPTED. This error occurred after multiple retries.
which seems to indicate a internal compiler error.
I have a minimal repro here: https://github.com/kcloudy0717/metal_pso_fail/tree/main, simply follow the instructions in README.
I rewrote my graphics pipeline to use Load/Store better for clearing and don't care cases. All my tests pass, and in the Metal debugger, all the draw calls succeed.
But when I present drawables (before [commandBuffer commit]) I only get a pink screen. I've tried everything I can think of: making sure the pixel formats are the same for the back buffer as my render targets, etc. But it's still pink.
Could you point me in the right direction so I can fix this, or help describe why it's pink. That would be really helpful.
Thank you,
Brian Hapgood
Topic:
Graphics & Games
SubTopic:
Metal
I have been tasked with creating content for the Apple Vision Pro. Just the 3D content and animation, not the programming end of things.
I can't seem to get any kind of mesh deformation animation to import into Reality Composer Pro. By that I mean bones/skin, or even point cache.
I work on PC, and my main software is 3DS Max, but I'm borrowing an iMac for this job, and was instructed to use RCP on it for testing before handing things off to the programmer. My files open and play fine in other USD programs, like Omniverse, or USD View, just not Reality Composer Pro.
I've seen the dinosaur demo in AVP, so I know mesh deformation is possible. If there are other essential tools that might make this possible, I have not been made aware of them.
I am experimenting with bouncing things off of Blender, in case that exports better, but not really having luck there either -though my results are different.
Thanks.
Topic:
Graphics & Games
SubTopic:
RealityKit
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc.
They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
A bit of background on what our app is doing:
We have a RealityKit ARView session running.
During this period we place objects in RealityKit.
At some point user can "take photo" and we use session.captureHighResolutionFrame to capture a frame.
We then use captured frame and frame.camera.projectPoint to project my objects back to 2D
Issue we found is that on devices that have iOS26, first photo user takes and the first frame received from session.captureHighResolutionFrame gives incorrect CGPoint for frame.camera.projectPoint. If user takes the second photo with the same camera phostion, second frame received from session.captureHighResolutionFrame gives correct CGPoint for frame.camera.projectPoint
I notices some difference between first and subsequent frames that i believe is corresponding with the issue. Yaw value of camera (frame.camera.eulerAngles.y) on first frame is not correct ( inconsistent with any subsequent frame)
I also created a small example app and i followed Building an Immersive Experience with RealityKit example to create it. The issue exists in this app for iOS26, while iOS18.* has consistent values between first and subsequent captured frames.
Note:
The yaw value seems to differ more if we start session in portrait but take photo in landscape.
Example result for 3 captured frames:
Frame captured with yaw: 1.4855177402496338
Frame captured with yaw: -0.08803760260343552
Frame captured with yaw: -0.08179682493209839
Example code:
class CustomARView: ARView, ARSessionDelegate {
required init(frame: CGRect) { super.init(frame: frame) }
required init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented")}
func setup() {
let singleTap = UITapGestureRecognizer(target: self, action: #selector(handleTap))
addGestureRecognizer(singleTap)
}
@objc
func handleTap(_ gestureRecognizer: UIGestureRecognizer) {
Task {
do {
let frame = try await session.captureHighResolutionFrame()
print("Frame captured with yaw: \(Double(frame.camera.eulerAngles.y))")
} catch { }
}
}
}
struct CustomARViewUIViewRepresentable: UIViewRepresentable {
func makeUIView(context: Context) -> some UIView {
let arView = CustomARView(frame: .zero)
arView.setup()
return arView
}
func updateUIView(_ uiView: UIViewType, context: Context) { }
}
struct ContentView: View {
var body: some View {
CustomARViewUIViewRepresentable()
.frame(maxWidth: .infinity, maxHeight: .infinity)
.ignoresSafeArea()
}
}
Developing a prototype Vision Pro app and would like to render a 3D scene made from Reality Composer Pro on an image anchor in a RealityView. But I have no luck so far to make it work and need some guidance to move on.
I got the image file stored in the assets like below:
And from below is the source codes:
import SwiftUI
import RealityKit
import RealityKitContent
struct AnchorView: View {
@State var imageEntity: Entity = {
let anchorEntity = AnchorEntity(.image(group: "AR Resources", name: "reanchor"))
return anchorEntity
}()
var body: some View {
RealityView { content in
do
{
// Add the initial RealityKit content
if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle)
{
imageEntity.addChild(scene)
content.add(imageEntity)
}
}
catch
{
print("Error occurs when adding reality view content: \(error)")
}
}
}
}
Hi,
I've just moved my SpriteKit-based game from UIView to SwiftUI + SpriteView and I'm getting this mesage
Adding 'GCControllerView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead.
Here's how I'm doing this
struct ContentView: View {
@State var alreadyStarted = false
let initialScene = GKScene(fileNamed: "StartScene")!.rootNode as! SKScene
var body: some View {
ZStack {
SpriteView(scene: initialScene, transition: .crossFade(withDuration: 1), isPaused: false , preferredFramesPerSecond: 60)
.edgesIgnoringSafeArea(.all)
.onAppear {
if !self.alreadyStarted {
self.alreadyStarted.toggle()
initialScene.scaleMode = .aspectFit
}
}
VirtualControllerView()
.onAppear {
let virtualController = BTTSUtilities.shared.makeVirtualController()
BTTSSharedData.shared.virtualGameController = virtualController
BTTSSharedData.shared.virtualGameController?.connect()
}
.onDisappear {
BTTSSharedData.shared.virtualGameController?.disconnect()
}
}
}
}
struct VirtualControllerView: UIViewRepresentable {
func makeUIView(context: Context) -> UIView {
let result = PassthroughView()
return result
}
func updateUIView(_ uiView: UIView, context: Context) {
}
}
class PassthroughView: UIView {
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
for subview in subviews.reversed() {
let convertedPoint = convert(point, to: subview)
if let hitView = subview.hitTest(convertedPoint, with: event) {
return hitView
}
}
return nil
}
}
Hello,
When testing GameKit "Manage Game Progress" in Xcode 26:
On iOS devices, achievements, leaderboards, and party code data display and work correctly.
On macOS devices, none of these data appear in "Manage Game Progress."
Is this a known issue with macOS GameKit, or is there a limitation compared to iOS?
If it is not a bug, is there any additional configuration needed to make achievements and leaderboards visible on macOS?
I also included the GameKit bundle in my macOS app and enabled Enable Debug Mode in GameKit Configuration in the scheme options.
Thank you.
after launching a nearby exoerience on quick look or inside our app, all the user in the group watch sometimes teh model blinking abd becoming transparet...
... just one user hasnt the issue, either the one who launched shareplay or the user who force align the immersive space in front
weird
Can't seem to get the Metal HUD to display value range's (pre 26 Tahoe). The documented environment variable MTL_HUD_SHOW_VALUE_RANGE doesn't seem to work.
https://developer.apple.com/documentation/xcode/monitoring-your-metal-apps-graphics-performance#Display-the-value-range-of-metrics
Anyone having any luck?
Topic:
Graphics & Games
SubTopic:
Metal
Hello -
Upon enabling debug mode for GameKit configuration for my iOS app, I do not see any devices (my iPhone or simulators) in the device list within the "Manage Game Progress" debug tool. I only see my Mac as a device. Is there any trick to this or a way to add devices to this list?
Thank you
We have a released game in the App Store with Game Center Challenges. The challenges were working well in testing of the final version before the Game Center challenges went live. After release the activity that should take the player to the challenge in game results in this printout and we do not get informed by GKGameActivityListener of the activity happening in our code as we did before the Game Center challenges went live.
“Invalid game activity definition.
Failed to kick activity notification to GameKit. Error: Error Domain=GKErrorDomain Code=17 "(null)"”
Also, Game Center reports that the are no challenges in GKChallengeDefinition.all even though there are ongoing challenges.
The leaderboard results do get reported to Game Center though and show up as challenge entries in the Games app.
At the moment we are trying to figure out if this is a problem on our end or a Game Center server issue.
Topic:
Graphics & Games
SubTopic:
GameKit
Hello,
I found an issue with the Games app on macOS 26 (Tahoe) when viewing achievements:
In App Store Connect, each achievement has different values set for the pre-earned description and the post-earned description.
When testing with GameKit directly (GKAchievementDescription), both values are returned correctly.
However, in the macOS Games app, the post-earned description is shown even before the achievement is earned.
This seems to be a display issue specific to the Games app on macOS.
Could you confirm if this is a known bug in the Games app, or if there is a reason why pre-earned descriptions are not being shown?
Thank you.
I implemented an EntityAction to change the baseColor tint - and had it working on VisionOS 2.x.
import RealityKit
import UIKit
typealias Float4 = SIMD4<Float>
extension UIColor {
var float4: Float4 {
if
cgColor.numberOfComponents == 4,
let c = cgColor.components
{
Float4(Float(c[0]), Float(c[1]), Float(c[2]), Float(c[3]))
} else {
Float4()
}
}
}
struct ColourAction: EntityAction {
// MARK: - PUBLIC PROPERTIES
let startColour: Float4
let targetColour: Float4
// MARK: - PUBLIC COMPUTED PROPERTIES
var animatedValueType: (any AnimatableData.Type)? { Float4.self }
// MARK: - INITIATION
init(startColour: UIColor, targetColour: UIColor) {
self.startColour = startColour.float4
self.targetColour = targetColour.float4
}
// MARK: - PUBLIC STATIC FUNCTIONS
@MainActor static func registerEntityAction() {
ColourAction.subscribe(to: .updated) { event in
guard let animationState = event.animationState else { return }
let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime))
animationState.storeAnimatedValue(interpolatedColour)
}
}
}
extension Entity {
// MARK: - PUBLIC FUNCTIONS
func changeColourTo(_ targetColour: UIColor, duration: Double) {
guard
let modelComponent = components[ModelComponent.self],
let material = modelComponent.materials.first as? PhysicallyBasedMaterial
else {
return
}
let colourAction = ColourAction(startColour: material.baseColor.tint, targetColour: targetColour)
if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) {
playAnimation(colourAnimation)
}
}
}
This doesn't work in VisionOS 26. My current fix is to directly set the material base colour - but this feels like the wrong approach:
@MainActor static func registerEntityAction() {
ColourAction.subscribe(to: .updated) { event in
guard
let animationState = event.animationState,
let entity = event.targetEntity,
let modelComponent = entity.components[ModelComponent.self],
var material = modelComponent.materials.first as? PhysicallyBasedMaterial
else { return }
let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime))
material.baseColor.tint = UIColor(interpolatedColour)
entity.components[ModelComponent.self]?.materials[0] = material
animationState.storeAnimatedValue(interpolatedColour)
}
}
So before I raise this as a bug, was I doing anything wrong in the former version and got lucky? Is there a better approach?
Hello, I’m porting my UIKit/SceneKit app to SwiftUI/RealityKit and I’m wondering how to change the camera target programmatically. I created a simple scene in Reality Composer Pro with two spheres. My goal is straightforward: when the user taps a sphere, the camera should look at it as the main target.
Following Apple’s videos, I implemented the .gesture modifier and it is printing the tapped sphere correctly, but updating my targetEntity state doesn’t change anything, so the camera won't update its target. Is there a way to access the scene content at that level? Or what else should I do?
Here’s my current code implementation:
Thanks!
Just wondering if anyone knows what it will take to hit greater than 60hz when targeting iPhone. If I set the preferredFramesPerSecond of an MTKView to 120, it works on the iPad, but on iPhone it never goes over 60hz, even with a simple hello triangle sample app... is this a limitation of targeting iPhone?
Topic:
Graphics & Games
SubTopic:
Metal
Hi,
Since iOS 26 introduced the new Games app, I’ve noticed a problem when using a Nintendo Switch Pro Controller in wired USB-C mode, and also with third-party controllers that emulate it (like the GameSir X5 Lite).
In the Games app interface, only the L/R buttons respond, but the D-Pad and analog sticks don’t work at all. Once inside actual games, the controller works fine — the issue only affects the Games app UI.
What I’ve tested so far:
Xbox / PlayStation controllers → work fine in both wired and Bluetooth, including inside the Games app.
Switch Pro Controller (Bluetooth) → works fine, including in the Games app.
Switch Pro Controller (wired) → same issue as the X5 Lite, D-Pad and sticks don’t work in the Games app.
This makes it hard to use the new Games app launcher with these controllers, even though they work perfectly once a game is launched.
My question: is this an iOS bug (Apple needs to add proper support for wired Switch Pro controllers in the Games app), or something that Nintendo / GameSir would need to address?
Thanks in advance to anyone who can confirm this or provide more info.