Hi,
Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool.
GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks.
My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance?
To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work?
Thanks!
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm updating an existing distributed game to add turn-based matches. When the Matchmaker ViewController Info Button next to a game is pressed, the results vary:
iOS 15.x - Button under avatar says "Accept Invite" or "View Game" (depending on if invite has already been accepted)
iOS 18.x - Button always says "App Store" - I assume that means it would lead one to the App store to install the game.
Both devices (iPad 15.x and iPhone 18.x) have the same version of the game installed. The results are the same when running in the simulator.
When the game is released, I assume this button will work properly, no?
Topic:
Graphics & Games
SubTopic:
GameKit
Hello, when I'm looking to customize the icons of my phone, the applications that are in the grouping genres without replacing with all-black images, I don't know what happens by changing the color of the applications in group of change no color throws just listen not the black stuff
Topic:
Graphics & Games
SubTopic:
General
Does anyone know if we will be able to airplay content from another Apple device, say an iPad or iPhone to the Vision Pro?
Hey there,
I tried to install GPTK again, since I had to reinstall the OS for irrelevant reasons. But every time I try to install the tool kit, it gives me theError: apple/apple/game-porting-toolkit 1.1 did not build error. Before that error occired, I had the Openssl error, which I fixed with the rbenv version of openssl. Is there any way to fix this error? Down bellow you'll find the full error message it gave me. The specs for my Mac are (if they are helpful in any way): M1 Pro MBP 14" with 16GB Ram and 512GB SSD.
Thanks!
``Error: apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
If reporting this issue please do so at (not Homebrew/brew or Homebrew/homebrew-core):
https://github.com/apple/homebrew-apple/issues```
After watching WWDC 2025 session "Combine Metal 4 machine learning and graphics", I have decided to give it a shot to integrate the latest MTL4MachineLearningCommandEncoder to my existing render pipeline. After a lot of trial and errors, I managed to set up the pipeline and have the app compiled.
However, I am now stuck on creating a MTLLibrary with .mtlpackage.
Here is the code I have to create a MTLLibrary according the WWDC session https://developer.apple.com/videos/play/wwdc2025/262/?time=550:
let coreMLFilePath = bundle.path(forResource: "my_model", ofType: "mtlpackage")!
let coreMLURL = URL(string: coreMLFilePath)!
do {
metalDevice.makeLibrary(URL: coreMLURL)
} catch {
print("error: \(error)")
}
With the above code, I am getting error:
Error Domain=MTLLibraryErrorDomain Code=1 "Invalid metal package" UserInfo={NSLocalizedDescription=Invalid metal package}
What is the correct way to create a MTLLibrary with .mtlpackage? Do I see this error because the .mtlpackage I am using is incorrect? How should I go with debugging this?
I'd really appreciate if I could get some help on this as I have been stuck with it for some time now. Thanks in advance!
I used xcode gpu capture to profile render pipeline's bandwidth of my game.Then i found depth buffer and stencil buffer use the same buffer whitch it's format is Depth32Float_Stencil8.
But why in a single pass of pipeline, this buffer was loaded twice, and the Load Attachment Size of Encoder Statistics was double.
Is there any bug with xcode gpu capture?Or the pass really loaded the buffer twice times?
Topic:
Graphics & Games
SubTopic:
Metal
Hey I wanted to make an app that tracks changes in the room and room lightning and I was wondering if its possible to use VirtualEnvironmentProbeComponent to obtain the EnvironmentResource image and store it?
If so are there any example of similar operation I could use?
Thank you!
In the Creating A 3D Application With Hydra Rendering tutorial on the Apple Developer website, on the last step where I execute this command:
cmake -S ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ -B ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/
I keep getting an error:
CMake Error at CMakeLists.txt:5 (include):
include could not find requested file:
/Users/macuser/USDInstall/bin/pxrConfig.cmake
I've tried to follow the instructions as mentioned in the README.md file included in the project files at least 5 times as well as moving the pxrConfig.cmake file around and copying it in different folders, then executed the command and was still unsuccessful into generating the proper file expected to compile and render the HydraPlayer renderer. How do I get cmake to generate the Xcode file to create the HydraPlayer renderer?
Hi,
I have an Unity game. I need to have multiple App Icons for my game for it to be able to be recognized in different countries.
In other words, is it possible to have an iOS app in which the App Icon changes based on device locale/language?
On Android this is possible using Unity Localization package "com.unity.localization"
Topic:
Graphics & Games
SubTopic:
General
Hi all,
I've developed some code that enables an arcball camera interaction with my scene. I've done this using components and systems. The implementation feels a bit messy as I've got gesture code on my realityView, and then a bunch of other code that uses those gesture inputs in my component and system.
Is there a demo app, or some example code that shows a nice way to encapsulate these things in to one item for custom cameras, something like Apple's .realityViewCameraControls(.orbit)
If not can anyone recommend an approach to take?
My IOS app generates pdf files.
Every time my users open the generated pdf files, the autofill popup jumps out, but my pdf file is NOT for interacting.
I'm here to ask if there's a way to mark my pdf files as "not a form", like in metadata or anywhere else?
Given a graph with added obstacles I want to make a copy of it.
When I make the copy:
currentGrath added 20 obstacles.
var newGrapth = currentGrath.copy() as? GKObstacleGraph
newGrapth2.removeObstacles([newGrapth!.obstacles.first!])
This returns a BAD ACCESS.
I don't understand what's going on or what the problem is.
If I do this same thing with the main network there is no problem:
currentGrath.removeObstacles([currentGrath!.obstacles.first!])
Thanks for the help
Topic:
Graphics & Games
SubTopic:
SpriteKit
Hello Apple team,
I'm working on an iOS AR app using SwiftUI and RealityKit,
and I was wondering if the Cinematic API can be used with a RealityKit scene. I’d like to achieve a shallow depth of field while keeping the 3D asset in focus, and vice versa.
Thanks!
I am using Unity's GameKit to implement a turnbase game.
I want to make a UI in Unity to show all the games I can join.
I tried using
var matches = await GKTurnBasedMatch.LoadMatches();
to get all the open matches.
But it seems that I can only get the matcm related to the current apple account.
Can you help me get all the matches?
ALSO
I used
var match = await GKTurnBasedMatchmakerViewController.Request(request);
to exit the gamecenter interface and start a game (automatic matching, no one was invited)
Another device used
var match = await GKTurnBasedMatch.Find(request);
to find the game, but it did not find the game, but it start a new game (automatic matching).
Can you help me solve these problems?
Hello,
MacOS 26 Betas are limiting games (noticeably, games that use java) to the native display of the MacBook Pro (120hz). Even connecting an external display this is not changing. I have submitted a bug report, but I have not had any responses to it yet. I am looking to see if anyone may have an answer or fix to this issue.
Thanks!
Topic:
Graphics & Games
SubTopic:
General
as in the environments we have real tiem reflections of movies on a screen or reflections of the surrounding hood in the background...
could i get a metallic surface getting accurate reflections of a box on top ?
i don't mean getting a rpobe or hdr cubemap, i mean the same accurate reflections as the water of the mt hood with movie i'm wacthing in other app
I am trying to load some PNG data with MTKTextureLoader newTextureWithData,but the result shows wrong at the alpha area.
Here is the code. I have an image URL, after it downloads successfully, I try to use the data or UIImagePNGRepresentation (image), they all show wrong.
UIImage *tempImg = [UIImage imageWithData:data];
CGImageRef cgRef = tempImg.CGImage;
MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:device];
id<MTLTexture> temp1 = [loader newTextureWithData:data options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
NSData *tempData = UIImagePNGRepresentation(tempImg);
id<MTLTexture> temp2 = [loader newTextureWithData:tempData options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
id<MTLTexture> temp3 = [loader newTextureWithCGImage:cgRef options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
}] resume];
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
I’m building an app that uses RealityKit and specifically ARConfiguration.FrameSemantics.personSegmentationWithDepth.
The goal is to insert an AR object into the scene behind a person, and an additional AR object in front of the person, while being as photo realistic as possible.
Through testing, I’ve noticed that many times, the edges of the person segmentation mask are not well matched to the actual person, and parts of the person are transparent, with the AR object bleeding through. It’s sort of like a “bad green screen” effect, which I’d expect to see a little bit, but not to this extent. I’ve been testing on iPhone 16, iPhone 14 Pro, iPad Pro 12.9 inch 6th Generation, and iPhone 12 Pro, with similar results across all devices.
I’m wondering what else I can do to improve this… either code changes, platform (like different iPhone models), or environment (like lighting, distance, etc).
Attaching some example screen grabs and a minimum reproducible code sample. Appreciate any insights!
import ARKit
import SwiftUI
import RealityKit
struct RealityViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.environment.sceneUnderstanding.options.insert(.occlusion)
arView.renderOptions.insert(.disableMotionBlur)
arView.renderOptions.insert(.disableDepthOfField)
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal]
if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
configuration.frameSemantics.insert(.personSegmentationWithDepth)
}
arView.session.run(configuration)
arView.session.delegate = context.coordinator
context.coordinator.arView = arView
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
class Coordinator: NSObject, ARSessionDelegate {
var parent: RealityViewContainer
var floorAnchor: ARPlaneAnchor?
init(_ parent: RealityViewContainer) {
self.parent = parent
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
if let arView,floorAnchor == nil {
for anchor in anchors {
if let horizontalPlaneAnchor = anchor as? ARPlaneAnchor,
horizontalPlaneAnchor.alignment == .horizontal,
horizontalPlaneAnchor.transform.columns.3.y < arView.cameraTransform.translation.y { // filter out ceiling
floorAnchor = horizontalPlaneAnchor
let backgroundEntity = BackgroundEntity()
let anchorEntity = AnchorEntity(anchor: horizontalPlaneAnchor)
anchorEntity.addChild(background)
let foregroundEntity = ForegroundEntity()
backgroundEntity.addChild(foregroundEntity)
arView.scene.addAnchor(anchorEntity)
arView.installGestures([.rotation, .translation], for: backgroundEntity)
break // Stop after adding the first horizontal plane (floor)
}
}
}
}
}
}
Topic:
Graphics & Games
SubTopic:
RealityKit