I noticed that MTLPixelFormat has this cases:
case r32Float = 55
case rg32Float = 105
case rgba32Float = 125
But no case rgb32Float. What's the reason for such a discrimination?
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers.
In your sample application, I was not able to confirm that this was working. Only pinch clicking with my hands worked. Pulling the trigger on the controller whilst looking at a 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though)
This is inconsistent with hover effect behavior with psvr2 controllers on swift ui views, where the trigger press does count as a button click.
The sample I used was this one:
https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
In my game 854159268 (com.1791entertainment.qugame), in my quMostRecent3 leaderboard, the top 2 entries have 'vanished'. They were there yesterday. I know these players have played today, as I see their scores on other leaderboards.
Any ideas how to get these back?
These 2 players (me and my tester) are both TestFlight ing - not sure if that changes things.
Topic:
Graphics & Games
SubTopic:
GameKit
I was wondering if there's a method on MacOS to have my application hide a hid device such as a game controller and instead have the receiving game/application see my app's virtual controller? Is this possible via DriverKit or some other form of kernel level coding?
On Windows we have a tool known as HidHide that hids a game controller from all other applications. Is it possible to implement such behavior into an app or is that system level?
Hey I'm using the CIDepthBlurEffect Core Image Filter in my app. It seems to work ok but I get these errors in the console when calling the class.
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_scan
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_diffuse
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_copy_back
CoreImage Metal library does not contain function for name: plain_or_sRGB_copy
Am I missing some sort of import to gain these Metal functions? I am using my own custom shaders but I assume you'd be able to use them along side the built in ones.
Was wondering if anyone from Apple could provide some clarification, The gaming studio "Epic Games" Is wondering if they could distribute the award winning game "Fortnite" back on MacOS without any retaliations.
I know Fortnite being back on MacOS would benefit thousands of MacOS Devs. Hoping to get a clarification so Epic could start on bringing Fortnite back.
I'm updating an existing distributed game to add turn-based matches. When the Matchmaker ViewController Info Button next to a game is pressed, the results vary:
iOS 15.x - Button under avatar says "Accept Invite" or "View Game" (depending on if invite has already been accepted)
iOS 18.x - Button always says "App Store" - I assume that means it would lead one to the App store to install the game.
Both devices (iPad 15.x and iPhone 18.x) have the same version of the game installed. The results are the same when running in the simulator.
When the game is released, I assume this button will work properly, no?
Topic:
Graphics & Games
SubTopic:
GameKit
Turn-based games: 2 players
When an opponent declines a game in the Game Center MatchMaker VC, that player sees that they quit, but no message is sent to the listener about that fact. For the person who started the match, their MMVC shows it's their turn
again. Why doesn't Game Center end the match?
Topic:
Graphics & Games
SubTopic:
General
I've recently run into an issue in Xcode where the sks editor's preview canvas just vanishes for every project on my computer. I don't think it is an issue with my sks files because this works as expected on another computer with the same files, and when it happens it happens for ALL sks files in all projects. There used to be menu items to toggle the canvas and its settings, but those are now gone for me in sks files (they show up for swift files that have previews, however).
Any idea what is going on here? How do I get the canvas back? I literally cannot get any work done on my primary computer because of this...
How can I paste a string to the findNavigator of a TextEditor ?
Topic:
Graphics & Games
SubTopic:
General
I've tried out a ParticleEmitter in Reality Composer Pro to produce a burst of particles that don't move (i.e. speed close to zero).
When viewing from different angles, it clearly looks like the particles are rendered exactly in the wrong order, that is, front first and back last. In other words, back particles obscure front particles.
I would prefer it the correct way around.
I've only tried this interactively in Reality Composer Pro, not programmatically, but I assume I would get the same result.
My Reality Composer Pro "File" (zipped):
https://gert-rieger-edv.de/Posts/Post-1/RealityParticles.zip
Screenshot:
Click on the ParticleEmitter object, then on its Play button, then select the Particles tab and click on "Burst" a few times to get a few random particles.
Mac Studio 2025
Apple M4 Max
macOS 15.7.2 (24G325)
Reality Composer Pro
Version 2.0 (494.60.2)
Is the pseudocode below thread-safe? Imagine that the Main thread sets the CAMetalLayer's drawableSize to a new size meanwhile the rendering thread is in the middle of rendering into an existing MTLDrawable which does still have the old size.
Is the change of metalLayer.drawableSize thread-safe in the sense that I can present an old MTLDrawable which has a different resolution than the current value of metalLayer.drawableSize? I assume that setting the drawableSize property informs Metal that the next MTLDrawable offered by the CAMetalLayer should have the new size, right?
Is it valid to assume that "metalLayer.drawableSize = newSize" and "metalLayer.nextDrawable()" are internally synchronized, so it cannot happen that metalLayer.nextDrawable() would produce e.g. a MTLDrawable with the old width but with the new height (or a completely invalid resolution due to potential race conditions)?
func onWindowResized(newSize: CGSize) {
// Called on the Main thread
metalLayer.drawableSize = newSize
}
func onVsync(drawable: MTLDrawable) {
// Called on a background rendering thread
renderer.renderInto(drawable: drawable)
}
Hi, I'm trying to set the displayScale environment value for a RealityView, so it renders at 2x instead of 3x on the iPhone, but it seems to have no effect.
.environment(\.displayScale, 2.0)
Is this expected behavior, or a bug?
The reason I want it to render at 2x and not at the default 3x is for game optimization and performance.
We are developing a hybrid iOS app where Angular content is rendered inside a WKWebView, hosted by a native Swift application.
We use the GameController framework to detect whether an external Bluetooth keyboard is connected to an iPad. The following code is executed when the app enters the foreground and also when requested by the web layer:
func keyboardStatusHandler(){
let isKeyboardConnected = GCKeyboard.coalesced != nil
if(!isKeyboardConnected){
//sent status to Angular
} else {
//sent status to Angular
}
}
Crash details
We are seeing intermittent crashes on iPad with the following stack trace:
Crashed: GCDeviceSession.HID
0 libobjc.A.dylib 0x7db8 objc_retain_x8 + 16
1 libsystem_blocks.dylib 0xfb8 void HelperBase<ExtendedInline>::copyCapture<(HelperBase<ExtendedInline>::BlockCaptureKind)3>(unsigned int) + 48
2 libsystem_blocks.dylib 0xbc4 HelperBase<GenericInline>::copyBlock(Block_layout*, Block_layout*) + 108
3 libsystem_blocks.dylib 0xc94 _call_copy_helpers_excp + 60
4 libsystem_blocks.dylib 0xef8 _Block_copy + 412
5 libdispatch.dylib 0x1a70 _dispatch_Block_copy + 32
6 libdispatch.dylib 0x792c dispatch_async + 56
7 libdispatch.dylib 0x792c dispatch_channel_async + 56
8 GameController 0xea6dc -[GCKeyboardInput _handleKeyboardEvent:] + 324
9 GameController 0x22508 __53-[_GCKeyboardEventHIDAdapter initWithSource:service:]_block_invoke + 376
10 GameController 0x11d30 -[_GCHIDEventSubject publishHIDEvent:] + 268
11 GameController 0xb79cc __40-[_GCHIDEventUIKitClient initWithQueue:]_block_invoke_3 + 44
12 libdispatch.dylib 0x1b584 _dispatch_client_callout + 16
13 libdispatch.dylib 0x12088 _dispatch_async_and_wait_invoke_and_complete_recurse + 272
14 libdispatch.dylib 0x8448 _dispatch_async_and_wait_f + 108
15 GameController 0xb7984 __40-[_GCHIDEventUIKitClient initWithQueue:]_block_invoke_2 + 132
16 GameController 0xb746c __48-[__GCHIDEventUIKitClient _initWithApplication:]_block_invoke + 256
17 UIKitCore 0x11fd394 __61-[UIEventFetcher _setHIDGameControllerEventObserver:onQueue:]_block_invoke_3 + 40
18 libdispatch.dylib 0x1aac _dispatch_call_block_and_release + 32
19 libdispatch.dylib 0x1b584 _dispatch_client_callout + 16
20 libdispatch.dylib 0xa2d0 _dispatch_lane_serial_drain + 740
21 libdispatch.dylib 0xadac _dispatch_lane_invoke + 388
22 libdispatch.dylib 0x151dc _dispatch_root_queue_drain_deferred_wlh + 292
23 libdispatch.dylib 0x14a60 _dispatch_workloop_worker_thread + 540
24 libsystem_pthread.dylib 0xa0c _pthread_wqthread + 292
25 libsystem_pthread.dylib 0xaac start_wqthread + 8
Observed scenarios
Crash occurs when the app transitions from background to foreground
Crash also occurs when the Angular layer requests keyboard status, triggering the same code path
Questions
Has anyone encountered crashes related to GCKeyboard.coalesced or GCKeyboardInput like this?
Are there known issues with the GameController framework when querying keyboard state during app lifecycle transitions?
Is there a recommended or safer way to detect external keyboard connection status on iPad (especially when using WKWebView)?
Any insights, known platform issues, or suggested workarounds would be greatly appreciated.
Thanks!
After authenticating the user I'm loading my Game Center leaderboards like this:
let leaderboards = try await GKLeaderboard.loadLeaderboards(IDs: [leaderboardID])
This is working fine, but there are times when this just returns an empty array. When I encounter this situation, the array remains empty for several hours when retrying, but then at some point it suddenly starts working again.
Is this a known issue? Or am I hitting some kind of quota maybe (as I do it quite often while developing my game)?.
Edit: My leaderboards are grouped in sets if that makes any difference here.
I do not understand how offline leaderboard submission is supposed to work in Game Kit:
While the documentation briefly states that offline submission is supported, how is that even possible when you first have to fetch a leaderboard object in order to then call its submitScore function? How can I get the leaderboard object in the first place when offline?
Can anyone enlighten me how this works? Or maybe point me to some relevant documentation?
I have implemented the Game Center for authentication and saving player's game data. Both authentication and saving player's data works correctly all the time, but there is a problem with fetching and loading the data.
The game works like this:
At the startup, I start the authentication
After the player successfully logs in, I start loading the player's data by calling fetchSavedGames method
If a game data exists for the player, I receive a list of SavedGame object containing the player's data
The problem is that after I uninstall the game and install it again, sometimes the SavedGame list is empty(step 3). But if I don't uninstall the game and reopen the game, this process works fine.
Here's the complete code of Game Center implementation:
class GameCenterHandler {
public func signIn() {
GKLocalPlayer.local.authenticateHandler = { viewController, error in
if let viewController = viewController {
viewController.present(viewController, animated: false)
return
}
if error != nil {
// Player could not be authenticated.
// Disable Game Center in the game.
return
}
// Auth successfull
self.load(filename: "TestFileName")
}
}
public func save(filename: String, data: String) {
if GKLocalPlayer.local.isAuthenticated {
GKLocalPlayer.local.saveGameData(Data(data.utf8), withName: filename) { savedGame, error in
if savedGame != nil {
// Data saved successfully
}
if error != nil {
// Error in saving game data!
}
}
} else {
// Error in saving game data! User is not authenticated"
}
}
public func load(filename: String) {
if GKLocalPlayer.local.isAuthenticated {
GKLocalPlayer.local.fetchSavedGames { games, error in
if let game = games?.first(where: {$0.name == filename}){
game.loadData { data, error in
if data != nil {
// Data loaded successfully
}
if error != nil {
// Error in loading game data!
}
}
} else {
// Error in loading game data! Filename not found
}
}
} else {
// Error in loading game data! User is not authenticated
}
}
}
I have also added Game Center and iCloud capabilities in xcode. Also in the iCloud section, I selected the iCloud Documents and added a container.
I found a simillar question here but it doesn't make things clearer.
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
After updating iPad/iPhone devices from iOS 18 to iOS 26, PhotogrammetrySession intermittently crashes during photogrammetry processing. The same workflow was stable on iOS 18 with no code changes to the app.
Environment:
OS versions: Works on OS 18, crashes on OS 26
Device: iPad/iPhone (reproducible across devices)
Source images: ~170-200 JPG files at 2160 x 3840 resolution
Reproduction:
The crash occurs consistently on the second or third sequential run of the photogrammetry session with the same image set. First run typically succeeds.
Crash details:
Xcode shows an uncaught exception during image processing:
terminating due to uncaught exception of type std::bad_alloc: std::bad_alloc
VTPixelTransferSession 420f sid 269 (2160.00 x 3840.00) [0.00 0.00 2160 3840]
rowbytes( 2160, 2160 ) Color( (null), 0x0, (null), (null), ITU_R_601_4 )
=> 24 sid 19 (2160.00 x 3840.00) [0.00 0.00 2160 3840] rowbytes( 6528 )
Color( 0x0, (null), (null), (null) )
This appears to be a memory allocation failure in VTPixelTransferSession during color space conversion. Has anyone else experienced similar crashes with CorePhotogrammetry on iOS 26, or found workarounds?
Topic:
Graphics & Games
SubTopic:
RealityKit