I'm building a game with a client-server architecture. Using GKMatch.chooseBestHostingPlayer(_:) rarely works. When I started testing it today, it worked once at the very beginning, and since then it always succeeds on one client and returns nil on the other client. I'm testing with a Mac and an iPhone. Sometimes it fails on the Mac, sometimes on the iPhone. On the device that it succeeds on, the provided host can be the device itself or the other one.
I created FB9583628 in August 2021, but after the Feedback Assistant team replied that they are not able to reproduce it, the feedback never went forward.
import SceneKit
import GameKit
#if os(macOS)
typealias ViewController = NSViewController
#else
typealias ViewController = UIViewController
#endif
class GameViewController: ViewController, GKMatchmakerViewControllerDelegate, GKMatchDelegate {
var match: GKMatch?
var matchStarted = false
override func viewDidLoad() {
super.viewDidLoad()
GKLocalPlayer.local.authenticateHandler = authenticate
}
private func authenticate(_ viewController: ViewController?, _ error: Error?) {
#if os(macOS)
if let viewController = viewController {
presentAsSheet(viewController)
} else if let error = error {
print(error)
} else {
print("authenticated as \(GKLocalPlayer.local.gamePlayerID)")
let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())!
viewController.matchmakerDelegate = self
GKDialogController.shared().present(viewController)
}
#else
if let viewController = viewController {
present(viewController, animated: true)
} else if let error = error {
print(error)
} else {
print("authenticated as \(GKLocalPlayer.local.gamePlayerID)")
let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())!
viewController.matchmakerDelegate = self
present(viewController, animated: true)
}
#endif
}
private func defaultMatchRequest() -> GKMatchRequest {
let request = GKMatchRequest()
request.minPlayers = 2
request.maxPlayers = 2
request.defaultNumberOfPlayers = 2
request.inviteMessage = "Ciao!"
return request
}
func matchmakerViewControllerWasCancelled(_ viewController: GKMatchmakerViewController) {
print("cancelled")
}
func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFailWithError error: Error) {
print(error)
}
func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFind match: GKMatch) {
self.match = match
match.delegate = self
startMatch()
}
func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) {
print("\(player.gamePlayerID) changed state to \(String(describing: state))")
startMatch()
}
func startMatch() {
let match = match!
if matchStarted || match.expectedPlayerCount > 0 {
return
}
print("starting match with local player \(GKLocalPlayer.local.gamePlayerID) and remote players \(match.players.map({ $0.gamePlayerID }))")
match.chooseBestHostingPlayer { host in
print("host is \(String(describing: host?.gamePlayerID))")
}
}
}
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In Xcode version 16.2 (16C5032a) I created:
One [ScenekitApp] [File/New/Project/iOS/Game/Game Technology: SceneKit].
Three custom Frameworks [File/New/Project/iOS/Framework] [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework].
In the four projects the [Minimum Deployments=iOS 15.0], [Swift version=6.0] and [BUILD_LIBRARY_FOR_DISTRIBUTION=Yes].
I directly installed the [Zip.framework] in [ASCSource.framework] without [pod init/pod install] and in the [General tab] [Frameworks, Libraries and Embeded Content] [Zip.framework Embed&Sign].
I installed the three frameworks [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework] in [ScenekitApp] and everything works perfectly in Xcode version 16.2(16C5032a).
I updated Xcode version 16.2(16C5032a) to Xcode version 16.3(16E140) and some errors in the SDKs were indicated.
[Failed to build module 'ASCSource'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK].
[Failed to build module 'Zip'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK].
If I recompile all frameworks in Xcode version 16.3 (16E140) it will work, but I don't think it will be a good solution.
I found some discussions in links like the following without success.
https://github.com/firebase/firebase-ios-sdk/issues/13727
https://stackoverflow.com/questions/70556401/swift-version-conflict-this-sdk-is-not-supported-by-the-compiler-please-select
Any help is welcome.
Thanks everyone!!!
Hello
I am trying to get thread group memory access in fragment shader. In essence, I would like to have all the fragments in a tile to bitwiseOR some value. My idea was to use simd_or across the SIMD group, then make each SIMD group thread 0 to atomic or the value into thread group memory. Finally very first thread of the tile would be tasked with writing the value down to texture with write access.
Now, I can allocate the thread group memory argument to the fragment function all right. MTLRenderEncoder has setThreadgroupMemoryLength call, which I am using the following way
[renderEncoder setThreagroupMemoryLength: 16 offset: 0 atIndex:0]
Unfortunately, all I am getting is the following error (runtime assertion)
-[MTLDebugRenderCommandEncoder setThreadgroupMemoryLength:offset:atIndex:]:3487: failed assertion Set Threadgroup Memory Length Validation
offset + length(16) must be <= threadgroupMemoryLength(0).`
What I am doing wrong? How I can get thread group memory in the fragment shader? I know I could use tile shading and compute function but the problem is that here I really like to use fragment stuff. Will be grateful for help.
Hi Apple,
In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering?
Thank you very much.
Hi, I am trying to detect if all the screen are in fullscreen mode.
The current approach is to get all windows' information from CGWindowListCopyWindowInfo and then compare the frame and coordinate with the frame of NSScreen.screens.
However, there is a problem, the y position of the window seems to be relative to the screen. As it is not absolute position, I cannot compare it with the coordinate of the screen.
Does anyone know if there are other information that I can use? Or is there a way to retrieve the absolute position or screen ID from the GCWIndow object?
i have a game that i upload it in the app store that my game size is 3 gigaByte but when I download it, it show that the really size is about 100 megaByte, i upload the game in google app is given me the real size,
so the problem i think is when it get out the xcode, maybe some one can give me i clue for what is going on.
my game was made by unity2020.
if that helps.
Hello,
I created a new project with the provided template for Immersive Environments.
Straight out of box I build to both the Simulator and to Vision Pro and the provided Environment looks like this.
What's interesting is that in Reality Composer Pro, it looks correct so how do I achieve the same look?
Thank you in advance!
So I've been trying out GPTK with Elite Dangerous Horizons game and it looks like from what I can tell. The VRAM keeps going up until it goes over the limit where it drops the FPS to 1-3 FPS and then crashes the game. From the Performance HUD I can see that it looks like when using GPTK, the VRAM usage just keeps climbing and I never saw it drop down at all. I did some limited testing, and from that I think I can conclude that it is probably not a VRAM leak, but it might be caching it. The reason for this is because I noticed that if I went back to the area that I've been before. It won't increase the VRAM usage.
So either there is something wrong with the freeing VRAM memory part, or it could be that GPTK might not be reporting the right amount of VRAM available to use? So maybe that's why it keeps allocating VRAM until it went out of memory and crashed the game.
Just to test, I did try running the game with DXVK+MoltenVK combo, and I can see that it works just fine. VRAM is being freed up when it's no longer used.
Is this a known issue in some games?
Starting with iOS 18.0 beta 1, I've noticed that RealityKit frequently crashes in the simulator when an app launches and presents an ARView.
I was able to create a small sample app with repro steps that demonstrates the issue, and I've submitted feedback: FB16144085
I've included a crash log with the feedback.
If possible, I'd appreciate it if an Apple engineer could investigate and suggest a workaround. It's awkward to be restricted to the iOS 17 simulator, which does not exhibit this behavior.
Please let me know if there's anything I can do to help.
Thank you.
The code is pretty simple
kernel void naive(
constant RunParams *param [[ buffer(0) ]],
const device float *A [[ buffer(1) ]], // [N, K]
device float *output [[ buffer(2) ]],
uint2 gid [[ thread_position_in_grid ]]) {
uint a_ptr = gid.x * param->K;
for (uint i = 0; i < param->K; i++, a_ptr++) {
val += A[b_ptr];
}
output[ptr] = val;
}
when uint a_ptr = gid.x * param->K, the code got 150 GFLops
when uint a_ptr = gid.y * param->K, the code got 860 GFLops
param->K = 256;
thread per group: [16, 16]
I'd like to understand why the performance is so different, and how can I profile/diagnose this to help with further optimization.
Topic:
Graphics & Games
SubTopic:
Metal
I am trying to install the Game Porting Toolkit 2.1 according to the Readme file provided with the toolkit.
When I run the following command:
WINEPREFIX=~/my-game-prefix brew --prefix game-porting-toolkit/bin/wine64 winecfg
I get an error message:
zsh: no such file or directory: /usr/local/opt/game-porting-toolkit/bin/wine64
I don't know how to resolve this.
When I type in the command which brew , I get the path
/usr/local/bin/brew
What am I doing wrong?
In the SceneKit framework, I want to render a point cloud and need to set the minimumPointScreenSpaceRadius property of the SCNGeometryElement class, but it doesn't work. I searched for related issues on the Internet, but didn't get a final solution.
Here is my code:
- (SCNGeometry *)createGeometryWithVector3Data:(NSData *)vData colorData: (NSData * _Nullable)cData{
int stride = sizeof(SCNVector3);
long count = vData.length / stride;
SCNGeometrySource *dataSource = [SCNGeometrySource geometrySourceWithData:vData
semantic:SCNGeometrySourceSemanticVertex
vectorCount:count
floatComponents:YES
componentsPerVector:3
bytesPerComponent:sizeof(float)
dataOffset:0
dataStride:stride];
SCNGeometryElement *element = [SCNGeometryElement geometryElementWithData:nil
primitiveType:SCNGeometryPrimitiveTypePoint
primitiveCount:count
bytesPerIndex:sizeof(int)];
element.minimumPointScreenSpaceRadius = 1.0; // not work
element.maximumPointScreenSpaceRadius = 20.0;
SCNGeometry *pointCloudGeometry;
SCNGeometrySource *colorSource;
if (cData && cData.length != 0) {
colorSource = [SCNGeometrySource geometrySourceWithData:cData
semantic:SCNGeometrySourceSemanticColor
vectorCount:count
floatComponents:YES
componentsPerVector:3
bytesPerComponent:sizeof(float)
dataOffset:0
dataStride:stride];
pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource, colorSource] elements:@[element]];
} else {
pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource] elements:@[element]];
}
return pointCloudGeometry;
}
Hi,
Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool.
GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks.
My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance?
To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work?
Thanks!
Hi everyone,
I'm using the Vision framework’s ImageAestheticsScoresObservation class (https://developer.apple.com/documentation/vision/imageaestheticsscoresobservation).
I noticed that the overallScore returned sometimes gives negative values. Could someone confirm whether the expected range of the score is from -1.0 to 1.0?
The documentation doesn’t explicitly state the possible score range, so I’d appreciate any clarification or insights.
Thanks in advance!
Hello there,
I'm having trouble matching what I see in the scenekit editor and the output of the resulting scene in a scnview.
For a glitter effect I have set a high value on the diffuse intensity which looks fine in the editor but when running the game the colors are much darker. To see if the intensity value is merely capped I have set the same multiplier on the hat below - but it is blown out which looks to me like there is some grading going on
I have tried to switch on hdr rendering but that didn't make a difference.
I tried disabling linear rendering and that simply made everything darker still - which I expect.
Does someone have an idea what else this could be? What rendering is the scenekit editor using and how can I match it?
Interestingly when I take a screenshot of the editor window for this post, the image is also blown out... what is going on? :)
Thanks so much for any pointers,
Seb
My app is being rejected and all I'm being told is that it is spam.
I've tried improving various aspects of the game, but I just receive the same copy and paste rejection message each time.
I have no idea if I'm moving in the right direction or what part of my game needs to be changed or improved. Is there a game quality benchmark document or some kind of resource I can use to better understand why my game is being rejected and how to bring it to a level that meets apple's standards.
I'm implementing optimized matmul on metal: https://github.com/crynux-ai/metal-matmul/blob/main/metal/1_shared_mem.metal
I notice that performance is significantly different with different threadgroup memory set in
[computeEncoder setThreadgroupMemoryLength]
All other lines are exactly same, the only difference is this parameter.
Matmul performance is roughly 250 GFLops if I set 32768 (max bytes allowed on this M1 Max),
but 400 GFLops if I set 8192.
Why does this happen? How can I optimize it?
Topic:
Graphics & Games
SubTopic:
Metal
Hello !
We are working on a real-time 2-player online game targeting multiple Apple devices.
The following issue only occurs on tvOS:
When selecting matchmaking to connect with another random player, the native Game Center interface opens and begins the matchmaking process.
Almost immediately after clicking "start", the following log appears in the console, and the matchmaking screen remains indefinitely without completing:
Timeout while starting matching with request: <GKMatchRequestInternal 0x30d62f690> {
defaultNumberOfPlayers : 0
isLateJoin : 0
localPlayerID : U:bea182d69b85f0839e3958742fbc4609
matchType : 0
maxPlayers : 2
minPlayers : 2
playerAttributes : 4294967295
playerGroup : 1
preloadedMatch : 0
recipientPlayerIDs : <__NSArrayM 0x3034ed5c0> {}
recipients : <__NSArrayM 0x3034ee280> {}
restrictToAutomatch : 0
version : 1
archivedSharePlayInviteeTokensFromProgrammaticInvite, inviteMessage, localizableInviteMessage, messagesBasedRecipients, properties, queueName, recipientProperties, rid, sessionToken : (null)
} . Error: (null)
However, as shown in the code snippet below, the task does not complete when the log appears. But when we manually cancel the matchmaking process, the "User cancel" log is correctly triggered.
var gkMatchRequest = GKMatchRequest.Init();
gkMatchRequest.MinPlayers = 2;
gkMatchRequest.MaxPlayers = 2;
var matchRequestTask = GKMatchmakerViewController.Request(gkMatchRequest);
matchRequestTask.ContinueWith(t => { Debug.LogException(t.Exception); }, TaskContinuationOptions.OnlyOnFaulted);
matchRequestTask.ContinueWith(t => { Debug.Log("User cancel"); }, TaskContinuationOptions.OnlyOnCanceled);
matchRequestTask.ContinueWith(t => { Debug.Log("Success"); }, TaskContinuationOptions.OnlyOnRanToCompletion);
We have tested this on multiple Apple TV and network types (Wi-Fi, 5G, Ethernet), but we consistently encounter this bug along with the same log message.
Could you please help us understand or resolve this issue?
Thank you.
Hello,
We are working on a real-time 2-player online game targeting multiple Apple devices. The following issue only occurs on tvOS:
When selecting matchmaking to connect with another player, the native Game Center interface opens and begins the matchmaking process.
Almost immediately, the following log appears in the console, and the matchmaking screen remains indefinitely without completing:
Timeout while starting matching with request: <GKMatchRequestInternal 0x30d62f690> {
defaultNumberOfPlayers : 0
isLateJoin : 0
localPlayerID : U:bea182d69b85f0839e3958742fbc4609
matchType : 0
maxPlayers : 2
minPlayers : 2
playerAttributes : 4294967295
playerGroup : 1
preloadedMatch : 0
recipientPlayerIDs : <__NSArrayM 0x3034ed5c0> {}
recipients : <__NSArrayM 0x3034ee280> {}
restrictToAutomatch : 0
version : 1
archivedSharePlayInviteeTokensFromProgrammaticInvite, inviteMessage, localizableInviteMessage, messagesBasedRecipients, properties, queueName, recipientProperties, rid, sessionToken : (null)
} . Error: (null)
However, the task does not complete when the log appears (our Debug.Log are nerver called).
But if we manually cancel the matchmaking process, the "User cancel" log is correctly triggered.
Here is a code snippet for the request :
var gkMatchRequest = GKMatchRequest.Init();
gkMatchRequest.MinPlayers = 2;
gkMatchRequest.MaxPlayers = 2;
var matchRequestTask = GKMatchmakerViewController.Request(gkMatchRequest);
matchRequestTask.ContinueWith(t => { Debug.LogException(t.Exception); }, TaskContinuationOptions.OnlyOnFaulted);
matchRequestTask.ContinueWith(t => { Debug.LogInfo("User cancel"); }, TaskContinuationOptions.OnlyOnCanceled);
matchRequestTask.ContinueWith(t => { Debug.LogInfo("Success"); }, TaskContinuationOptions.OnlyOnRanToCompletion);
We have tested this on multiple devices and network types (Wi-Fi, 5G, Ethernet), but we consistently encounter this bug along with the same log message.
Could you please help us understand or resolve this issue?
Thank you.
Dear Apple, there is still no support for WebGPU via WebView. Do you have a roadmap on when you plan to support it? That would be very helpful to release our new game.
Topic:
Graphics & Games
SubTopic:
General