Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Activity

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
807
Jul ’25
Development and testing on different machines
Hi developers, I'm searching for a kind of way of working to develop my apps on a different machine than testing and final building. For development I have a MacBook Pro m4 and for testing I want to outsource this to a Mac mini m1. I was searching for a solution and also contacted the support, but the answer wasn't really helpful. Any ideas how to setup this configuration to automate this kind of tests? Thanks a lot!
0
0
340
Jan ’25
Developer Program Enrollment Payment debited but no account yet
I recently made the enrollment for developer program and make the payment on Apple developer website. I saw the payment amount is debited from my account. But the developer program is not approved. When I contacted the developer support, they said Apple did not receive the money and you have to enroll again. So, I asked my Bank to make sure the payment is processed or not. The bank said it is already processed and they even sent me reference number and approval number. Even I sent the screenshot of those information, but the support guy keeps saying they did not receive it but without any helpful information to troubleshoot the issue at all. I am really amazed a company such as Apple has such terrible customer support. Now, I have to request payment dispute to my bank about it. If I do not get my money back, I will tag Apple CEO on x.com and post about it every day with a schedular script.
0
0
367
Dec ’24
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
445
Feb ’25
Adding Fonts in Xcode 16.1 Causes XIB Files to Malfunction
I encountered a problem when adding a new custom font in Xcode 16.1. After including the font and opening my XIB files, the interface preview became blank and the application seemed to experience a heavy load. To troubleshoot, I removed all custom fonts, and everything returned to normal functionality. However, even after reinstalling Xcode, the issue persisted when adding the font again. The XIB preview loaded correctly: The XIB preview turned blank and became unresponsive:
0
0
527
Dec ’24
Missing Developer Kit for build 22H417
I cannot find this specific KDK for my build 22H417. I need help locating and downloading this Developer Kit. Error Domain=KMErrorDomain Code=34 "Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 22H417 to rebuild kernel collections." UserInfo={NSLocalizedDescription=Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 22H417 to rebuild kernel collections.} I
0
0
327
Feb ’25
Unable to deploy app from Visual Studio on Windows 11 to iOS device
I am trying to get my app deployed to an iOs device (iphone 14) from Visual Studio on Windows 11. If the device I am trying to deploy to is included in https://developer.apple.com/account/resources/devices/list then I see below error in Visual Studio logs. Xamarin.Messaging.IDB.AppleProvisioningManager Error: 0 : Xamarin.MacDev.AppleSigning.AppleServerException: A device with number '0000xxxx-0014093926Bxxxx' already exists on this team. at Xamarin.MacDev.AppleSigning.AppStoreDeveloperPortal.d__42.MoveNext() in D:\a_work\1\s\External\maciostools\Xamarin.MacDev.AppleSigning\AppleDeveloperPortal\AppStoreDeveloperPortal.cs:line 913 If I disable it I see below error in Visual Studio logs: Xamarin.Messaging.Client.MessagingClient Error: 0 : An error occurred on the receiver while executing a post for topic xvs/idb/auto-provision and client vs26896sv3 Xamarin.Messaging.Exceptions.MessagingRemoteException: An error occurred on client xxxxxxx while executing a reply for topic xvs/idb/auto-provision ---> Newtonsoft.Json.JsonSerializationException: Error converting value {null} to type 'System.DateTime'. Path 'data.attributes.addedDate', line 6, position 24 I am seeing no option to completely remove the device from the list. How can this issue be fixed?
0
0
262
Jan ’25
Local notifications in flutter can't work in Release Mode but working in debug mode in iOS. Pub Package Used-awesome_notifications
When we tried local notification by enabling background fetch, audio , remote notifications and push notifications under signing and capability in debugging mode everything was working fine. We need this package to work in background mode i.e., when the app is idle in IOS. This functionality is required in our application send notifications when we receive a signal from signalr package from server even though the app is idle. It worked perfectly when we connect it in debugging mode and we have proceed to TestFlight update but app from the TestFlight didn't send notifications when the screen is off or in background
0
0
228
Jan ’25
Is it possible to send pushes through the Apple production server to an app running in Xcode?
I can sucessfully send pushes to an app (which has been installed/run via Xcode) when the pushes are going through the Apple sandbox server. However I want to test the server is configured correctly to send them through the Apple production server. In the Xcode scheme I tried to change the build configuration to release (and ticked debug executable off) ,however the pushes still only work when sent through the sandbox. Is there a way of installing/running the app using Xcode such that its compatible with the push production environment? Does the APS Environment entitlement come into play here? this only ever says development. (The app is on behalf of a 3rd party company, they've added me to their apple developer account but with limited powers, I can't upload to Testflight nor make an ad-hoc release with with to test with)
0
0
285
Jan ’25
How does DerivedData really work on Xcode Cloud?
Currently Im trying to save few files in it but on every run this folder is empty. I have the following script in ci_post_clone.sh mkdir ${CI_DERIVED_DATA_PATH} cd ${CI_DERIVED_DATA_PATH} ls -als touch test return 1 My expectation is that on the second run it would show test file in DerivedData or fail at creating the directory, but the issue is that this file is not created. does it need a successful build for this folder to be saved? in Xcode Cloud, in workflow environment I have unchecked Clean build Xcode Cloud will not restore derived data or caches for your builds, which may take significantly longer as a result.. One more question here would be what is meant by caches? are there other folders being saved? Also a bit of context. Im trying to build a Kotlin Multiplatform project but it fails with Showing All Issues > Could not resolve all files for configuration ':composeApp:iosArm64CompileKlibraries'. > Could not download lifecycle-viewmodel.klib (androidx.lifecycle:lifecycle-viewmodel-iosarm64:2.9.0-alpha03) > Could not get resource 'https://dl.google.com/dl/android/maven2/androidx/lifecycle/lifecycle-viewmodel-iosarm64/2.9.0-alpha03/lifecycle-viewmodel-iosarm64-2.9.0-alpha03.klib'. > Could not GET 'https://dl.google.com/dl/android/maven2/androidx/lifecycle/lifecycle-viewmodel-iosarm64/2.9.0-alpha03/lifecycle-viewmodel-iosarm64-2.9.0-alpha03.klib'. > Got socket exception during request. It might be caused by SSL misconfiguration > Connection reset by peer my guess is that Xcode Cloud or Google servers probably has some limitation. So if I cache those libraries my project will hopefuly compile once again.
0
1
523
Dec ’24
Multiple issues on Xcode Cloud, WeatherKit and TestFlight after changing bundle id
After changing the bundle identifier of my app, I’ve encountered several issues that I can’t seem to resolve, even though I’ve followed all the necessary steps. The app with the previous identifier was live on Testflight, and working perfectly fine, but now I’m facing the following problems: WeatherKit Authentication Issue WeatherKit has stopped working, and I’m getting authentication errors. I’ve updated the app in the Developer Portal to reflect the new bundle ID, but it still doesn’t authenticate properly. Xcode Cloud Configuration Issue: Xcode is asking me to set up Xcode Cloud again, which I understand is expected after a bundle ID change. However, during the setup process, it fails to recognize my remote repository, even though the repository is correctly added and visible under the Source Control tab. TestFlight Internal Testing Issue: I manually uploaded a build to TestFlight, but internal testers cannot use it because the invitation appears as invalid. This wasn’t an issue with the app’s previous identifier. It seems like the bundle ID change has caused some fundamental issues that I can’t resolve despite following all the usual instructions. Has anyone experienced this before or knows how to resolve these problems? I'm using the latest Xcode 16.2 on Mac OS Sequoia 15.2
0
0
533
Jan ’25
Fairplay 4.x Certificate Revocation
I created a fairplay.cer file using the below commands : openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr Here, I manually entered the Country, Organization, etc. I was supposed to use the below commands to make the same : openssl genrsa -aes256 -out privatekey.pem 1024 opensslreq-new-sha1-keyprivatekey.pem-outcertreq.csr-subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US" Owing to this, I am unable to create a .p12 file through Keychain Access. I thus want to generate a new fairplay.cer file for Fairplay 4.x. I want to revoke the certificate in order to generate a new one (as it has a limit of 1 certificate for Fairplay) Requesting developer support from Apple. Have raised multiple requests over the past 4 days.
0
0
400
Dec ’24
Delay in Apple Developer Account Processing – No Response from Support
Hello, I made a payment for an Apple Developer account on January 22, 2025, but the processing has not been completed yet. According to Apple's stated timeline, the account should be processed within two business days, but it has now been significantly delayed. I have already sent a follow-up email and a message via the support form, but I have not received any response. I would appreciate any guidance from the community or Apple representatives on how to proceed. Has anyone else faced a similar delay recently? Is there any alternative way to escalate this issue? Looking forward to any insights. Ali
0
1
230
Feb ’25
How to let instruments - app launch template wait for process to launch , rather than actively launch by instruments
Hi there, How to let instruments - app launch template wait for process to launch , rather than actively launch by instruments? I need to profile the app launch performance when clicking a push notification or cold launch via a url link. How to let instruments to wait for the process and collect the data? Currently, I tried the command xctrace record --template "App Launch" --attach MyApp --device-name 'Phone-Dev' --output mytrace.trace But it soon failed with 'Cannot find process matching name: MyApp'. How to make it work?
0
0
509
Jan ’25