Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Activity

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
997
Jul ’25
Requesting permission for MusicKit in Xcode Cloud
I am experimenting with Swift Testing and Xcode Cloud and would like to write some tests that require to use MusicKit functionality. For example I'd like to fetch an album via MusicCatalogRessourceRequest to test an initializer of another struct. However this test fails because the permission to access the music library is not granted. Once the permission is granted, the test works as expected. Things I have tried: Add NSPrivacyAccessedAPITypes to the Info.plist. This did not show any effect. Below is the corresponding snippet Trying to tap the button programmatically. Once again this did not show any effect. The Info.plist snippet: <key>NSPrivacyAccessedAPITypes</key> <array> <string>NSPrivacyAccessedAPIMediaLibrary</string> </array> The code snippet to tap the button: let systemAlerts = XCUIApplication(bundleIdentifier: "com.apple.springboard") let allowButton = systemAlerts.buttons["Allow"] if allowButton.exists { allowButton.tap() } What am I doing wrong here? I need access to MusicKit functionalities to write meaningful tests. Thank you
0
0
429
Feb ’25
Unable to Access Organization Developer Account – “Join Now” Appears
Our organization has an active Apple Developer membership, but since last month, we’ve been unable to access our team information or manage certificates. Our apps are still active in App Store Connect, but we cannot update them. On the Developer account page, a “Join Now” button appears, but clicking it results in the message: “Your Apple Account is already associated with the account holder of a membership.” Additionally, this account was originally registered with an older email when it was used to create a personal developer account in 2015—nearly 10 years ago. We do not know if this is causing the issue. We have contacted Apple many times but no reply within one month. Any guidance or assistance would be greatly appreciated. Thanks!
0
0
290
Feb ’25
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
483
Feb ’25
There is a problem installing the iwatch app. It has been installing
We have recently developed the iwatch app. We will package the APP + iwatch app and publish it in testfight for test download. However, we often find that after the mobile app is downloaded, the iwatch app is also on the iwatch System application inside display. Click install is always in the installation. It has not been installed. Especially when the phone we tested is paired with several watches. Or delete the mobile APP and download it again to recover. But some situations still cannot be recovered. May I ask why it is always installed? We have confirmed that there is no problem with the network, and the pairing connection between the watch and the mobile phone is normal. This problem has caused us a lot of trouble. I wonder if this will happen in the app store. Please give the official reply
0
0
429
Feb ’25
Apple Developer Program Enrollment
Hi. For over 1.5 months, I am trying to opt in apple developer program for my company. Couple of times, I got informed that I have missing documents like shown below. • Applicant’s employment verification • Applicant’s employee badge or business card • Business documents for COMPANY_NAME I have provided employment verification and business documents, but I don't have any employee badge or business card so instead, I've provided my own id card's photo. Apple keeps sending me same email regarding missing documents. What I am missing here? What should I do if i don't have employee badge or business card? Thank you
0
0
396
Feb ’25
Getting an error in Xcode after resting simulator
I just made clean data on simulator then started getting the below error built on Xcode ? Showing Recent Issues Entitlements file "Clinic.entitlements" was modified during the build, which is not supported. You can disable this error by setting 'CODE_SIGN_ALLOW_ENTITLEMENTS_MODIFICATION' to 'YES', however this may cause the built product's code signature or provisioning profile to contain incorrect entitlements.
0
0
324
Feb ’25
(React-Native Expo) Add Dependency for Local Native Module
I am converting a project to expo and have created a new expo project. I have migrated most of the react-native code but need to add a native module. I added it using npx create-expo-module expo-settings --local The name of the module DataRetrieval. So far so good but I need a package SwiftCSV. I added it as a dependency to Pods and did a npx pod-install but when I try to import SwiftCSV as a subproject, it is not found. So I tried adding to the DataRetrieval podspec an s.dependency 'SwiftCSV'. I then get an error that there is a redefining of symbols. I am able to include this in a regular Swift file but not in the sub-module under expo. What am I missing about how to not only add a native module but to add dependencies and include it in my project? Thanks, Ray
1
0
344
Feb ’25
Issue with Module Import and Archiving a Mixed Swift/C Library Using Swift Package Manager
Hello everyone, I’m encountering an issue when trying to build and archive my library BleeckerCodesLib using Swift Package Manager. My project is structured with two targets: CBleeckerLib: A C target that contains my image processing code (C source files and public headers). BleeckerCodesLib: A Swift target that depends on CBleeckerLib and performs an import CBleeckerLib. Below is the relevant portion of my Package.swift: // swift-tools-version:5.7 import PackageDescription let package = Package( name: "BleeckerCodesLib", platforms: [.iOS(.v16)], products: [ .library(name: "BleeckerCodesLib", targets: ["BleeckerCodesLib"]) ], targets: [ .target( name: "CBleeckerLib", publicHeadersPath: "include" ), .target( name: "BleeckerCodesLib", dependencies: ["CBleeckerLib"] ) ] ) Directory Structure My project directory looks like this: BleeckerCodesLib/ ├── BleeckerCodesLib.xcodeproj/ │ └── xcuserdata/ │ └── robertopitarch.xcuserdatad/ │ └── xcschemes/ │ └── xcschememanagement.plist ├── BleeckerCodesLib.h ├── Package.swift └── Sources/ ├── CBleeckerLib/ │ ├── bleecker-lib.c │ └── include/ │ ├── bleecker-lib.h │ └── CBleeckerLib.h └── BleeckerCodesLib/ ├── UIImage+Extensions.swift ├── ImageProcessingUtility.swift ├── APIManager.swift ├── BleeckerCodesLib.swift ├── CameraView.swift ├── RealTimeCameraView.swift └── BleeckerCameraWrapper.swift Code Example In my Swift code (for example, in BleeckerCodesLib.swift), I import the C module as follows: import SwiftUI import UIKit import CBleeckerLib // Import the C module public struct BleeckerCodes { public struct DetectedCode { public let code: String public let corners: [CGPoint] public init(code: String, corners: [CGPoint]) { self.code = code self.corners = corners } } // Initialization function public static func initializeLibrary() -> String { bleecker_init() // Call the C module function return "BleeckerCodesLibrary initialized!" } // ... other functions } The Problem When I try to compile or archive the project using commands such as: xcodebuild archive -project BleeckerCodesLib.xcodeproj -scheme BleeckerCodesLib -destination "generic/platform=iOS" -archivePath "archives/BleeckerCodesLib" I receive the error: "no such module 'CBleeckerLib'" Any assistance or step-by-step guidance on resolving this integration issue would be greatly appreciated. Thank you in advance!
0
0
258
Feb ’25
Not Getting any update on Developer account
I am facing a problem. I have enrolled developer program on 7th may, 25. It says will take 2 days but I haven't received any confirmation. Again I submitted information on 11th February, 25 but still haven't received any email about confirmation or rejection. I tried to contact apple support. They created ticket for me. But haven't received any further information from them. It seems like I am a ghost and apple is not seeing me. Can anyone tell me what is the problem. Thank you
0
0
324
Feb ’25
The Unity application crashes due to KERN_PROTECTION_FAILURE and GC_clear_stack_inneb why?
Crash dump: `Crashed Thread: 0 tid_103 Dispatch queue: com.apple.main-thread Exception Type: EXC_BAD_ACCESS (SIGILL) Exception Codes: KERN_PROTECTION_FAILURE at 0x000000016d3bfea0 Exception Codes: 0x0000000000000002, 0x000000016d3bfea0 Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4 Terminating Process: Unity [7873] VM Region Info: 0x16d3bfea0 is in 0x169bbc000-0x16d3c0000; bytes after start: 58736288 bytes before end: 351 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL mapped file 169b00000-169ba8000 [ 672K] rw-/rwx SM=PRV Object_id=4d22156e GAP OF 0x14000 BYTES ---&gt; STACK GUARD 169bbc000-16d3c0000 [ 56.0M] ---/rwx SM=NUL stack guard for thread 0 Stack 16d3c0000-16dbbc000 [ 8176K] rw-/rwx SM=SHM thread 0 Thread 0 Crashed:: tid_103 Dispatch queue: com.apple.main-thread 0 libsystem_platform.dylib 0x1932ee7ac _platform_memset + 108 1 libmonobdwgc-2.0.dylib 0x33977abdc GC_clear_stack_inner + 60 2 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 3 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 4 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 5 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 6 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 7 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 8 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 9 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 10 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 11 libmonobdwgc-2.0.dylib 0x33977abf8 GC_clear_stack_inner + 88 12 libmonobdwgc-2.0.dylib 0x33976b518 GC_clear_stack + 76 13 libmonobdwgc-2.0.dylib 0x33973c074 mono_gc_alloc_obj + 112 14 libmonobdwgc-2.0.dylib 0x3396e0db4 mono_object_new_specific_checked + 72 15 libmonobdwgc-2.0.dylib 0x3396e116c ves_icall_object_new_specific + 28`
2
0
435
Feb ’25
can't get Xcode not to build x86_64 for Swift Packages
I'm trying to improve my build time on macOS by not building for x86_64. I've got the following settings: This gets Xcode not to build x86_64 for my app, but not all the package dependencies. I've updated most of the packages to swift-tools-version: 6.0 but FlatBuffers is still on 5.8 and .macOS(.v10_14). GPT claims: If your deployment target is set to macOS 10.15 or earlier, Xcode may force x86_64 support for compatibility reasons. But Xcode is building x86_64 for ALL my packages, even the ones that don't depend on FlatBuffers. When I open a package in Xcode that depends on FlatBuffers, then it builds arm only, so that may be a red herring. Not sure what else to try.
1
0
321
Mar ’25
Xcode is is throwing error when I run my Unity Build
When I try to build my project in Xcode (from Unity AR) project, it throws me these errors: I feel like I've tried everything to make the LaunchScreen work. I downloaded xcode the night I tried running this build, so the whole deleting and redownloading and restarting everything didn't work. I've tried making sure my macbook and terminal are fully up to date. I literally can't find a solution! Please help!! I will also say, I'm fairly new to App building, Xcode, and Unity. But this does seem like a barrier that is stopping me from testing my project.
0
0
160
Mar ’25
Changing Developer Account type from Organization (Business) to Individual
I have just started the process of closing down my limited company and would like to change my Apple Developer ID account type from Business (Organization) to Individual to release my apps under my own name. I was planning on releasing software in the App Store under my limited company, but I encountered an issue (https://developer.apple.com/forums/thread/759605) that has blocked the development for almost a year and Apple has never bothered to fix the issue. This has forced me to close down the company. I have decided I will be releasing the software as an open source project. All I want now is to be able to sign the software using my regular, personal Apple Account ID. However, my Apple Account is currently tied to my company's Organization Developer Account. It is registered as a business with business accounts, the App Store Connect agreements are signed by my company, and everything else on my Developer profile is tied to that company. I do not see the option to close down the company/business account. The only relevant post I found online was here: https://developer.apple.com/forums/thread/702447 Unfortunately, people mention in the comments that the workaround does not actually work. I have tried reaching Developer Support but had no response via email, and I cannot schedule a call back via the Support page because I cannot verify my overseas phone number. Seems like I have a really tough luck with Apple Developer ecosystem... Has anyone ever managed to do this? Any help much appreciated!
2
0
328
Feb ’25
XCFrameworks deploying .a / include to output target directory
Hello Friends, We have a strange bug that Xcode is deploying static binaries from within .XCFramework into the CONFIGURATION_BUILD_DIR An example of our xcframeworks structure is: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>AvailableLibraries</key> <array> <dict> <key>BinaryPath</key> <string>libglfw3.a</string> <key>HeadersPath</key> <string>Headers</string> <key>LibraryIdentifier</key> <string>macos-arm64_x86_64</string> <key>LibraryPath</key> <string>libglfw3.a</string> <key>SupportedArchitectures</key> <array> <string>arm64</string> <string>x86_64</string> </array> <key>SupportedPlatform</key> <string>macos</string> </dict> </array> <key>CFBundlePackageType</key> <string>XFWK</string> <key>XCFrameworkFormatVersion</key> <string>1.0</string> </dict> </plist> for open source creative coding toolkit https://github.com/openframeworks/openFrameworks Can you see any issues in the above. Some ideas for me is to generate xcarchive instead of deploying the .a in the xcframeworks, however that does not explain the include being packaged there, so I think this might be a Xcode issue
0
0
175
Mar ’25
Info.plist
Hello, im trying to get my app to be able to ask the device for permission to have access to the camera. To do so i created an info.plist and turned off the generate Info.plist file in packaging. I then added what i believe are all the necessary keys. However, when i try to build and test on my phone i keep running an error that says that my app has a missing or invalid CFBundleExecutable in its info.plist. I tried to fix it by adding Key: CFBundleExecutable Type: String Value: $(EXECUTABLE_NAME) However, this isnt working. I have already added a BundleIdentifier using my com.name.appname the Bundle version string and Bundle Version. Now im not fully sure what to put to fix this issue. Is there another way to get the camera to work without having to create an info.plist? Or is this the only way?
1
0
383
Feb ’25