Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Created

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
806
Jul ’25
Is Xcode's predictive code completion model has some censoring?
It seems Xcode's predictive code completion model is censored. Specifically, when typing the word "torrent," the model stops working completely. It doesn't matter whether the word is written directly in the code or in a comment. It could also be part of another word, such as "qBittorrent." In either case, the model stops working. Reproducing this issue is fairly simple. Create a Swift file and type the word "torrent." The model will stop generating code. Xcode Version 26.2 (17C52) Predictive Code Completion Model: [com.apple.fm.code.generate_small_v2.base: 700.0.81600.13.202379,0] [com.apple.fm.code.generate_safety_guardrail.base: 1.6.81619.13.202072,0] [com.apple.gm.safety_deny.input.code_intelligence.base: 32025010.20251009.91600.100.1651,0] [com.apple.gm.safety_deny.output.code_intelligence.base: 32025010.20251009.91600.100.1651,0] (Installed)
0
0
44
5h
AR location errors on cellular + WiFi model iPad with device connected to Wi-Fi
I am developing an Augmented Reality (AR) navigation application for the iPad, utilizing the ARCL library to place Points of Interest (POIs) in the real world. The application's behavior varies significantly based on the device's networking configuration: Cellular Network (Expected Behavior): On an iPad with a cellular modem, when using the cellular network, all POIs are placed accurately with correct orientation. Wi-Fi Only (Expected Behavior): On a Wi-Fi-only model (no GPS chip), POI placement is inaccurate, confirming the need for an external GPS receiver for that hardware configuration. Cellular + Wi-Fi (Anomalous Behavior): The iPad is a cellular model (equipped with GNSS/GPS). The device is connected to a Wi-Fi network (enforced via an MDM profile, preventing the user from disabling Wi-Fi). When actively connected to this specific Wi-Fi network, the AR POIs consistently display with an incorrect orientation and placement, even though the device hardware has a dedicated GPS chip. The placement error strongly suggests that the device's determined location or heading is erroneous. It appears that the active Wi-Fi connection is somehow interfering with or overriding the high-accuracy GNSS/GPS data, leading to a flawed Core Location determination that negatively impacts the ARCL world tracking and anchor placement. Has anyone experienced a scenario where an active Wi-Fi connection on a cellular iPad model causes Core Location to prioritize less accurate location data (potentially Wi-Fi-based location services) over the device's built-in GNSS/GPS, resulting in severe orientation errors? We observed that on Apple map(native application) as well it is showing wrong location and orientation when it is connected to WiFi
0
0
57
7h
Trouble setting up watches to use TestFlight that are AWFK configured
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the test app from TestFlight on the phone, spinner moves for awhile then goes back to Install. I am not able to get the watch app installed on the test watches from the phone. Is what I am attempting to do supported? I haven't found much specific documentation on this. If I pair the test watches as regular watches, set them to developer mode, can I pair them again as AWFK and will developer mode survive the switch? Or is there something really simple that I'm overlooking? Appreciate any help that can be extended.
0
0
60
9h
How to delete iOS simulator runtimes?
There are multiple iOS simulator runtimes located at /System/Library/AssetsV2/com_apple_MobileAsset_iOSSimulatorRuntime which I don't need. I tried the following approaches to delete them but not working. Xcode > Settings > Components > Delete (It can delete some but not all of them) xcrun simctl runtime list (It doesn't show the old runtimes) sudo rm (Operation not permitted) sudo rm -rf cc1f035290d244fca4f74d9d243fcd02d2876c27.asset Password: rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData/096-69246-684.dmg: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/Info.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/version.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset: Operation not permitted I have two questions: Why "xcrun simctl runtime list" is unable to list some old runtimes? How to delete them?
0
0
33
1d
Forecasts missing in WeatherKit
I am using the WeatherKit REST API with hourlyStart/hourlyEnd parameters to request up to 240 hours of forecast data. However, when requesting later in the day, the API returns fewer than 240 hourly forecasts — e.g., 239 at 08:00, 238 at 09:00, etc. and goes up to 224 for 23:00 It appears the returned list is contiguous but truncated at the end compared to the full 240-hour window. I have also tried getting the data after sometime, like 09:00 data at 09:45 but still was missing the same data at the end. Is this expected WeatherKit behavior or a bug? If it’s expected, is there documentation explaining how the “forecast horizon” is determined and when it is updated? Thank you.
0
0
36
1d
Critical CallKit Issue: Audio Route Flapping due to reason: 3 (CategoryChange) after User Toggle
I am facing a severe audio routing instability issue when using CallKit and the Zego Express SDK on iOS. The problem is that the audio route immediately reverts from the Speaker back to the Earpiece, effectively disabling the Speaker button functionality.📝 Observed BehaviorWhen the user taps the native CallKit Speaker Button, the audio route is correctly changed to the Speaker, but then instantly flips back to the Receiver (Earpiece), as shown in the system log captured via AVAudioSession.routeChangeNotification monitoring.🧾 Log Evidence (Flapping Occurs in 0.4 seconds)The following log snippet clearly illustrates the system overriding the user's action (reason: 4) with an unexpected CategoryChange (reason: 3) event: TimestampComponentReason CodeDescriptionRouteIs Speaker16:31:18.009[CallKitManager]4Override (CallKit/ControlCenter)Loa ngoài (Speaker)true16:31:18.411[CallKitManager]3CategoryChangeThiết bị nhận (Receiver)false
0
0
8
1d
Terminated Account Remaining Balance Payout
I recently had my developer account terminated. In the final termination email, it said: "If applicable, no further payments will be made to you pursuant to Section 7.1 of the Paid Applications agreement (Schedules 2 and 3 to the ADP Agreement)." I've talked to multiple different developers who also had 3.2f account terminations. Some of them say that they got all their remaining earnings paid out 3-6 months later. Others have said that Apple just kept the money forever. How can I find out if Apple will pay me my account's remaining balance? When I got the original pending termination notice, I was still able to log into App Store Connect to view everything. But immediately after I got the final termination notice email, I could no longer even log into App Store Connect. Could anybody please help me? I'm dependent on the unpaid earnings as a young 21 indie developer. Thank you!! :)
0
0
25
1d
Xcode 26.2 / iOS 26.2 Simulator not downloading
Hey all, I recently updated to Xcode 26.2 and I'm having the hardest time trying to download the corresponding iOS simulator. I installed Xcode from developer downloads and the app did not come loaded with an iOS simulator. When trying to download from Components in Settings, I only get the following message: Download failed due to a bad URL. (Catalog download for com.apple.MobileAsset.iOSSimulatorRuntime) Domain: com.apple.MobileAssetError.Download Code: 49 User Info: { checkConfiguration = 1; } I also tried downloading via Terminal but also get a download failed message. I am on the latest macOS and have over 600 GB of disk space available. In previous versions, I was able to download the iOS simulator directly from Developer Downloads, but anything after 26 is not there. Any suggestions?
1
0
123
1d
Incremental build not working on Mac mini CI (Xcode 26) — always full rebuild
Hi everyone, We run our CI builds on a Mac mini. On my local MacBook, incremental builds work properly and build times are fast, but on CI it looks like Xcode rebuilds everything from scratch every time, and the build is about 6 minutes slower than local. In Xcode 26, “compilation caching” became available. My understanding was that if DerivedData is preserved between CI runs, compilation caching / incremental builds should reduce build time. So I tried specifying the DerivedData path as an absolute path in our xcodebuild command to reuse it across builds. However, it still seems to do a full rebuild every time and the build time didn’t improve. Has anyone seen a similar issue with incremental builds on CI (Mac mini) with Xcode 26? Any advice on what to check or how to configure CI so incremental builds / caching actually work would be greatly appreciated. Thanks!
0
0
34
1d
Vision face landmarks shifted on iOS 26 but correct on iOS 18 with same code and image
I'm using Vision framework (DetectFaceLandmarksRequest) with the same code and the same test image to detect face landmarks. On iOS 18 everything works as expected: detected face landmarks align with the face correctly. But when I run the same code on devices with iOS 26, the landmark coordinates are outside the [0,1] range, which indicates they are out of face bounds. Fun fact: the old VNDetectFaceLandmarksRequest API works very well without encountering this issue How I get face landmarks: private let faceRectangleRequest = DetectFaceRectanglesRequest(.revision3) private var faceLandmarksRequest = DetectFaceLandmarksRequest(.revision3) func detectFaces(in ciImage: CIImage) async throws -> FaceTrackingResult { let faces = try await faceRectangleRequest.perform(on: ciImage) faceLandmarksRequest.inputFaceObservations = faces let landmarksResults = try await faceLandmarksRequest.perform(on: ciImage) ... } How I show face landmarks in SwiftUI View: private func convert( point: NormalizedPoint, faceBoundingBox: NormalizedRect, imageSize: CGSize ) -> CGPoint { let point = point.toImageCoordinates( from: faceBoundingBox, imageSize: imageSize, origin: .upperLeft ) return point } At the same time, it works as expected and gives me the correct results: region is FaceObservation.Landmarks2D.Region let points: [CGPoint] = region.pointsInImageCoordinates( imageSize, origin: .upperLeft ) After that, I found that the landmarks are normalized relative to the unalignedBoundingBox. However, I can’t access it in code. Still, using these values for the bounding box works correctly. Things I've already tried: Same image input Tested multiple devices on iOS 26.2 -> always wrong. Tested multiple devices on iOS 18.7.1 -> always correct. Environment: macOS 26.2 Xcode 26.2 (17C52) Real devices, not simulator Face Landmarks iOS 18 Face Landmarks iOS 26
0
0
112
3d
Xcode Cloud fails while exporting archive
Hi, Since Xcode 26.2 beta 1, till now (I thought RC or actual release will solve this issue) I have problems exporting archive to appstore via Xcode Cloud (locally everything works and using Xcode 26.2 I was able to submit app to AppStore). Build is archived successfully but during export I get an error Exporting for App Store Distribution failed. Please download the logs artifact for more information. In xcodebuild-export-archive.log I see multiple attempts & errors like this 2025-12-13T08:29:34.040644840Z 2025-12-13 00:29:33.900 xcodebuild[10889:58988] DVTServices: Sending request 9A4173F4-CD87-4A76-95FC-501B4D11EF22 to <http://172.16.55.70:8089/services/v1/capabilities> for session DVTFoundation.DVTServicesSessionProxy. 2025-12-13T08:29:34.040647258Z Method: POST 2025-12-13T08:29:34.040648780Z 2025-12-13T08:29:34.040650230Z Headers: 2025-12-13T08:29:34.040651730Z { 2025-12-13T08:29:34.040653539Z Accept = "application/vnd.api+json"; 2025-12-13T08:29:34.040655403Z "Accept-Encoding" = "gzip, deflate"; 2025-12-13T08:29:34.040673847Z "Content-Length" = 124; 2025-12-13T08:29:34.040675883Z "Content-Type" = "application/vnd.api+json"; 2025-12-13T08:29:34.040677903Z "User-Agent" = Xcode; 2025-12-13T08:29:34.040681998Z "X-HTTP-Method-Override" = GET; 2025-12-13T08:29:34.040684468Z "X-Xcode-Version" = "26.2 (17C52)"; 2025-12-13T08:29:34.040686326Z } 2025-12-13T08:29:34.040688028Z 2025-12-13T08:29:34.040689711Z Payload: 2025-12-13T08:29:34.040692163Z {"urlEncodedQueryParams":"teamId=8CKLLLC9UE&filter%5BreferenceType%5D=bundle&filter%5BincludeRequestable%5D=true&limit=200"} 2025-12-13T08:29:34.308538247Z 2025-12-13 00:29:34.208 xcodebuild[10889:58988] DVTServices: Received response for 9A4173F4-CD87-4A76-95FC-501B4D11EF22 @ <http://172.16.55.70:8089/services/v1/capabilities>. Code = 0 2025-12-13T08:29:34.308555026Z 2025-12-13 00:29:34.208 xcodebuild[10889:58988] DVTServices: Response payload: { 2025-12-13T08:29:34.308557639Z "errors" : [ { 2025-12-13T08:29:34.308569919Z "id" : "0316154c-332b-4f9d-a90a-5770d4f8dc1a", 2025-12-13T08:29:34.308573588Z "status" : "400", 2025-12-13T08:29:34.308575871Z "code" : "PARAMETER_ERROR.INVALID", 2025-12-13T08:29:34.308578343Z "title" : "A parameter has an invalid value", 2025-12-13T08:29:34.308581767Z "detail" : "A parameter 'filter[includeRequestable]' has an invalid value : ''includeRequestable' is not a valid field name.'", 2025-12-13T08:29:34.308584989Z "source" : { 2025-12-13T08:29:34.308587270Z "parameter" : "filter[includeRequestable]" 2025-12-13T08:29:34.308589531Z } 2025-12-13T08:29:34.308591437Z } ] 2025-12-13T08:29:34.308593381Z } 2025-12-13T08:29:34.308595389Z 2025-12-13T08:29:34.308597297Z 2025-12-13T08:29:34.308600188Z 2025-12-13 00:29:34.208 xcodebuild[10889:58988] DVTServices: Could not fetch capabilities from network due to error: error = 'A parameter has an invalid value' When I use Xcode 26.1.1 to export the same commit, eveything works and workflow sends build to AppStoreConnect, any ideas what might cause this issue? Unfortunately I need to use Xcode 26.2 if I want to use tabViewBottomAccessory(isEnabled:content:) bcs even though this api is available since iOS 26.1, it's not available in Xcode 26.1
1
1
159
3d