Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Created

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
807
Jul ’25
Getting Valgrind to run on macOS 10.15 Catalina, reboot
I posted about this about 5 years ago, and now at last it's close to being finished. The main problem that I now have is related to matching up DWARF debuginfo and global variables. This works fairly well on macOS 10.14. On 10.15 much less so, and I think that the reason is how the macho data is mmap'd. When Valgrind runs it does the job of the OS and loads the guest exe into memory. It'll then load and run dyld in Valgrind. I can get memory map debug traces. In one test with a problem I see 4 load segments. __DATA_CONST and __DATA both have prot 3 (RW) so we load them RW and not R then RW. Then I think that dyld munmaps and re mmaps the _DATA_CONST segment as RO. Valgrind works based on mmaps triggering reading debuginfo. I don't think that it handles munmap correctly. I need to debug that a lot more - I can see the changed mappings in the debug output but I don't see exactly what is happening with munmap and mmap (unless dyld is doing that on a section by basis). Does what I'm saying about the mappings make any sense?
0
0
1
6m
Xcode 26.1 re-release?
The developer downloads page now lists an Xcode 26.1 which was released on 11th Dec (the original Xcode 26.1 was posted on 3rd Nov). Strangely, this new Xcode 26.1 has a CFBundleShortVersionString of 26.1.1, and a DTXcodeBuild of 17B55 % ls -ln total 4413136 -rw-r--r--@ 1 503 20 2259523057 16 Dec 19:01 Xcode_26.1_Apple_silicon.xip % xip --expand Xcode_26.1_Apple_silicon.xip xip: signing certificate was "Software Update" (validation not attempted) xip: expanded items from "/Users/me/Downloads/temp/Xcode_26.1_Apple_silicon.xip" % plutil -p Xcode.app/Contents/Info.plist | grep CFBundleShort "CFBundleShortVersionString" => "26.1.1" % plutil -p Xcode.app/Contents/Info.plist | grep DTXcodeBuild "DTXcodeBuild" => "17B55" 17B55 does correspond to the original Xcode 26.1 final release. The Xcode 26.1.1 release that was previously posted had a DTXcodeBuild of 17B100, though. The pairing of 26.1.1 and 17B55 looks new and probably a packaging error?
0
0
10
1h
Controlling UIDesignRequiresCompatibility via Remote Config
Hello, I am currently in the process of gradually adding support for LiquidGlass to my app. The transition is taking place incrementally, i.e., new screens and minor features are gradually being adapted to the new design and already deployed. Currently, the old design is still active via the feature flag UIDesignRequiresCompatibility, as the existing UI should remain locked for the public app store version until the transition is complete. My challenge is as follows: I would like to work with the new LiquidGlass design during development without having to manually change the UIDesignRequiresCompatibility flag with each deployment. Ideally, I am looking for a solution where: • the new design is only activated for me (e.g., specific account or specific devices) • the old design remains active for all other users • the App Store version can be delivered unchanged So my question is: Is it possible to control UIDesignRequiresCompatibility via remote config or server-side logic in order to activate the new design specifically for certain users or devices? I have observed similar behavior on WhatsApp—two devices with the same app version, but only one shows the new design. This suggests server-side or remote config-based control. Do you have any experience or recommendations on how to implement something like this cleanly? Kind regards Heinz
1
0
14
3h
Real Time Spatial Video Streaming with Vision Pro
Hello, I am trying to build an AVP app for real-time "zero-latency" spatial video streaming. I am trying to figure out, on a high level, the best way to do this. Currently this is my method: Server sends stereo images via a WebRTC service (ie, livekit) The WebRTC stream is converted to a CVPixelBuffer, writes them to file, plays via AVPlayer, and applies a VideoMaterial to a plane entity. However, this is a bit hacky and it seems like this won't be compatible with Apple's spatial experinces. To my understanding, Apple supports HLS streaming for spatial experiences and APMP content. However, HLS (and even Low Latency HLS) introduces a second or more of latency, likely do to the segmentation nature of HLS. Thus, HLS will not work for us. Some other alternatives I've thought of are streaming the live stream video via webrtc from the server to a local computer in the AVP's network, and then using LL-HLS to stream from the local computer to the vision pro. Still, it seems like this would introduce latency on the order of seconds. Is my current approach the best way to implement this? Or could anyone suggest a better way, perhaps something compatible with AVP's spatial experiences
0
0
9
4h
Is Xcode's predictive code completion model has some censoring?
It seems Xcode's predictive code completion model is censored. Specifically, when typing the word "torrent," the model stops working completely. It doesn't matter whether the word is written directly in the code or in a comment. It could also be part of another word, such as "qBittorrent." In either case, the model stops working. Reproducing this issue is fairly simple. Create a Swift file and type the word "torrent." The model will stop generating code. Xcode Version 26.2 (17C52) Predictive Code Completion Model: [com.apple.fm.code.generate_small_v2.base: 700.0.81600.13.202379,0] [com.apple.fm.code.generate_safety_guardrail.base: 1.6.81619.13.202072,0] [com.apple.gm.safety_deny.input.code_intelligence.base: 32025010.20251009.91600.100.1651,0] [com.apple.gm.safety_deny.output.code_intelligence.base: 32025010.20251009.91600.100.1651,0] (Installed)
0
0
74
13h
AR location errors on cellular + WiFi model iPad with device connected to Wi-Fi
I am developing an Augmented Reality (AR) navigation application for the iPad, utilizing the ARCL library to place Points of Interest (POIs) in the real world. The application's behavior varies significantly based on the device's networking configuration: Cellular Network (Expected Behavior): On an iPad with a cellular modem, when using the cellular network, all POIs are placed accurately with correct orientation. Wi-Fi Only (Expected Behavior): On a Wi-Fi-only model (no GPS chip), POI placement is inaccurate, confirming the need for an external GPS receiver for that hardware configuration. Cellular + Wi-Fi (Anomalous Behavior): The iPad is a cellular model (equipped with GNSS/GPS). The device is connected to a Wi-Fi network (enforced via an MDM profile, preventing the user from disabling Wi-Fi). When actively connected to this specific Wi-Fi network, the AR POIs consistently display with an incorrect orientation and placement, even though the device hardware has a dedicated GPS chip. The placement error strongly suggests that the device's determined location or heading is erroneous. It appears that the active Wi-Fi connection is somehow interfering with or overriding the high-accuracy GNSS/GPS data, leading to a flawed Core Location determination that negatively impacts the ARCL world tracking and anchor placement. Has anyone experienced a scenario where an active Wi-Fi connection on a cellular iPad model causes Core Location to prioritize less accurate location data (potentially Wi-Fi-based location services) over the device's built-in GNSS/GPS, resulting in severe orientation errors? We observed that on Apple map(native application) as well it is showing wrong location and orientation when it is connected to WiFi
0
0
62
15h
Trouble setting up watches to use TestFlight that are AWFK configured
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the test app from TestFlight on the phone, spinner moves for awhile then goes back to Install. I am not able to get the watch app installed on the test watches from the phone. Is what I am attempting to do supported? I haven't found much specific documentation on this. If I pair the test watches as regular watches, set them to developer mode, can I pair them again as AWFK and will developer mode survive the switch? Or is there something really simple that I'm overlooking? Appreciate any help that can be extended.
0
0
63
17h
How to delete iOS simulator runtimes?
There are multiple iOS simulator runtimes located at /System/Library/AssetsV2/com_apple_MobileAsset_iOSSimulatorRuntime which I don't need. I tried the following approaches to delete them but not working. Xcode > Settings > Components > Delete (It can delete some but not all of them) xcrun simctl runtime list (It doesn't show the old runtimes) sudo rm (Operation not permitted) sudo rm -rf cc1f035290d244fca4f74d9d243fcd02d2876c27.asset Password: rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData/096-69246-684.dmg: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/Info.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/version.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset: Operation not permitted I have two questions: Why "xcrun simctl runtime list" is unable to list some old runtimes? How to delete them?
0
0
37
1d
Forecasts missing in WeatherKit
I am using the WeatherKit REST API with hourlyStart/hourlyEnd parameters to request up to 240 hours of forecast data. However, when requesting later in the day, the API returns fewer than 240 hourly forecasts — e.g., 239 at 08:00, 238 at 09:00, etc. and goes up to 224 for 23:00 It appears the returned list is contiguous but truncated at the end compared to the full 240-hour window. I have also tried getting the data after sometime, like 09:00 data at 09:45 but still was missing the same data at the end. Is this expected WeatherKit behavior or a bug? If it’s expected, is there documentation explaining how the “forecast horizon” is determined and when it is updated? Thank you.
0
0
38
1d
Critical CallKit Issue: Audio Route Flapping due to reason: 3 (CategoryChange) after User Toggle
I am facing a severe audio routing instability issue when using CallKit and the Zego Express SDK on iOS. The problem is that the audio route immediately reverts from the Speaker back to the Earpiece, effectively disabling the Speaker button functionality.📝 Observed BehaviorWhen the user taps the native CallKit Speaker Button, the audio route is correctly changed to the Speaker, but then instantly flips back to the Receiver (Earpiece), as shown in the system log captured via AVAudioSession.routeChangeNotification monitoring.🧾 Log Evidence (Flapping Occurs in 0.4 seconds)The following log snippet clearly illustrates the system overriding the user's action (reason: 4) with an unexpected CategoryChange (reason: 3) event: TimestampComponentReason CodeDescriptionRouteIs Speaker16:31:18.009[CallKitManager]4Override (CallKit/ControlCenter)Loa ngoài (Speaker)true16:31:18.411[CallKitManager]3CategoryChangeThiết bị nhận (Receiver)false
0
0
8
1d
Terminated Account Remaining Balance Payout
I recently had my developer account terminated. In the final termination email, it said: "If applicable, no further payments will be made to you pursuant to Section 7.1 of the Paid Applications agreement (Schedules 2 and 3 to the ADP Agreement)." I've talked to multiple different developers who also had 3.2f account terminations. Some of them say that they got all their remaining earnings paid out 3-6 months later. Others have said that Apple just kept the money forever. How can I find out if Apple will pay me my account's remaining balance? When I got the original pending termination notice, I was still able to log into App Store Connect to view everything. But immediately after I got the final termination notice email, I could no longer even log into App Store Connect. Could anybody please help me? I'm dependent on the unpaid earnings as a young 21 indie developer. Thank you!! :)
0
0
25
2d
Xcode 26.2 / iOS 26.2 Simulator not downloading
Hey all, I recently updated to Xcode 26.2 and I'm having the hardest time trying to download the corresponding iOS simulator. I installed Xcode from developer downloads and the app did not come loaded with an iOS simulator. When trying to download from Components in Settings, I only get the following message: Download failed due to a bad URL. (Catalog download for com.apple.MobileAsset.iOSSimulatorRuntime) Domain: com.apple.MobileAssetError.Download Code: 49 User Info: { checkConfiguration = 1; } I also tried downloading via Terminal but also get a download failed message. I am on the latest macOS and have over 600 GB of disk space available. In previous versions, I was able to download the iOS simulator directly from Developer Downloads, but anything after 26 is not there. Any suggestions?
1
0
137
2d
Incremental build not working on Mac mini CI (Xcode 26) — always full rebuild
Hi everyone, We run our CI builds on a Mac mini. On my local MacBook, incremental builds work properly and build times are fast, but on CI it looks like Xcode rebuilds everything from scratch every time, and the build is about 6 minutes slower than local. In Xcode 26, “compilation caching” became available. My understanding was that if DerivedData is preserved between CI runs, compilation caching / incremental builds should reduce build time. So I tried specifying the DerivedData path as an absolute path in our xcodebuild command to reuse it across builds. However, it still seems to do a full rebuild every time and the build time didn’t improve. Has anyone seen a similar issue with incremental builds on CI (Mac mini) with Xcode 26? Any advice on what to check or how to configure CI so incremental builds / caching actually work would be greatly appreciated. Thanks!
0
0
34
2d