Posts under App & System Services topic

Post

Replies

Boosts

Views

Activity

Terrible performance on iPad 11th BLE attribute notification messages.
We've been developing an iOS app in Swift for several years that run on iPad tablets in which our proprietary device emits EEG signals via BLE to the app running on the iPad tablet. The device emits the data as BLE notification messages in which the MTU is set to the maximum size that is allowed between our device and the iPad. Our device when communicating with the app running on a 10th generation iPad running iOS 18.5 it takes less than 200ms to transmit an interval of EEG signals which occurs at 500ms. Under the same conditions same version of iOS & app and the same device but using an iPad 11th generation, it takes anywhere from 800ms to 1.1 seconds (4x to 5x) to transmit an interval. Our device transmits the EEG signal using several ATT notification messages using the maximum MTU size. We are perplexed about such a huge step down in performance when running on the iPad 11th generation tablets. iPad generation Chipset Firmware -------------------------------------------------------------- 10th BCM_4387 22.5.614.3457 11th SRS_7923 HCI Rev. 2504 sub. 5003 We know that the 10th generation iPad used chipset manufactured by Broadcom. Whereas the iPad 11th generation that we've received uses a SRS chipset in which I'm unfamiliar with that chipset's manufacturer. We question if this performance degradation is due from the chipset manufacturer, the firmware revision when using attribute notifications messages over BLE in such a context. Using PacketLogger as to log the communication between the iPad tablets and our device and after analysis we haven't found anything that identifies difference in configuration settings that are exchanged between our device and iPad tablets that account for this performance degradation. Fortunately, our device is designed to work in complex environments & contexts and thus it has mechanisms accounting for transmission delays and interferences. I'd appreciate if any other Apple Developer or Apple staff is aware of the degradation when transmitting BLE attribute notification messages with the newer Apple devices using this series of chipset. If so, then: Are there any recommendations of solutions to improve this latency? Is this is being addressed for iPad 11th generation tablets? Regards, Steven Belbin Principal Developer at NeuroServo Inc.
0
1
142
Jul ’25
Why don't my os_log entries show up until the second time my driver loads?
I'm in the process of writing a DriverKit USBHostInterface driver, and while I'm finally starting to get there, I've run into a bit of a frustration with logging. Naturally I have a liberal amount of os_log calls that I'm using to troubleshoot my driver. However I've noticed that they don't show up until after the first time my driver has loaded. Meaning, for example, suppose I make a new build of my driver and it's bundled user-mode app, install the bundle to /Applications, run the installer, verify it took with systemextensionsctl list, fire up Console and start streaming log entries, then plug in my device. I can see the log entries that show that my driver is loaded, etc., then a bunch of kernel -> log entries, but none of my Start method log entries. If I unplug my device and plug it in again, my log entries show up as expected. Why is this and, more importantly, how can I fix it? I'd like to see those log entries the first time the driver loads, if I could.
0
0
28
2d
iOS AirPrint sends print-quality=high when file-type is photo even if user selects “normal”
Hi everyone, I observed a behavior with AirPrint from an iPhone and wanted to confirm if this is expected behavior from iOS. Scenario tested: File type: Photo Print-quality selected by the user: Normal Observation (from packet capture): When checking the PCAP for the request sent from the iPhone, the print-quality attribute is always sent as high, even though the user selected Normal in the UI. Question: Is this an expected behavior in iOS/AirPrint where photos are always sent with print-quality=high regardless of the user-selected print quality? Or could this be a bug?
0
0
23
6d
UI Testing and 'Allow Paste'
I am developing an app that allows the user to ask it to process the clipboard contents and do something with it. In developing a XC UI Test, I find the app stops while it waits for the user to give permission. That breaks the automation. I tried: let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard") let allowButton = springboard.buttons["Allow Paste"] But that does not work. Is there a way to tell the framework to automatically give the test permission to access the Paste clipboard (or to allow me to write tests to grant this)?
0
0
193
Nov ’25
Siri media search unable to provide keyword
Hi, I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search. When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil. When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song. Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result. I've read documentations and searched web but couldnt find any similar experience. What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
0
0
341
Nov ’25
CarPlay Simulator: How to Change Dock Position for Right-Hand Drive (RHD)
Hello everyone, I'm developing a CarPlay app and am trying to test it with the dock on the right side of the screen, as is standard for right-hand drive vehicles like those in Japan. Currently, the CarPlay Simulator always displays the dock on the left, and I can't find an option to change its position. This is important for ensuring a proper user experience for my target market. Has anyone figured out how to configure the simulator for RHD layouts? Any guidance on how to move the dock to the right would be greatly appreciated. Thanks in advance for your help!
0
0
138
Sep ’25
Improving And Scaling App Intent Support
Platform and Version iOS Development Environment: Xcode 16.2.0, macOS 15.3 Run-time Configuration: iOS 18.3, 17.x Description of Problem We have started migrating some of the app’s core functionality over to App Intents. Our first release of App Intent support focused on two settings a user can modify on their Bose products, Audio Modes and Immersive Audio, giving users the ability to modify these settings via Siri and shortcuts. The implementation uses two separate shortcuts for each setting type, with each shortcut supporting a single phrase for Siri each: “Change my Bose mode to ” and “Change my Bose immersive audio to ”. Each shortcut uses their own App Intent, and each App Intent has support for optionally providing both a product and a setting when performing the intent. Failing to provide a device, which happens when the intent is performed via Siri, simply auto selects a currently connected Bose product. Failing to provide a setting, like in cases where a user says “Change my Bose ” without providing a setting will simply have Siri confirm the setting the user wants to change before changing the setting. We are using AppEntity to identify a Bose product for both App Intents. Because the App Intent for the Audio Modes setting has a larger number of supported values (up to 15 maximum), we are also using AppEntity to identify these settings. We are using AppEnum to identify available settings for the Immersive Audio App Intent, as only 3 static values are supported. Our original implementation of App Intent support had quite a few phrases supported for each shortcut. We had explicit support for direct synonyms of the verb “Change” in other phrases, supporting words like “Switch” and “Set”. We also had support for words that are like the word “Change”, but not directly related, like the word “Toggle” for instance. We also had support for phrases with or without the setting in each phrase. However, early on we had a lot of trouble with phrase detection with Siri. Siri had a hard time identifying what shortcut was being requested, as well as not being able to identify what settings the user was providing for the setting parameter of each App Intent. While researching potential fixes for this issue, we found a response to a thread in the Apple forums (https://developer.apple.com/forums/thread/759909) that seemed to indicate that Siri phrase recognition was very much an aggregate process. With the total number of phrases supported combined with the available settings for each phrase further compounding the total number of phrases Siri needs to learn to recognize for each shortcut. So, to hopefully improve Siri phrase detection, we added logic to limit the amount of Audio Mode settings supported based on what Audio Modes the user had setup on their Bose products. But, more importantly, we limited the number of explicit phrases supported for each shortcut to just a single phrase. In our testing, not only did this improve phrase recognition, but support for synonyms like “Set” or “Switch” seemed to implicitly still be recognized by Siri. The issues we ran into with Siri phrase detection above has us a bit concerned about scaling App Intent support to other settings and features for our products in the future. Our app supports the ability to modify a large number of settings on their Bose products, with support constantly expanding to new products as they are released. Our roadmap for App Intent support was initially very ambitious, supporting much more than just the two settings mentioned above. But our initial experience with App Intents has us tapering our expectations a little bit as far as how much can be supported in total for App Intents. One thing we also noticed is less than optimal display of default shortcuts in the Shortcuts app. The default shortcuts appeared like so, with shortcuts displayed based on available settings fro each shortcut: However, we could not find a way to indicate to users that one particular section pertained specifically to the Audio Mode setting and the other to the Immersive Audio setting. The only information the user has to make this determination for themselves is the available settings (or shortcuts) for each. This may not be immediately clear to a new customer who might be using one of our products for the first time. This display of default shortcuts in the Shortcuts app has us wondering if our shortcuts implementation is what is intended as far as support for the Shortcuts app is concerned. We did survey default shortcuts displayed by other third-party applications and they mostly dealt with navigation with a single section containing default options clearly indicating where the user can navigate with a shortcut. We couldn’t find an example of an application supporting the ability to change different setting types, with each setting type having their own available values for each. So, to summarize the questions we have concerning App Intent support: What can we do with our App Intents and Shortcuts implementation to guarantee optimal performance with Siri? What is an ideal number of phrases to support for each Shortcut. What limitations should we be placing as far as the total number of available settings for each Shortcut. Are there phrases that might work better than others for what we’re trying to achieve with App Intent support? i.e. Is “Change my Bose mode” or “Change my Bose immersive audio” a good phrase to use for this kind of functionality? Or should we be using different verbs or wording? Assuming optimal support of each Shortcut above. What is a reasonable expectation as far as how many different supported shortcuts we can scale to support at the same time. One issue we ran into early on was Siri confusing one shortcut with the other and triggering the wrong App Intent at times. While this was ultimately resolved, this outcome seems much more likely the greater the number of individual shortcuts supported. Are there any recommendations on how to display these App Intents to customers as far as default shortcuts in the Shortcuts app is concerned? Is what we currently display for default shortcuts in the Shortcuts app what was initially intended for third party support for App Intents? If what we are currently displaying is expected, would it be possible to support the ability to provide additional context to each section of default shortcuts displayed? We would like to indicate to the user that one set of shortcuts pertains to the Audio Modes settings, and the other to Immersive Audio. Something along the lines of a section header like some of the first-party apps use. Are there any recommendations or tips for supporting App Intents, particularly phrases for Siri, in other languages?
0
0
195
Apr ’25
API to Programmatically Establish SCO Connection for HFP Accessories in iOS
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. Accessory team has suggested to establish SCO connection to route the tones through WM500. But iOS does not provide an external API (like Android's startBluetoothSco) to explicitly force the establishment of an SCO connection for notification tones. Is there any other approach to establish SCO connection in iOS to route notification tones through WM500
0
0
129
Dec ’25
SetFocusFilterIntent app cannot be copied to another Mac
I have recently added a SetFocusFilterIntent target extension to my app which is a system utility which goes into the menu bar(Application is agent = YES). I have followed the approach in the WWDC22 video introducing Focus Intent and I have created an App Groups to being able to make the Extension to communicate with my main app, however from when I did this sometimes when I run the app I do get this log line: Couldn't read values in CFPrefsPlistSource<0x97cd34700> (Domain: group.xxx.xxx.MyApp, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd Despite this the Focus mode integration is working correctly on my development Mac. However I used to Archive the app and then Copy the app to my MacBook but when I do that now my other Mac cannot open the app and it is giving me an error. If I revert this change then I can bring the app back to my other Mac as usual following the procedure: Product -> Archive. Then from the archiver: Distribute App -> Copy App. After that I copy the app generated to the App folder of my other MacBook but it doesn't open anymore. During the archival phase now I am even getting this warning: MyAppFocus.appex is an ExtensionKit extension and must be embedded in the parent app bundle's Extensions directory, but is embedded in the parent app bundle's ../../../BuildProductsPath/Release/MyApp.app/Contents/Extensions directory. How can I solve this issue? If I rollback the commit related to this SetFocusFilterIntent new feature the app can be Copied and moved to the other Mac as before. Is this related to the extension or to the fact that I had to use this new entitlement: com.apple.security.application-groups ?
0
1
223
Dec ’25
KDK for current stable version (26.1) missing
The current stable macOS version, 26.1 (build 25B78) is missing a corresponding Kernel Debug Kit (KDK) on the developer downloads page. This means I can't do any kernel-level development tasks currently. For example, if I try to build a new kernel collection with kmutil I get the message Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 25B78 to rebuild kernel collections. but there is no build 25B78 KDK available to download. The latest 26.1 KDK on the download page is 25B5062e (from a beta I believe) and the final stable KDK for build 25B78 (which kernel development tools require) was never published. Is there any workaround for this to correctly do kernel-level development targeting the latest stable release, or a timeline for when the KDK will release? Thanks!
0
3
324
Nov ’25
False delete alarm when renaming a file
I use the code below to rename a file, it works ok, but then the system calls accommodatePresentedItemDeletion(completionHandler:) on a NSFilePresenter that presents the file, immediately after the call to presentedItemDidMove(to:) What am I doing wrong? NSFileCoordinator().coordinate(writingItemAt: oldURL, options: .forMoving, writingItemAt: newURL, options: [], error: &error) { (actualURL1, actualURL2) in do { coordinator.item(at: actualURL1, willMoveTo: actualURL2) try FileManager().moveItem(at: actualURL1, to: actualURL2) coordinator.item(at: actualURL1, didMoveTo: actualURL2) } catch {...} }
0
0
120
Nov ’25
App Intents: String array parameter value clears immediately in Shortcuts editor
Hello, I am experiencing an issue with the App Intents framework where a parameter of type [String] (String Array) fails to persist user input in the Shortcuts app action editor. Issue Description: When adding an item to the String Array parameter in the Shortcuts app action editor, the input text automatically clears/resets to empty within less than 1 second. This happens spontaneously while the keyboard is still active, or immediately after typing, making it impossible to input any values. Environment: Xcode Version: 26.2 (17C52) iOS Version: 26.2.1 Device: iPhone 17 Code Snippet: import AppIntents import SwiftUI struct TestStringArrayIntent: AppIntent { static var title: LocalizedStringResource = "Test Array Input Bug" static var description: IntentDescription = "Reproduces the issue where String Array input clears automatically." // PROBLEM: // Input for this parameter vanishes automatically < 1s after typing. @Parameter(title: "Test Strings", default: []) var strings: [String] func perform() async throws -> some IntentResult & ReturnsValue<String> { return .result(value: "Count: \(strings.count)") } } Steps to Reproduce: Build and install the app containing the code above. Open the Shortcuts app and create a new shortcut. Add the "Test Array Input Bug" action. Tap the "Test Strings" parameter to add a new item. Type any text (e.g., "Hi"). Observe: Wait for about 1 second Observed Behavior: The text field clears itself automatically. The array remains empty ([]). Expected Behavior: The text should remain in the field and be successfully added to the array. **Filed as Feedback:**FB21808619 Thank you.
0
0
159
Jan ’26
Shortcuts: How to add text to a file name
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. The shortcut does batch processing of images if that makes any difference. Any help much appreciated:)
0
0
242
Jan ’26
How to determine TX region when using Declared Age Range (SB2420 compliance)
Hello, I’m working on implementing SB2420 compliance using the Declared Age Range framework. While referring to the documentation at https://developer.apple.com/documentation/declaredagerange, I couldn’t find details on how the TX region (transaction region or territory) is determined when using Declared Age Range. Specifically, I’d like to confirm the following points: How does the system determine the TX region when the user’s declared age range is retrieved? Is it based on the App Store region, the device locale, or the user’s Apple ID country? If the app’s backend needs to verify or log the TX region, is there a way to obtain or infer it from the API response or receipt data? Is there any difference in TX region determination between Sandbox and Production environments? If anyone has experience implementing Declared Age Range (SB2420) and handling region determination, I’d appreciate your insights or best practices. Thank you.
0
5
156
Nov ’25
Non–App Clip NFC URLs show CPSErrorDomain error 2 after creating 50+ Advanced App Clips
We’re seeing unexpected NFC behavior once our app has 50+ Advanced App Clips created. Expected: Scanning an NFC tag with a URL that is NOT an App Clip invocation should show the standard “Open in Safari” notification. Actual: After we create ~50+ Advanced App Clips, scanning NFC tags for URLs on the same domain that are not associated with App Clips consistently shows “CPSErrorDomain error 2” instead of the Safari prompt.
 QR codes for the same non–App Clip URLs work as expected (shows Safari prompt). Clearing the App Clips “Experience Cache” sometimes helps briefly, but the error returns on consequent scans. Notes: Domain has valid AASA. App Clip invocation URLs work as expected.
 The issue appears tied to the number of Advanced App Clips configured. Below ~50, non–App Clip NFC scans behave correctly; above that, they fail.
 Affected across multiple devices and iOS versions tested. Repro steps: Configure 50+ Advanced App Clips for paths on a single domain.
 Encode a different URL on the same domain that is NOT listed as an App Clip invocation into an NFC tag.
 Scan the NFC tag on iPhone.
 Observe “CPSErrorDomain error 2” instead of the “Open in Safari” notification. Impact: blocks our NFC use case for regular web links once we scale App Clip experiences. Sysdiagnose #: FB20563121
0
0
108
Nov ’25
Wrong AppIntents in Shortcuts app with iOS 26
I and many of my users observed the problem that the Shortcuts app seems to confuse and swap actions of different apps from the same developer. This happens after updating to iOS 26. This breaks many Shortcuts. Deleting one of the apps or re-adding the actions sometimes seems to help. Does anybody else observe this problem, or know how to handle this?
Replies
0
Boosts
0
Views
116
Activity
Sep ’25
Screen time API on parent/child devices
I’m creating an app with the Screen Time API and I would like to know how to make the app show a parental control editing view for parents and a view for child accounts that shows which apps are blocked. How can I do this?
Replies
0
Boosts
0
Views
118
Activity
Jun ’25
Terrible performance on iPad 11th BLE attribute notification messages.
We've been developing an iOS app in Swift for several years that run on iPad tablets in which our proprietary device emits EEG signals via BLE to the app running on the iPad tablet. The device emits the data as BLE notification messages in which the MTU is set to the maximum size that is allowed between our device and the iPad. Our device when communicating with the app running on a 10th generation iPad running iOS 18.5 it takes less than 200ms to transmit an interval of EEG signals which occurs at 500ms. Under the same conditions same version of iOS & app and the same device but using an iPad 11th generation, it takes anywhere from 800ms to 1.1 seconds (4x to 5x) to transmit an interval. Our device transmits the EEG signal using several ATT notification messages using the maximum MTU size. We are perplexed about such a huge step down in performance when running on the iPad 11th generation tablets. iPad generation Chipset Firmware -------------------------------------------------------------- 10th BCM_4387 22.5.614.3457 11th SRS_7923 HCI Rev. 2504 sub. 5003 We know that the 10th generation iPad used chipset manufactured by Broadcom. Whereas the iPad 11th generation that we've received uses a SRS chipset in which I'm unfamiliar with that chipset's manufacturer. We question if this performance degradation is due from the chipset manufacturer, the firmware revision when using attribute notifications messages over BLE in such a context. Using PacketLogger as to log the communication between the iPad tablets and our device and after analysis we haven't found anything that identifies difference in configuration settings that are exchanged between our device and iPad tablets that account for this performance degradation. Fortunately, our device is designed to work in complex environments & contexts and thus it has mechanisms accounting for transmission delays and interferences. I'd appreciate if any other Apple Developer or Apple staff is aware of the degradation when transmitting BLE attribute notification messages with the newer Apple devices using this series of chipset. If so, then: Are there any recommendations of solutions to improve this latency? Is this is being addressed for iPad 11th generation tablets? Regards, Steven Belbin Principal Developer at NeuroServo Inc.
Replies
0
Boosts
1
Views
142
Activity
Jul ’25
Why don't my os_log entries show up until the second time my driver loads?
I'm in the process of writing a DriverKit USBHostInterface driver, and while I'm finally starting to get there, I've run into a bit of a frustration with logging. Naturally I have a liberal amount of os_log calls that I'm using to troubleshoot my driver. However I've noticed that they don't show up until after the first time my driver has loaded. Meaning, for example, suppose I make a new build of my driver and it's bundled user-mode app, install the bundle to /Applications, run the installer, verify it took with systemextensionsctl list, fire up Console and start streaming log entries, then plug in my device. I can see the log entries that show that my driver is loaded, etc., then a bunch of kernel -> log entries, but none of my Start method log entries. If I unplug my device and plug it in again, my log entries show up as expected. Why is this and, more importantly, how can I fix it? I'd like to see those log entries the first time the driver loads, if I could.
Replies
0
Boosts
0
Views
28
Activity
2d
Kext user consent cannot be disabled on Apple Silicon?
Hi all, I would like to know if kext consent can still be disabled on Apple Silicon Macs. I tried spctl kext-consent disable in recovery OS, but after rebooting spctl kext-consent status still returns ENABLED. Is this command disabled or something?
Replies
0
Boosts
0
Views
104
Activity
May ’25
iOS AirPrint sends print-quality=high when file-type is photo even if user selects “normal”
Hi everyone, I observed a behavior with AirPrint from an iPhone and wanted to confirm if this is expected behavior from iOS. Scenario tested: File type: Photo Print-quality selected by the user: Normal Observation (from packet capture): When checking the PCAP for the request sent from the iPhone, the print-quality attribute is always sent as high, even though the user selected Normal in the UI. Question: Is this an expected behavior in iOS/AirPrint where photos are always sent with print-quality=high regardless of the user-selected print quality? Or could this be a bug?
Replies
0
Boosts
0
Views
23
Activity
6d
UI Testing and 'Allow Paste'
I am developing an app that allows the user to ask it to process the clipboard contents and do something with it. In developing a XC UI Test, I find the app stops while it waits for the user to give permission. That breaks the automation. I tried: let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard") let allowButton = springboard.buttons["Allow Paste"] But that does not work. Is there a way to tell the framework to automatically give the test permission to access the Paste clipboard (or to allow me to write tests to grant this)?
Replies
0
Boosts
0
Views
193
Activity
Nov ’25
Testflight
How can I get the Testflight invitation code
Replies
0
Boosts
0
Views
83
Activity
Jun ’25
Siri media search unable to provide keyword
Hi, I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search. When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil. When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song. Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result. I've read documentations and searched web but couldnt find any similar experience. What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
Replies
0
Boosts
0
Views
341
Activity
Nov ’25
CarPlay Simulator: How to Change Dock Position for Right-Hand Drive (RHD)
Hello everyone, I'm developing a CarPlay app and am trying to test it with the dock on the right side of the screen, as is standard for right-hand drive vehicles like those in Japan. Currently, the CarPlay Simulator always displays the dock on the left, and I can't find an option to change its position. This is important for ensuring a proper user experience for my target market. Has anyone figured out how to configure the simulator for RHD layouts? Any guidance on how to move the dock to the right would be greatly appreciated. Thanks in advance for your help!
Replies
0
Boosts
0
Views
138
Activity
Sep ’25
Improving And Scaling App Intent Support
Platform and Version iOS Development Environment: Xcode 16.2.0, macOS 15.3 Run-time Configuration: iOS 18.3, 17.x Description of Problem We have started migrating some of the app’s core functionality over to App Intents. Our first release of App Intent support focused on two settings a user can modify on their Bose products, Audio Modes and Immersive Audio, giving users the ability to modify these settings via Siri and shortcuts. The implementation uses two separate shortcuts for each setting type, with each shortcut supporting a single phrase for Siri each: “Change my Bose mode to ” and “Change my Bose immersive audio to ”. Each shortcut uses their own App Intent, and each App Intent has support for optionally providing both a product and a setting when performing the intent. Failing to provide a device, which happens when the intent is performed via Siri, simply auto selects a currently connected Bose product. Failing to provide a setting, like in cases where a user says “Change my Bose ” without providing a setting will simply have Siri confirm the setting the user wants to change before changing the setting. We are using AppEntity to identify a Bose product for both App Intents. Because the App Intent for the Audio Modes setting has a larger number of supported values (up to 15 maximum), we are also using AppEntity to identify these settings. We are using AppEnum to identify available settings for the Immersive Audio App Intent, as only 3 static values are supported. Our original implementation of App Intent support had quite a few phrases supported for each shortcut. We had explicit support for direct synonyms of the verb “Change” in other phrases, supporting words like “Switch” and “Set”. We also had support for words that are like the word “Change”, but not directly related, like the word “Toggle” for instance. We also had support for phrases with or without the setting in each phrase. However, early on we had a lot of trouble with phrase detection with Siri. Siri had a hard time identifying what shortcut was being requested, as well as not being able to identify what settings the user was providing for the setting parameter of each App Intent. While researching potential fixes for this issue, we found a response to a thread in the Apple forums (https://developer.apple.com/forums/thread/759909) that seemed to indicate that Siri phrase recognition was very much an aggregate process. With the total number of phrases supported combined with the available settings for each phrase further compounding the total number of phrases Siri needs to learn to recognize for each shortcut. So, to hopefully improve Siri phrase detection, we added logic to limit the amount of Audio Mode settings supported based on what Audio Modes the user had setup on their Bose products. But, more importantly, we limited the number of explicit phrases supported for each shortcut to just a single phrase. In our testing, not only did this improve phrase recognition, but support for synonyms like “Set” or “Switch” seemed to implicitly still be recognized by Siri. The issues we ran into with Siri phrase detection above has us a bit concerned about scaling App Intent support to other settings and features for our products in the future. Our app supports the ability to modify a large number of settings on their Bose products, with support constantly expanding to new products as they are released. Our roadmap for App Intent support was initially very ambitious, supporting much more than just the two settings mentioned above. But our initial experience with App Intents has us tapering our expectations a little bit as far as how much can be supported in total for App Intents. One thing we also noticed is less than optimal display of default shortcuts in the Shortcuts app. The default shortcuts appeared like so, with shortcuts displayed based on available settings fro each shortcut: However, we could not find a way to indicate to users that one particular section pertained specifically to the Audio Mode setting and the other to the Immersive Audio setting. The only information the user has to make this determination for themselves is the available settings (or shortcuts) for each. This may not be immediately clear to a new customer who might be using one of our products for the first time. This display of default shortcuts in the Shortcuts app has us wondering if our shortcuts implementation is what is intended as far as support for the Shortcuts app is concerned. We did survey default shortcuts displayed by other third-party applications and they mostly dealt with navigation with a single section containing default options clearly indicating where the user can navigate with a shortcut. We couldn’t find an example of an application supporting the ability to change different setting types, with each setting type having their own available values for each. So, to summarize the questions we have concerning App Intent support: What can we do with our App Intents and Shortcuts implementation to guarantee optimal performance with Siri? What is an ideal number of phrases to support for each Shortcut. What limitations should we be placing as far as the total number of available settings for each Shortcut. Are there phrases that might work better than others for what we’re trying to achieve with App Intent support? i.e. Is “Change my Bose mode” or “Change my Bose immersive audio” a good phrase to use for this kind of functionality? Or should we be using different verbs or wording? Assuming optimal support of each Shortcut above. What is a reasonable expectation as far as how many different supported shortcuts we can scale to support at the same time. One issue we ran into early on was Siri confusing one shortcut with the other and triggering the wrong App Intent at times. While this was ultimately resolved, this outcome seems much more likely the greater the number of individual shortcuts supported. Are there any recommendations on how to display these App Intents to customers as far as default shortcuts in the Shortcuts app is concerned? Is what we currently display for default shortcuts in the Shortcuts app what was initially intended for third party support for App Intents? If what we are currently displaying is expected, would it be possible to support the ability to provide additional context to each section of default shortcuts displayed? We would like to indicate to the user that one set of shortcuts pertains to the Audio Modes settings, and the other to Immersive Audio. Something along the lines of a section header like some of the first-party apps use. Are there any recommendations or tips for supporting App Intents, particularly phrases for Siri, in other languages?
Replies
0
Boosts
0
Views
195
Activity
Apr ’25
API to Programmatically Establish SCO Connection for HFP Accessories in iOS
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. Accessory team has suggested to establish SCO connection to route the tones through WM500. But iOS does not provide an external API (like Android's startBluetoothSco) to explicitly force the establishment of an SCO connection for notification tones. Is there any other approach to establish SCO connection in iOS to route notification tones through WM500
Replies
0
Boosts
0
Views
129
Activity
Dec ’25
SetFocusFilterIntent app cannot be copied to another Mac
I have recently added a SetFocusFilterIntent target extension to my app which is a system utility which goes into the menu bar(Application is agent = YES). I have followed the approach in the WWDC22 video introducing Focus Intent and I have created an App Groups to being able to make the Extension to communicate with my main app, however from when I did this sometimes when I run the app I do get this log line: Couldn't read values in CFPrefsPlistSource<0x97cd34700> (Domain: group.xxx.xxx.MyApp, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd Despite this the Focus mode integration is working correctly on my development Mac. However I used to Archive the app and then Copy the app to my MacBook but when I do that now my other Mac cannot open the app and it is giving me an error. If I revert this change then I can bring the app back to my other Mac as usual following the procedure: Product -> Archive. Then from the archiver: Distribute App -> Copy App. After that I copy the app generated to the App folder of my other MacBook but it doesn't open anymore. During the archival phase now I am even getting this warning: MyAppFocus.appex is an ExtensionKit extension and must be embedded in the parent app bundle's Extensions directory, but is embedded in the parent app bundle's ../../../BuildProductsPath/Release/MyApp.app/Contents/Extensions directory. How can I solve this issue? If I rollback the commit related to this SetFocusFilterIntent new feature the app can be Copied and moved to the other Mac as before. Is this related to the extension or to the fact that I had to use this new entitlement: com.apple.security.application-groups ?
Replies
0
Boosts
1
Views
223
Activity
Dec ’25
KDK for current stable version (26.1) missing
The current stable macOS version, 26.1 (build 25B78) is missing a corresponding Kernel Debug Kit (KDK) on the developer downloads page. This means I can't do any kernel-level development tasks currently. For example, if I try to build a new kernel collection with kmutil I get the message Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 25B78 to rebuild kernel collections. but there is no build 25B78 KDK available to download. The latest 26.1 KDK on the download page is 25B5062e (from a beta I believe) and the final stable KDK for build 25B78 (which kernel development tools require) was never published. Is there any workaround for this to correctly do kernel-level development targeting the latest stable release, or a timeline for when the KDK will release? Thanks!
Replies
0
Boosts
3
Views
324
Activity
Nov ’25
False delete alarm when renaming a file
I use the code below to rename a file, it works ok, but then the system calls accommodatePresentedItemDeletion(completionHandler:) on a NSFilePresenter that presents the file, immediately after the call to presentedItemDidMove(to:) What am I doing wrong? NSFileCoordinator().coordinate(writingItemAt: oldURL, options: .forMoving, writingItemAt: newURL, options: [], error: &error) { (actualURL1, actualURL2) in do { coordinator.item(at: actualURL1, willMoveTo: actualURL2) try FileManager().moveItem(at: actualURL1, to: actualURL2) coordinator.item(at: actualURL1, didMoveTo: actualURL2) } catch {...} }
Replies
0
Boosts
0
Views
120
Activity
Nov ’25
The system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time.
mac/ios acts as a BLE client. After successfully establishing a BLE connection, it sends large amounts of data to the peer device. After sending data for a period of time, the system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time, causing the data transmission to stall.
Replies
0
Boosts
0
Views
50
Activity
Oct ’25
App Intents: String array parameter value clears immediately in Shortcuts editor
Hello, I am experiencing an issue with the App Intents framework where a parameter of type [String] (String Array) fails to persist user input in the Shortcuts app action editor. Issue Description: When adding an item to the String Array parameter in the Shortcuts app action editor, the input text automatically clears/resets to empty within less than 1 second. This happens spontaneously while the keyboard is still active, or immediately after typing, making it impossible to input any values. Environment: Xcode Version: 26.2 (17C52) iOS Version: 26.2.1 Device: iPhone 17 Code Snippet: import AppIntents import SwiftUI struct TestStringArrayIntent: AppIntent { static var title: LocalizedStringResource = "Test Array Input Bug" static var description: IntentDescription = "Reproduces the issue where String Array input clears automatically." // PROBLEM: // Input for this parameter vanishes automatically < 1s after typing. @Parameter(title: "Test Strings", default: []) var strings: [String] func perform() async throws -> some IntentResult & ReturnsValue<String> { return .result(value: "Count: \(strings.count)") } } Steps to Reproduce: Build and install the app containing the code above. Open the Shortcuts app and create a new shortcut. Add the "Test Array Input Bug" action. Tap the "Test Strings" parameter to add a new item. Type any text (e.g., "Hi"). Observe: Wait for about 1 second Observed Behavior: The text field clears itself automatically. The array remains empty ([]). Expected Behavior: The text should remain in the field and be successfully added to the array. **Filed as Feedback:**FB21808619 Thank you.
Replies
0
Boosts
0
Views
159
Activity
Jan ’26
Shortcuts: How to add text to a file name
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. The shortcut does batch processing of images if that makes any difference. Any help much appreciated:)
Replies
0
Boosts
0
Views
242
Activity
Jan ’26
How to determine TX region when using Declared Age Range (SB2420 compliance)
Hello, I’m working on implementing SB2420 compliance using the Declared Age Range framework. While referring to the documentation at https://developer.apple.com/documentation/declaredagerange, I couldn’t find details on how the TX region (transaction region or territory) is determined when using Declared Age Range. Specifically, I’d like to confirm the following points: How does the system determine the TX region when the user’s declared age range is retrieved? Is it based on the App Store region, the device locale, or the user’s Apple ID country? If the app’s backend needs to verify or log the TX region, is there a way to obtain or infer it from the API response or receipt data? Is there any difference in TX region determination between Sandbox and Production environments? If anyone has experience implementing Declared Age Range (SB2420) and handling region determination, I’d appreciate your insights or best practices. Thank you.
Replies
0
Boosts
5
Views
156
Activity
Nov ’25
Non–App Clip NFC URLs show CPSErrorDomain error 2 after creating 50+ Advanced App Clips
We’re seeing unexpected NFC behavior once our app has 50+ Advanced App Clips created. Expected: Scanning an NFC tag with a URL that is NOT an App Clip invocation should show the standard “Open in Safari” notification. Actual: After we create ~50+ Advanced App Clips, scanning NFC tags for URLs on the same domain that are not associated with App Clips consistently shows “CPSErrorDomain error 2” instead of the Safari prompt.
 QR codes for the same non–App Clip URLs work as expected (shows Safari prompt). Clearing the App Clips “Experience Cache” sometimes helps briefly, but the error returns on consequent scans. Notes: Domain has valid AASA. App Clip invocation URLs work as expected.
 The issue appears tied to the number of Advanced App Clips configured. Below ~50, non–App Clip NFC scans behave correctly; above that, they fail.
 Affected across multiple devices and iOS versions tested. Repro steps: Configure 50+ Advanced App Clips for paths on a single domain.
 Encode a different URL on the same domain that is NOT listed as an App Clip invocation into an NFC tag.
 Scan the NFC tag on iPhone.
 Observe “CPSErrorDomain error 2” instead of the “Open in Safari” notification. Impact: blocks our NFC use case for regular web links once we scale App Clip experiences. Sysdiagnose #: FB20563121
Replies
0
Boosts
0
Views
108
Activity
Nov ’25