Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

iMessage functionality
Hey guys! I've recently noticed a number of PaaS'es and CPaaS'es offering bulk outgoing messaging using the iMessage the same way it's done with the SMS. I always thought that iMessage sort of only allowed businesses to send outgoings subject to user contacting their account first (to avoid being spammed). But then there's those I mentioned above. Have you faced anything like this? Did Apple make changes to the model so that businesses can now initiate conversations with users? If so, how does it work?
0
0
83
Oct ’25
Live Caller ID Lookup - CipherMLError.missingSecre t Ke y error
I'm trying to implement live caller id PIR server in python and I have an issue related to evaluation key config. I don't receive the POST /key request even if I try to install the extension on new device and I had this error in device system logs: error 11:21:30.663022+0200 ciphermld requestData(byKeywords:shardIds:clientConfig:) threw an error: CipherML.CipherMLError.missingSecre t Ke y I think the reason why the evaluation key is not generated is related to this error but I'm not sure. It might be also related to HE params - I tried with the same params as in swift server example with plaintext_modulus=17 and it works, but for plaintext_modulus=65537 on the same device the system doesn't send me the evaluation key. Is there a limitation that restricts the evaluation key generation for some HE params? There is how the entire config object that I retrieve looks like: { "configs": { "Live-C aller-ID-Lookup. TestLiveCallerI D.iden tity": { "pir_config": { "encry ption_p aram eters": { "pol ynomial_deg ree": "4096", "plaint ext_mo dulus": "65537", "coeff icient_m oduli": [ "134176769", "268369921", "268361729" ], "secu rity_le vel": "SECU RITY LEVEL QUANT UM128", "h e_sc heme": "H E_SC HEM E_B F V" }, "shard_configs": [ { "num_entries": "2", "entry_size": "55991", "dimensions": [ "2", "1" ], "shard_id": "" } ], "keyword_pir_params": { "num_hash_functions": "2", "sharding_function": { "sha256": {} } }, "algorithm": "PIR _ALGO RITHM _MUL_PIR", "batch_size": "2", "evalu ation_ke config_hash": "" }, "config_id": "" } }, "key_info": [ { "timestamp": "1738660849", "key_config": { "encryp tion_par ameters": { "polynomial_degree": "4096", "plaintex t_mo dulus": "65537", "coeffic ient_m oduli": [ "134176769", "268369921", "268361729" ], "secu rity_level": "SECUR ITY_LEVEL_QU ANTUM128", "he_sc heme": "HE_SC HEME_BFV" }, "gal ois_e lements": [ 20 49, 40 97 ], "has _rel in_key": true } } ] } PS evaluation key data is just a placeholder, but anyway it should be skipped cause of expired timestamp More logs: ```language default 11:21:30.535865+0200 ciphermld Running rotation task for ["Live-Caller-ID-Lookup.TestLiveCallerID.identity"] info 11:21:30.535953+0200 ciphermld Skipping groups that manage their own networking: <private> default 11:21:30.537007+0200 ciphermld Request to fetchConfigs has started for useCases '["Li ve-Caller-ID-Lookup.TestLiveCallerID.identity"]', userId: '<private>', existingConfigIds: '["id"]' default 11:21:30.542174+0200 ciphermld Request to queries-batch has started for userId: '<private>', length: 28350 default 11:21:30.655914+0200 ciphermld Request to fetchConfigs has finished, response length: 230 default 11:21:30.656182+0200 ciphermld Received configurations: 1 usecase(s), 1 key(s) for group 'Live-Caller-ID-Lookup.TestLiveCallerID.identity' debug 11:21:30.660868+0200 ciphermld Skipping non-active key: timestamp: 1738660849 key_config { encryption_parameters { polynomial_degree: 4096 plaintext_modulus: 65537 coefficient_moduli: [134176769, 268369921, 268361729] security_level: Quantum128 he_scheme: BFV } galois_elements: [2049, 4097] has_relin_key: true } error 11:21:30.662982+0200 ciphermld No key for use-case 'Live-Caller-ID-Lookup.TestLiveCallerID.identity' error 11:21:30.663022+0200 ciphermld requestData(byKeywords:shardIds:clientConfig:) threw an error: CipherML.CipherMLError.missingSecre t Ke y default 11:21:30.663824+0200 com.apple.CallKit.CallDirectory <private> XPC request complete, results(0) error:Error Domain=CipherML.CipherMLError Code=32 "missing secre t ke y" UserInfo={NSLocalizedDescription=missing secre t ke y} default 11:21:30.972372+0200 ciphermld Request to queries-batch has finished response, length: 0 default 11:21:30.974711+0200 com.apple.CallKit.CallDirectory <private> XPC request complete, results(1) error:(null) default 11:21:36.161964+0200 com.apple.CallKit.CallDirectory <private> Sending XPC request default 11:21:36.163149+0200 com.apple.CallKit.CallDirectory <private> Sending XPC request default 11:21:36.169931+0200 ciphermld requestData(byKeywords:shardIds:clientConfig:) method was called default 11:21:36.170448+0200 ciphermld requestData(byKeywords:shardIds:clientConfig:) method was called default 11:21:36.174001+0200 ciphermld Cached: 0 / Missing: 1 error 11:21:36.174997+0200 ciphermld No userId or secre t Ke y for use-case '.Lve-Caller-ID-Lookup.TestLiveCallerID.identity'. Running rotation task' default 11:21:36.175075+0200 ciphermld Running rotation task for ["Live-Caller-ID-Lookup.TestLiveCallerID.identity"] info 11:21:36.175240+0200 ciphermld Skipping groups that manage their own networking: <private> default 11:21:36.177700+0200 ciphermld Request to fetchConfigs has started for useCases '["Live-Caller-ID-Lookup.TestLiveCallerID.identity"]', userId: '<private>', existingConfigIds: '["id"]' default 11:21:36.179914+0200 ciphermld Request to queries-batch has started for userId: '<private>', length: 28350 default 11:21:36.336051+0200 ciphermld Request to fetchConfigs has finished, response length: 230 default 11:21:36.336308+0200 ciphermld Received configurations: 1 usecase(s), 1 key(s) for group 'Live-Caller-ID-Lookup.TestLiveCallerID.identity' debug 11:21:36.341522+0200 ciphermld Skipping non-active key: timestamp: 1738660849 key_config { encryption_parameters { polynomial_degree: 4096 plaintext_modulus: 65537 coefficient_moduli: [134176769, 268369921, 268361729] security_level: Quantum128 he_scheme: BFV } galois_elements: [2049, 4097] has_relin_key: true } error 11:21:36.356497+0200 ciphermld No key for use-case 'Live-Caller-ID-Lookup.TestLiveCallerID.identity' error 11:21:36.356669+0200 ciphermld requestData(byKeywords:shardIds:clientConfig:) threw an error: CipherML.CipherMLError.missingSecre t Ke y default 11:21:36.357075+0200 com.apple.CallKit.CallDirectory <private> XPC request complete, results(0) error:Error Domain=CipherML.CipherMLError Code=32 "missing secre t ke y" UserInfo={NSLocalizedDescription=missing secre t ke y} default 11:21:36.625701+0200 ciphermld Request to queries-batch has finished response, length: 0 default 11:21:36.626749+0200 com.apple.CallKit.CallDirectory
0
0
231
Feb ’25
How to install macOS Tahoe 26 on an external drive
In the past I was always able to install every major macOS version on an external drive so that I can test my apps. But now I'm unable to install macOS Tahoe 26 on an external drive. Actually, as far as I'm aware, there are not even official links to macOS 26 installers, but only instructions on how to update to macOS 26 from an existing macOS installation. So I thought I'd install macOS 15 on a separate drive and then update to macOS 26, but whenever I run the macOS 15 installer, tell it to install on the external drive, and reboot after the setup process completes, my MacBook just boots into my main macOS partition as if nothing happened. 3 months ago I somehow managed to install macOS Tahoe beta 1 on an external drive, I don't remember how (but I don't think it was anything crazy); booting into that beta 1 partition and trying to update doesn't work either, as my MacBook again boots into my main macOS partition. I already asked help about the update problem one month ago here, but nobody replied. Could someone at Apple please provide instructions on how one is supposed to install macOS 26 on an external drive (if possible before it becomes available to the public)? Are we supposed to buy a separate Mac for every macOS version that we want to test our apps on?
0
1
179
Sep ’25
How do I launch the Outlook Mail app from my development app?
I want to start the mail application from the development application, but the mail application made by Apple starts. It doesn't work even if I change the default email application to Outlook. When you delete the Apple Mail app, Outlook starts. I want to automatically attach the generated pdf file to the email, but that doesn't work either. Please tell me how to code.
0
0
101
Sep ’25
Testing iMessage extension from recipient POV
Hello, I am building an iMessage extension for my app and I am struggling to figure out how to test it. The extension allows users to send their friends an interactive widget and the recipient experience is very important to test. I tried to do it in the simulators, but simulators do not support iMessage. I have got a second iPhone and created a sandbox account, but I cannot install TestFlight with the sandbox account, as this feature is not supported. Reddit, Stackoverflow, ChatGPT and Apple Developer support also did not help. Can someone share their experience with testing recipient experience in the iMessage extension?
0
0
86
Aug ’25
How can I stop CPNavigationSession properly on CarPlay disconnect
Hi there, I'm facing an issue when disconnecting CarPlay that the navigation session seems to be in some weird state where it is not properly finished. So when I reconnect CarPlay the "Metadata in instrument cluster or HUD" does not update anymore until I start another navigation session and stop that one. You can see that the instruction to the left on this screen recording is not updating anymore after a reconnect. https://www.youtube.com/watch?v=sncxyJULjQk I have a modified the CostalRoad sample app to add support for the HUD cluster and to auto start a navigation simulation when CarPlay connects. https://github.com/g4rb4g3/CoastalRoads Can anyone tell me what I have to do when CarPlay disconnect so I can start a new navigation session on reconnect that has a working HUD cluster? Fun fact is that Apple Maps handles this quite nice (https://www.youtube.com/watch?v=OpJEIyGcwdo), it somehow manages to finish the navigation session and brings up the HUD cluster just fine on reconnect. I wonder how I can achieve the same, anyone having an idea on that?
0
0
258
Feb ’25
ASAM supported App on MacOS
I'd like to write a MacOS App that makes use of the ASAM functonality as described here: https://developer.apple.com/documentation/devicemanagement/autonomoussingleappmode I have tried to use the example with Safari, and have enrolled a Mac with MDM and installed the profile. But when opening Safari it does not appear in Single App Mode. I've only tried it with Safari so far but eventually I want to be able to use my own App. Is there an API that has to be used to enter single app mode programmatically? I've found the whole Assessment API but as I do not have the required entitlements to use that API I'm looking for another solution. The documentation on ASAM does not mention the Assessment API at all, is it the only way to enter "a" single app mode on MacOS? How is the Assessment API linked to ASAM? As far as I have understood there's the com.apple.developer.automatic-assessment-configuration entitlement but apps having this do not need to be configured via MDM? I'm really confused as to what's actually required to be able to get into single app mode on MacOS. The app I'm trying to write isn't really related to an an assessment task, but I am doing this for an academic institution so maybe requesting the entitlement would be feasable. The documentation on ASAM also mentions that the App is granted access to the "Accessibility" API and I've found the whole UIAccessibility/requestGuidedAccessSession but this does not seem to be available on MacOS proper? Any help on this would be greatly appreciated.
0
0
82
Nov ’25
Any way to adjust the speechRecognitionMetadata pause duration?
Speech Framework I've been checking for SFSpeechRecognitionMetadata to determine the end of a sentence when using Voice Recognition. Yet it doesn't detect small pauses but only large ones, so that I've transcribed basically an entire paragraph before going onto the next one. Besides implementing your own timer, are there any other ways to have more natural pauses to detect the end of sentences, similar to the browser's Web Speech recognition? Since it's in Safari, I assume there should be some similar feature that can be equivalent in MacOS.
0
0
343
Dec ’24
App Clip works in TestFlight but not elsewhere
My app is available in TestFlight but has been rejected in App Review with the review feedback that the app clip "just shows a blank screen". However, in the TestFlight app, the App Clip works as expected and brings up the clip. It also works correctly from Xcode testing. Any ideas on what the problem could be? It is using the default App Clip link (appclip.apple.com)
0
0
326
Jan ’25
Embedding automation command line tool into an AppStore app
I am developing a macOS word-processing app that should be distributed via the Apple App Store. Some of the app's functions like generating HTML and PDF exports should be automatable via Shortcuts and via shell scripts. To support the latter, I plan to include a command line tool inside the app that can be called from the Terminal or a shell script. The tool should be able to instruct the main app to then perform the desired commands. A well-known AppStore app that uses this design is BBEdit which also contains multiple command line tools that offer functionality from the main app to users of the Terminal. My technical questions now are: Should the command line tool executable be sandboxed and if yes, how? Even after many trials, I have not found a way to make a working sandboxed command line tool. If a sandboxed tool is started from the Terminal, it is immediately terminated with an exception in _libsecinit_appsandbox.cold.12. I am aware of the Apple developer documentation article Embedding A Helper Tool In A Sandboxed App, but it addresses a different architecture in which the helper tool is started from the main app and therefore is able to inherit its sandbox. BBEdit is only sandboxing the main app, but not its embedded command line tools and is still allowed in the App Store. Is this the way to go for me as well or does BBEdit get some special treatment in the App Store? How can the command line tool pass the permission to access files to the main app? As my main app is sandboxed, it needs explicit permission from the user to be able to access files. Users of a command line tool give this permission by providing file paths as arguments. How can I pass these permissions along to the main app? BBEdit is able to do this even when the user has not given it full-disk access. I know that it is using Apple Events for the communication between the command line tool and the main app, but I am not sure how this allows to pass permissions. Can anyone shed light on how to implement a solution here? Thanks!
0
0
386
Dec ’24
A Summary of the WWDC25 Group Lab - watchOS (Part 1)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 1). 1. I'm really excited about the new design system on all platforms. Liquid Glass is super cool. What do developers need to keep in mind when building for watchOS 26? To adopt the new design system, start with updating your app for watchOS 10 – If you have done so, your app will be mostly ready for watchOS 26. For more information, see Design and build apps for WatchOS 10. You can then look into Liquid Glass specific APIs to fine tune your app. This topic is covered in Adopting Liquid Glass. If you have SwiftUI views using any custom style, make sure they are still legible and fit with the new design system. 2. Something that really stood out to me were updates to the Smart Stack, with the system prioritizing Widgets when they're most relevant. Tell me more about these new opportunities for apps. Workout apps that record workouts using HealthKit may be automatically suggested on the watch face and appear in the Smart Stack without adding a widget. Relevant widgets are a great way to present information related to a date, location, point-of-interest type, sleep schedule, or fitness condition in the Smart Stack when it is relevant. Relevant widgets don't need to display a empty state view when they are not relevant. They are only shown in the Smart Stack when relevant. The watchOS 26 Design ToolKit in the Apple Design Resources includes a set of templates that you can use to layout your widgets. 3. Is the Wrist Flick gesture available to developers in the same way as Double Tap is? The system uses Wrist Flick to dismiss notifications and incoming calls, silence timers and alarms, or return to the watch face. There is no separate API for the Wrist Flick gesture. Apps that are using XCUIAutomation to make sure their user interface behaves as intended can use the XCUIDeviceHandGesture.flick to automate tests that verify that their app responds appropriately to the Wrist Flick gesture. For apps using automated testing, the XCUIDeviceHandGesture.doubleTap can be also be used to automate testing of the app with the Double Tap gesture. See XCUIDevice.perform(handGesture:) 4. Can HRV measurements be triggered on demand via API in watchOS? Guidelines or processes for enabling energy-intensive biometric sampling on development devices for IRB-approved research? You don’t have direct control on the sampling rate in watchOS. You can use HealthKit (HKQuantityTypeIdentifierHeartRateVariabilitySDNN, to be specific) to query the HRV data, once the system has sampled and persisted the data to HealthKit. If that doesn’t help, we suggest that you file a feedback report with your concrete use case for us to investigate. Specific to IRB-approved research using Apple Watch or its companion iPhone, you might want to look at this FAQ and SensorKit to see if they can be of any help. 5. What is the best advice for someone who is new to making a watchOS app that’s been on iOS and iPadOS? You can start with exploring the system experience features on watchOS, such as notifications, controls, and widgets, and getting familiar with the system spaces, like Smart Stack, watch face, and control center. Knowing the watchOS app design principles and practices is important as well. Design and build apps for WatchOS 10 is a great resource for this topic. SwiftUI is an amazing across-platform framework, and you will use it to create your watchOS app. If you're already using it, great! Keep in mind some watch-only constraints. Comparing to iPhone or iPad, Apple Watch has a limited battery and smaller screen size, which significantly impacts how people use your app and how your app works. 6. Was there any extension this year to the 7 day limit on querying Apple Health data on the watch? There is no change on the limit this year. You can get this official limit at runtime using earliestPermittedSampleDate. There are some exceptions, and so don't be surprised if you see some data types are retained longer. The companion iPhone holds the full set of the health data. If you need to access the health data that has been purged from the Apple Watch, consider doing it with your iOS app, and then passing the result to your watchOS app.
0
0
119
Jul ’25
A Summary of the WWDC25 Group Lab - watchOS (Part 2)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 2). 7. For widget (complication) update budgets, is there an overall budget or are scheduled update separate from APNS updates? For context I have a complication that is updated on a fixed schedule (every 20 min), but there can be times of the day that are more "interesting" where pushes make sense. Like timeline updates, the system budgets WidgetKit push notifications and delivers them opportunistically. You can use WidgetKit push notification updates as an addition to timeline updates. For more information, see Updating widgets with WidgetKit push notifications. 8. It seems like the new Control Center widgets can be sourced from either the iPhone or directly on the Watch. Can we control whether a control appears in the watch list, or will it always be a combination of all controls from both sources? iPhone controls will be automatically available on the companion Apple Watch, even if they don’t have an associated watchOS app. When an iPhone control is tapped on the Apple Watch, the action is performed on the iPhone. Controls whose actions foreground the iOS app will not appear on Apple Watch. If a watchOS app has controls, no controls will appear on Apple Watch from the companion iOS app. 9. From UI/UX perspective, what are the current practices for Designing watchOS apps that feels native. The WWDC23 session Design and build apps for WatchOS 10 covers the details of watchOS design principles and how to apply them in your app using SwiftUI. A lot of SwiftUI APIs, such as NavigationSplitView, vertical tab view, list view, and etc, already implement the look and feel native to watchOS. 10. When adopting the new design system on watchOS, it seems like the main place we will use the glass effect is for our buttons in toolbar? Standard buttons in system apps seem to continue to use a flat appearance and full width. We leave the choice to you – You can use the new GlassButtonStyle API or .buttonStyle(.glass) to apply the liquid glass material to buttons. Learn when to use the Liquid Glass styles in Get to know the new design system. 11. Is there any way to gracefully migrate extensions when their bundleIDs have to change? e.g., converting a multi-target watch app to single-target, which drops the .watchkitextension from both the app and WidgetKit ext bundleIDs Updating a watchOS app to single-target is covered in TechNote TN3157: Updating your watchOS project for SwiftUI and WidgetKit. Xcode provides a tool that can do the update automatically, and the technote describes the details about how to use it and how to clean up the project after the automatic update. If there's something that technote doesn't address, please reach out to us on the Developer Forums. 12. What is the status of WatchConnectivity? Is that still the preferred way for iOS + watchOS communications? The Watch Connectivity framework is still supported, and is appropriate for the communication between an watchOS app and its companion iOS app. The systems also provide other APIs for the apps to exchange data. For example, watchOS supports Apple Push Notification service (APNs). If data for your widget changes on your server, your widget can receive a WidgetKit push notification, and update accordingly. That’s the preferred mechanism for widget updates.
0
0
110
Jul ’25
How to Archive iMessages via API with User Authorization Workflow?
I’m working on a solution to archive iMessages by using an API or similar mechanism. Here’s the desired workflow: The user provides their phone number to initiate the archiving process. They receive a text message with a URL link. Clicking on the link authorizes the archiving of their iMessages. Once authorized, their text messages are archived. So far, I’ve researched third-party services and APIs but haven’t found any that offer this capability directly for iMessages. Questions: Are there any APIs or frameworks (Apple or third-party) that support accessing and archiving iMessages programmatically?
0
0
437
Jan ’25
WebDomain.token always returns nil - What am I doing wrong?
I'm new to the Screen Time API and trying to block custom websites, but I can't get WebDomain tokens to work. When I create a WebDomain like WebDomain(domain: "reddit.com"), the token property is always nil. I have proper authorization and the app works fine for blocking apps, but website blocking just won't work. I'm confused because I see apps like JOMO that let users type in any website domain and successfully block it using Screen Time API. They have the same 49 domain limit and only ask for Screen Time permission, so they must be using the same API I am. But somehow their WebDomain tokens work and mine don't. I've tried creating the tokens right after getting authorization and during the FamilyActivityPicker session, but still get nil. Am I missing some setup step or API call that makes WebDomain tokens valid? Any help would be really appreciated since I'm stuck on this.
0
0
59
Aug ’25
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
0
0
49
Jun ’25