I am developing a macOS word-processing app that should be distributed via the Apple App Store. Some of the app's functions like generating HTML and PDF exports should be automatable via Shortcuts and via shell scripts.
To support the latter, I plan to include a command line tool inside the app that can be called from the Terminal or a shell script. The tool should be able to instruct the main app to then perform the desired commands.
A well-known AppStore app that uses this design is BBEdit which also contains multiple command line tools that offer functionality from the main app to users of the Terminal.
My technical questions now are:
Should the command line tool executable be sandboxed and if yes, how?
Even after many trials, I have not found a way to make a working sandboxed command line tool. If a sandboxed tool is started from the Terminal, it is immediately terminated with an exception in _libsecinit_appsandbox.cold.12. I am aware of the Apple developer documentation article Embedding A Helper Tool In A Sandboxed App, but it addresses a different architecture in which the helper tool is started from the main app and therefore is able to inherit its sandbox.
BBEdit is only sandboxing the main app, but not its embedded command line tools and is still allowed in the App Store. Is this the way to go for me as well or does BBEdit get some special treatment in the App Store?
How can the command line tool pass the permission to access files to the main app?
As my main app is sandboxed, it needs explicit permission from the user to be able to access files. Users of a command line tool give this permission by providing file paths as arguments. How can I pass these permissions along to the main app? BBEdit is able to do this even when the user has not given it full-disk access. I know that it is using Apple Events for the communication between the command line tool and the main app, but I am not sure how this allows to pass permissions. Can anyone shed light on how to implement a solution here?
Thanks!
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
reposting this in case it got missed the first time around here
https://developer.apple.com/forums/thread/775900
We had a question that came up when we comparing data from WeatherKit to other sources - WeatherKit visibility was well beyond the boundaries we had historically, even from Darksky. That raises two questions:
is visibility actually in meters like the docs say?
is this visibility at ground level, 500ft, or some other height?
We were seeing visibility numbers of up to 40 miles (after converting the number the API sent to miles), where all of our other sources are usually within 10 miles
Hi,
I'm trying to setup PIR service for live caller id lookup (in python but based on swift example: https://github.com/apple/live-caller-id-lookup-example). The swift example provides utilities for database setup and encryption, but I can't find any specification about which key is used for database encryption and how the ios system knows about this key in order to be able to construct the PIR requests.
So my question is how does the PIR service communicate the secret key to ios system or vice versa? (specific to the test environment, before onboarding)
Hi everyone,
i'm trying to request in a AppIntent an array of strings. But I want to give the user the chance to add more than one String.
Yet, I do it so:
import AppIntent
struct AddHomework: AppIntent {
// some Parameters
@Parameter(title: "Tasks")
var tasks: [String]?
@Parameter(title: "New Task") //Only for the special request
var input: String?
private func collectTasks() async throws -> [String] {
var collectedTasks: [String] = tasks ?? []
while true {
if !collectedTasks.isEmpty {
let addMore = try await $input.requestConfirmation(for: "Möchtest du noch eine Aufgabe hinzufügen?")
if !addMore {
break
}
}
let newTask = try await $input.requestValue("Please enter your task:")
collectedTasks.append(newTask)
}
return collectedTasks
}
@MainActor
func perform() async throws -> some IntentResult {
let finalTasks = try await collectTasks()
// some more code
return .result()
}
}
But this is not working. The Shortcut is ending without requesting anything. But it is not crashing.
I would thankfully for some help.
Hey,
I would love to access the users Contact (ie. the Me Card)
Apple recently released the Invites app and within this app you can find the users Contacts Photo. I tried to replicate this behaviour but I currently need to match based on a name or email or something else. I have no idea how to receive this data because when using the Invites app I only remember the app asking for Contacts permission and nothing else so far.
let store = CNContactStore()
let keysToFetch = [CNContactImageDataAvailableKey, CNContactImageDataKey, CNContactThumbnailImageDataKey] as [CNKeyDescriptor]
let email = "test@test.de"
let predicate = CNContact.predicateForContacts(matchingEmailAddress: email)
do {
let contacts = try store.unifiedContacts(matching: predicate, keysToFetch: keysToFetch)
let imageDatas: [Data] = contacts.compactMap { $0.imageData }
self.images = imageDatas.map { UIImage(data: $0) ?? UIImage() }
} catch {
print("Error fetching contacts: \(error)")
}
This is how I am retrieving the Image. MAYBE someone can help me out.
Thank you so far
~ Flo
I've successfully started the Live Caller ID Lookup example and initialized the PIRService.
I added several identities to the input.txtpb file, some with block: true and others with block: false.
Here is the file but modified phone digits:
identities {
key: "+40790123123"
value {
name: "Blocking 1"
cache_expiry_minutes: 7
block: true
}
}
identities {
key: "+972526111111"
value {
name: "Blocking 2"
cache_expiry_minutes: 7
block: true
}
}
identities {
key: "+123"
value {
name: "Adam"
cache_expiry_minutes: 8
block: false
category: IDENTITY_CATEGORY_PERSON
}
}
identities {
key: "+972526111112"
value {
name: "Identified Business Name 1"
cache_expiry_minutes: 1
block: false
category: IDENTITY_CATEGORY_BUSINESS
}
}
identities {
key: "+972526111113"
value {
name: "Identified Business Name 2"
cache_expiry_minutes: 1
block: false
category: IDENTITY_CATEGORY_BUSINESS
}
}
The main issue is that only the number marked as +40790123123 was actually blocked, while "Blocking 2" appeared as identified contacts with their assigned name displayed.
Notably, the only blocked number was a foreign number with a different country code than the number being called. The other numbers belonged to the same country.
Can someone clarify whether this is a bug in the example project or an issue with the data file?
I am creating an iOS app that needs to parse the text from a PDF document. I can read the entire PDF document's text using the string property, but if it's a large PDF document, this could cause delays for users.
From the documentation, I came across the beginFindString function, which seems to asynchronously, with no return?
https://developer.apple.com/documentation/pdfkit/pdfdocument/beginfindstring(_:withoptions:))
Unfortunately I cannot find examples on how to use this function or its intended purpose/functionality, so any guidance would be appreciated.
My goal is to read the PDF document one line at a time, searching for newlines ('\n'), then parsing that line as needed. I'm hoping the beginFindString function will be useful.
Is it possible to integrate a button in an app that displays advertisements to support charities -without offering any direct reward to the user?
Currently, I am working on an iOS app with a Deployment Target set to iOS 15.0, and macOS 12.0. The app is allowed to run on Macs with Apple Silicon.
A customer with a Mac running macOS Monterey (12) is complaining that in the TestFlight app, they cannot install the app since it shows "Requires OS Update", even though the deployment target is smaller than the installed version of macOS 12.
Are there any specifications available on which macOS version is required in order to use iOS apps on Silicon Macs?
I try to update remoteHandle using CXCallUpdate for outgoing call, but this works only on iOS 15 but not on 17 or 18 (16 didn't test). This problem actual only for outgoing calls, but for incoming calls update works fine.
func startOutgoingCall(with callID: UUID, userID: String) {
let handle = CXHandle(type: .generic, value: userID)
let action = CXStartCallAction(call: callID, handle: handle)
callController.requestTransaction(with: action) { [weak self] error in
// ...
}
}
func updateOutgoingCall(with callID: UUID, groupID: String) {
let update = CXCallUpdate()
update.remoteHandle = CXHandle(type: .generic, value: groupID)
provider.reportCall(with: callID, updated: update)
}
I also tried phoneNumber type but it seems initial handle that I set to CXStartCallAction not possible to change (value or even type).
I use this handle value to implement recall by tap on call in Recents tab of system address book. But since my calls can transform from p2p to group call, I need to update handle value or find some another way to pass call identification info.
After setting up all permissions, family members not showing up on the device list
Topic:
App & System Services
SubTopic:
General
Tags:
Family Controls
Device Activity
Managed Settings
Screen Time
We had a question that came up when we comparing data from WeatherKit to other sources - WeatherKit visibility was well beyond the boundaries we had historically, even from Darksky. That raises two questions:
is visibility actually in meters like the docs say?
is this visibility at ground level, 500ft, or some other height?
We were seeing visibility numbers of up to 40 miles (after converting the number the API sent to miles), where all of our other sources are usually within 10 miles
This is more a general question of whether it is possible to share persistent/coredata from the main app to Screentime-related extensions such as DeviceActivityReportExtension.
I've set my code up (e.g., App Groups, files to different targets, using nspersistentcontainer with app group url, etc.) in a way that it builds, and the extension seems to recognize my CoreData schema (able to query using fetchrequest). But the data returned is always null. So i'm wondering if it is even possible to READ app data from the extension.
I understand it is not possible to write or pass data from the extension back to the app. I've also been able to read data that was saved in main app from UserDefaults in my extension.
I am developing an app that can help users disable selected apps at a specified time, so that users can get away from their phones and enjoy real life.
Here is my data structure:
extension ActivityModel {
@NSManaged public var id: UUID
@NSManaged public var name: String
@NSManaged public var weeks: Data
@NSManaged public var weekDates: Data
@NSManaged public var appTokens: Data
}
Among them, weeks is of [Bool] type, indicating which weeks from Sunday to Saturday are effective; weekDates is of [[Date,Date]] type, indicating the effective time period; appTokens is of Set type, indicating the selected apps。
At the beginning, I will open a main monitor:
let deviceActivityCenter = DeviceActivityCenter()
do{
try deviceActivityCenter.startMonitoring(
DeviceActivityName(activityModel.id),
during: DeviceActivitySchedule(
intervalStart: DateComponents(hour: 0,minute: 0,second: 0),
intervalEnd: DateComponents(hour: 23,minute: 59,second: 59),
repeats: true
)
)
}catch {
return false
}
Since the time range may be different every day, I will start the sub-monitoring of the day every time the main monitoring starts:
override func intervalDidStart(for activity: DeviceActivityName) {
super.intervalDidStart(for: activity)
if activity.rawValue.hasPrefix("Sub-") {
ActivityModelManager.disableApps(
Tools.getUUIDFromString(activity.rawValue)
)
return
}
let weekIndex = Calendar.current.component(.weekday, from: .now)
let weeks = ActivityModelManager.getWeeks(activity.rawValue)
if weeks[weekIndex] {
let weekDates =
ActivityModelManager.getWeekDates(activity.rawValue)
let deviceActivityCenter = DeviceActivityCenter()
do{
try deviceActivityCenter.startMonitoring(
DeviceActivityName("Sub-" + activityModel.id),
during: DeviceActivitySchedule(
intervalStart: getHourAndMinute(weekDates[weekIndex][0]),
intervalEnd: getHourAndMinute(weekDates[weekIndex][1]),
repeats: false
)
)
}catch {
return
}
}esle {
return
}
}
I will judge whether it is main monitoring or sub monitoring based on the different activity names.
When the sub-monitor starts, I will get the bound application and then disable it:
static func disableApps(_ id : UUID){
let appTokens = ActivityModelManager.getLimitAppById(id)
let name = ManagedSettingsStore.Name(id.uuidString)
let store = ManagedSettingsStore(named: name)
store.shield.applications = appTokens
return
}
When the child monitoring is finished, I resume the application:
static func enableApps(_ id : UUID){
let name = ManagedSettingsStore.Name(id.uuidString)
let store = ManagedSettingsStore(named: name)
store.shield.applications = []
}
The above is my code logic.
When using DeviceActivityMonitorExtension, I found the following problems:
intervalDidStart may be called multiple times, resulting in several sub-monitors being started.
After a period of time, the monitoring is turned off.
The static methods enableApps and disableApps are sometimes not called
Topic:
App & System Services
SubTopic:
General
Tags:
Family Controls
Device Activity
Screen Time
Entitlements
Hi
I am developing a game app with Epic Unreal Engine.
I am testing this as testFlight these days.
My problem is “launch URL” what a function in Unreal Engine.
This is a function that allows user to search the Internet with the entered URL.
It worked well before. But not now. I don't know when it did start not working. It's like after the iPhone IOS update or the Xcode update.
Mac sequoia 15.1.1
Xcode 16.2
( Unreal Engine 5.4.4 )
IOS is 18.2. but It didn't work since the just previous version.
Any advice can I get?
Topic:
App & System Services
SubTopic:
General
Is there any way I can get updates when I change CarPlay style settings?
I've tried CPSessionConfigurationDelegate.contentStyleChanged and CPTemplateApplicationSceneDelegate.contentStyleDidChange, but they always produce the same result.
When I choose:
Automatic -> I receive light in case of daylight;
Always Dark and Always Show Dark Map toggle on -> dark
Always Dark and Always Show Dark Map toggle off -> light.
But it seems to be wrong, b/c CarPlay's toolbar is still dark, and I receive light.
Is there a way to get a dark style when choosing Always Dark and Always Show Dark Map toggle off? Or at least get updates when the Always Show Dark Map toggle changes?
I'm writing an app that uses text to speech. On the physical Iphone 12 it works but on the simulator I don't get any speech from my app.
I tried the following to enable.
Settings -> Accessebility -> "Spoken Content" -> "Speak Selecton" to ON (green)
in Voices I added the German Anna that I use on my Iphone with the downloaded enhancements.
The speech test works in this menu but not in my app. What am I missing ?
Cheers,
Gerhard
Am showing daily screen-time of a user in my app in Device Activity Report Extension. The only way to get that is to sum up all the activityDuration of apps/categories/domains. But it differs a lot from phone's settings screen-time, why?
I have debugged in details and counted manually the time spent on each app and it turned out that the calculation is appearing correctly in my app but Phone settings showing quite less time on top (Day).
I am working on a Flutter application which is use solely to collect data from a bluetooth low energy (BLE) peripheral and then upload the data to our cloud.
The application runs in the background 99% of the time after the initial login and BLE pairing which is causing us some issues.
After the Application is backgrounded it would work for a day to 2 days and then stop working. (What I mean with working is to download data from the BLE peripheral and then upload the data to our cloud). Once the data syncing has stopped it would take up to 12 hours until data starts flowing again.
I have read in a couple of places that iOS implements some sort of "budget/heuristics" when the application is running in the background to keep track of the application and when this "budget" is used up iOS will stop servicing the application until iOS decides that the application can run in the background again.
My question, is it possible via a enablement or some other mechanism to prevent iOS from blocking our application from running in the background to enable 24/7 periodic data uploads every 30 minutes.
We have implemented the following so far;
The data sync process is triggered from the BLE peripheral using a notification. This notification is sent every 30 minutes.
Each sync duration is currently 24 seconds on average, we are working on reducing this to below 10 seconds.
We implemented State Restoration to assist iOS in starting the application more efficiently.
We are considering using Silent Push Notifications from the Cloud to wake up the application when data hasn't synced in 6 hours.
Any assistance would be high appreciated.
I have been using the hourly weather forecast API, for some reason sometimes the API fails with 400 Bad Request, but on retrying just a minute later the call successfully returns data. The start and end time are 2 days apart so I don't think it's an issue with the time frame.
The failed calls also don't return any reason so not sure what is the exact failure.
Has anyone encountered this issue or knows why this might be happening??
Thanks!!