With the Live Caller ID example server, the caller lookup dataset is defined in an input.txtpd and processed by running a ConstructDatabase command which creates a block.binpb and an identity.binpb file.
In other words, a static input file is being processed into static block and identity files.
However, in the real world, the data content for identified and blocked numbers is something which is in a constant state of flux and evolution, as new numbers becoming available, old ones become stale, numbers which were initially considered safe change into being considered malicious etc. etc.
Is the example server just that, merely an example using fixed datasets, and an actual production server is able to use live every changing data to formulate its response back to the iPhone OS query?
Here's a concrete use case - suppose it's a requirement to permit US nanp numbers but to block anything else. The total number of non US nanp numbers is so large and ever changing that it would be unfeasible to attempt to capture them in an input.txtpd file and then process that, and then to re-capture and re-process it endlessly. Instead what would be required is the ability for the Live Caller ID server to evaluate at query time, using a regular expressions for example, if a number is nanp or not.
Is this possible?
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I can reproduce the bug that CallKit doesn't active audiosession after the outgoing call put on hold because of an incoming call.
VoIP calling with CallKit
Steps to reproduce:
Download SpeakerBox example app from the link above and start it with XCode
Start a new outgoing call
Call your phone from other phone
Hold and Accept the call
After a few secs finish the call from the other phone
The outgoing call will be still on hold
Click on the call and click Toggle Hold
The call won't be active again because the audiosession is activated.
Logs:
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Requested transaction successfully
Starting audio
Type: stdio
AURemoteIO.cpp:1162 failed: 561017449 (enable 3, outf< 1 ch, 44100 Hz, Float32> inf< 1 ch, 44100 Hz, Float32>)
Type: Error | Timestamp: 2024-08-15 12:20:29.949437+02:00 | Process: Speakerbox | Library: libEmbeddedSystemAUs.dylib | Subsystem: com.apple.coreaudio | Category: aurioc | TID: 0x19540d
AVAEInternal.h:109 [AVAudioEngineGraph.mm:1344:Initialize: (err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)): error 561017449
Type: Error | Timestamp: 2024-08-15 12:20:29.949619+02:00 | Process: Speakerbox | Library: AVFAudio | Subsystem: com.apple.avfaudio | Category: avae | TID: 0x19540d
Couldn't start Apple Voice Processing IO: Error Domain=com.apple.coreaudio.avfaudio Code=561017449 "(null)" UserInfo={failed call=err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)}
Type: Notice | Timestamp: 2024-08-15 12:20:29.949730+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
Route change:
Type: Notice | Timestamp: 2024-08-15 12:20:30.167498+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
ReasonUnknown
Type: Notice | Timestamp: 2024-08-15 12:20:30.167549+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
Previous route:
Type: Notice | Timestamp: 2024-08-15 12:20:30.167568+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
<AVAudioSessionRouteDescription: 0x302c00bc0,
inputs = (
"<AVAudioSessionPortDescription: 0x302c01330, type = MicrophoneBuiltIn; name = iPhone Mikrofon; UID = Built-In Microphone; selectedDataSource = (null)>"
);
outputs = (
"<AVAudioSessionPortDescription: 0x302c004d0, type = Receiver; name = Vev\U0151; UID = Built-In Receiver; selectedDataSource = (null)>"
)>
Type: Notice | Timestamp: 2024-08-15 12:20:30.167626+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
I've been working a lot with the FamilyControls API and App Shield recently but have encountered a problem with no documentation. I used the FamilyActivitySelection to select the app store to shield(This is just for testing), and then printed out the application token:
1wfY¸êB ò S« öi #×(É?âðw ù/jQ ¿ J ïE¢? ·¿ º<Òd?ý r7¥Ãn N átJ¹ÿ85B_{VAF fC8. ,,¸¯3 T7F ±õü; ¹?v@¯ô Ä \-õ# Ò
I know the application token is a Codable object so I was wondering,
How do I create an application token using the Token<Application> initializer
init(from: any Decoder) throws
Creates a new instance by decoding from the given decoder.
Using the above data? Do I have to encode first in order to decode it?
For reference, the code I tried to use is:
newValue.applicationTokens.encode(to: JSONEncoder)
if let encoded = try? JSONEncoder().encode(newValue.applicationTokens) {
data = encoded
print(String(data: data, encoding: .utf8)!)
}
if let app = try? JSONDecoder().decode(Token<Application>.self, from: data) {
let token = Application(token: app)
print(token)
} else {
print("didn't work")
}
But it prints didn't work every time.
What should I do differently?
Topic:
App & System Services
SubTopic:
General
Tags:
Privacy
Application Services
Managed Settings
Family Controls
PDFKit PDFPage.characterBounds(at: Int) is returning incorrect coordinates with iOS 18 beta 4 and later / Xcode 16 beta 4.
It worked fine in iOS 17 and earlier (after showing the same issue during the early iOS 17 beta cycle)
It breaks critical functionality that my app relies on.
I have filed feedback (FB14843671).
So far no changes in the latest betas. iOS release date is approaching fast!
Anybody having the same issue? Any workaround available?
It looks like Apple has added some new API(s) to SFSpeechRecognition
My app, which is currently listed on App Store does feature speech recognition.
Yet, trying to use it under iOS 18.0 throws errors:
-[SFSpeechRecognitionTask localSpeechRecognitionClient:speechRecordingDidFail:]_block_invoke Ignoring subsequent local speech recording error: Error Domain=kAFAssistantErrorDomain Code=1101 "(null)"
What happens is that after several words are transcribed and displayed, the next sentence results in previous words disappearance.
That's probably what that portion of the error text - "Ignoring subsequent local speech recording error: Error Domain=kAFAssistantErrorDomain Code=1101 "(null)" means.
The problem occurs ONLY when the app is running under iOS 18.0
Even when it's compiled in Xcode 16.0 using iOS 17.5 everything works fine.
Any suggestions?
Hello everyone,
I’m experiencing a crash in my iOS application that’s occurring predominantly on devices running iOS 16.6.0. The crash seems to happen on the main thread during a UI operation, specifically within the UIKitCore framework.
Crash Log Summary
Thread 0 Crashed:
0 libsystem_kernel.dylib 0xca4 mach_msg2_trap + 8
1 libsystem_kernel.dylib 0x13b74 mach_msg2_internal + 80
2 libsystem_kernel.dylib 0x13e4c mach_msg_overwrite + 540
3 libsystem_kernel.dylib 0x11e8 mach_msg + 24
4 CoreFoundation 0x79024 __CFRunLoopServiceMachPort + 160
5 CoreFoundation 0x7a250 __CFRunLoopRun + 1208
6 CoreFoundation 0x7f3ec CFRunLoopRunSpecific + 612
7 GraphicsServices 0x135c GSEventRunModal + 164
8 UIKitCore 0x39cf58 -[UIApplication _run] + 888
9 UIKitCore 0x39cbbc UIApplicationMain + 340
10 MyApp 0x24050 main + 51 (AppDelegate.swift:51)
11 ??? 0x1d3594dec (Missing)
I’ve attached the full crash
crashlog.txt
and would appreciate any insights or recommendations on how to resolve this issue.
Our app includes showing external web service with WebView or Safari and returning to the app with custom URL scheme or universal link.
When we set "Hide and Require Face ID" feature which was available on iOS 18, neither custom URL scheme nor universal link activated the app.
If we only set "Require Face ID", the deep link worked properly.
Here is what we've tried:
Define custom URL scheme or universal link in the app
https://developer.apple.com/documentation/xcode/defining-a-custom-url-scheme-for-your-app
https://developer.apple.com/documentation/xcode/supporting-universal-links-in-your-app
Implement external web service with one of the following frameworks
ASWebAuthenticationSession
https://developer.apple.com/documentation/authenticationservices/aswebauthenticationsession/
SFSafariViewController
https://developer.apple.com/documentation/safariservices/sfsafariviewcontroller
Safari
WKWebView
https://developer.apple.com/documentation/webkit/wkwebview
On iOS 18 device, install the app and set "Hide and Require Face ID"
Access external web page and tap the link which activates custom URL scheme or universal link
We expected the deep link to work, but the results were:
Custom URL scheme &amp; ASWebAuthenticationSession/SFSafariViewController/Safari
The system shows "Cannot open the page because the address is invalid"
Custom URL scheme &amp; WKWebView
Nothing happens when tapping the link
Universal link
Directed to the server with associated domain file, but the system doesn't call the app which is defined in the associated domain file
We tested the feature with the app built with Xcode16 beta 6, and the device with iOS 18 Seed 8(22A5350a).
Does hide app feature support custom URL scheme and universal link?
We are in the process of updating our legacy Spotlight MDImporter to the new macOS Spotlight App Extension.
The transition works well for standard attributes such as title, textContent, and keywords.
However, we encounter an issue when adding custom attributes to the CSSearchableItemAttributeSet.
These custom attributes are not being persisted, which means they cannot be queried using a Spotlight NSMetadataQuery.
Has anyone an idea on how to append custom attributes so that they are included in the indexed file status, as displayed by the shell command mdimport -t -d3 <path>
A sample project illustrating the problem is available here: https://www.dropbox.com/scl/fi/t8qg51cr1rpwouxdl900b/2024-09-04-Spotlight-extAttr.zip?rlkey=lg6n9060snw7mrz6jsxfdlnfa&dl=1
Our app uses a 24-hour DeviceActivityMonitor repeating schedule to send users notifications for every hour of screen time they spend on their phone per day. Notifications are sent from eventDidReachThreshold callbacks at 1, 2, 3, etc, hour thresholds to keep them aware of their screen time.
We have recently received an influx of emails from our users that after updating to iOS 17.6.1 their DeviceActivityMonitor notifications are saying their screen time was much higher than what is shown in DeviceActivityReport and their device's Screen Time settings.
These users have disabled "Share Across Devices" - but I suspect the DeviceActivityMonitor is still getting screen time from their other devices even though that setting is turned off.
Has anybody else noticed this, understands what is causing this, or could recommend a fix that we can tell our users to do?
Hi! We are having a hard time with the universal link, help is appreciated! Thanks in advance!
The universal link doesn't work after installation for some time. A user has to wait for from 5 to a couple of hours after the app is installed on the device.
This has also affected App reviewers since we need the universal link to work for successful login. Each submission will receive a rejection of we cannot login and it will be approved until we kindly ask them to try again.
I believe the JSON is delivered to devices by Apple's CDN system and the fact that it works on most devices most of the time should imply that we have a valid apple-app-site-association setup.
So I am really confused about the wait time, which is giving us trouble with app review and a bad user experience
I'm a font developer. In the development process, I will revise a font and overwrite the OTF file that is currently enabled (registered) with macOS.
If I then launch an app, it will immediately use the revised version of the font; while apps that are already loaded will continue to use the old version.
This suggests that each app is loading new and separate font data, rather than getting it from some existing cache in memory. Yet macOS does have a "font cache" of some sort.
Some apps, like TextEdit, seem to only load the fonts that they need to use. However, other apps, like Pages, load every enabled (registered) font on the OS!! (According to the Open Files list in Activity Monitor.)
Given that /System/Library/Fonts/ is 625 Mb, and we can't disable any of it, isn't that a lot of data to be repeating? How many fonts is too many fonts?
I can't find much documentation about the process.
We have found a large number of memory-related crashes in iOS18, and multiple issues have finally pointed to this crash line xzm_xzone_malloc_tiny_outlined. Do not know how to solve it now
A message filter extension is only forwarded SMSs by the OS for filtering, iMessages aren't.
But what is the situation with RCS messages? Will they be filterable by a message filtering extension?
Hi,
I'm currently working on an app made originally for iOS 15. On it, I add an observer on viewDidLoad function of my ViewController to listen for changes on the UserDefault values for connection settings.
NotificationCenter.default.addObserver(self, selector: #selector(settingsChanged), name: UserDefaults.didChangeNotification, object: nil)
Said values can only be modified on the app's section from System Settings.
Thing is, up to iOS 17, the notification fired as expected, but starting from iOS 18, the notification doesn't seem to be sent by the OS.
Is there anything I should change in my observer, or any other technique to listen for the describe event?
Thanks in advance.
I'm the developer of Camera RawX (avail on the Mac App Store).
I'm working on Camera RawX for iOS to provide Quick Look support for camera RAW files not supported by iOS.
I use the Files app to open a RAW file to invoke Quick Look on my iPad (it is running iOS 17.6.1).
The RAW file in question is a Fuji compressed RAF file.
When I tap on the RAF file, iOS opens the Quick Look window, but my app's Quick Look extension is not called.
If the RAW file in question is a Sigma Foveon X3F file, a file that has no native Apple RAW support, then my Quick Look extension is called and I'm able to display the image in the Quick Look window without issue.
It seems that a system recognized RAW file extension (RAF in this case), is not triggering my Quick Look extension. On the macOS, this works fine without any issue.
The strange thing is that my Thumbnail extension is being called when the RAW files show up in Files. Even if it is a RAF file. So it seems like a bug to me or am I missing something crucial in my Info.plist file?
Albert
Since I updated to iOS 18, CallKit-linked caller not display on screen of CarPlay.
CarPlay display only "{App Name} Caller ID".
When iOS version was 17.x, CarPlay displayed caller name of CallKit-linked contact.
I think CarPlay should perform the same function as iOS 17.
Please review it.
We have developed an application using xamarin forms , our iOS app is working fine till iOS17 , if we upgraded our OS version to iOS18 app is not working properly.
Visual studio for Mac 2022
Xcode 16
Minimum OS version 15.4
Xamarin.iOS version 16.4.023
Hello,
I think it is quite a common use-case to open the parent app that owns the ShieldActionDelegate when the user selects an action in the Shield.
There are only three options available that we can do in response to an action:
ShieldActionResponse.none
ShieldActionResponse.close
ShieldActionResponse.defer
It would be great if this new one would be added as well:
ShieldActionResponse.openParentApp
While finding a workaround for now, the problem is that the ShieldActionDelegate is not a normal app extension. That means, normal tricks do not work to open the parent app from here.
For example, UIApplication.shared.open(url) does not work because we can’t access UIApplication from the ShieldActionDelegate unfortunately.
NSExtensionContext is also not available in the ShieldActionDelegate unfortunately, so that’s also not possible.
There are apps however, that managed to find a workaround, in my research I stumbled across these two:
https://apps.apple.com/de/app/applocker-passcode-lock-apps/id1132845904?l=en-GB
https://apps.apple.com/us/app/app-lock/id6448239603
Please find a screen recording (gif) attached.
Their workaround is 100% what I’m looking for, so there MUST be a way to do so that is compliant with the App Store guidelines (after all, the apps are available on the App Store!).
I had documented my feature request more than 2 years ago in this radar as well: FB10393561
Hey, I was wondering on how to react to the focus mode being turned off inside the FocusFilterIntent. I've successfully managed to call a specific action when the focus is being set, but I now want to deactivate/ react when the focus is being deactivated. How can I achieve something like this?
Hello everyone,
I’m currently receiving feedback from clients in a production environment who are encountering a BadDeviceToken error with Live Activities, which is preventing their states from updating. However, for other clients, the token is working fine and everything functions as expected.
I’m collaborating with the back-end developers to gather more information about this issue, but the only log message we’re seeing is:
Failed to send a push, APNS reported an error: BadDeviceToken
I would greatly appreciate it if anyone could provide some insight or information on how to resolve this issue.