I've noticed issues with logging in the Mac's console app for a very long time now (from Xcode 15?) but have just spent several hours trying to objectively observe and monitor what's going on and try to set up reliable logging. But I've totally been unable to and my conclusion is that logging just absolutely cannot be relied upon at all, its so chronically bad as to be unusable.
If I for example add some logging lines right at the start of didFinishLaunchingWithOptions() and use a variety of logging mechanisms NSLog(), print(),os_log_with_type(), OSLog() (AppDelegate is in Obj-C and calls Obj-C logging, then calls a swift function for Swift logging), then none of them are reliable.
If the app is build/installed via Xcode then logging is reliable within Xcode's console and also within the Mac's console app. But then if the app is uploaded/installed via Testflight it's a very different matter.
Sometimes, but not very often, the logging is as expected, but more often than not, some of it is missing. How much is missing seems totally random, sometimes its a little, sometimes its a lot, and something else that very very often happens is there's lots of duplicate logging, each logging line will appear 2 or three times.
Here's a very simple example to illustrate what happens (in this example for simplicity I'm just showing using NSLog, don't focus on that as I know NSLog is "old", its the exact same result regardless of how the logging is actually performed).
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
appDelegate = self;
NSLog(@"Log line 1");
NSLog(@"Log line 2");
NSLog(@"Log line 3");
NSLog(@"Log line 4");
NSLog(@"Log line 5");
When the app is downloaded from Testflight then only very rarely will I see 5 lines of logging, sometimes it'll be 4, sometimes 2, sometimes none. And quite often the logging is duplicated i.e. I might see for example in the Console app:
Log line 1
Log line 1
Log line 2
Log line 2
Log line 3
Log line 3
In general it's just totally unusable and unreliable. It just cannot be be used at all.
Why is it this bad? What can be done to make logging reliable and useful?
I've spent days and days reading the recommended approaches, trying things out, including the new stuff like OSLog etc. But it remains dreadful.
What is the recommended approach to make logging 100% reliable?
There's never any problem with Xcode's console, it's only with the Mac's console app. However, when an app is being tested which has been installed from Testflight then using Xcode's console can't be used. So If a QA team find problems with a Testflight build and attach the Console log its utterly useless as its contents are effectively random.
Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am trying to play with the sample code that you provided to run the fedora distribution. However, when I compiled it with swift terminal, I get the following error.
error: 'VZVirtualMachineConfiguration' is only available in macOS 11.0 or newer
How can I instruct swift to fetch the proper framework?
PS: I am running all from my terminal, I am not an IDE user
I have shortcuts up and running and I have my custom response added to my completion handler since day1.
Recently I upgraded to iOS18, and found out the app I develop can not display the custom response.
I test the app on iOS17.6, the display of custom response is no problem.
The situation is exactly like the problem posted on 2018: https://forums.developer.apple.com/forums/thread/109324
Can anyone help me or have the same bugs?
Thank you so much!
Happy 2025
Hello, any one encounter the issue NSApplicationServices is invalid when uploading TestFlight build?
We are facing an issue with our latest iOS build.
For context, we are trying to add support for the apple watch connectivity with tvOS. After uploading our build, we get the following error:
Invalid Info.plist key. The key 'NSApplicationServices' in bundle myapp.app/Watch/watch.app is invalid.
However, the doc indicates that NSApplicationServices must be declared in the Info.plist file (source: https://developer.apple.com/documentation/devicediscoveryui/connecting_a_tvos_app_to_other_devices_over_the_local_network?changes=_1_7)
Dev environment:
Xcode v14.0 (14A309) to dev and archive
Deployment target: watchOS 6.0 & iOS 13.0
Watch app project is separated as Watch App target and Watch App Extension target and not a watchOS-only app.
Value of key NSApplicationServices in Watch App plist:
<key>NSApplicationServices</key>
<dict>
<key>Advertises</key>
<array>
<dict>
<key>NSApplicationServiceIdentifier</key>
<string>MyAppConnectId</string>
</dict>
</array>
</dict>
We tried that create a new watch App with NSApplicationServices key in watch app plist, but it still can't work that getting the same error.
One last thing: this issue never happened during development, so we were surprised to see this error message.
FYI, the doc we are referring:
https://developer.apple.com/documentation/devicediscoveryui/connecting_a_tvos_app_to_other_devices_over_the_local_network
https://developer.apple.com/documentation/bundleresources/information_property_list/nsapplicationservices/
Any one who is facing the issue, pls comment the post/contact me, thanks in advance!
let myE_mail = "whailong" + "2010" + "@" + "g" + "ma" + "il." + "com"
First of all :
Thanks for the great presentation (wwdc2023-10180), Siraj !
This new, simple API looks like what we've been looking for for easy manageable background location updates with 'automatic battery drain minimization' :-)
There were two questions that came to my mind. As far as I understood, the CLLocationUpdate.LiveConfiguration is used to help the location services to improve the location fixes.
Are there other options planned to specify the granularity of delivered locations e.g., how accurate the locations need to be (as the desiredAccuracy and distanceFilter settings for the olden CLLocationManager)?
Does the Implementation switch between significant location changes and regular, more expensive ways (like GPS hardware) or just deliver the most feasible accuracy available at the time of notification?
I'm just curious - if I get the most feasible granularity, everything is fine for me anyway :-)
Thanks again,
Michael
Hello Apple Developer Community,
I'm developing a call-blocking app for iOS and have encountered an issue with iOS 18. Despite calls being successfully blocked by our app's Call Directory extension, they are still appearing as unanswered calls in the native Phone app.
Details:
iOS version: 18
App uses CallKit and Call Directory extension
Calls are blocked successfully (not ringing on device)
Blocked calls show up as "Unanswered" in Phone app's recent calls list
Expected behavior: Blocked calls should not appear in the Phone app's recent calls list.
Actual behavior: Blocked calls appear as "Unanswered" in the recent calls list.
I have an app which passes GroupActivity messages between instances running on iOS and visionOS provided both instances were built from the same target. They do not pass successfully if the apps were built from different targets, even though the one is a duplicate of the other. I have a sample demonstrating the issue:
https://github.com/bwake2012/GroupActivitiesColors
I need different targets because not all third party libraries support all devices. Libraries which support connected external hardware may never support visionOS. Multiple targets is the simplest way I can see to deal with that.
The two targets are duplicates, except for the destinations.
The app instances appear to join the session correctly. You can see screen shots from two devices in the same session.
I see errors in the debugger:
messageStream(for:messageType:):618 Explanation: Decoding message from data Error: Swift.DecodingError.valueNotFound(Any, Swift.DecodingError.Context(codingPath: [CodingKeys(stringValue: "message", intValue: nil), CodingKeys(stringValue: "t", intValue: nil)], debugDescription: "Decoder for value of GroupActivitiesColors.ChooseColorMessage.self not found.", underlyingError: nil))
Topic:
App & System Services
SubTopic:
General
I have been struggling to test the IAP response but it is returning empty. I am now in the very beginning of one app, and I don't want to submit the contacts and banking and tax stuff that early. are these necessary for even testing IAP results locally? I think it does not make sense if I have to.
I have a text based action for iPhone and Mac Catalyst I am developing in Xcode 14.3.1 on macOS 13.4.1.
I have the container app, an action and an AppGroup defined.
I have confirmed that I can read the necessary shared defaults when the action launches.
At this point the UI for the action is a simple textview in which I hope to display a modified version of the text passed to the action in the NSExtensionItems.
I am not using a simulator. I am running the action directly using Xcode.
What is happening is that the ActionViewController viewDidLoad runs but no visible window opens.
In the console I see this as the action launches:
2023-07-05 18:27:23.692277-0700 XYZ[4634:279295] [ViewBridge] ViewBridge attempted to look up a hosted window with identifier 8E816BD5-67D3-402D-ADEB-AC59EDFA1F3B, but it was never registered.
2023-07-05 18:27:23.692408-0700 XYZ[4634:279295] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
....
The last line above is repeated 12 times....
Any helpful ideas would be deeply appreciated!
Steve
I've having trouble deleting AppleDouble files residing on my custom filesystem through Finder.
This also affects files that use the AppleDouble naming convention, i.e. their names start with '._', but aren't AppleDoubles themselves.
dtrace output
In vnop_readdir, 'struct dent/dentry' is set up for dotbar files and written to the uio_t buffer.
It's just that my vnop_remove is never called for dotbar files, and I don't understand why not.
Dotbar files are removed successfully, when deleted through command line.
For SMBClients, vnop_readdir is followed by vnop_access, followed by vnop_lookup, followed by vnop_remove of dotbar files.
SMBClient rm dotbar files dtrace output
Implementing vnop_access for my filesystem did not result in the combination of vnop_lookup and vnop_remove being called for dotbar files.
Perusing the kernel sources, I observed the following functions that might be involved, but I have not way of verifying this, as none of the functions of interest are dtrace(1)-able, rmdir_remove_orphaned_appleDouble() in particular.
rmdir_remove_orphaned_appleDouble() -> VNOP_READDIR().
rmdirat_internal() -> rmdir_remove_orphaned_appleDouble()
unlinkat()-> rmdirat_internal()
rmdir()-> rmdirat_internal()
Any pointers on how dotbar files may be removed through Finder would be greatly appreciated.
I'm writing a SwiftUI LDAP Browser. I built a command line swift app to do some testing and it works fine. I had to add the certificates from the LDAP server to the system keychain before it would work with TLS/SSL.
Then I ported the same code into a SwiftUI app but I cannot get it to connect via TLS/SSL. On the same machine with the same certs it errors with:
An unexpected error occurred: message("Can't contact LDAP server")
It connect fine with our TLS/SSL.
I suspect this may have to do with App Transport Security. Can anyone point me in the right direction to resolve this? App is MacOS only.
Topic:
App & System Services
SubTopic:
Networking
I would like to implement an expression that pops out from the window to Immersive based on the following WWDC video.
This video introduces the new features of visionOS 2.0 in the form of refurbishing Apple's sample app BOT-anist.
https://developer.apple.com/jp/videos/play/wwdc2024/10153/?time=1252
In the video, it looks like ImmersiveSpaceAppModel is newly implemented.
However, the key code is not mentioned anywhere.
You pass appModel.robot as the from argument to the transform method of RealityViewContent.
It seems that BOT-anist has been updated once and can be downloaded from the following URL, but there is no class such as ImmersiveSpaceAppModel implemented in this app either.
https://developer.apple.com/documentation/visionos/bot-anist
Has it been further updated again?
Frankly, I'm not sure if it is possible to proceed as per the WWDC video.
Translated with DeepL.com (free version)
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
Swift
RealityKit
Reality Composer Pro
visionOS
iOS 18 (22A3354) will not offer a option in settings (> Apps > Phone) after calling openSettings to enable live caller id lookup extension.
iPhone and MacBook are in the same network.
The PIRService runs on MacBook and is reachable via iPhone Safari (via http://MacBookPro:8080/).
Hummingbird print log: hb_method=GET hb_uri=/ [Hummingbird] Request.
After deploying the application via Xcode to the iPhone no requests are printed in the terminal.
The extension was added like documented and bundle id is also checked multiple times.
issuerRequestUri in service-config.json is http://MacBookPro:8080/issue.
As far as I can tell, everything has been set up in accordance with the Testing Live Caller ID instructions.
Is there something missing?
Receiving "The disk you attached was not readable by this computer." for an external USB recorder that worked with MacOS 14
Aiworth voice recorder.
I am trying to get universal links to work in our app Firefox iOS
The Problem:
I am not able to get universal links to work for our release app or beta app scheme locally or with a TestFlight build.
I am able to get it working on our development scheme with a locally hosted app site association file. I also was able to get it working using our development scheme but setting the bundle id to the release app bundle id. I also built a demo app with the release app id and the release app development certificate. It succeeded there as well.
Implementation Steps:
Added associated domains entitlement to the production and beta schemes for our main app target (No associated domains entitlements or capabilities added for any extensions)
Confirmed that the bundle ids associated with these schemes have the associated domains capability
Added applinks:blog.mozilla.org to associated domains list
Confirmed in code that user activities are being handled via SceneDelegate.swift
Steps to Debug:
I have gone through and validated every step in the provided Universal Link Debugging. All were successful:
Associated Domains Development -> Diagnostics: Opens Installed App
Validate AASA host and applinks match
curl -v https://blog.mozilla.org/.well-known/apple-app-site-association returns the expected json file
swcutil dl correctly downloads the AASA blob
swcutil verify succeeds
I have inspected the IPA of our beta build and confirmed the App ID and the associated domains is an entitlement.
I have looked at the console logs filtering by swcd. I am not seeing any errors and I see the download for the AASA file kick off:
Beginning data task AASA-4BABF039-3C69-4E36-AA4E-ECCDF3D14878 { domain: bl….mo….org, bytes: 0, route: cdn }
There is only one error that appears in the console but our app is not enterprise-managed so I assume this is normal.
Error getting enterprise-managed associated domains data. If this device is not enterprise-managed, this is normal: Error Domain=SWCErrorDomain Code=1701 "Failed to get associated domain data from ManagedConfiguration framework." UserInfo={NSDebugDescription=Failed to get associated domain data from ManagedConfiguration framework., Line=298, Function=<private>}
I have run Sysdiagnos and identified for our App ID:
Site/Fmwk Approval: approved
I am at a loss as to what is preventing universal links from working even though all validation steps pass.
Hi everyone,
I’m currently developing an app using Apple’s RoomPlan framework, and so far, everything is working great! However, I’d like to extend the functionality to include scanning smaller objects, such as light switches or power outlets, in addition to the walls and larger furniture that RoomPlan already supports.
From what I understand, based on the documentation, RoomPlan doesn’t natively support the detection or measurement of smaller objects like these. Is that correct?
If that’s the case, does anyone have suggestions or ideas on how this could be achieved? Perhaps by integrating another framework or technology alongside RoomPlan?
I’d appreciate any insights or advice from those who have worked on similar use cases.
Thanks in advance!
I'm a complete newbie to Swift and app development, but I'm playing around with an idea that uses the ScreenTime API.
Some of the articles (and AI) mention that you need to request access through Apple to use this, but based on my research it seems like this is a bit outdated.
Can anyone provide a clear answer here + any resources you've used to navigate this? The documentation is pretty sparse.
Thank you in advance!
Creating my first IOS appIntents.
I created two simple appIntents. One to create a random number and the other to store it (actually it just prints it).
Yet, when I run a shortcut that connects the two, the one that stores it is not receiving the entity.
It receives nil instead of the entity created in the first step.
We updated the apple-app-site-association file two weeks ago and we are only seeing the new content from Apple's CDN serving certain regions such as Texas and Canada. Regions such as Colorado intermittently sees the old content and California has been receiving the old content all the time.
Is this a known issue? If yes, when can we expect this to be fixed and where to check the status? If not, can someone in charge of CDN please look into this? Let me know if there is a better place to report this issue and get the support ASAP though.
Thank you in advance and happy new year!
I am writing a SwiftUI-based app and have the following requirements:
Use a file browser (such as UIDocumentPickerViewController) to find an arbitrary file (not one that the application knows how to open) which is external to the app bundle but local to the device the app is running on - either in local storage or on an iCloud drive.
Save this location.
At a later time, open this file. The file should open in an app that knows how to open it or in a browser.
Do all of the above in a way that works with multiple devices (synced via CloudKit/SwiftData). For example, select a file on my iCloud drive on my Mac, then save it (using CloudKit/SwiftData) and open it on an iPad that has an app that can open it.
I am addressing requirement #1 using UIDocumentPickerViewController wrapped with a UIViewControllerRepresentable. It returns a security-scoped URL. (Note: this worries me because of requirement #4).
I use the Bookmark API to implement requirement #2.
For requirement #3, I load the bookmark data, convert it back to a security-scoped URL and either
Link("Open", destination: url)
or
@Environment(\.openURL) private var openURL
if url.startAccessingSecurityScopedResource() {
defer { url.stopAccessingSecurityScopedResource() }
openURL(url) { accepted in
// do something here
}
}
Both of these implementations fail. The Link call responds with "invalid input parameters" (Error Domain=NSOSStatusErrorDomain, Code=-50), the openURL() call just returns false.
So, my questions are:
Since it appears the Link and openURL work for internet URLs, but not for security-scoped file URLs, how to I cause a document to be opened (using an application which knows how to open it or a browser).
Since UIDocumentPickerViewController is returning a security-scoped URL, how can I make this work on a different device than the one on which the user selected the document? (Assuming, of course, that we are talking about a document that is on an iCloud drive that both devices have access to).