Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

MacOS Sequoia support for VoiceOver AppleScript automation
We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia) In macOS 15, Apple moved the VoiceOver configuration from: ~/Library/Preferences/com.apple.VoiceOver4/default.plist to a sandboxed path: ~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Steps to Reproduce: Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM). Open VoiceOver: open /System/Library/CoreServices/VoiceOver.app Set the SCREnableAppleScript flag to true in the new sandboxed .plist: plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Confirm csrutil status is either disabled or not enforced. Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript). Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
3
0
197
Jun ’25
VoiceOver cursor focus tracking
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients. We need to know which item has the VoiceOver focus so we can keep track of it. setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused). At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder. Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element? Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
4
0
491
Mar ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
438
Feb ’25
Attempting to go directly to a help book page results in the main help book page being displayed instead
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page. Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS. So my questions are: Is this a known bug? And if so, is there any ETA on a fix? Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
3
0
1.9k
Oct ’25
IOS 26 Full Keyboard Access (navigation) and WKWebView
We use an embedded WKWebView for several screens in our app. Recently, we have been testing keyboard navigation via Full Keyboard Access in our apps. On IOS 18, everything works pretty much as expected. On IOS 26, it does not. On IOS 26, you can "tab" away from the webview and then never tab back to the webview for keyboard navigation. Is this a known issue? Are there workarounds for this issue that anyone is aware of?
2
0
367
Nov ’25
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
551
Mar ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
132
Mar ’25
Guided Access Mode From Background
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token. We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
2
0
258
Mar ’25
Programmatically setting accessibility focus broken?
Hello! I'm trying to improve the accessibility of a UIKit login form in our iOS app. If an error occurs, an error message is shown in a label that is hidden by default. For our VoiceOver users, I want to move the focus to the error message label so that VoiceOver reads out the error message. I'm trying to achieve this using UIAccessibility.post, but try as I might, it does not work. To better understand the problem, I created a very simple App which shows a button and a label (always visible), and on pressing the button, I post an accessibility notification: UIAccessibility.post(notification: .layoutChanged, argument: label) What I expect to happen is for the focus to move from the button to the label. What happens instead is the focus stays with the button and VoiceOver reads out the button's label again. So it seems to process the notification, but ignore the argument. Am I misunderstanding how accessibility notifications work or is this simply broken at the moment? I am testing this withy my iPhone with the current iOS version 18.2.1 By the way, using the more modern variant leads to the same result: AccessibilityNotification.LayoutChanged(label).post()
3
0
520
Jan ’25
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
3
0
94
Mar ’25
Custom tab bar in SwiftUI
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused. According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup. In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact. So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
2
0
325
Aug ’25
Programmatically Modifying Per Application or System Wide Color Filters using Cocoa/Swift in MacOS?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings. As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs. My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
0
0
375
Jul ’25
VoiceOver doesn't work for AVRoutePickerView wrapped in UIViewRepresentable
Hi, I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes: func makeUIView(context: Context) -> UIView { let routePickerView = AVRoutePickerView() // Configure the button's color. //routePickerView.delegate = context.coordinator //routePickerView.backgroundColor = .secondarySystemBackground routePickerView.tintColor = .accent routePickerView.activeTintColor = .accent // Indicate whether your app prefers video content. routePickerView.prioritizesVideoDevices = false return routePickerView } I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue. Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
3
0
385
Aug ’25
VoiceOver spells word letter by letter
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items. The app is in German. I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine. If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
4
0
1.2k
Jan ’25