In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello everyone,
I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5.
This has also been reported via Feedback Assistant with
Feedback ID: FB17806167
Description:
When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation.
Specifically:
The pointer cannot move past the center of the screen
Horizontal and vertical (X/Y) movements appear to be swapped or misaligned
Natural movement of the pointer is not possible
It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape.
This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps.
Steps to Reproduce:
Enable AssistiveTouch
Connect a Bluetooth mouse to the iPhone
Rotate the device to landscape orientation
Try moving the mouse pointer across the screen
→ Notice that:
Pointer cannot move past the center
Horizontal/vertical input is interpreted incorrectly (as if still in portrait)
Expected Behavior:
The mouse pointer should move across the entire screen correctly, regardless of device orientation.
Actual Behavior:
In landscape orientation, the pointer is either restricted to part of the screen or misaligned.
It behaves as if the device is still in portrait.
Horizontal mouse movement causes vertical pointer movement, and vice versa
User experience feels broken and unintuitive
Feature Suggestion:
Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS.
I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior.
Thanks in advance for any insights or suggestions.
Best regards,
Jannis
Dear developer team,
After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight.
There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13.
I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels.
I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight.
Kind regards,
Maria
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings.
As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs.
My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
Topic:
Accessibility & Inclusion
SubTopic:
General
If I use NSAlert the buttons look like this:
The Cancel button has a gray background. We got complaints about the bad contrast and people pointed out that the alerts from System Settings look like this:
Here the Cancel button has a white background. Unfortunately I did not find out how to make the buttons in my own alerts look like those in System Settings. Setting the button's bezel color to white did not work. Any help would be highly appreciated. Thanks.
Best regards,
Marc
[macOS 15.4] Game Controller Background Input Capture Broken - Accessibility App No Longer Functions
Our application,
https://apps.apple.com/us/app/gamecontroller-mapper/id6737088417
which maps game controller inputs to keyboard/mouse events system-wide, has stopped functioning properly after the macOS 15.4 update. Specifically, the app can no longer capture game controller inputs when running in the background, severely impacting its core functionality.
Environment
macOS version: 15.4
Previous working versions: All versions prior to 15.4
App type: Background utility with accessibility permissions
Hardware: All game controller brands compatible with macOS
Detailed Description
Before macOS 15.4
Our application correctly captured game controller inputs from any brand connected to Mac and successfully translated them to keyboard/mouse events system-wide. Users could control any application (e.g., scrolling through documents in Preview using controller buttons) while our app ran in the background with the accessibility permissions granted.
After macOS 15.4
The application only works when it has active focus (is in the foreground). When any other application gains focus, our app completely stops receiving or detecting any input events from the game controller while running in the background. For instance, pressing the 'down' button on the controller while another app is active results in no event being registered within our application.
We've tried updating the app to work in accessory mode (in the menubar), but the issue persists.
Steps to Reproduce
Install our application on macOS 15.3 or earlier
Grant accessibility permissions when prompted
Connect a compatible game controller (e.g., Xbox or other controller)
Open another application (e.g., Preview with a PDF document)
Press buttons on the controller to navigate the document without touching the keyboard
Expected result on 15.3: Controller inputs are translated to keyboard events, even when our app is in the background
Upgrade to macOS 15.4
Repeat steps 2-5
Actual result on 15.4: Controller inputs are only translated to keyboard events when our application has focus
Technical Implementation
Our app uses:
CGEvent.tapCreate() to create a global event tap
CGEvent for simulating keyboard and mouse events
GCController.extendedGamepad?.valueChangedHandler for detecting controller inputs
Proper NSAccessibilityUsageDescription and appropriate entitlements
GCController.shouldMonitorBackgroundEvents = true to ensure controller events continue when the app is inactive
Possible Relation to Recent Changes
We noticed in the macOS 15.4 Release Notes:
Game Controller - Resolved Issues:
Fixed: Game controllers might stop responding when accessibility features, such as Voice Over, are enabled. (141497799)
We suspect this fix might have introduced a regression or intentional limitation affecting applications like ours that rely on background event simulation with game controller input.
Impact
This change severely impacts:
Applications designed to use game controllers as assistive input devices for users who may have difficulty using traditional keyboard and mouse inputs
Applications for media control, presentation navigation, and other similar use cases
Users who rely on our application for accessibility purposes
Questions
Is this an intentional security change or an unintended side effect of the controller fix mentioned in the release notes?
Are there any new APIs or alternative approaches we should implement to restore functionality?
If this is a system bug, when can we expect a fix?
We would greatly appreciate any guidance on how to restore our application's functionality. Thank you for your assistance.
I updated with 18.3 beta, but lost video and audio option with that update, I tried to restore with itune in windows 11, got struck between. Forcefully turned off ipad, after two tries got off.... Off like blinked out screen... Not tried all tricks to on, can't on....please tell a solution. Used all your advices in internet. It was 90% charged, working superbly. So far no risk... Please help me. No charging icon, no sign of life. How can On?
Topic:
Accessibility & Inclusion
SubTopic:
General
accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control.
When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem.
Is anyone facing the same issue?
Using Xcode 16.0 (16A242) on iOS 18
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
I’m trying to enroll in the Apple Developer Program as an individual. I’ve gone through the steps on the website and started the purchase process. However, after a couple of days when I return to the site, it doesn’t remember my progress — I have to start the enrollment from scratch every time.
Is this expected behavior? Am I missing a step to save my progress or complete the enrollment properly?
Any help or guidance would be appreciated. Thank you!
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
I am trying to grant Input Monitoring permission using MDM (Mobile Device Management), but I am facing issues. While I am able to deny the permission, I am unable to grant it.
In some profile configurator tools, I noticed a note stating:
"Allows the application to use CoreGraphics and HID APIs to listen to (receive) CGEvents and HID events from all processes. Access to these events cannot be given in a profile; it can only be denied."
This seems to suggest that granting Input Monitoring permission via an MDM profile may not be possible.
Has anyone successfully granted Input Monitoring permission using MDM, or is there an alternative way to achieve this on managed macOS devices?
again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
.accessibilityRotor(.headings) {
ForEach(entries) { entry in
// entry.id must be the same as the id of the SwiftUI view it is about
AccessibilityRotorEntry(entry.name, id: entry.id)
}
}
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
I have a couple follow up questions after the "Accessibility technologies group lab".
I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example.
Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer?
Are there other strategies, avenues, or examples to promote user feedback?
I need to direct text-to-speech generated audio from my app simultaneously to a bluetooth speaker device AND to the internal iPad speaker. The app uses AVSpeechSynthesizer and several third party speech engines. How best to do this?
I noticed the outputChannels property on AVSpeechSynthesizer...are there any examples of how to use this?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have a product for designing particle emitters, which I suspect may be of limited interest to people with limited vision.
I'd still like to ensure I'm doing a good job with VoiceOver mode.
There's a related, simplified sample online, if you want to look at the code
As you can see from the picture below, a large part of the interface mimics Xcode's particle editor, with many value entry controls that combine up/down buttons with a tappable label. Tapping the label goes into edit mode.
Apart from changing how labels are stepped through in voiceover in my app, how should I handle these stepper buttons? Is this a good place to use a Custom Rotor?
Topic:
Accessibility & Inclusion
SubTopic:
General
At present, in iOS, if using the in-house app, there may be crashes in the new iOS 18.3 and later versions, but it works normally on other phones and the certificate is not problematic.
A total of 3 machines were found, and there was no pattern between the machines and the system, with different models and versions.
We tested it on a machine that crashes, but the app downloaded from the store doesn't. If the same app is packaged and installed directly in the development tool, it will crash. Is this related to compatibility with the new version of IOS?
Is there a solution? Do others also have relevant situations?
Topic:
Accessibility & Inclusion
SubTopic:
General
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?