Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
0
0
104
Apr ’25
Frames rotated and shifted in landscape for iOS simulator
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator. I get the same frames when using the Accessibility Inspector when the Max is selected as the host. When I switch the host to the iOS simulator the frames are correct. How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
1
0
130
Jun ’25
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
575
Sep ’25
Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
552
Jan ’25
Accessibility IDs showing up in Accessibility Inspector, but automated testing script is unable to find them
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this: ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in Button { navigationCallback(paymentTool.segueID) } label: { PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName) .accessibilityElement() .accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))") } if paymentTool != self.paymentToolsVM.paymentToolsItems.last { Divider() } } So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one. If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
1
0
153
May ’25
Default Voices for AVSpeechUtterance
It appears iOS only comes with low quality voices installed. iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance. There doesn't seem to be any api that can be used to make this process easier for the app user. Is there a way / api that would allow an app to download and use a higher quality voice? Will apple ever install on default higher quality voices? We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response. Thanks
0
0
709
Sep ’25
Please consider having Name Recognition in a shortcut automation
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss. I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
1
0
592
Sep ’25
VoiceOver and limited vision approaches for custom stepper control
I have a product for designing particle emitters, which I suspect may be of limited interest to people with limited vision. I'd still like to ensure I'm doing a good job with VoiceOver mode. There's a related, simplified sample online, if you want to look at the code As you can see from the picture below, a large part of the interface mimics Xcode's particle editor, with many value entry controls that combine up/down buttons with a tappable label. Tapping the label goes into edit mode. Apart from changing how labels are stepped through in voiceover in my app, how should I handle these stepper buttons? Is this a good place to use a Custom Rotor?
0
0
79
Jun ’25
Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
0
0
377
Sep ’25
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain screen regions even with calibration 3. Sensitivity and Stability Issues Cursor jitters or jumps around when looking at center Too much sensitivity to micro-movements Inconsistent behavior between calibration and normal operation 4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook. Primary Question: Word-Level Eye Tracking Feasibility Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware? I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for: Tracking natural reading patterns (left-to-right, line-by-line progression) Detecting which specific words a user is looking at Maintaining accuracy for sustained reading sessions (15-30 minutes) Working reliably across different users and lighting conditions Questions for the Community Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities? Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed? Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking? Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case? Current Setup Devices: iPhone 14 Pro iOS Version: iOS 18.3 ARKit Version: Latest available Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features. Thank you for your time and expertise!
0
0
689
Oct ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
112
May ’25
Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
798
Jun ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
0
0
103
May ’25
VoiceOver Headings Accessibility Rotor with SwiftUI on iOS
Hi, On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor). In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense. There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following. .accessibilityRotor(.headings) { ForEach(entries) { entry in // entry.id must be the same as the id of the SwiftUI view it is about AccessibilityRotorEntry(entry.name, id: entry.id) } } It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up. I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them. So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else) The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed). Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder. Thanks
0
0
98
Jun ’25
How to disable the default focus effect and detect keyboard focus in SwiftUI?
I’m trying to customize the keyboard focus appearance in SwiftUI. In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not. In SwiftUI I’ve tried the following: .focusable() // .focusable(true, interactions: .activate) .focusEffectDisabled() .focused($isFocused) However, I’m running into several issues: .focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding .focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS Using @FocusState prevents Space from triggering the action when the view has keyboard focus My main questions: How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?) What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border? Any guidance or best practices would be greatly appreciated! Here's my sample code: import SwiftUI struct KeyboardFocusExample: View { var body: some View { // The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case ScrollView { VStack { Text("First button") .keyboardFocus() .button { print("First button tapped") } Text("Second button") .keyboardFocus() .button { print("Second button tapped") } } } } } // MARK: - Focus Modifier struct KeyboardFocusModifier: ViewModifier { @FocusState private var isFocused: Bool func body(content: Content) -> some View { content .focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized // .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds .focusEffectDisabled() // ⚠️ Has no effect on iOS .focused($isFocused) // Custom Halo effect .padding(4) .overlay( RoundedRectangle(cornerRadius: 18) .strokeBorder( isFocused ? .red : .clear, lineWidth: 2 ) ) .padding(-4) } } extension View { public func keyboardFocus() -> some View { modifier(KeyboardFocusModifier()) } } // MARK: - Button Modifier /// ⚠️ Using a Button view makes no difference struct ButtonModifier: ViewModifier { let action: () -> Void func body(content: Content) -> some View { content .contentShape(Rectangle()) .onTapGesture { action() } .accessibilityAction { action() } .accessibilityAddTraits(.isButton) .accessibilityElement(children: .combine) .accessibilityRespondsToUserInteraction() } } extension View { public func button(action: @escaping () -> Void) -> some View { modifier(ButtonModifier(action: action)) } }
1
0
448
Sep ’25
"illegal character encoding in string literal" warnings in Xcode
Good day! I have a long-term project ported all the way up from old Think C through many versions of Xcode. Its source files are encoded in "Western (Mac OS Roman)". Some of my error messages have characters outside the straight ASCII character set (i.e. "å"). The editor correctly displays these, but I get plenty of Illegal Character warnings and the messages do not display properly. I imagine there's a way to have seperate files of localized text for internationalized applications, but I am the only end-user of this application, and it used to just plain work in earlier Xcode versions. Furthermore, there must be developers throughout Europe who use such characters in string literals, just typing in their native languages, straight off their keyboards. I was thinking that there must be a Clang setting or something, but have been unable to find it, and an internet search turns up no solution except to cumbersomely escape each individual character. I can't imagine that a French programmer does that every time they want to type "è", "é", or "à"! Any help? (Disclaimer: I'm an English speaker and only use such characters whimsically, but want to keep them for legacy's sake.) Thanks.... p.s. using Xcode 15.3, and under Settings->Text Editing->Editing, "Western (Mac OS Roman)" is already selected as the default text encoding with "Convert existing files on save" checked.
3
0
185
Jun ’25