Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
683
Sep ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
722
Sep ’25
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
0
0
911
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
527
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
891
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
771
Dec ’25
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
1
0
1.8k
Feb ’26
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
1
0
1.4k
Jan ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
0
0
428
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
1
0
638
Mar ’26
Left-flick and right-flick gestures with VoiceOver and UIAccessibilityReadingContent
Hi, I have an app that displays lines of text, that I want to make accessible with VoiceOver. It's based on a UITextView. I have implemented the UIAccessibilityReadingContent protocol, following the instructions in https://developer.apple.com/videos/play/wwdc2019/248 and now users can see the screen line by line, by moving their fingers on the screen. That works fine. However, users would also like to be able to use left-flick and right-flick to move to the previous or next line on the screen, and I haven't been able to make this work. I can see that left-flick triggers accessibilityPreviousTextNavigationElement and right-flick triggers accessibilityNextTextNavigationElement, but I don't understand what these variables should be.
1
0
1.4k
Apr ’26
Full Keyboard Access Photos app Information button not selectable
When Full Keyboard Access is enabled, some controls in the Photos app appear to be unreachable using keyboard navigation. Steps to reproduce: Enable Full Keyboard Access. Open the Photos app. Navigate to the Collections tab. Use arrow keys to move focus through the screen. The Info (“i”) button related to library optimization is not reachable. When focus is on the left side (e.g. “Reorder”), pressing the down arrow moves focus directly to the tab bar. When focus is on the right side (e.g. the chevron next to “Wallpaper Suggestions”), focus moves directly to the Search button. The Info button is skipped in both cases. Tested on simulator iPhone 17 Pro with iOS 26.2.
0
0
66
1w
Full Keyboard Access Photos app scroll view is not accessible
When using Full Keyboard Access on iPhone in Photos app some interactive content is not available to enter. Steps to reproduce: Open Photos app Go to Collections tab Go to memories Try to select any memory playlist Whole scroll view is selectable as one element and using space is not making any difference. Using tab goes to navigation controls. Tested on iPhone 13 mini with iOS 26.4 using Magic Keyboard.
0
0
120
1w
Full Keyboard Access Reminders app inaccessible content
When using Full Keyboard Access some content in Reminders app is not accessible. Steps to reproduce: Open Reminders app Open one of the lists with few (ex. 4) records. Try to check reminder. Issue: Navigating with keyboard focuses on whole row, using arrows left/right doesn't move to check control. Using space on whole row activates textfield, then still left/right arrow does not move outside textfield. Tested on iPhone 13 mini with iOS 26.4 using Magic Keyboard. Solving this issue might help with similar issues in our app.
0
0
161
1w
Full Keyboard Access Health app close button not accessible
I found difficulty interacting with close button on placement inside Health app on newest iOS. Steps to reproduce: Open Health app Scroll to "Get more from health" Try to focus on close button on any box like "Set Up your medical ID" Focus is on the whole box and there is now way to move to close button. After hitting space it opens next screen. Solving this issue might help with similar issue in our app.
1
0
597
1w
input type="number" not mapped to spinbutton role
input[type=number] mapped to AXTextField instead of AXIncrementor/UIAccessibilityTraitAdjustable in Safari (macOS and iOS). According to ML-AAM 1.0, <input type="number"> is required to map to the ARIA spinbutton role, but it is not being mapped as expected on WebKit (macOS and iOS) to the platform accessibility APIs: the element is reported as AXTextField on macOS and lacks UIAccessibilityTraitAdjustable on iOS. As a consequence, VoiceOver announces the element as a textfield rather than a spinbutton, does not increment with arrow keys on macOS, and does not respond to the swipe up/down gesture on iOS. This affects every <input type="number"> on the web (quantity steppers, age inputs, year pickers, etc). Authors are currently forced to work around it by reimplementing the spinbutton with role="spinbutton", which force the authors to emulate the native HTML solutions with JavaScript, contradicts the First Rule of ARIA Use and presents another interaction issues in WebKit (will create an issue about this and update this post later). References: HTML-AAM 1.0, input type=number: https://www.w3.org/TR/html-aam-1.0/#el-input-number ARIA 1.2, spinbutton role: https://www.w3.org/TR/wai-aria-1.2/#spinbutton First Rule of ARIA Use: https://www.w3.org/TR/using-aria/#firstrule Reproduction: https://codesandbox.io/p/sandbox/beautiful-hofstadter-vn7nj3 <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <title>spinbutton techniques</title> </head> <body> <main> <h1>Spinbutton pattern</h1> <section aria-labelledby="html-solution"> <h2 id="html-solution">HTML solution: spinbutton</h2> <label for="qty">Quantity</label> <input id="qty" type="number" min="0" max="10" value="1" /> </section> </main> </body> </html> Expected behavior: AX role on macOS: AXIncrementor (mapped from spinbutton) AX trait on iOS: UIAccessibilityTraitAdjustable VoiceOver on macOS announces "[value], [name of the input], stepper" and then "you are currently on a stepper. To begin interacting with this stepper, press Control-Shift-Down Arrow", after pressing the combo should announce "in stepper" and inmmediately should announce "you are currently in a stepper. To decrease this value, press Control-Option-Down Arrow. To increase this value, press Control-Option-Up Arrow. to exit this stepper, press Control-Option-Shift-Up Arrow" One-finger swipe up/down on iOS increments/decrements the value Actual behavior on Safari: AX role on macOS: AXTextField AX trait on iOS: standard textfield, no Adjustable VoiceOver announces "[value], insertion at the beginning/end of the text, [name of the input], number field" and then "you are currently on text field. To enter text in this field, type". Omitting the native functionality of a spinbutton One-finger swipe up/down on iOS does nothing Cross-platform comparison (same HTML, same spec): Firefox + NVDA on Windows: Chrome + NVDA on Windows: same as Firefox Safari + VoiceOver on macOS: broken as described above Safari + VoiceOver on iOS: broken as described above Environment: macOS: maOS Tahoe 26.4.1 Safari: 26.4 (21624.1.16.11.4) iOS: iOS 26.4.2 Device: iPhone 15 VoiceOver: default settings
0
0
402
3d
VoiceOver spatial navigation doesn't focus elements using UISheetPresentationController with small detent
I have already filed a bug report with a sample project via Feedback Assistant: FB22760723 When presenting a UIViewController using UIModalPresentationFormSheet alongside UISheetPresentationController with a small custom detent (e.g., around 300pt height), VoiceOver spatial swipe navigation breaks. The user is unable to swipe left or right to navigate sequentially through the accessible elements inside the sheet. Accessibility Inspector reveals that the focus seems to get trapped by the background layer (UIDimmingView / "dismiss popup"). If the sheet is taller (e.g., 600pt), the issue does not occur and swipe navigation works as expected Steps to Reproduce: Run the attached sample Objective-C/UIKit project. Turn on VoiceOver. Tap the “Open transparent modal” button to present the modal with UIModalPresentationOverFullScreen presentation. Tap the "Open form sheet" button to present the sheet (configured with a custom 300pt detent). Attempt to swipe right using VoiceOver to navigate to the next element (e.g., from the title label to the buttons). Expected Results: VoiceOver should smoothly navigate through the sequential accessibility elements inside the sheet's view hierarchy, respecting the bounds of the modal sheet. Actual Results: VoiceOver gets stuck. The swipe right/left gestures fail to move focus to the next element inside the sheet. Instead, focus often escapes to the background or doesn’t change completely.
1
0
176
1d
Disable sleep/wake when in Autonomous Single App Mode (ASAM)
If a user enables/disables Guided Access, they can modify the session settings to disable the top (sleep/wake) button. In Single App Mode (SAM), there is a payload option for disabling the sleep/wake button via Mobile Device Management (MDM). In Autonomous Single App Mode (ASAM), there doesn't appear to be any way to disable the top button. ASAM does not honor the Guided Access sessions settings, and there is no payload option in the MDM. This is a glaring issue especially when ASAM is marketed as the solution for apps in a medical setting where the app is trading hands from a medical professional to a patient. Our app is used during a lengthy procedure and does not function properly if the patient puts the iPad to sleep. We're stuck asking our medical professionals to put the iPad in Guided Access, but the user experience is clunky and would be much improved by implementing ASAM. Is there some little-known API for disabling the sleep/wake button during ASAM that I'm just missing?
0
0
35
8h
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
Replies
1
Boosts
0
Views
683
Activity
Sep ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
Replies
2
Boosts
0
Views
722
Activity
Sep ’25
Custom Keyboard Extension Not Showing in Settings for Activation
Hi everyone, I’m developing a React Native iOS app that includes a custom keyboard extension for sending stickers across apps. The project builds successfully, and the main app installs fine on my test device. However, I’m not seeing the keyboard extension appear under Settings → General → Keyboard → Keyboards → Add New Keyboard, which means I can’t activate it or grant access. At this point, I’m not even sure if the extension is actually being installed on the device along with the main app. Here’s what I’ve done so far. I created a Keyboard Extension target in Xcode, set the correct bundle identifiers and provisioning profiles, and enabled “Requests Open Access” in the extension’s Info.plist. I built and installed the app on a physical device rather than the simulator to ensure proper testing. My main questions are: how can I confirm that the extension is being installed on the device, and if it isn’t, what might prevent it from installing even though the build completes successfully? Any insights, troubleshooting steps, or guidance would be greatly appreciated.
Replies
0
Boosts
0
Views
911
Activity
Nov ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
Replies
1
Boosts
0
Views
527
Activity
Nov ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
Replies
1
Boosts
0
Views
891
Activity
Dec ’25
Accessibility voice command recording does not start on Apple Vision Pro
Is the accessibility feature, voice command recording available on the Apple Vision Pro? It does not start on my device. The Apple Vision Pro is on 26.1. Regular single voice commands work on the Apple Vision Pro. Recording commands worked on other devices. (iPad and iPhone)
Replies
2
Boosts
0
Views
894
Activity
Dec ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
Replies
1
Boosts
0
Views
771
Activity
Dec ’25
Voice Control evaluation questions: "Stop Recording" command failure & Item numbers on non-interactive web elements
Hello everyone, I am currently evaluating my app's accessibility features to accurately display the "Accessibility" information on the App Store. I have encountered two specific issues regarding Voice Control testing and would appreciate any guidance. Voice Command for "Stop Recording" According to the evaluation criteria, if an app supports audio recording or dictation, users must be able to start and stop recording using only their voice. Behavior: I can successfully trigger the recording using the command "Start Recording". However, I cannot find a command to stop it. Commands like "Stop Recording" or "Stop" are not recognized by the system. Question: Is there a specific standard voice command intended for stopping a recording? Item Number Overlays on Non-Interactive Web Elements (WKWebView) I noticed an inconsistency between native views and web content regarding Voice Control item numbering. Behavior: When testing web content within the app (WKWebView) or in Safari, Voice Control displays item number overlays on non-interactive text elements (such as standard or tags). In native views, static labels do not receive item numbers. Question: Is this expected behavior for web content? Since these elements are not interactive, I am unsure if this should be considered a bug (fail) or an acceptable exception for the accessibility evaluation. Has anyone experienced similar issues or know the correct criteria for these cases? Thank you.
Replies
1
Boosts
0
Views
1.8k
Activity
Feb ’26
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
Replies
1
Boosts
0
Views
1.4k
Activity
Jan ’26
square mouse and lack of transparency
after the 26.3 beta update, my mouse has been having major problems with transparency, have to keep going to reset colors in display, but it doesn't hold, anyone else?
Replies
1
Boosts
0
Views
1.2k
Activity
Feb ’26
VoiceOver with Swift Charts summaries
I had a VoiceOver user point out an issue with my app that I’ve definitely known about but have never been able to fix. I thought that I had filed feedback for it but it looks like I didn’t. Before I do I’m hoping someone has some insight. With Swift Charts when I tap part of a chart it summarizes the three hours and then you can swipe vertically to hear it read out details of each hour. For example, the Y-Axis is the amount of precipitation for the hour and the X-Axis is the hours of the day. The units aren't being read in the summary but they are for individual hours when you vertical swipe. The summary says something such as "varies between 0.012 and 0.082". In the AXChartDescriptor I’ve tried everything I can think of, including adding a label to the Y axis in the DataPoint but nothing seems to work in getting that summary to include units. With a vertical swipe it seems to just be using my accessibility label and value (like I would expect).
Replies
0
Boosts
0
Views
428
Activity
Feb ’26
Apple Pay e installazione di app di terze parti non funzionanti
Scrivo questo post per farmi notare meglio, il 6 marzo ho mandato un feedback (poi aggiornato oggi, 18 marzo) tramite l‘app Feedback installata su iPhone chiedo a chiunque lavori all’interno di Apple, specialmente agli ingegneri informatici che si occupano delle funzioni di accessibilità di iOS 26 di visionare questo Feedback per aumentare ancora di più le opzioni di accessibilità degli utenti Apple, vi lascio di seguito l’ID del Feedback, grazie mille per il lavoro che fate FB22142615
Replies
1
Boosts
0
Views
638
Activity
Mar ’26
Left-flick and right-flick gestures with VoiceOver and UIAccessibilityReadingContent
Hi, I have an app that displays lines of text, that I want to make accessible with VoiceOver. It's based on a UITextView. I have implemented the UIAccessibilityReadingContent protocol, following the instructions in https://developer.apple.com/videos/play/wwdc2019/248 and now users can see the screen line by line, by moving their fingers on the screen. That works fine. However, users would also like to be able to use left-flick and right-flick to move to the previous or next line on the screen, and I haven't been able to make this work. I can see that left-flick triggers accessibilityPreviousTextNavigationElement and right-flick triggers accessibilityNextTextNavigationElement, but I don't understand what these variables should be.
Replies
1
Boosts
0
Views
1.4k
Activity
Apr ’26
Full Keyboard Access Photos app Information button not selectable
When Full Keyboard Access is enabled, some controls in the Photos app appear to be unreachable using keyboard navigation. Steps to reproduce: Enable Full Keyboard Access. Open the Photos app. Navigate to the Collections tab. Use arrow keys to move focus through the screen. The Info (“i”) button related to library optimization is not reachable. When focus is on the left side (e.g. “Reorder”), pressing the down arrow moves focus directly to the tab bar. When focus is on the right side (e.g. the chevron next to “Wallpaper Suggestions”), focus moves directly to the Search button. The Info button is skipped in both cases. Tested on simulator iPhone 17 Pro with iOS 26.2.
Replies
0
Boosts
0
Views
66
Activity
1w
Full Keyboard Access Photos app scroll view is not accessible
When using Full Keyboard Access on iPhone in Photos app some interactive content is not available to enter. Steps to reproduce: Open Photos app Go to Collections tab Go to memories Try to select any memory playlist Whole scroll view is selectable as one element and using space is not making any difference. Using tab goes to navigation controls. Tested on iPhone 13 mini with iOS 26.4 using Magic Keyboard.
Replies
0
Boosts
0
Views
120
Activity
1w
Full Keyboard Access Reminders app inaccessible content
When using Full Keyboard Access some content in Reminders app is not accessible. Steps to reproduce: Open Reminders app Open one of the lists with few (ex. 4) records. Try to check reminder. Issue: Navigating with keyboard focuses on whole row, using arrows left/right doesn't move to check control. Using space on whole row activates textfield, then still left/right arrow does not move outside textfield. Tested on iPhone 13 mini with iOS 26.4 using Magic Keyboard. Solving this issue might help with similar issues in our app.
Replies
0
Boosts
0
Views
161
Activity
1w
Full Keyboard Access Health app close button not accessible
I found difficulty interacting with close button on placement inside Health app on newest iOS. Steps to reproduce: Open Health app Scroll to "Get more from health" Try to focus on close button on any box like "Set Up your medical ID" Focus is on the whole box and there is now way to move to close button. After hitting space it opens next screen. Solving this issue might help with similar issue in our app.
Replies
1
Boosts
0
Views
597
Activity
1w
input type="number" not mapped to spinbutton role
input[type=number] mapped to AXTextField instead of AXIncrementor/UIAccessibilityTraitAdjustable in Safari (macOS and iOS). According to ML-AAM 1.0, <input type="number"> is required to map to the ARIA spinbutton role, but it is not being mapped as expected on WebKit (macOS and iOS) to the platform accessibility APIs: the element is reported as AXTextField on macOS and lacks UIAccessibilityTraitAdjustable on iOS. As a consequence, VoiceOver announces the element as a textfield rather than a spinbutton, does not increment with arrow keys on macOS, and does not respond to the swipe up/down gesture on iOS. This affects every <input type="number"> on the web (quantity steppers, age inputs, year pickers, etc). Authors are currently forced to work around it by reimplementing the spinbutton with role="spinbutton", which force the authors to emulate the native HTML solutions with JavaScript, contradicts the First Rule of ARIA Use and presents another interaction issues in WebKit (will create an issue about this and update this post later). References: HTML-AAM 1.0, input type=number: https://www.w3.org/TR/html-aam-1.0/#el-input-number ARIA 1.2, spinbutton role: https://www.w3.org/TR/wai-aria-1.2/#spinbutton First Rule of ARIA Use: https://www.w3.org/TR/using-aria/#firstrule Reproduction: https://codesandbox.io/p/sandbox/beautiful-hofstadter-vn7nj3 <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <title>spinbutton techniques</title> </head> <body> <main> <h1>Spinbutton pattern</h1> <section aria-labelledby="html-solution"> <h2 id="html-solution">HTML solution: spinbutton</h2> <label for="qty">Quantity</label> <input id="qty" type="number" min="0" max="10" value="1" /> </section> </main> </body> </html> Expected behavior: AX role on macOS: AXIncrementor (mapped from spinbutton) AX trait on iOS: UIAccessibilityTraitAdjustable VoiceOver on macOS announces "[value], [name of the input], stepper" and then "you are currently on a stepper. To begin interacting with this stepper, press Control-Shift-Down Arrow", after pressing the combo should announce "in stepper" and inmmediately should announce "you are currently in a stepper. To decrease this value, press Control-Option-Down Arrow. To increase this value, press Control-Option-Up Arrow. to exit this stepper, press Control-Option-Shift-Up Arrow" One-finger swipe up/down on iOS increments/decrements the value Actual behavior on Safari: AX role on macOS: AXTextField AX trait on iOS: standard textfield, no Adjustable VoiceOver announces "[value], insertion at the beginning/end of the text, [name of the input], number field" and then "you are currently on text field. To enter text in this field, type". Omitting the native functionality of a spinbutton One-finger swipe up/down on iOS does nothing Cross-platform comparison (same HTML, same spec): Firefox + NVDA on Windows: Chrome + NVDA on Windows: same as Firefox Safari + VoiceOver on macOS: broken as described above Safari + VoiceOver on iOS: broken as described above Environment: macOS: maOS Tahoe 26.4.1 Safari: 26.4 (21624.1.16.11.4) iOS: iOS 26.4.2 Device: iPhone 15 VoiceOver: default settings
Replies
0
Boosts
0
Views
402
Activity
3d
VoiceOver spatial navigation doesn't focus elements using UISheetPresentationController with small detent
I have already filed a bug report with a sample project via Feedback Assistant: FB22760723 When presenting a UIViewController using UIModalPresentationFormSheet alongside UISheetPresentationController with a small custom detent (e.g., around 300pt height), VoiceOver spatial swipe navigation breaks. The user is unable to swipe left or right to navigate sequentially through the accessible elements inside the sheet. Accessibility Inspector reveals that the focus seems to get trapped by the background layer (UIDimmingView / "dismiss popup"). If the sheet is taller (e.g., 600pt), the issue does not occur and swipe navigation works as expected Steps to Reproduce: Run the attached sample Objective-C/UIKit project. Turn on VoiceOver. Tap the “Open transparent modal” button to present the modal with UIModalPresentationOverFullScreen presentation. Tap the "Open form sheet" button to present the sheet (configured with a custom 300pt detent). Attempt to swipe right using VoiceOver to navigate to the next element (e.g., from the title label to the buttons). Expected Results: VoiceOver should smoothly navigate through the sequential accessibility elements inside the sheet's view hierarchy, respecting the bounds of the modal sheet. Actual Results: VoiceOver gets stuck. The swipe right/left gestures fail to move focus to the next element inside the sheet. Instead, focus often escapes to the background or doesn’t change completely.
Replies
1
Boosts
0
Views
176
Activity
1d
Disable sleep/wake when in Autonomous Single App Mode (ASAM)
If a user enables/disables Guided Access, they can modify the session settings to disable the top (sleep/wake) button. In Single App Mode (SAM), there is a payload option for disabling the sleep/wake button via Mobile Device Management (MDM). In Autonomous Single App Mode (ASAM), there doesn't appear to be any way to disable the top button. ASAM does not honor the Guided Access sessions settings, and there is no payload option in the MDM. This is a glaring issue especially when ASAM is marketed as the solution for apps in a medical setting where the app is trading hands from a medical professional to a patient. Our app is used during a lengthy procedure and does not function properly if the patient puts the iPad to sleep. We're stuck asking our medical professionals to put the iPad in Guided Access, but the user experience is clunky and would be much improved by implementing ASAM. Is there some little-known API for disabling the sleep/wake button during ASAM that I'm just missing?
Replies
0
Boosts
0
Views
35
Activity
8h