Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
210
Apr ’25
How to Enable Group Navigation Behavior for Custom Views in VoiceOver?
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar. How can I achieve the same behavior for a custom view? I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
0
0
427
Mar ’25
VoiceOver Headings Accessibility Rotor with SwiftUI on iOS
Hi, On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor). In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense. There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following. .accessibilityRotor(.headings) { ForEach(entries) { entry in // entry.id must be the same as the id of the SwiftUI view it is about AccessibilityRotorEntry(entry.name, id: entry.id) } } It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up. I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them. So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else) The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed). Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder. Thanks
0
0
104
Jun ’25
App Store Policy for apps that allows users to unlock contact details of person who posted content.
We have an app under development which allows musicians to unlock contact details of people who posted about an upcoming event. The musician pays a fees to unlock this contact details. Both the musician & the post owner are registered users. We will reveal the same contact info that the post owner used for account signup verification. Questions: Is this allowed? (given that we obtain consent to share contact info to other people and clearly mention this in privacy policy) If yes, will we have to use App store in-app purchase to facilitate this transaction or are we free to use a payment processor such as Stripe.
0
1
626
Feb ’25
Default Voices for AVSpeechUtterance
It appears iOS only comes with low quality voices installed. iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance. There doesn't seem to be any api that can be used to make this process easier for the app user. Is there a way / api that would allow an app to download and use a higher quality voice? Will apple ever install on default higher quality voices? We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response. Thanks
0
0
744
Sep ’25
A Summary of the WWDC25 Group Lab - Accessibility
A Summary of the WWDC25 Group Lab - Accessibility At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility. Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels? A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature. Which accessibility features can I indicate support for in Accessibility Nutrition Labels? The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion. With the new Accessibility Nutrition Labels, will app store reviewers validate what we select? The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app. Are there any updates to tools for analyzing the accessibility of our apps? Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource. What are accessibility features you wish more people integrated? Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users. What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass? Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable. How does Liquid Glass respond to contrast? Especially for text and low contrast environments. Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps. What are some Apple apps that stand out for their accessibility? Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
0
0
900
Jul ’25
ApplePay Merchant Session - Could not create SSL/TLS secure channel issue
As part of apple pay implementation we are trying to create a merchant session by trying to connect to apple endpoint https://apple-pay-gateway-cert.apple.com/paymentservices/startSession. While trying to do so we are facing an error “An error occurred while sending the request. The request was aborted: Could not create SSL/TLS secure channel.” . I call the validation url by passing to a C# .Net Framework 4.8 Web API. The API setups an HttpClient with the Merchant Identity Validation Certificate found in my apple account and calls the validation url passing in the required Json Validation Object. When I call PostAsync() I get an exception with the above error message Code is working successfully on my local machine but facing this issue while deployed on Dev / Model environment for testing. We have used Azure app service for deployment and TLS version 1.2 already present here. We have used the Merchant Identity certificate that was issued and have also checked with networking and infrastructure team to make its not an issue from our side. Does anyone have any other idea what could be causing this error. Thank you, Supriya
0
0
118
Jun ’25
Unable to Grant Input Monitoring Permission via MDM
I am trying to grant Input Monitoring permission using MDM (Mobile Device Management), but I am facing issues. While I am able to deny the permission, I am unable to grant it. In some profile configurator tools, I noticed a note stating: "Allows the application to use CoreGraphics and HID APIs to listen to (receive) CGEvents and HID events from all processes. Access to these events cannot be given in a profile; it can only be denied." This seems to suggest that granting Input Monitoring permission via an MDM profile may not be possible. Has anyone successfully granted Input Monitoring permission using MDM, or is there an alternative way to achieve this on managed macOS devices?
0
0
493
Feb ’25
Accessibility Personal Voice Issue with iOS & iPadOS Beta 1-6
Please refer to Feedback report: FB19701007 The Personal Voice file created in English changes to either Spanish or Chinese and no longer works properly. This has been happening since Beta 1 of iOS/iPadOS 26. I have been unable to pinpoint what causes this to occur. Possibly downloading foreign voices to a device or using different voices via AVSpeechSynthesizer. I run an app in Xcode on my device that prints to the console info about the installed voices. Initially after creation here is the output: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: en-US Then I hear a Spanish accent and find this: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: es-MX Currently it isn't working and here is the output: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: zh-CN Note that the voice file on all three above is the same. No matter what the user does, the created voice file should never be able to change languages. On my test devices I reset them all by erasing all content and settings and creating a new English Personal Voice and the issue persists. A side issue is the toggle share across devices doesn't remain off if turned off. I tried to not share to see if that could be the cause, but the toggle turns on automatically. It won’t remain off.
0
0
865
Aug ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
128
May ’25
Programmatically Modifying Per Application or System Wide Color Filters using Cocoa/Swift in MacOS?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings. As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs. My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
0
0
381
Jul ’25
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
149
Apr ’25
App Store Connect – “Unable to Handle This Request” Error
Hello, I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message: “appstoreconnect.apple.com is currently unable to handle this request.” I’ve tried the following steps, but the issue persists: Cleared browser cache and cookies Tried different browsers (Safari, Chrome) Attempted from multiple devices and networks Is this a known issue or is there any workaround available? Would appreciate any help or update on the current status. Thank you,
0
0
161
Jun ’25
VoiceOver and limited vision approaches for custom stepper control
I have a product for designing particle emitters, which I suspect may be of limited interest to people with limited vision. I'd still like to ensure I'm doing a good job with VoiceOver mode. There's a related, simplified sample online, if you want to look at the code As you can see from the picture below, a large part of the interface mimics Xcode's particle editor, with many value entry controls that combine up/down buttons with a tappable label. Tapping the label goes into edit mode. Apart from changing how labels are stepped through in voiceover in my app, how should I handle these stepper buttons? Is this a good place to use a Custom Rotor?
0
0
83
Jun ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
0
0
114
May ’25
How to handle AX (Keyboard Access & VoiceOver) for AttributedString used in UILabel
I use AttributedString to create a string containing a link. And I set the AttributedString to UILabel. How should I set up the Accessibility feature to make sure that I can keyboard focus on the substring with link and use keyboard operation to open the link I can VoiceOver the whole string and VoiceOver the substring with link to open the link Thanks a lot.
0
0
149
Mar ’25
Apple greets Global Accessibility Awareness Day with severe accessibility violations on macOS
I'm reposting here my FB17602742, submitted yesterday: The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions. At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS. Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet. The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions. What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations: Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
 Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting. Global Accessibility Awareness Day is approaching. Dear Apple, Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions. Thank you for your attention to the issue.
0
0
160
May ’25
Download Voices screen
System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
0
1
526
Nov ’25