Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Accessibility for detents behaves different in fullscreen cover
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not. struct ContentView: View { @State private var buttonClicked = false @State private var bottomSheetShowing = false var body: some View { NavigationView { VStack { Button(action: { buttonClicked = true }, label: { Text("First Page Button") .padding() .background(Color.blue) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("First Page Button") FullscreenView2() } .navigationTitle("Welcome") .fullScreenCover(isPresented: $buttonClicked) { FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing) } } } } struct FullscreenView: View { @Binding var buttonClicked: Bool @Binding var bottomSheetShowing: Bool var body: some View { NavigationView { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button(action: { buttonClicked = false }, label: { Text("Close") }) .accessibilityLabel("Close Fullscreen View Button") } } .accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } } struct FullscreenView2: View { @State var bottomSheetShowing = false var body: some View { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") //.accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } struct BottomSheetView: View { @Binding var bottomSheetShowing: Bool // @AccessibilityFocusState var isFocused: Bool var body: some View { VStack(spacing: 20) { Text("Bottom Sheet") .font(.headline) .accessibilityAddTraits(.isHeader) Button(action: { bottomSheetShowing = false }, label: { Text("Dismiss") .padding() .background(Color.red) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("Dismiss Bottom Sheet Button") } .padding() .frame(maxWidth: .infinity, maxHeight: .infinity) .background( Color(UIColor.systemBackground) .edgesIgnoringSafeArea(.all) ) .accessibilityAddTraits(.isModal) // Indicates that this view is a modal // .onAppear { // // Set initial accessibility focus when the sheet appears // DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { // isFocused = true // } // } // .accessibilityFocused($isFocused) } }
1
1
609
Feb ’25
Imessage and Facetime error
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message. Does anyone have any ideas what the problem could be?
1
1
148
Jun ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
98
Jun ’25
How to Implement Dynamic Type for UITextFields Without Resetting Data
Hello! I was doing some accessibility testing for my app and found out that when the user switches the text size, all of the data in the text fields is reset, which causes major disruption. I've tried looking for documentation, but all I've found is information on how to dynamically scale the UI for different text sizes, which I've already implemented. My guess is that every time Dynamic Type registers a change, it redraws my UI instead of just updating it. How can I make sure the data is not reset when the text size changes?
1
1
522
2w
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
114
Jun ’25
The brightness of the iPad Pro screen is gone after new ios26
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
1
1
94
Jun ’25
AirPods Pro 3 HRV Data Access Through HealthKit?
Hey everyone I'm working on a health app that's heavily focused on HRV tracking and analysis, and I'm trying to figure out what's actually possible with AirPods Pro 3 from a developer standpoint. The hardware clearly has a much better heart rate sensor than the previous generation, but I'm hitting some walls when it comes to actually accessing the data I need. So here's the situation I'm dealing with: When I query HealthKit for HRV samples, I'm not seeing anything coming from AirPods Pro 3. The device is obviously capable of tracking heart rate continuously during workouts and listening sessions, and from what I've read about the hardware, it should theoretically be able to capture the inter-beat intervals needed for HRV calculation. But either that data isn't being processed on-device, or it's just not being made available through the standard HealthKit data types that third-party apps can access. What I'm really after is either direct HRV metrics (like SDNN, which Apple Watch already provides through HKQuantityTypeIdentifierHeartRateVariabilitySDNN) or even better, access to the raw R-R interval data. With R-R intervals, I could calculate RMSSD, pNN50, and other time-domain and frequency-domain HRV metrics that are super valuable for tracking recovery, autonomic nervous system balance, and stress levels. This would be especially useful since a lot of users wear AirPods during activities when they're not wearing their Apple Watch. Has anyone managed to find a way to pull this data from AirPods Pro 3? Are there any private frameworks or entitlements I should be looking into? Or is this just fundamentally not exposed to developers at the OS level right now? I've gone through the HealthKit documentation pretty thoroughly and haven't found anything that specifically addresses this, but I'm wondering if I'm missing something or if there are any known workarounds. I'm also curious if anyone has heard anything from Apple about future plans to expose this data. It seems like a missed opportunity given how capable the hardware is and how much value developers could provide with access to this physiological data. Would love to hear if anyone else is working on similar features or has insights into the technical limitations here.
1
0
618
Oct ’25
Live Captions only partially works - help?
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
3
1
228
Mar ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
360
Oct ’25
iOS 26 Voice Over is reporting an extra tab
Feedback number: FB20451665 When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below: /* This is a Sample project to show that I believe there is a Voice Over bug in iOS 26. When swiping through tabs with Voice Over active, there always appears to be an extra tab. Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6. Is there a possibility that voice over is picking up the underlying `more` tab and reading that out? This has also been reportedly found in the Files app here: https://www.applevis.com/comment/195441#comment-195441 */ struct ContentView: View { var body: some View { TabView { /// Activating this has Voice over telling us there are 6 Tabs. Tab(RootTab.home.title, systemImage: "circle.fill") { Text("This is the \(RootTab.home.title.capitalized) screen") } .accessibilityLabel("\(RootTab.home.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.home.title.capitalized) tab") Tab(RootTab.diary.title, systemImage: "circle.fill") { Text("This is the \(RootTab.diary.title.capitalized) screen") } .accessibilityLabel("\(RootTab.diary.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.diary.title.capitalized) tab") Tab(RootTab.meals.title, systemImage: "circle.fill") { Text("This is the \(RootTab.meals.title.capitalized) screen") } .accessibilityLabel("\(RootTab.meals.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.meals.title.capitalized) tab") Tab(RootTab.knowledge.title, systemImage: "circle.fill") { Text("This is the \(RootTab.knowledge.title.capitalized) screen") } .accessibilityLabel("\(RootTab.knowledge.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.knowledge.title.capitalized) tab") Tab(RootTab.profile.title, systemImage: "circle.fill") { Text("This is the \(RootTab.profile.title.capitalized) screen") } .accessibilityLabel("\(RootTab.profile.title.capitalized) tab") .accessibilityHint("Double tap to open the \(RootTab.profile.title.capitalized) tab") /// Activating this also has Voice over telling us there are 6 Tabs. // ForEach(RootTab.allCases, id: \.self) { tab in // // Text("This is the \(tab.title.capitalized) screen") // .tabItem { // Label(tab.title.capitalized, systemImage: "circle.fill") // } // .accessibilityLabel("\(tab.title.capitalized) tab") // .accessibilityHint("Double tap to open the \(tab.title.capitalized) tab") // } } } enum RootTab: CaseIterable { case home case diary case meals case knowledge case profile var title: String { switch self { case .home: "home" case .diary: "diary" case .meals: "meals" case .knowledge: "knowledge" case .profile: "profile" } } } } I'm curious if anyone else can see this issue, or if anyone knows of a workaround for it.
3
0
2k
Oct ’25
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data.
Your app's binary includes references to HealthKit components, but the app still does not appear to include any primary features that require health or fitness data. Next Steps To resolve this issue, please remove any HealthKit functionality from the app, as well as any references to this app’s interactivity with HealthKit from the app or its metadata. This includes removing any HealthKit-related keys in the app's Info.plist or InfoPlist.strings files, as well as removing any calls to HealthKit APIs, including those from third-party platforms, from the app.
1
1
257
Oct ’25
Medication data insert from third party app
I want to insert the medication data which is available from ios 26 from my app to apple health kit. I have tried to get the permission to read and write data but app got crashed while I tried to request that permission. Does apple allow to insert the medication data to apple health kit likewise we are able to add other health and fitness data or not? let healthStore = HKHealthStore() @available(iOS 26.0, *) @objc func requestAuthorization(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: @escaping RCTPromiseRejectBlock) { guard HKHealthStore.isHealthDataAvailable() else { print("not available ") return } let doseType = HKObjectType.medicationDoseEventType() let medType = HKObjectType.userAnnotatedMedicationType() healthStore.requestAuthorization(toShare: [doseType], read: [doseType]) { success, error in if let err = error { reject("auth_error", err.localizedDescription, err); return } self.healthStore.requestPerObjectReadAuthorization(for: medType, predicate: nil) { s, e in if let err2 = e { reject("per_obj_auth", err2.localizedDescription, err2); return } resolve(["ok": success && s]) } } }
1
1
867
Oct ’25
Making VoiceOver more concise on a SwiftUI Menu
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include: Apple Music, Library tab Photos, Album view Reminders In all these cases, VoiceOver reads this element as "More, Button". In my SwiftUI app, I've implemented a visually identical button. Menu { // Button for Menu Item 1 // Button for Menu Item 2 // ... } label: { Image(systemName: "ellipsis.circle") .accessibilityHidden(true) } .accessibilityLabel("More") However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
2
1
506
Jan ’25
App Store Policy for apps that allows users to unlock contact details of person who posted content.
We have an app under development which allows musicians to unlock contact details of people who posted about an upcoming event. The musician pays a fees to unlock this contact details. Both the musician & the post owner are registered users. We will reveal the same contact info that the post owner used for account signup verification. Questions: Is this allowed? (given that we obtain consent to share contact info to other people and clearly mention this in privacy policy) If yes, will we have to use App store in-app purchase to facilitate this transaction or are we free to use a payment processor such as Stripe.
0
1
614
Feb ’25