Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

TestFlight access stuck on old device
Hey all — hoping someone here has dealt with this before. I’m testing an iOS app via TestFlight, and when I originally got access, I didn’t have an iPhone. So I signed in with my Apple ID on my girlfriend’s iPhone and used TestFlight there. Everything worked fine. Now I finally have my own iPhone (iPhone 16), downloaded TestFlight, signed in with the same Apple ID, and had the developer resend the invite. But when I tap "Open in TestFlight" from the invite email, I get this error: “Couldn’t load app because your Apple account has already been associated to this app.” The dev tried removing me as a tester and re-adding me, I’ve deleted TestFlight from both phones, rebooted, reinstalled, waited in between — still no luck. Even tried opening the invite link in Safari instead of Mail. Is there any way to get Apple to fully reset the association with the old device so I can use TestFlight on my new iPhone? Or do I really need to make a new Apple ID just to get around this? Any help would be huge — thanks!
0
2
321
Jun ’25
Implementing App Clips with .NET MAUI
We have an iOS App built in .NET MAUI (Multi-platform App UI). This is a web view App. We wish to integrate APP Clips into this App. But we are unable to do it, due to less available resources online on such implementation. We do not wish to share code between .NET MAUI App and App clips. We understand it is not possible to add APP Clips without a parent swift/Xcode app. As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it. This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect. We need to know if such apps will be approved by APPSTORE Connect? Please guide us on this. Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
0
1
127
Jun ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
116
Jun ’25
Accessibility for detents behaves different in fullscreen cover
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not. struct ContentView: View { @State private var buttonClicked = false @State private var bottomSheetShowing = false var body: some View { NavigationView { VStack { Button(action: { buttonClicked = true }, label: { Text("First Page Button") .padding() .background(Color.blue) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("First Page Button") FullscreenView2() } .navigationTitle("Welcome") .fullScreenCover(isPresented: $buttonClicked) { FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing) } } } } struct FullscreenView: View { @Binding var buttonClicked: Bool @Binding var bottomSheetShowing: Bool var body: some View { NavigationView { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button(action: { buttonClicked = false }, label: { Text("Close") }) .accessibilityLabel("Close Fullscreen View Button") } } .accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } } struct FullscreenView2: View { @State var bottomSheetShowing = false var body: some View { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") //.accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } struct BottomSheetView: View { @Binding var bottomSheetShowing: Bool // @AccessibilityFocusState var isFocused: Bool var body: some View { VStack(spacing: 20) { Text("Bottom Sheet") .font(.headline) .accessibilityAddTraits(.isHeader) Button(action: { bottomSheetShowing = false }, label: { Text("Dismiss") .padding() .background(Color.red) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("Dismiss Bottom Sheet Button") } .padding() .frame(maxWidth: .infinity, maxHeight: .infinity) .background( Color(UIColor.systemBackground) .edgesIgnoringSafeArea(.all) ) .accessibilityAddTraits(.isModal) // Indicates that this view is a modal // .onAppear { // // Set initial accessibility focus when the sheet appears // DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { // isFocused = true // } // } // .accessibilityFocused($isFocused) } }
1
1
616
Feb ’25
The brightness of the iPad Pro screen is gone after new ios26
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
1
1
103
Jun ’25
Problem with the note application
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared. Thank you very much Good to you Christophe
1
1
245
May ’25
Live Captions only partially works - help?
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
3
1
250
Mar ’25
how accessible is enough for Accessibility Nutrition Labels?
My team has a robust digital accessibility program and processes for WCAG conformance in our apps. Because of this, there are definitely accessibility defects that get caught and addressed in order of impact and business priority like any other bug. Obviously we want to aim for 100% accessibility for our users, but it's a continual work in progress as new enhancements or changes are released. I'm stuck on the appropriate measurement to indicate support. If we have 50 common tasks and the most central 10 tasks are solid but some supporting (but also common) tasks have a contrast fail or accessibleLabel missing, does that make the whole app not supporting the feature? If "completing the task" is the rubric there are a whole range of interpretations for that. In a complex app, I anticipate that a group like ours will have strong support for many of the Accessibility Nutrition Labels accessibility features across tasks and devices, but realistically never be 100% free of defects for a given Apple Accessibility feature, even among core tasks. As I consider the next steps for Nutrition Labels, I do not see anything in the documentation that gives a sort of baseline or measurement for inclusion. We plan to test all steps to complete a task, and log defects accordingly with an assigned timeline for fixing them (as would be true for functional defects).
4
0
2.0k
3w
"Captions" in the Accessibility Nutrition Label for text-based apps
My game app is text-based interactive fiction, containing no audio/video content, making captions unnecessary. Our game is completely accessible to deaf users. Despite this, in the Accessibility Nutrition Label, I'm only able to leave the "Captions" box checked or unchecked. Leaving it unchecked would leave deaf players with the wrong impression that they can't enjoy our game. Leaving it checked would imply that we do have A/V content with captions included. In the WWDC video on this, https://developer.apple.com/videos/play/wwdc2025/224/ the video says: After we completed common tasks, we realized our app doesn’t have any video or audio only content. In this case, we aren’t going to indicate that Landmarks supports Captions. That's okay. This accurately describes the features that people will expect to be available while using the app. Maybe that's "OK," but I wish the form allowed me to say "This app doesn't contain audio/video content."
3
1
130
Jun ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
124
Jun ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
434
Oct ’25
IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) does not work
I have an app that needs Input Monitoring permissions to get keyboard access in the background. I've attempted to use both IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) and IOHIDRequestAccess(kIOHIDRequestTypeListenEvent), but they always return denied, even though I have given the permission for Input Monitoring to the app in Settings. Is there something I need to put in my Info.plist to enable this permission to work?
1
1
2.4k
3w
Assistive Access + Firebase Authentication
I have an issue in my app when it is used together with the assistive access feature. For authentication, we are using the capacitor firebase authentication plugin (https://www.npmjs.com/package/@capacitor-firebase/authentication) which enables users to login via apple (FirebaseAuthentication.signInWithApple(...)), google (FirebaseAuthentication.signInWithGoogle(...)), or email. Works just fine. However, when the assistive access feature is enabled, the login fails for apple ("The operation couldn't be completed. com.apple.AuthenticationServices.AuthorizationError error 1000) and google ("The user canceled the sign-in flow). It seems like the popups for sign-in are blocked and therefore an error is returned immediately. The popups may be blocked by assistive access, causing the capacitor plugin to be unable to authenticate. I have tested this on my iPhone 12 Pro using iOS 17.7 I would appreciate any suggestions to handle this issue!
1
1
747
Jul ’25
Download Voices screen
System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
0
1
526
Nov ’25
VoiceOver Accessibility Tree out of sync with WKWebView contents
Hey, We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously. I have a sample project where this can be reproduced and a video showing the issue. See FB21257352 The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.
1
0
738
Dec ’25