Hello,
I am working on a Braille keyboard by using HID approach.
Current the device works with iPhone 11 and SE3.
However, when tested in iPhone 6s with iOS 15, although the device can be connected and recognized as Braille device in VoiceOver screen, the phone shows no response to key press report.
Would there be any requirement at points such as HID descriptor for iPhone 6s support on Braille device? If iPhone 6s does not support such devices, what is the minimum system requirements?
Thank you!
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design.
What I want to achieve:
Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale.
Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled.
What I’ve tried:
Increased color contrast in the original icon design.
Added outlines and stronger shapes.
Tested with grayscale filters in design tools.
Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode.
Questions:
Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon?
If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale?
Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes?
Additional info:
iOS version tested: iOS 17.5
Development in Swift + SwiftUI, using Asset Catalog for icons.
I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
I am facing issue of back camera in my iphone 14 plus it is showing the black screen and my iphone is manufacture between april 2023 to april 2024 but its still not eligible for apple program my phone is also getting same issue why its not eligible for it
Topic:
Accessibility & Inclusion
SubTopic:
General
i have a swift app,in latest version , i change some class module . in old version used archivedData(withRootObject rootObject: Any) to save class model. in lastest version crashed -[NSKeyedUnarchiver decodeObjectForKey:]: cannot decode object of class
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
Whenever I put accessibility focus on an image and if image has some text in it, voiceover reads that text along with image's accessibility label.
Is there a way to programmatically turn off text recognition on images for accessibility?
I couldn't find any relevant accessibility API's that could help here.
Thanks!
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
hi giys,can anyone help me bcouse i cant pair my apple watch series 1 with my iPhone 15
Topic:
Accessibility & Inclusion
SubTopic:
General
i have updated to the ipados 26 and my pointer is still the circle one and not the arrow cursor
Topic:
Accessibility & Inclusion
SubTopic:
General
When my macOS Cocoa app displays a modal alert with beginSheetModal(for:completionHandler:), VoiceOver sometimes seems to focus on an "illegal" upper level, where any attempts at navigation will give the unhelpful response "Alert, dialog", until you "drill down" with VO + shift + down or switch apps. After that, things will work as expected.
Is this a known bug? Does it happen to anybody else, or am I doing something wrong?
Hey everyone
I'm working on a health app that's heavily focused on HRV tracking and analysis, and I'm trying to figure out what's actually possible with AirPods Pro 3 from a developer standpoint. The hardware clearly has a much better heart rate sensor than the previous generation, but I'm hitting some walls when it comes to actually accessing the data I need.
So here's the situation I'm dealing with: When I query HealthKit for HRV samples, I'm not seeing anything coming from AirPods Pro 3. The device is obviously capable of tracking heart rate continuously during workouts and listening sessions, and from what I've read about the hardware, it should theoretically be able to capture the inter-beat intervals needed for HRV calculation. But either that data isn't being processed on-device, or it's just not being made available through the standard HealthKit data types that third-party apps can access.
What I'm really after is either direct HRV metrics (like SDNN, which Apple Watch already provides through HKQuantityTypeIdentifierHeartRateVariabilitySDNN) or even better, access to the raw R-R interval data. With R-R intervals, I could calculate RMSSD, pNN50, and other time-domain and frequency-domain HRV metrics that are super valuable for tracking recovery, autonomic nervous system balance, and stress levels. This would be especially useful since a lot of users wear AirPods during activities when they're not wearing their Apple Watch.
Has anyone managed to find a way to pull this data from AirPods Pro 3? Are there any private frameworks or entitlements I should be looking into? Or is this just fundamentally not exposed to developers at the OS level right now?
I've gone through the HealthKit documentation pretty thoroughly and haven't found anything that specifically addresses this, but I'm wondering if I'm missing something or if there are any known workarounds.
I'm also curious if anyone has heard anything from Apple about future plans to expose this data. It seems like a missed opportunity given how capable the hardware is and how much value developers could provide with access to this physiological data. Would love to hear if anyone else is working on similar features or has insights into the technical limitations here.
Hi,
I want to detect if there is a fullscreen window on each screen.
_AXUIElementGetWindow and kAXFullscreenAttribute methods work, but I have to be in a non-sandbox environment to use them.
Is there any other way that also works? I don't think it's enough to judge if it's fullscreen by comparing the window size to the screen size, since it doesn't work on MacBook with notch, or the menu bar is set to 'auto-hide'.
Thanks.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Accessibility
Mac App Store
Core Graphics
App Sandbox
I have an issue in my app when it is used together with the assistive access feature.
For authentication, we are using the capacitor firebase authentication plugin (https://www.npmjs.com/package/@capacitor-firebase/authentication) which enables users to login via apple (FirebaseAuthentication.signInWithApple(...)), google (FirebaseAuthentication.signInWithGoogle(...)), or email. Works just fine. However, when the assistive access feature is enabled, the login fails for apple ("The operation couldn't be completed. com.apple.AuthenticationServices.AuthorizationError error 1000) and google ("The user canceled the sign-in flow).
It seems like the popups for sign-in are blocked and therefore an error is returned immediately. The popups may be blocked by assistive access, causing the capacitor plugin to be unable to authenticate.
I have tested this on my iPhone 12 Pro using iOS 17.7
I would appreciate any suggestions to handle this issue!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi All,
I am develop the a braille keyboard with iOS, when I testing the typing function in notes , the screen update is very slow after typing, we suppose the respond should been instance change, I am not sure how to setup voiceover setting.
does any document supply on this issue?
or does a any guideline for this?
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com.
Normally, the server returns:
HTTP Status: 200 (OK)
Content-Type: text/html
This has always been used as a sign of normal connectivity.
Issue:
Since last Friday, the server sometimes responds with:
Content-Type: application/octet-stream
When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon).
Question:
Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type?
Any advice how can we solve this problem?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
I’d love to see Apple implement a Bionic Reading feature as a system-wide accessibility option. This type of reading aid highlights the first part of each word in bold to help guide the eyes and improve comprehension.
It’s been shown to be especially helpful for people with ADHD, dyslexia, and other neurodivergent needs. Having a toggle in Settings > Accessibility would be life-changing.
Ideally, it could be:
• Enabled system-wide, or per-app
• Allow customization of how much of the word is bolded
• Available in Safari, Messages, Books, News, etc.
In iOS18, when a button using @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink,
The button is not accessible via Bluetooth (external) keyboard)
Is this a known isssue in iOS18
Topic:
Accessibility & Inclusion
SubTopic:
General
I've just received an email from Apple regarding the Global Accessibility Awareness Day and some forthcoming sessions to promote their accessibility features.
What a joke.
For many years, Apple refuses to provide the most basic accessibility requirement on macOS:
LET USERS DISABLE ALL NON-CONSENSUAL UNSOLICITED ANIMATIONS AND OTHER UI CONVULSIONS.
The scourge of animations started from macOS Lion.
Yes, many of them can be, fortunately, disabled through some obscure Terminal commands (that is, if the user is lucky enough to discover them on some obscure internet resources).
The "Reduce motion" control in System Settings is a fake option that doesn't do anything.
And there are two most glaring accessibility violations that cannot be disabled:
Scroll bar rollover highlight effect introduced on macOS 10.7.3. Every time you move the cursor over a scroll bar, the bar gets highlighted. It results in bringing the user's attention to random scroll bars for no reason whatsoever just because the cursor happens to pass over the bar at some point. HUNDREDS of unnecessary, annoying events of distraction daily!
Expand/collapse animation of NSOutlineView (such as when we open/close a folder in the list view in the Finder, as well as any other app that's using outline views). It's extremely annoying, distracting, and time-wasting.
All feedback submitted about this through the years remains mostly ignored (except for a few cases where I received some ridiculous replies from employees who, apparently, are barely familiar with Macs in general).
Apple does NOT care about accessibility. Not only this, but it's obvious that Apple is, in fact, intentionally abusing those users who can't tolerate distracting, time-wasting animations and UI convulsions.
Updated to iOS 26 beta and now the TV remote app in the control center won’t open. I’ve tried the following:
Restart phone
Remove shortcut and re-add
Cant find any other troubleshooting methods for this issue online so I’m guessing it’s a new problem.
Topic:
Accessibility & Inclusion
SubTopic:
General
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings.
Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2.
Particularly affects:
Third-party speech synthesis voices (CereProc, Grammatek, etc.)
Apps relying on automatic voice selection based on user preferences
Example:
// User selected CereProc Heather for en-GB in Accessibility settings
let voice = AVSpeechSynthesisVoice(language: "en-GB")
print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default)
Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting.
Tested methods:
AVSpeechSynthesisVoice(language:)
AVSpeechUtterance auto-selection
Reflection for new APIs
All return the system default voice, not the user's preference.
Filed: FB[20271264]
Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?