Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Need app blocking permission for Screen Time Limit app - CAN'T GET ANSWER FROM SUPPORT FOR 3 WEEKS. APP HAS 100K FOLLOWERS ON SOCIAL MEDIA ALREADY
Hey everyone! I am developing a screen time limit app to help people spend less time in distracting apps. It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps. This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet. So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago. There are a lot of similar apps on the market, and this feature exists in other screen time limit apps. Why is app blocking permission needed? Time Exchange Functionality: Users independently select which apps are productive and which are distracting for them. The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage. Full User Control: All apps to be blocked are manually selected by the user in the settings. The extension does not impose any restrictions without explicit permission. Transparency and Security: Blocking happens locally, with no data collected about app usage. We adhere to Apple’s privacy policy. Compliance with App Store Guidelines: We understand that app blocking is a sensitive feature, but in our case it: Is used for the benefit of the user (digital detox, productivity improvement). Does not interfere with system processes or other developers’ apps. Does not misuse access to APIs. My question to the forum is: Did you have similar problems, and how did you resolve them? Are there any ways to speed up the process or contact someone from the approval team directly? Should I give up and release it on Android? I am very disappointed and frustrated. Hope to get some useful tips. Thank you very much!
0
1
147
May ’25
ApplePay Merchant Session - Could not create SSL/TLS secure channel issue
As part of apple pay implementation we are trying to create a merchant session by trying to connect to apple endpoint https://apple-pay-gateway-cert.apple.com/paymentservices/startSession. While trying to do so we are facing an error “An error occurred while sending the request. The request was aborted: Could not create SSL/TLS secure channel.” . I call the validation url by passing to a C# .Net Framework 4.8 Web API. The API setups an HttpClient with the Merchant Identity Validation Certificate found in my apple account and calls the validation url passing in the required Json Validation Object. When I call PostAsync() I get an exception with the above error message Code is working successfully on my local machine but facing this issue while deployed on Dev / Model environment for testing. We have used Azure app service for deployment and TLS version 1.2 already present here. We have used the Merchant Identity certificate that was issued and have also checked with networking and infrastructure team to make its not an issue from our side. Does anyone have any other idea what could be causing this error. Thank you, Supriya
0
0
107
Jun ’25
How to lookup keybinding translation across input sources
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources. The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too. Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs. I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled. Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
0
0
433
Jan ’25
App in Unlisted Language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it. Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
0
0
169
Jul ’25
When using UIScrollView and UITextView together, inserting a link causes the following text to disappear.
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView. However, when I use the link here, the text behind it is missing. Ex) https://appstoreconnect.apple.com/login\nApple Developer Login -> The text “Apple Developer Login” does not appear. If anyone has experienced the same problem as me or knows a solution, please leave a comment. Note) It is working normally in iOS16, but the text behind the link disappears in iOS18. The text is not visible, but you can copy it and paste it to retrieve the missing text.
0
0
244
Feb ’25
iOS: How to maintain good app icon contrast in grayscale mode?
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design. What I want to achieve: Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale. Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled. What I’ve tried: Increased color contrast in the original icon design. Added outlines and stronger shapes. Tested with grayscale filters in design tools. Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode. Questions: Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon? If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale? Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes? Additional info: iOS version tested: iOS 17.5 Development in Swift + SwiftUI, using Asset Catalog for icons. I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
0
1
419
Aug ’25
IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) does not work
I have an app that needs Input Monitoring permissions to get keyboard access in the background. I've attempted to use both IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) and IOHIDRequestAccess(kIOHIDRequestTypeListenEvent), but they always return denied, even though I have given the permission for Input Monitoring to the app in Settings. Is there something I need to put in my Info.plist to enable this permission to work?
0
0
438
1w
Make Accessibility Focus move to UIPickerView when tapping on UITextField (Full Keyboard Access)
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too). The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end. What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys. I've tried using: UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access. Making the UIPickerView a first responder when it appears. Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either). Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
0
0
395
Mar ’25
Seeking API Support for Marking Substrings as Headings in NSTextView for VoiceOver
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app. I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links. Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way. Thank you in advance for your help! Best regards,
0
0
100
May ’25
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26 Steps: Open the path above → “Hattori” is not listed and cannot be downloaded Expected: Hattori is available to download and select Actual: Hattori is absent from the catalog Regression: Was available on iOS 18.x on the same device
0
0
382
Sep ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
137
May ’25
Thai input glitch after Command + Delete in macOS beta
Issue: When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected. Details: Affected apps: Notes, Messages, and some other native apps Not affected: Browser text fields (Safari, Chrome, etc.) Does not occur when using Option + Delete or just Delete macOS [insert beta version + build number] Mac model: [insert model] Input sources: Thai – Kedmanee, English – U.S. Steps to reproduce: Open Notes (or Messages). Switch to Thai input. Type a few Thai words. Press Command + Delete. Type again — the first character shows up in English. Expected: First character should remain in Thai, consistent with the active input source. Actual: First character shows as English, then input switches back to Thai.
0
0
795
Aug ’25
Download Voices screen
System settings => Accessibility => System Voice => the little (i) beside the pulldown => Voices => THIS SCREEN will allow you to download Premium Voices Is there a way to trigger this screen programmatically. Or at least a link to get my users there without having to dig thru that swamp of screens?
0
1
392
Nov ’25
CFPrefsPlistSource Read Error with App Group Preferences
I am encountering the following issue while working with app group preferences in my Safari web extension: Couldn't read values in CFPrefsPlistSource<0x3034e7f80> (Domain: [MyAppGroup], User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd. I am trying to read/write shared preferences using UserDefaults with an App Group but keep running into this error. Any guidance on how to resolve this would be greatly appreciated! Has anyone encountered this before? How can I properly configure my app group preferences to avoid this issue?
0
0
684
Feb ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
113
Apr ’25