In our application we are using a pop over view and we have enabled the accessibility VoiceOver, When user navigating inside the popover and reached to the last element that time with the right swipe we need to dismiss the popover.
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
I am registering my startup for an Apple Developer Account so that we can put our app on the App Store. I do not want use my personal Apple ID's payment to pay the $99 annual fee. How can I change the payment method so that I can continue with my enrollment?
It’s very annoying but on my iPhone 12 Pro I keep getting the accessibility app with the microphone on and it keeps opening the app by itself and it’s a blank screen and every time I close it it just reopens. I don’t know why it keeps doing this, but it drives me crazy. Does anyone know what else to do? I also have the beta iOS 26 but it’s been doing this even with the past update.
Hi,
I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong.
Regards.
I am encountering the following issue while working with app group preferences in my Safari web extension:
Couldn't read values in CFPrefsPlistSource<0x3034e7f80> (Domain: [MyAppGroup], User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd.
I am trying to read/write shared preferences using UserDefaults with an App Group but keep running into this error. Any guidance on how to resolve this would be greatly appreciated!
Has anyone encountered this before? How can I properly configure my app group preferences to avoid this issue?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Swift Packages
Messages
Xcode
Group Activities
On iOS, there is accessibilityLanguage.
The Text view seems to automatically prevent orphaned words on screen, but barring exceptional circumstances such as the font size being too large.
I couldn't find any documentation on this behaviour, how to configure, and would also be very interested in how it's implemented?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
iOSアプリでNEAppPushSessionを使い、NEAppPushDelegateの通知を受けてCallKitの着信画面を表示する実装をしていますが、以下の問題に直面しています。
8/13
ログにて下記のエラーが頻発しました。
通知の受け取りテストを約120回してその間ずっとこのエラーが出ていました。
エラー 2025-08-14 11:27:06.793073 +0900 nesessionmanager NESMAppPushSession[SimplePushDefaultConfiguration:7B7218F3-94B5-4AE5-9B9E-94E176694D02] failed to report incoming call to CallKit, error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process." UserInfo={NSDebugDescription=The connection to service named com.apple.callkit.networkextension.messagecontrollerhost was invalidated from this process.}
このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら通知の監視は開始しているようです。
15時間後の8/14
時間をあけたからか、再度通知が来るようになりました。
ですが再度通知の受け取りテストを行った時に同じエラーログが出ました。
再度通知テストを約120回程行ったら、このエラーログが頻発した後、callkitの通知画面が表示されなくなりました。
ですがどうやら今回も通知の監視は開始しているようです。
15時間後の8/15
今日は15時間かけても通知を取得できませんでした。
ですが同じく通知の監視は開始していそうです。
iPhoneの再起動、Xcodeのクリーンアップ、アンインストールして再インストールなどしても通知は来ないままでした。
また、不思議なことに通知が来ない事象が起きた端末以外でも同じように通知を取得することができません。
他の通知は受け取ることができますが独自の通知であるNEAppPushManagerだけ通知を取得することができません。
質問です。
再度通知を出すためには何をすれば良いでしょうか。
この事象は4099エラーを出しすぎたことにより発生する障害なのでしょうか。
4099エラーを出ている原因は何でしょうか。
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
User Notifications
PushKit
CallKit
Push To Talk
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration
Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss.
I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
Topic:
Accessibility & Inclusion
SubTopic:
General
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
Hello everyone,
Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”).
Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model:
The app is completely free
No digital content or subscriptions are sold
Dues payments are made via bank transfer or credit card directly to the manager
Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process?
Thanks!
Topic:
Accessibility & Inclusion
SubTopic:
General
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
Topic:
Accessibility & Inclusion
SubTopic:
General
Please excuse me if this is obvious. I'm new to Apple development.
Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them?
Or... is there a demangler somewhere that will translate from these names into something this human might recognize?
I'm targeting macos, btw, if that makes any difference.
Topic:
Accessibility & Inclusion
SubTopic:
General
there is no possibility to sett the allow mobile Data switch I have the latest update but still does not work and I realised it when I went to another country and I could not sett my Mobile data and when I came back still I could not.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi everyone,
My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment.
Problem 1: Simulating Eye Tracking in Simulator
We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro?
Problem 2: WebSocket-triggered "Click" doesn't work outside the app
We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works.
We suspect this is due to sandbox restrictions or lack of system-level permissions.
Is it possible in anyway to:
Trigger interaction events outside the app using custom input (like BCI via Websocket)?
Access system-wide click/tap simulation APIs from within VisionOS apps
Integrate this with accessibility services (like Voice Control or AssistiveTouch)
We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS.
Thanks in advance for any insight you can provide!
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Xcode
Accessibility
iPad and iOS apps on visionOS
Should I allow the CIJSULAgent to find devices on local network?
Description
When calling AppStore.showManageSubscriptions(in:), the system modal for managing subscriptions appears visually. However, it is not automatically focused by VoiceOver, and in some cases, VoiceOver still allows interaction with elements in the underlying view controller, such as buttons and labels. This creates confusion and violates accessibility expectations.
Steps to Reproduce
1. In a UIKit app, present the system subscription sheet via AppStore.showManageSubscriptions(in:).
2. Ensure VoiceOver is enabled on the device.
3. Observe the focus behavior when the modal appears.
4. Try swiping right/left — VoiceOver continues to announce items in the presenting view controller.
Expected Result
The modal should automatically take VoiceOver focus, and all elements behind it should be non-accessible until dismissed.
Actual Result
VoiceOver continues to focus and interact with elements behind the presented modal.
Notes
• Tested on iOS 18.5
• Reproducible on device
• Using Swift/UIKit (not SwiftUI)
Topic:
Accessibility & Inclusion
SubTopic:
General
In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
Hello,
I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued.
I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update.
My questions are:
Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending?
How long does Apple typically wait for D&B updates before the enrollment is affected?
Are there any alternative steps I can take to avoid further delays?
Thank you for your guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General