Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks.
I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap.
How can I properly observe and handle the accessibilityPerformMagicTap() action?
I have a parent view containing 10 subviews. To control the VoiceOver navigation order, I set only a few elements in accessibilityElements. However, the remaining elements are not being focused or are completely inaccessible.
Is this the expected behavior? If I only specify a subset of elements in accessibilityElements, does it exclude the rest? What’s the best way to ensure all elements remain accessible while customising the order?
Hi everyone,
After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues:
Significant increase in system temperature
The laptop feels hot even with light usage like Safari and Figma
Rapid battery drain
Battery is dropping unusually fast compared to macOS Sonoma.
I’ve tried, Restarting the device.
I’m aware this is a beta, but just wondering.
Is anyone else experiencing this?
Is this a known issue?
Would love to hear if others are facing similar problems or if it might be something specific to my setup.
Thanks in advance!
Topic:
Accessibility & Inclusion
SubTopic:
General
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly.
I copied code to an empty project and did the same as what they did but still is not working
os: macOS and iOS
on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight.
static let shared = PersistenceController()
var spotlightDelegate: NSCoreDataCoreSpotlightDelegate?
@MainActor
static let preview: PersistenceController = {
let result = PersistenceController(inMemory: true)
let viewContext = result.container.viewContext
for _ in 0..<10 {
let newItem = Item(context: viewContext)
newItem.timestamp = Date()
}
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
return result
}()
let container: NSPersistentContainer
init(inMemory: Bool = false) {
container = NSPersistentContainer(name: "SpotLightSearchTest")
if inMemory {
container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null")
}
container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
if let description = self?.container.persistentStoreDescriptions.first {
description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
description.type = NSSQLiteStoreType
if let coordinator = self?.container.persistentStoreCoordinator {
self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate(
forStoreWith: description,
coordinator: coordinator
)
self?.spotlightDelegate?.startSpotlightIndexing()
}
}
})
container.viewContext.automaticallyMergesChangesFromParent = true
}
}
in my @main view
struct SpotLightSearchTestApp: App {
let persistenceController = PersistenceController.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistenceController.container.viewContext)
.onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
}
}
}
onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore.
Is There any reason for this to happening?.
Is anyone aware of this behaviour?
Any suggestion would be really appreciated.
Cheers
This has been an ongoing issue and continues in Tahoe. When dictating into Gmail in Safari, whole portions of sentences are copy and pasted, making the text a mess. I have reported this in feedback for a couple years, and it has never been resolved.
Topic:
Accessibility & Inclusion
SubTopic:
General
again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
Topic:
Accessibility & Inclusion
SubTopic:
General
I have an app that uses nearby with a custom accessory.
works great on iPhone 11-13,
starting with iPhone 14, one must use ARkit to get angles
we have two problems
ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app)
ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions..
IF ARkit is foreground, our app doesn't work.
with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen
also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done
this same code works ok on iOS 17.1 on iPhone 12.
Info.plist here
info.txt
if(SearchedServices == [] ){
services = [TransferService.serviceUUID,QorvoNIService.serviceUUID]
}
logger.info(
"scannerready, starting scan for peripherals \(services) and devices \(IDs)")
filteredIDs=IDs;
scanning=true;
centralManager.scanForPeripherals(withServices: services,
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
the calling code
dataChannel.autoConnect=autoConnect;
dataChannel.start(x,ids) // datachannel.start is above
self.scanning = true;
return "scanning started";
... log output
services from js = and devices= 5FE04CBB
services in implementation =
bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"]
scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"]
⚡️ TO JS {"value":"scanning started"}
AVPlayer has 3 visual accessibility issues with videos out of the box:
The contrast fails for the current time in the video
The contrast fails for the remaining time in the video
The hit area is too small for the time slider. The WCAG AA requirement is a minimum hit size of 24 x 24. The height of the hit area of the offending region is 8.
Is there a known fix for any of these?
This can be reproduced with this code in an app playground:
import SwiftUI
import AVKit
import UIKit
struct ContentView: View {
private let video = URL(string: "https://server15700.contentdm.oclc.org/dmwebservices/index.php?q=dmGetStreamingFile/p15700coll2/15.mp4/byte/json")!
@State private var player: AVPlayer?
var body: some View {
VStack {
VideoPlayerView(player: player)
.frame(maxWidth: .infinity, maxHeight: 200)
}
.task {
player = try? await loadPlayer(video: video)
}
}
}
private struct VideoPlayerView: UIViewControllerRepresentable {
let player: AVPlayer?
func makeUIViewController(context: Context) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
controller.modalPresentationStyle = .overFullScreen
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {
uiViewController.player = player
}
}
private func loadPlayer(video: URL) async throws -> AVPlayer {
let videoAsset = AVURLAsset(url: video)
let videoPlusSubtitles = AVMutableComposition()
try await videoPlusSubtitles.add(videoAsset, withMediaType: .video)
try await videoPlusSubtitles.add(videoAsset, withMediaType: .audio)
return await AVPlayer(playerItem: AVPlayerItem(asset: videoPlusSubtitles))
}
private extension AVMutableComposition {
func add(_ asset: AVAsset, withMediaType mediaType: AVMediaType) async throws {
let duration = try await asset.load(.duration)
try await asset.loadTracks(withMediaType: mediaType).first.map { track in
let newTrack = self.addMutableTrack(withMediaType: mediaType, preferredTrackID: kCMPersistentTrackID_Invalid)
let range = CMTimeRangeMake(start: .zero, duration: duration)
try newTrack?.insertTimeRange(range, of: track, at: .zero)
}
}
}
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
At present, in iOS, if using the in-house app, there may be crashes in the new iOS 18.3 and later versions, but it works normally on other phones and the certificate is not problematic.
A total of 3 machines were found, and there was no pattern between the machines and the system, with different models and versions.
We tested it on a machine that crashes, but the app downloaded from the store doesn't. If the same app is packaged and installed directly in the development tool, it will crash. Is this related to compatibility with the new version of IOS?
Is there a solution? Do others also have relevant situations?
Topic:
Accessibility & Inclusion
SubTopic:
General
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
Voice Control Disabling System Services After Reboot
I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem.
To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months.
I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem.
I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter.
Sincerely,
Donald Spencer Kirby
Dayton, Ohio
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
Is the accessibility feature, voice command recording available on the Apple Vision Pro? It does not start on my device.
The Apple Vision Pro is on 26.1.
Regular single voice commands work on the Apple Vision Pro.
Recording commands worked on other devices. (iPad and iPhone)
if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
Topic:
Accessibility & Inclusion
SubTopic:
General
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data.
Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible.
But when user swipes back:
Should it load the previous data and announce the previous item or it should go back to the navigation items?
If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa?
Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll?
I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
Dear developer team,
After updating to iOS 18.3.1 I noticed the font in the Notes app became too small to read comfortably, and I have already got poor eyesight.
There is no way to increase the font size. When I select my preferred text size through Accessibility settings, it only changes the size of headings in the Notes app but the text remains too small in the note itself. I’m using the IPhone 13.
I googled the issue and seems like other users across the Internet are also unhappy about the lack of ability to change the text size in Notes to suit their comfortable levels.
I hope that this issue will be addressed by developers in the next version of the iOS because the reading size in the standard app can affect health for the tired and diminished eyesight.
Kind regards,
Maria
Topic:
Accessibility & Inclusion
SubTopic:
General