Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

VoiceOver Not Scrolling to Focused TableView Cell
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app. Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
1
0
177
Apr ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
139
Apr ’25
How to set accessibility-label to NSTextAttachment ?
I have the following method to insert @mentions to a text field: func insertMention(user: Token, at range: NSRange) -> Void { let tokenImage: UIImage = renderMentionToken(text: "@\(user.username)") let attachment: NSTextAttachment = NSTextAttachment() attachment.image = tokenImage attachment.bounds = CGRect(x: 0, y: -3, width: tokenImage.size.width, height: tokenImage.size.height) attachment.accessibilityLabel = user.username attachment.accessibilityHint = "Mention of \(user.username)" let attachmentString: NSMutableAttributedString = NSMutableAttributedString(attributedString: NSAttributedString(attachment: attachment)) attachmentString.addAttribute(.TokenID, value: user.id, range: NSRange(location: 0, length: 1)) attachmentString.addAttribute(.Tokenname, value: user.username, range: NSRange(location: 0, length: 1)) let mutableText: NSMutableAttributedString = NSMutableAttributedString(attributedString: textView.attributedText) mutableText.replaceCharacters(in: range, with: attachmentString) mutableText.append(NSAttributedString(string: " ")) textView.attributedText = mutableText textView.selectedRange = NSRange(location: range.location + 2, length: 0) mentionRange = nil tableView.isHidden = true } When I use XCode's accessibility inspector to inspect the text input, the inserted token is not read by the inspector - instead a whitespace is shown for the token. I want to set the accessibility-label to the string content of the NSTextAttachment. How?
1
1
868
Jul ’25
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
1
0
211
Apr ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
146
Apr ’25
VoiceOver for Accessibility Labels with Localization
Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency. CurrencyFormatter.m + (NSString *) localizedCurrencyStringFromDouble: (double) value { NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init]; formatter.numberStyle = NSNumberFormatterCurrencyStyle; formatter.locale = [NSLocale currentLocale]; NSString *currencyString = [formatter stringFromNumber: @(value)]; [formatter release]; return currencyString; } View Contoller self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total"); self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total]; I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here. If anyone has any guidance, it would help me a lot! Thank you!
1
0
759
Jul ’25
Please consider having Name Recognition in a shortcut automation
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss. I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
1
0
606
Sep ’25
Siri misreads local currency in notifications (Bug reported, still unresolved)
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars. Issue details: My iPhone is set to Region: Chile and Language: Spanish (Chile). In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars. A notification with the text: let content = UNMutableNotificationContent() content.body = "¡Has recibido un pago por $5.000!" is read aloud by Siri as: ”¡Has recibido un pago por 5.000 dólares!” (English: “You have received a payment of five thousand dollars!”) instead of the correct: ”¡Has recibido un pago por 5.000 pesos!” (English: “You have received a payment of five thousand pesos!”) Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177 This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services: watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions). Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency. Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly. Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency. I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348. This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars. Has anyone found a workaround, or is there any update from Apple on this?
1
1
692
Feb ’25
SwiftUI Accessibility Inspector?
Please excuse me if this is obvious. I'm new to Apple development. Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them? Or... is there a demangler somewhere that will translate from these names into something this human might recognize? I'm targeting macos, btw, if that makes any difference.
1
0
1.3k
Jul ’25
Speak Selection broken with SwiftUI text
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection. Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
1
0
239
Jul ’25
Speak Screen gesture not working
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce. I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work. Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
1
0
209
Jul ’25
VoiceOver doesn't work well for accessing PDFs/forms with tables
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data. The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system? As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality. I would appreciate a human response to the original email I sent about this on 7/30/2025.
1
0
137
Jul ’25
Unexpected behaviour of hardware keyboard focus in UITests
Hello! I was faced with unexpected behavior of hardware keyboard focus in UITests. A clear description of the problem When running UITests on the iOS Simulator with both "Full Keyboard Access" and "Connect Hardware Keyboard" options enabled, there is a noticeable delay between keyboard actions for focus managing (like pressing Tab or arrow keys). The delay seems to increase with repeated input and suggests that events are being queued instead of processed immediately. I will describe why I have such an assumption later. A step-by-step set of instructions to reproduce the problem Launch the iOS Simulator. Enable both "Full Keyboard Access" and "Connect Hardware Keyboard" in the Simulator settings. Run a UITest on a target application (ideally an endless or long-running test). Once the app is launched, press the Tab key several times. Observe the delay in focus movement. Optionally, press the Tab or arrow keys rapidly, then stop the UITest. After stopping, you’ll see a burst of rapid focus changes. What results you expected We expected keyboard actions (like Tab) to be handled immediately and the UI focus to update smoothly during UITests. What results you saw There was a 4–10 (end more) second delay between pressing keys and seeing a response. All stacked keyboard events (used for managing focus) are performed all at once after stopping the UITest. The version of Xcode you are using Xcode: Version 16.3 (16E140) Simulator: iPhone 16 Pro (iOS 18.4 and 18.1) Simulator: iPad Pro 11-inch (M4) (iPadOS 17.5)
1
2
225
Apr ’25