Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

System clock ancs notification is not sent on iPhone >= 14
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore. Is There any reason for this to happening?. Is anyone aware of this behaviour? Any suggestion would be really appreciated. Cheers
2
0
161
Jun ’25
Custom tab bar in SwiftUI
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused. According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup. In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact. So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
2
0
325
Aug ’25
Programmatically Modifying Per Application or System Wide Color Filters using Cocoa/Swift in MacOS?
I'm looking into how to programmatically control color filters in the Accessibility settings under "System Settings" -> "Accessibility" -> "Color Filters"--in particular the "Intensity" and "Filter type" settings. As far as I have gathered, changing this setting can only be accomplished using the CoreGraphics APIs or Accessibility APIs (I've poked around GitHub, Stack Overflow, and queried some LLMs), but there doesn't seem to be a clear cut example for doing this using public facing APIs, without ripping off source code from another project wholesale or using private APIs. My goal is to overlay a color filter at either a per-application or system level to help with accessibility. If there's a way to overlay this capability on an application-by-application basis as a third-party developer, that would be the most ideal scenario. For example, modifying the look and feel/UX for Launchpad, Photos, etc, as a third-party developer without accessing the source code of the application that I'm modifying the look/feel for (with appropriate user consent of course).
0
0
375
Jul ’25
I cannot pair my PHONAK hearing aids
I cannot pair my PHONAK hearing aids after I upgraded to iOS 18.3 under hearing devices, it just keeps on switching will not find anything. I’ve undelete them and reinstall them four times. Uninstalled the app. Under Bluetooth, it finds it and has it. But under hearing devices where I can control everything it will not. Once I add the hearing aids, my phone goes completely silent. I do not know if anyone calls or texts me
4
0
512
Jan ’25
USB-C Cable for Galaxy Watch 7 not recognized by Macbook pro M4
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports. The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected. -- Running Sequioa Beta 15.4 thinking it may have been software related. No change. -- Settings> Privacy and Security> accessories> Changed between all available options. No change. -- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected. -- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch. Again, the cable charges in ANY other USBc ports from ANY other device I connect it to. Am I missing something? Or is this an intended jab at Samsung from Apple? lol
1
0
724
Mar ’25
Safari to app relationship
if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
2
0
406
Feb ’25
macOS (Tahoe) Beta 26.0 — Overheating & Rapid Battery Drain on MacBook Pro M3
Hi everyone, After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues: Significant increase in system temperature The laptop feels hot even with light usage like Safari and Figma Rapid battery drain Battery is dropping unusually fast compared to macOS Sonoma. I’ve tried, Restarting the device. I’m aware this is a beta, but just wondering. Is anyone else experiencing this? Is this a known issue? Would love to hear if others are facing similar problems or if it might be something specific to my setup. Thanks in advance!
3
0
480
Jun ’25
Programmatically force VoiceOver to read parentheses for math expressions
How can I force VoiceOver to read parentheses for math expressions like this: Text("(2+3)×4") // VoiceOver: Two plus three, times four I’m looking for a way to have VoiceOver announce parentheses (e.g. “left paren”, “right paren”) without relying on NumberFormatter.Style.spellOut or .speechAlwaysIncludesPunctuation(), as both have drawbacks. Using .spellOut breaks braille output and Rotor › Characters menu by turning numbers and symbols into words. And .speechAlwaysIncludesPunctuation() makes VoiceOver overly verbose—for example, it reads “21” as “twenty hyphen one.” Is there a better way to selectively announce specific punctuation like parentheses while keeping numbers and symbols intact for braille and Rotor use?
2
0
371
Jul ’25
Not able to receive silent pushes in background
I’ve developed the Pro Talkie app—a walkie-talkie solution designed to keep you connected with family and friends App Store: https://apps.apple.com/in/app/pro-talkie/id6742051063 Play Store: https://play.google.com/store/apps/details?id=com.protalkie.app While the app works flawlessly on Android and in the foreground on iOS, I’m facing issues with establishing connections when the app is in the background or terminated on iOS. Specifically, I’ve attempted the following: Silent pushes and alert payloads: These are intended to wake the app in the background, but they often fail—notifications may not be received or can be delayed by 20–30 minutes, leading to a poor user experience. VoIP pushes: These reliably wake the app, but they trigger the incoming call UI, which isn’t suitable for a walkie-talkie app that should connect directly without displaying a call screen. I’ve enabled all the necessary background modes (audio, remote notifications, VoIP, background fetch, processing), but the challenge remains. How can I ensure a consistent background connection on iOS without triggering the call UI?
1
0
439
Mar ’25
VoiceOver doesn't work for AVRoutePickerView wrapped in UIViewRepresentable
Hi, I've wrapped AVRoutePickerView in SwiftUI using pretty much the code given here, with a few changes: func makeUIView(context: Context) -> UIView { let routePickerView = AVRoutePickerView() // Configure the button's color. //routePickerView.delegate = context.coordinator //routePickerView.backgroundColor = .secondarySystemBackground routePickerView.tintColor = .accent routePickerView.activeTintColor = .accent // Indicate whether your app prefers video content. routePickerView.prioritizesVideoDevices = false return routePickerView } I commented out routePickerView.delegate = context.coordinator because it doesn't compile; context.coordinator is of type Void and I'm not sure how to fix that. I'm not sure if that has anything to do with the issue. Anyway, this works fine without VoiceOver; if I tap the button, I get the AirPlay popover. But in VoiceOver, if I select the button and double-tap, nothing happens… it just reads the button's accessibilityLabel again. How can I get the AirPlay popover to show in VoiceOver?
3
0
385
Aug ’25
I can’t create an iTunes Connect account.
I learned that I need to create an iTunes Connect account to publish my book translations on Apple Books, following the instructions on the Apple Support page. I was then directed to the iTunes Connect website. Despite trying multiple Apple accounts with different credit cards on my Mac, iPad, and iPhone, I kept getting the error “This Apple Account does not have a valid credit card on file.” In the end, I began to wonder if iTunes Connect is unavailable in Turkey. What do I need to do to publish content on Apple Books from Turkey? Do I need to obtain a developer account, or is this service not available in Turkey? The Apple customer service representative I contacted in Turkey said they didn’t have information on the matter and directed me here.
1
0
466
Mar ’25
Exploring VoiceOver Accessibility for UITableView
I’m currently exploring VoiceOver accessibility in iOS and looking for the best way to reduce the number of swipes required to navigate a UITableView. I’ve come across a couple of potential solutions but am unsure which is preferred. Solution 1: Grouping Subviews in Each Cell Combine all subviews inside a UITableViewCell into a single accessibility element. Provide a concise and meaningful accessibilityLabel. Use custom actions (UIAccessibilityCustomAction) or accessibilityActivationPoint to handle interactions on specific elements within the cell. Solution 2: Using UIAccessibilityContainerDataTableCell & UIAccessibilityContainerDataTable Implement UIAccessibilityContainerDataTable for structured table navigation. Make each cell conform to UIAccessibilityContainerDataTableCell, defining its row and column positions. However, I’m finding this approach a bit complex, and I need guidance on properly implementing these protocols. Additionally, in my case, VoiceOver is not navigating to Section 2—I’m not sure why. Questions: Which of these approaches is generally preferred for better VoiceOver navigation? How do I properly implement UIAccessibilityContainerDataTable so that all sections and rows are navigable? Any best practices or alternative recommendations? Would really appreciate any insights or guidance!
3
0
682
Mar ’25
AXIsProcessTrusted returns true, but AXUIElementCopyAttributeValue fails with .cannotComplete
This was working a few days ago, but it has since stopped and I can't figure out why. I've tried resetting TCC, double-checking my entitlements, restarting, deleting and rebuilding, and nothing works. My app is a sandboxed macOS SwiftUI LSUIElement app that, when invoked, checks to see if the frontmost process is Terminal, then tries to get the frontmost window’s title. func getFrontmostWindowTitle() throws -> String? { let trusted = AXIsProcessTrusted() print("getFrontmostWindowTitle AX trusted: \(trusted)") guard let app = NSWorkspace.shared.frontmostApplication else { return nil } let appElement = AXUIElementCreateApplication(app.processIdentifier) var focusedWindow: AnyObject? let status = AXUIElementCopyAttributeValue(appElement, kAXFocusedWindowAttribute as CFString, &focusedWindow) guard status == .success, let window = focusedWindow else { if status == .cannotComplete { throw Errors.needAccessibilityPermission } return nil } var title: AnyObject? let titleStatus = AXUIElementCopyAttributeValue(window as! AXUIElement, kAXTitleAttribute as CFString, &title) guard titleStatus == .success else { return nil } return title as? String } I recently renamed the app, but the Bundle ID has not yet changed. I have com.apple.security.accessibility set to YES in the Entitlements file (although i had to add it manually), and a NSAccessibilityUsageDescription string set in Info.plist. The first time I ran this, macOS nicely prompted for permission. Now it won't do that, even when I use AXIsProcessTrustedWithOptions() to try to force it. If I use tccutil to reset accessibility and apple events, it still doesn't prompt. If I drag my app from the build products folder to System Settings, it gets added to the system TCC DB (not the user DB). It shows an auth value of 2 for my app: % sudo sqlite3 "/Library/Application Support/com.apple.TCC/TCC.db" "SELECT client,auth_value FROM access WHERE service='kTCCServiceAccessibility' OR service='kTCCServiceAppleEvents';" com.latencyzero.<redacted>|2 <redactd> I'm at a loss as to what went wrong. I proved out the concept earlier and it worked, and have since spent a lot of time enhancing and polishing the app, and now things aren't working and I'm starting to worry.
4
0
1k
Jul ’25
How to Receive Callbacks for UIAccessibilityAction Methods Like accessibilityPerformMagicTap()?
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks. I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap. How can I properly observe and handle the accessibilityPerformMagicTap() action?
3
0
474
Mar ’25
Speak Selection broken with SwiftUI text
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection. Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
1
0
235
Jul ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
174
Jun ’25
SwiftUI PhotosPicker accessibility issue
iOS 18.3.1, iPhone 16 Pro. I pick photos using connected physical keyboard from the user's photo library using: .photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images) When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
1
0
519
Mar ’25
VoiceOver doesn't work well for accessing PDFs/forms with tables
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data. The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system? As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality. I would appreciate a human response to the original email I sent about this on 7/30/2025.
1
0
132
Jul ’25
Hi,I applied for the COMMUNICATION capability failed, what should I do next?
Hi,I applied for the COMMUNICATION capability, but have a message that I already have the driving task app entitlement. After that ,I have applied one more time ,there is no reply anymore. I do not have the com.apple.developer.carplay-communication capability, that means I can not apply this capability? What should i do next to get this capatibility? Thanks
2
0
1.1k
Jul ’25