Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
0
0
102
Apr ’25
Accessibility IDs showing up in Accessibility Inspector, but automated testing script is unable to find them
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this: ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in Button { navigationCallback(paymentTool.segueID) } label: { PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName) .accessibilityElement() .accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))") } if paymentTool != self.paymentToolsVM.paymentToolsItems.last { Divider() } } So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one. If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
1
0
153
May ’25
I have a problem
I want to open a developer account, but it is not personal, but rather a company, and I have an existing company, and I have DUNS, and I have a website that has been made, and everything is ready, and an official email, but when the application is made at Apple, he sends to my email that he wants a public website for people, and it will be in the name of the organization, and all of these matters have been resolved. Why do they not respond to us?
1
0
575
Sep ’25
Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
548
Jan ’25
Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
798
Jun ’25
Default Voices for AVSpeechUtterance
It appears iOS only comes with low quality voices installed. iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance. There doesn't seem to be any api that can be used to make this process easier for the app user. Is there a way / api that would allow an app to download and use a higher quality voice? Will apple ever install on default higher quality voices? We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response. Thanks
0
0
708
Sep ’25
"illegal character encoding in string literal" warnings in Xcode
Good day! I have a long-term project ported all the way up from old Think C through many versions of Xcode. Its source files are encoded in "Western (Mac OS Roman)". Some of my error messages have characters outside the straight ASCII character set (i.e. "å"). The editor correctly displays these, but I get plenty of Illegal Character warnings and the messages do not display properly. I imagine there's a way to have seperate files of localized text for internationalized applications, but I am the only end-user of this application, and it used to just plain work in earlier Xcode versions. Furthermore, there must be developers throughout Europe who use such characters in string literals, just typing in their native languages, straight off their keyboards. I was thinking that there must be a Clang setting or something, but have been unable to find it, and an internet search turns up no solution except to cumbersomely escape each individual character. I can't imagine that a French programmer does that every time they want to type "è", "é", or "à"! Any help? (Disclaimer: I'm an English speaker and only use such characters whimsically, but want to keep them for legacy's sake.) Thanks.... p.s. using Xcode 15.3, and under Settings->Text Editing->Editing, "Western (Mac OS Roman)" is already selected as the default text encoding with "Convert existing files on save" checked.
3
0
184
Jun ’25
Please consider having Name Recognition in a shortcut automation
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss. I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
1
0
592
Sep ’25
Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
0
0
376
Sep ’25
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain screen regions even with calibration 3. Sensitivity and Stability Issues Cursor jitters or jumps around when looking at center Too much sensitivity to micro-movements Inconsistent behavior between calibration and normal operation 4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook. Primary Question: Word-Level Eye Tracking Feasibility Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware? I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for: Tracking natural reading patterns (left-to-right, line-by-line progression) Detecting which specific words a user is looking at Maintaining accuracy for sustained reading sessions (15-30 minutes) Working reliably across different users and lighting conditions Questions for the Community Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities? Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed? Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking? Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case? Current Setup Devices: iPhone 14 Pro iOS Version: iOS 18.3 ARKit Version: Latest available Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features. Thank you for your time and expertise!
0
0
685
Oct ’25
Defining boundaries of inline dialogs for VO users
Hello, I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182. The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends? In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs. I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
0
0
111
May ’25
Seeking API Support for Marking Substrings as Headings in NSTextView for VoiceOver
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app. I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links. Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way. Thank you in advance for your help! Best regards,
0
0
100
May ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
0
0
102
May ’25
How to Enable Group Navigation Behavior for Custom Views in VoiceOver?
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar. How can I achieve the same behavior for a custom view? I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
0
0
410
Mar ’25
How to disable the default focus effect and detect keyboard focus in SwiftUI?
I’m trying to customize the keyboard focus appearance in SwiftUI. In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not. In SwiftUI I’ve tried the following: .focusable() // .focusable(true, interactions: .activate) .focusEffectDisabled() .focused($isFocused) However, I’m running into several issues: .focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding .focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS Using @FocusState prevents Space from triggering the action when the view has keyboard focus My main questions: How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?) What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border? Any guidance or best practices would be greatly appreciated! Here's my sample code: import SwiftUI struct KeyboardFocusExample: View { var body: some View { // The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case ScrollView { VStack { Text("First button") .keyboardFocus() .button { print("First button tapped") } Text("Second button") .keyboardFocus() .button { print("Second button tapped") } } } } } // MARK: - Focus Modifier struct KeyboardFocusModifier: ViewModifier { @FocusState private var isFocused: Bool func body(content: Content) -> some View { content .focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized // .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds .focusEffectDisabled() // ⚠️ Has no effect on iOS .focused($isFocused) // Custom Halo effect .padding(4) .overlay( RoundedRectangle(cornerRadius: 18) .strokeBorder( isFocused ? .red : .clear, lineWidth: 2 ) ) .padding(-4) } } extension View { public func keyboardFocus() -> some View { modifier(KeyboardFocusModifier()) } } // MARK: - Button Modifier /// ⚠️ Using a Button view makes no difference struct ButtonModifier: ViewModifier { let action: () -> Void func body(content: Content) -> some View { content .contentShape(Rectangle()) .onTapGesture { action() } .accessibilityAction { action() } .accessibilityAddTraits(.isButton) .accessibilityElement(children: .combine) .accessibilityRespondsToUserInteraction() } } extension View { public func button(action: @escaping () -> Void) -> some View { modifier(ButtonModifier(action: action)) } }
1
0
446
Sep ’25
Make Accessibility Focus move to UIPickerView when tapping on UITextField (Full Keyboard Access)
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too). The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end. What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys. I've tried using: UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access. Making the UIPickerView a first responder when it appears. Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either). Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
0
0
395
Mar ’25