Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title.
If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI.
how should I do? if I want to focus on the navigation item back button or navigation title.
my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange.
Why did you design it this way? Can you tell me your thinking?
Thanks
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Request: Name Recognition → Shortcut for SOS Flashlight + Vibration
Right now, iOS Name Recognition works, but all I can do is flash the tiny notification light. It would be much more useful if Name Recognition could trigger a Shortcut. That way, I could set it to flash the flashlight in an SOS pattern and vibrate, making the alert impossible to miss.
I tried using Custom Alarm, but it won’t let me record my spoken name, so it doesn’t really solve the problem. If Apple allowed Name Recognition to trigger Shortcuts — or expanded “Custom” to support names/words — this would open up far more practical, real-world alerts.
Topic:
Accessibility & Inclusion
SubTopic:
General
It appears iOS only comes with low quality voices installed.
iOS requires the user to go into settings to download higher quality voices to be used with AVSpeechUtterance.
There doesn't seem to be any api that can be used to make this process easier for the app user.
Is there a way / api that would allow an app to download and use a higher quality voice?
Will apple ever install on default higher quality voices?
We really want to use the text to speech api in iOS however the very high amount of user friction to use high quality voices is stopping us. I would appreciate a response.
Thanks
Using the floating keyboard extensively. Often It starts to jump up and down. I have to pinch out to see the large version and pinch in again to restore the floating version. Sometimes just touching a key sets it off. Sometimes returning to a window from which the keyboard is displayed starts the issue. This was never a problem in ipad os 18.
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Steps: Open the path above → “Hattori” is not listed and cannot be downloaded
Expected: Hattori is available to download and select
Actual: Hattori is absent from the catalog
Regression: Was available on iOS 18.x on the same device
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello,
I am working on a Braille keyboard by using HID approach.
Current the device works with iPhone 11 and SE3.
However, when tested in iPhone 6s with iOS 15, although the device can be connected and recognized as Braille device in VoiceOver screen, the phone shows no response to key press report.
Would there be any requirement at points such as HID descriptor for iPhone 6s support on Braille device? If iPhone 6s does not support such devices, what is the minimum system requirements?
Thank you!
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar.
I’ve noticed that this indicator is not consistent:
For some windows, the badge reliably appears when they are being shared.
For other windows, the badge never appears, even though the window is actively shared.
In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not.
My questions are:
What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar?
Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)?
Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
Topic:
Accessibility & Inclusion
SubTopic:
General
why did the screen recorder button disappear? It cannot be found anywhere.
Topic:
Accessibility & Inclusion
SubTopic:
General
I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display.
I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory.
Does AirPlay connection to non-Apple products become unstable over long periods of time?
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m trying to customize the keyboard focus appearance in SwiftUI.
In UIKit (see WWDC 2021 session Focus on iPad keyboard navigation), it’s possible to remove the default UIFocusHaloEffect and change a view’s appearance depending on whether it has focus or not.
In SwiftUI I’ve tried the following:
.focusable() // .focusable(true, interactions: .activate)
.focusEffectDisabled()
.focused($isFocused)
However, I’m running into several issues:
.focusable(true, interactions: .activate) causes an infinite loop, so keyboard navigation stops responding
.focusEffectDisabled() doesn’t seem to remove the default focus effect on iOS
Using @FocusState prevents Space from triggering the action when the view has keyboard focus
My main questions:
How can I reliably detect whether a SwiftUI view has keyboard focus? (Is there an alternative to FocusState that integrates better with keyboard navigation on iOS?)
What’s the recommended way in SwiftUI to disable the default focus effect (the blue overlay) and replace it with a custom border?
Any guidance or best practices would be greatly appreciated!
Here's my sample code:
import SwiftUI
struct KeyboardFocusExample: View {
var body: some View {
// The ScrollView is required, otherwise the custom focus value resets to false after a few seconds. I also need it for my actual use case
ScrollView {
VStack {
Text("First button")
.keyboardFocus()
.button {
print("First button tapped")
}
Text("Second button")
.keyboardFocus()
.button {
print("Second button tapped")
}
}
}
}
}
// MARK: - Focus Modifier
struct KeyboardFocusModifier: ViewModifier {
@FocusState private var isFocused: Bool
func body(content: Content) -> some View {
content
.focusable() // ⚠️ Must come before .focused(), otherwise the FocusState won’t be recognized
// .focusable(true, interactions: .activate) // ⚠️ This causes an infinite loop, so keyboard navigation no longer responds
.focusEffectDisabled() // ⚠️ Has no effect on iOS
.focused($isFocused)
// Custom Halo effect
.padding(4)
.overlay(
RoundedRectangle(cornerRadius: 18)
.strokeBorder(
isFocused ? .red : .clear,
lineWidth: 2
)
)
.padding(-4)
}
}
extension View {
public func keyboardFocus() -> some View {
modifier(KeyboardFocusModifier())
}
}
// MARK: - Button Modifier
/// ⚠️ Using a Button view makes no difference
struct ButtonModifier: ViewModifier {
let action: () -> Void
func body(content: Content) -> some View {
content
.contentShape(Rectangle())
.onTapGesture {
action()
}
.accessibilityAction {
action()
}
.accessibilityAddTraits(.isButton)
.accessibilityElement(children: .combine)
.accessibilityRespondsToUserInteraction()
}
}
extension View {
public func button(action: @escaping () -> Void) -> some View {
modifier(ButtonModifier(action: action))
}
}
there is no possibility to sett the allow mobile Data switch I have the latest update but still does not work and I realised it when I went to another country and I could not sett my Mobile data and when I came back still I could not.
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia)
I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard.
Here’s what I have so far:
• The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars.
• The panel appears above all applications and suggestions show up correctly.
• However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor.
I suspect this is because:
• Google Docs does not use a standard macOS text input field.
• Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events.
• Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor.
Important:
I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development.
Technical setup:
• MacBook Air (M2, 2022)
• RAM: 8 GB
• macOS: Sequoia 15.3.1
• Language: French (system and keyboard)
• Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard
• Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction”
• Browsers tested: Chrome, Safari, Firefox (same issue)
• Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs
Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor?
Thanks a lot!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that?
STEPS TO REPRODUCE:
Disable VoiceOver (Settings → Accessibility → VoiceOver → Off).
Connect and confirm that you can navigate other iOS interfaces using an external keyboard.
In any app, present a PKAddPassesViewController with a valid .pkpass file.
When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter).
Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus.
EXPECTED RESULT:
All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard.
ACTUAL RESULT:
Keyboard navigation is not possible.
No elements receive focus.
Users cannot activate Cancel or Add buttons using keyboard input.
The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements.
IMPACT:
Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible).
Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet.
Affects users of external keyboards who rely on tab/arrow navigation.
Creates an inconsistent accessibility experience compared to other iOS system modals.
There is an issue with Help Books that started with the release of macOS 14.4. The issue is that when an app attempts to go directly to a Help Book page, the help viewer opens to the Help Book's main index page, rather than the specific page requested. As I investigated the issue I found that the requested page was actually part of help viewer's navigation history, and all I had to do was to click the Back navigation arrow and the requested page would be displayed. So it seems like the requested page is momentarily visited but is then (for whatever reason) quickly replaced by the main index page.
Our app uses the AHGotoPage() API for directly accessing our Help Book's pages. This is the same mechanism/code that our app has used for more than a decade and has never caused us any issues. Everything works fine on macOS 14.3.0 and earlier. I've scoured the documentation and can't find any newer APIs for accessing Help pages. I've also tried various other things (e.g. reworking the code, creating new indexes for the app's Help, etc.), but none of it seems to make a difference. As far as I can tell, the issue seems to stem from some change made to the OS.
So my questions are:
Is this a known bug? And if so, is there any ETA on a fix?
Is there something different we should be doing for newer versions of the OS (create indexes differently, use a different API, etc.)?
Topic:
Accessibility & Inclusion
SubTopic:
General
Issue:
When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected.
Details:
Affected apps: Notes, Messages, and some other native apps
Not affected: Browser text fields (Safari, Chrome, etc.)
Does not occur when using Option + Delete or just Delete
macOS [insert beta version + build number]
Mac model: [insert model]
Input sources: Thai – Kedmanee, English – U.S.
Steps to reproduce:
Open Notes (or Messages).
Switch to Thai input.
Type a few Thai words.
Press Command + Delete.
Type again — the first character shows up in English.
Expected:
First character should remain in Thai, consistent with the active input source.
Actual:
First character shows as English, then input switches back to Thai.
Topic:
Accessibility & Inclusion
SubTopic:
General
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired.
Current and desired behavior:
This seems an issue so I filed a feedback.
Please refer to Feedback report: FB19701007
The Personal Voice file created in English changes to either Spanish or Chinese and no longer works properly. This has been happening since Beta 1 of iOS/iPadOS 26.
I have been unable to pinpoint what causes this to occur. Possibly downloading foreign voices to a device or using different voices via AVSpeechSynthesizer.
I run an app in Xcode on my device that prints to the console info about the installed voices. Initially after creation here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: en-US
Then I hear a Spanish accent and find this:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: es-MX
Currently it isn't working and here is the output:
Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4
Language: zh-CN
Note that the voice file on all three above is the same. No matter what the user does, the created voice file should never be able to change languages. On my test devices I reset them all by erasing all content and settings and creating a new English Personal Voice and the issue persists.
A side issue is the toggle share across devices doesn't remain off if turned off. I tried to not share to see if that could be the cause, but the toggle turns on automatically. It won’t remain off.
Hi Apple Developer Community,
I'm experiencing persistent issues with the Apple Search Ads API since today morning (August 16, 2025). My application keeps getting "Service Unavailable" errors when trying to connect to the API endpoints.
Error Details:
Error Message: "Service Unavailable"
HTTP Status: 503/500
API Endpoint: https://api.searchads.apple.com/api/v5/*
Frequency: Consistent failures since August 16, 2025
What I've Tried:
Verified API credentials and certificates are valid
Tested multiple API endpoints
Checked network connectivity
The API was working fine until yesterday, and no changes were made to our implementation. Any insights or updates from the community would be greatly appreciated.
Thanks in advance for your help!