Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Why does the macOS window sharing indicator appear for some windows but not others?
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar. I’ve noticed that this indicator is not consistent: For some windows, the badge reliably appears when they are being shared. For other windows, the badge never appears, even though the window is actively shared. In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not. My questions are: What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar? Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)? Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
0
0
1.1k
Sep ’25
Avoid trackpad gesture conflict between dragging and accessibility zooming when using three fingers
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired. Current and desired behavior: This seems an issue so I filed a feedback.
0
0
766
Aug ’25
iOS: How to maintain good app icon contrast in grayscale mode?
I’m developing an iOS app, and I’ve noticed that when the user enables Accessibility → Display & Text Size → Color Filters → Grayscale, my app icon loses a lot of visual contrast. The original colored version looks fine, but in grayscale it appears “flat” and harder to distinguish, unlike a pure black-and-white design. What I want to achieve: Ensure the app icon remains visually clear and high-contrast even when iOS renders it in grayscale. Ideally, provide an alternate “high-contrast” app icon version when grayscale mode is enabled. What I’ve tried: Increased color contrast in the original icon design. Added outlines and stronger shapes. Tested with grayscale filters in design tools. Researched Asset Catalog and alternate icons, but found no documented API to detect or respond to grayscale mode. Questions: Is there any API in iOS that allows detecting when the system is in grayscale mode so that I can programmatically switch to an alternate app icon? If not, are there Apple-recommended best practices for designing app icons that still look clear in grayscale? Are there any accessibility guidelines specifically addressing icon design for grayscale or color-blind modes? Additional info: iOS version tested: iOS 17.5 Development in Swift + SwiftUI, using Asset Catalog for icons. I am aware that iOS supports alternate icons via setAlternateIconName, but I haven’t found a trigger for grayscale mode.
0
1
450
Aug ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
139
May ’25
Seeking API Support for Marking Substrings as Headings in NSTextView for VoiceOver
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app. I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links. Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way. Thank you in advance for your help! Best regards,
0
0
111
May ’25
Camera Crashes
Hi everybody, I'm trying to build a QR-Code Scanner and Generator App for IOS. Whenever I try to implement the camera the app crashes with this comment: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data. I tried to reduce the app to the minimum of nothing but camera with the same result. Any ideas? Tank you and best Regards Horst Schippers
0
0
72
Apr ’25
iOS 26 regression: `DeviceActivityEvent`: `eventDidReachThreshold` called immediately (instead of waiting till threshold is reached)
Hello Albert! I am experiencing some strange bugs around DeviceActivityEvents (part of the DeviceActivity framework) on iOS 26 / iOS 26.1 / iOS 26.2 beta: When creating a DeviceActivityEvent we can assign a threshold and applicationTokens. The idea is, that after the user has spent said threshold on said apps, eventDidReachThreshold() is called. The property includesPastActivity is set to false. On iOS 26 however, it happens (quite reliably after updating to a new beta seed) quite often that eventDidReachThreshold() is called immediately (after a couple of seconds) instead of waiting for the threshold to be met. Is anyone else seeing similar issues on iOS 26 / iOS 26.1 / iOS 26.2 beta? Only workaround I have found is to ask users to revoke and re-grant Screen Time permissions. This only holds for about two weeks though or at most until the next iOS 26 beta update is installed, so it is not a permanent solution unfortunately. Feedback (incl. sysdiagnoses and sample project) is filed under: FB18061981 FB18927456 One of our users has filed their own feedback request as well: FB20817853 Thanks a lot for any help on this!
0
1
443
Nov ’25
Accessibility Traits for Children of a Tab Bar
Hi! I'm working on an application where I'd like VoiceOver to give each element of a tab bar the "Tab" trait. I'm testing this using the Accessibility Inspector. Essentially, I'd like to replicate the behavior of how Safari identifies each of its tabs as a "Tab" (I've attached a photo below). How exactly is this accomplished? I've tried using the .isTabBar trait to designate the child objects as "Tabs", but this doesn't seem to be working and I've struggled to find documentation about this. For additional context, these child items are Buttons, and I would like to have the .isButton trait essentially replaced by something like an .isTab trait. Not sure if this is actually possible or not, but curious how the Accessibility Inspector recognizes this in Safari.
0
0
176
Jun ’25
Imessage and Facetime error
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message. Does anyone have any ideas what the problem could be?
1
1
154
Jun ’25
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
117
Jun ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
633
Dec ’25
Accessibility for detents behaves different in fullscreen cover
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not. struct ContentView: View { @State private var buttonClicked = false @State private var bottomSheetShowing = false var body: some View { NavigationView { VStack { Button(action: { buttonClicked = true }, label: { Text("First Page Button") .padding() .background(Color.blue) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("First Page Button") FullscreenView2() } .navigationTitle("Welcome") .fullScreenCover(isPresented: $buttonClicked) { FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing) } } } } struct FullscreenView: View { @Binding var buttonClicked: Bool @Binding var bottomSheetShowing: Bool var body: some View { NavigationView { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button(action: { buttonClicked = false }, label: { Text("Close") }) .accessibilityLabel("Close Fullscreen View Button") } } .accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } } struct FullscreenView2: View { @State var bottomSheetShowing = false var body: some View { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") //.accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } struct BottomSheetView: View { @Binding var bottomSheetShowing: Bool // @AccessibilityFocusState var isFocused: Bool var body: some View { VStack(spacing: 20) { Text("Bottom Sheet") .font(.headline) .accessibilityAddTraits(.isHeader) Button(action: { bottomSheetShowing = false }, label: { Text("Dismiss") .padding() .background(Color.red) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("Dismiss Bottom Sheet Button") } .padding() .frame(maxWidth: .infinity, maxHeight: .infinity) .background( Color(UIColor.systemBackground) .edgesIgnoringSafeArea(.all) ) .accessibilityAddTraits(.isModal) // Indicates that this view is a modal // .onAppear { // // Set initial accessibility focus when the sheet appears // DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { // isFocused = true // } // } // .accessibilityFocused($isFocused) } }
1
1
616
Feb ’25
The virtual home button is not displayed in Developer Mode.
I have a question about Developer Mode on iPhone. Currently, the home button on my iPhone SE (2nd generation) is broken, so I use AssistiveTouch to display a virtual home button. However, in Developer Mode, the virtual home button does not appear, making it impossible to enable Developer Mode. Is there any way to enable Developer Mode in this situation?
1
2
288
Feb ’25
VoiceOver for Accessibility Labels with Localization
Hello! I'm adding VoiceOver support for my app, but I'm having an issue where my accessibility value is not being spoken. I have made a helper class that creates an NSString from a double and converts it to the user's region currency. CurrencyFormatter.m + (NSString *) localizedCurrencyStringFromDouble: (double) value { NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init]; formatter.numberStyle = NSNumberFormatterCurrencyStyle; formatter.locale = [NSLocale currentLocale]; NSString *currencyString = [formatter stringFromNumber: @(value)]; [formatter release]; return currencyString; } View Contoller self.checkTotalLabel.accessibilityLabel = NSLocalizedString(@"Total Amount", @"Accessibility Label for Total"); self.checkTotalLabel.accessibilityValue = [CurrencyFormatter localizedCurrencyStringFromDouble: total]; I'm confused on whether the value should go into the accessibility label or not. When the currency is just USD and the language is English, it's a simple fix. But when the currency needs to be converted, I'm not sure where to go from here. If anyone has any guidance, it would help me a lot! Thank you!
1
0
759
Jul ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
124
Jun ’25
The brightness of the iPad Pro screen is gone after new ios26
After 26 IOS update, the colors on my new iPad Pro M4 have become extremely dull almost like those on a very old device. The screen brightness is significantly reduced, and it's now difficult to see UI elements clearly. This is very disappointing considering the device’s high display quality before the update. Please advise if this is a known issue or if there's a fix.
1
1
103
Jun ’25