Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

VoiceOver Not Scrolling to Focused TableView Cell
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app. Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
1
0
144
Apr ’25
Allow Mobile Data switching
there is no possibility to sett the allow mobile Data switch I have the latest update but still does not work and I realised it when I went to another country and I could not sett my Mobile data and when I came back still I could not.
0
0
782
Sep ’25
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
139
Apr ’25
Clarification on Color Path Determination in Wallet Provisioning (Green,Yellow, Orange) Path recommendation
Hi, I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow: Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both? What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)? At what point in the provisioning flow is the color path recommendation received? Is it included in the response after the PKAddPaymentPassRequest is submitted? Is it accessible through any specific property or callback in the delegate method? Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app? Would appreciate any clarification or pointers to related documentation.
0
0
112
May ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom signal (a "1" from the brain signal device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
113
Apr ’25
.
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help privaterelay.appleid.com
1
0
255
Feb ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
186
Apr ’25
FamilyControls API access
I’m requesting access to the Family Controls API for an iOS app currently in development. I’ve submitted the request through the official form here: https://developer.apple.com/contact/request/family-controls-distribution However, after submitting, I receive no confirmation email or support ticket ID. The page only shows a “Thank you for requesting the API” message, and I’m left without a way to track or confirm the request. This entitlement is essential for my app’s functionality, and I need to move forward with development and testing. Can someone from the Apple team please confirm receipt of the request and provide guidance on the next steps or estimated timelines?
0
0
353
May ’25
iPhone screen layers
I need to understand the different layers that are there in the iPhone X and later OLED screens as I am designing a hardware attachment. They seem to be projecting letters and images from a different layer than the subpixel layer. Is this proprietary information, or is there a resource that explores them?
0
0
103
Apr ’25
Will iCloud service lose data access because users are in different regions?
My app uses CoreData based on iOS 13.0 combined with iCloud to store data. This function automatically manages the data collaboration between CoreData and iCloud. Some users have reported that after going abroad, their original data disappeared, and when they returned to China, the data could be displayed normally again. I'm located in Mainland China. I've learned that iCloud data in China is all stored in Guizhou-Cloud Big Data (the data center in Guizhou). Could this problem be caused by display abnormalities resulting from the switching of the iCloud data storage centers accessed in different regions?
0
0
386
Dec ’24
Unable to Grant Input Monitoring Permission via MDM
I am trying to grant Input Monitoring permission using MDM (Mobile Device Management), but I am facing issues. While I am able to deny the permission, I am unable to grant it. In some profile configurator tools, I noticed a note stating: "Allows the application to use CoreGraphics and HID APIs to listen to (receive) CGEvents and HID events from all processes. Access to these events cannot be given in a profile; it can only be denied." This seems to suggest that granting Input Monitoring permission via an MDM profile may not be possible. Has anyone successfully granted Input Monitoring permission using MDM, or is there an alternative way to achieve this on managed macOS devices?
0
0
468
Feb ’25
What is the appropriate accessibility trait for selectable UITableViewCell?
I’m trying to understand the best practice for assigning accessibilityTraits to a UITableViewCell that users can select from a list of options. In Apple’s first-party apps like Settings, I’ve noticed an inconsistent approach—some cells use the Button trait, while others simply announce the label along with the Selected trait when applicable, without any additional role like Button or Adjustable. So my question is: What is the most appropriate accessibility trait to use for a selectable table view cell that updates a selection (like a settings option)? Is using .button the right approach, or should we rely solely on .selected? Is there any user experience guideline from Apple that recommends one over the other? Would love to hear how others handle this for clarity and consistency in VoiceOver behavior.
1
0
119
Apr ’25
AXSpeech Thread Crash SEGV_ACCERR
Hi everyone, I've encountered a rare and strange crash in my app that I can't consistently reproduce. The crash seems to occur deep within Apple's internal frameworks, and I can't pinpoint which line of my own code is causing it. Here's the crash stack trace: #44 AXSpeech SIGSEGV SEGV_ACCERR 0 CoreFoundation ___CFCheckCFInfoPACSignature + 4 1 CoreFoundation _CFRunLoopSourceSignal + 28 2 Foundation _performQueueDequeue + 492 3 Foundation ___NSThreadPerformPerform + 88 4 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 5 CoreFoundation ___CFRunLoopDoSource0 + 176 6 CoreFoundation ___CFRunLoopDoSources0 + 340 7 CoreFoundation ___CFRunLoopRun + 828 8 CoreFoundation _CFRunLoopRunSpecific + 608 9 Foundation -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212 10 TextToSpeech _TTSCFAttributedStringCreateStringByBracketingAttributeWithString + 776 11 Foundation ___NSThread__start__ + 732 12 libsystem_pthread.dylib __pthread_start + 136 Sometimes, instead of line 10 referencing _TTSCFAttributedStringCreateStringByBracketingAttributeWithString, it shows: 10 TextToSpeech LogWarning(char const*, ...) + 7288 Has anyone experienced a similar issue or know what might be triggering this crash? Any guidance on how to investigate or resolve this would be greatly appreciated. Thank you!
1
0
147
Jun ’25
Add custom emoji tags
Allow the user to add their own tags to the default emoji tags. For instance, this emoji, for me, is nonna: 🤌🏻. My efficiency would improve immensely if I could search for it as the “Nonna” emoji, rather than searching for nonna, remembering it doesn’t exist, trying the search for other things it might be called, realising I don’t know what it is, then having to scroll through all the hand emojis twice to find it. 🤌🏻🤞🏼👌
1
0
506
Jan ’25
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26
Japanese “Hattori” TTS voice missing from Settings > General > Read & Speak > Voices > Japanese on iOS 26 Steps: Open the path above → “Hattori” is not listed and cannot be downloaded Expected: Hattori is available to download and select Actual: Hattori is absent from the catalog Regression: Was available on iOS 18.x on the same device
0
0
383
Sep ’25
Frames rotated and shifted in landscape for iOS simulator
When I try to get the frames of a AXUIElementRef using AXUIElementCopyAttributeValue(element, (CFStringRef)attribute, &result) the frames are shifted and rotated on the iOS simulator. I get the same frames when using the Accessibility Inspector when the Max is selected as the host. When I switch the host to the iOS simulator the frames are correct. How is the Accessibility Inspector getting the correct frames? And how can I do the same in my app?
1
0
130
Jun ’25
CFPrefsPlistSource Read Error with App Group Preferences
I am encountering the following issue while working with app group preferences in my Safari web extension: Couldn't read values in CFPrefsPlistSource<0x3034e7f80> (Domain: [MyAppGroup], User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd. I am trying to read/write shared preferences using UserDefaults with an App Group but keep running into this error. Any guidance on how to resolve this would be greatly appreciated! Has anyone encountered this before? How can I properly configure my app group preferences to avoid this issue?
0
0
689
Feb ’25