We have get the response from Apple pay after the the customer doing the face ID & touch ID authorization.
But the shiping contact is not complete, for examble:
` {
"addressLines": [
"1************ kwy"
],
"administrativeArea": "FL",
"country": "",
"countryCode": "",
"emailAddress": "S*********le.com",
"familyName": "******i",
"givenName": "******m",
"locality": "*******s",
"phoneNumber": "+*******79",
"phoneticFamilyName": "",
"phoneticGivenName": "",
"postalCode": "*****3",
"subAdministrativeArea": "",
"subLocality": ""
},`
as the documents said, it should be the completed shipping contact,
but the country & countrycode is null
https://developer.apple.com/documentation/apple_pay_on_the_web/applepaypayment/1916097-shippingcontact
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification!
We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine.
Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either:
We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly.
Accessibility Inspector is using some Apple private APIs that we don’t have access to.
Things I’ve tried:
I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children.
I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue.
I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched.
There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement.
Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing.
An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything.
Does anyone have any ideas? Here's our file.
Hi I'm planning to make macos App and distribute to MacOS App Store.
The question is should i make force update when update is needed.
The reason why I want to make this feature is I don't want to make user use previous version of app.
My plan is like this.
when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app.
the download will be automatically doing at background don't need to visit app store.
I search several forums and gpt but there is no positive reply of this..
so finally i make a post to know is there no way to make this.
Thank you!
Topic:
Accessibility & Inclusion
SubTopic:
General
before I start this could just be me and handful of people but I like to reorganize my phone screen to my needs based on what’s going on in life. I was jaut thinking it would be easier if u could get rid of all the folders at once then reorganize or something easier than this long extensive process it is now.
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out,
can someone help me fix or detect the issue
thank you
I'm encountering an issue related to BLE device discovery on iOS.
I have a BLE peripheral device that I initially connected to using an iOS device. After this connection, the BLE device's advertised name was programmatically changed by the peripheral. Now, when I try to scan for this device using other iOS devices, it does not appear in the scan results in most apps — including nRF Connect and our own custom BLE app that uses CoreBluetooth.
A few observations:
The device is definitely powered on and advertising (confirmed via Android).
The name change is reflected correctly on Android and on the iOS device that originally connected to it.
Other iOS devices no longer see the device in their scan list.
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this:
ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in
Button {
navigationCallback(paymentTool.segueID)
} label: {
PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName)
.accessibilityElement()
.accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))")
}
if paymentTool != self.paymentToolsVM.paymentToolsItems.last {
Divider()
}
}
So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one.
If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared.
Thank you very much
Good to you
Christophe
Topic:
Accessibility & Inclusion
SubTopic:
General
I have users who need to be able to hear the content of SwiftUI Text views. I have specified the .textSelection(.enabled) modifier for the text views. Adding this modifier causes a "copy" option to appear on long press, but it doesn't enable the visible selection of text, nor does it provide the "Speak" menu item that UIKit allows on text selection.
Is the "Speak Selection" accessibility feature broken for SwiftUI Text views? I've found that there's another accessibility feature that does work (enabling the Speech Controller button for "Speak Screen"). Do I need to tell my users that Apple is deprecating the "Speak Selection" accessibility feature, and that they need to use the Speech Controller instead? Or is there something else I can do to my SwiftUI to get that feature to work?
I am testing the accessibility feature available in the Settings app called "Speak Screen". The help text in the Setting app states that swiping down with two fingers will cause the screen content to be spoken. However, I've been unable to get this feature to work. Every time I try the double finger swipe down, it behaves the same as the single finger swipe down gesture. Usually this manifests as making scroll views bounce.
I've tried toggling the feature on and off, turning off Reachability, and rebooting my phone, but I can't get the speak screen gesture to work. If I access the speak screen feature from the "Speech Controller" button, then the screens content is spoken, as expected, so I know the feature is enabled. It's just the gesture that doesn't work.
Is there something else I need to do to get this gesture to work? I don't want to tell my users to turn this feature on if I can't verify that the gesture will work with my app.
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.
Please excuse me if this is obvious. I'm new to Apple development.
Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them?
Or... is there a demangler somewhere that will translate from these names into something this human might recognize?
I'm targeting macos, btw, if that makes any difference.
Topic:
Accessibility & Inclusion
SubTopic:
General
I am trying to implement voice over to my game, and have encountered an issue where a static text will take focus of all other interactions. I have a tutorial scene where I have one short "static text" accessibility node, but rest of gameplay is without such. This static text field occupies small part of screen and works fine, but I am not able to click on anything else, including any of my gameplay elements, wherever on the screen I click, it just re-highlights that static text.
It there a requirement for all elements to use Accessibility Nodes and can't have mixed setup with some not having them ?
How can I get around it?
Question number 2: What decides which Accessibility node gets selected when entering a new UI screen, I have multiple buttons and am observing rather random behaviour, every time different button is highlighted first.
Question number 3: The plugin documentation mentiones runtime support in play mode, are there any specific steps for this to work as I can't seem to be able to. I have VoiceOver enabled on macOS unity is on macOS (also tried iOS) platform but it doesn't seem to do anything. Note my buttons and label accessibility nodes work correctly on iOS build.
Thanks in advance for any help
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
Is there any way to get history?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have been working to remediate PDFs for a client. The documents/forms have many tables. When I correctly tag a table, using Foxit Editor Pro, it works beautifully on a PC reading it with NVDA. On Mac using VoiceOver the table isn't accessible. It doesn't matter if I try to read it in Adobe Acrobat, Foxit, or Preview. The reader often says the document is empty, omits column headers, and/or associates the wrong header with the column data.
The documents have essentially the same coding behind them as for the web. Why is it they perform so well on a PC with NVDA, but so poorly with Mac VoiceOver? I am a Quality Assurance Specialist. I review websites apps, and documents for accessibility. Why can't I do my job using only my Mac system?
As a Mac user, it frustrates me that I can't use my preferred system for checking documents to see if they are accessible because VoiceOver doesn't work well. I actually have to recommend to my clients and their customers that they need to use a PC with NVDA or Jaws for these documents to be able to get all the information. Unfortunately, most people aren't able to have, or maintain, both systems. Overall, Mac products are very high quality. This, and other issues with VoiceOver, seems to be a large gap in Apple's offerings and functionality.
I would appreciate a human response to the original email I sent about this on 7/30/2025.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello community,
We're designing an app that can optionally be controlled by a stylus with a mesh tip. In this case, the mesh tip we're using is 5 mm in diameter. It seems that mesh tip contact detection is unstable in this size, although it works better with a larger diameter.
Is it possible to access a setting in iOS that lets you define the minimum contact area needed to detect a contact on the screen? This would enable us to use this 5 mm stylus.
Best regards,
Edwin
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Siri and Voice
User Notifications
Localization
Apple Intelligence