Hello,
I added new In-App Purchase into my app, it was approved on 2nd of Oct but now 7th of Oct I still cannot see it in the list of products coming from Store.
I already have 2 subscriptions and 1 In-App purchase in my app, but the new In-App purchase is still not coming from the store in available products. What could cause this?
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Merhaba , bir apple mağaza kurulumu yaptık fakat yaptığımız ödemenin faturası henüz gelmedi. Faturaya nereden ulaşacağım hakkında bilgi verebilir misiniz?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources.
The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too.
Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs.
I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled.
Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
I have a TextField and entered for example "sg?!". At the TextField I set the modifier speechAlwaysIncludesPunctuation(). But when I activate VoiceOver the content of TextField is reading. The special characters don't read out.
How can I fix this?
I have an HTML select that has Spanish text in the options.
When VoiceOver reads the selected option (unopened), it switches to Spanish as expected.
However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text.
I have tried adding lang on to the select tag and the option tag but neither helps
https://codepen.io/grahamfowles/pen/VYYRxMK
Topic:
Accessibility & Inclusion
SubTopic:
General
I would like to enable the option to resize windows with the apple pencil pro. I tried but I see that this feature is not enabled.
Topic:
Accessibility & Inclusion
SubTopic:
General
I am trying to implement voice over to my game, and have encountered an issue where a static text will take focus of all other interactions. I have a tutorial scene where I have one short "static text" accessibility node, but rest of gameplay is without such. This static text field occupies small part of screen and works fine, but I am not able to click on anything else, including any of my gameplay elements, wherever on the screen I click, it just re-highlights that static text.
It there a requirement for all elements to use Accessibility Nodes and can't have mixed setup with some not having them ?
How can I get around it?
Question number 2: What decides which Accessibility node gets selected when entering a new UI screen, I have multiple buttons and am observing rather random behaviour, every time different button is highlighted first.
Question number 3: The plugin documentation mentiones runtime support in play mode, are there any specific steps for this to work as I can't seem to be able to. I have VoiceOver enabled on macOS unity is on macOS (also tried iOS) platform but it doesn't seem to do anything. Note my buttons and label accessibility nodes work correctly on iOS build.
Thanks in advance for any help
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar.
I’ve noticed that this indicator is not consistent:
For some windows, the badge reliably appears when they are being shared.
For other windows, the badge never appears, even though the window is actively shared.
In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not.
My questions are:
What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar?
Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)?
Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi,
On iOS, I'd like to mark views that are inside a LazyVStack as headers for VoiceOver (make them appear in the headings rotor).
In a VStack, you just have add .accessibilityAddTraits(.isHeader) to your header view. However, if your view is in a LazyVStack, that won't work if the view is not visible. As its name implies, LazyVStack is lazy so that makes sense.
There is very little information online about system rotors, but it seems you are supposed to use .accessibilityRotor() with the headings system rotor (.accessibilityRotor(.headings)) outside of the LazyVStack. Something like the following.
.accessibilityRotor(.headings) {
ForEach(entries) { entry in
// entry.id must be the same as the id of the SwiftUI view it is about
AccessibilityRotorEntry(entry.name, id: entry.id)
}
}
It kinds of work, but only kind of. When using .accessibilityAddTraits(.isHeader) in a VStack, the view is in the headings rotor as soon as you change screen. However, when using .accessibilityRotor(.headings), the headers (headings?) are not in the headings rotor at the time the screen appears. You have to move the accessibility focus inside the screen before your headers show up.
I'm a beginner in regards to VoiceOver, so I don't know how a blind user used to VoiceOver would perceive this, but it feels to me that having to move the focus before the headers are in the headings rotor would mean some users would miss them.
So my question is: is there a way to have headers inside a LazyVStack (and are not necessarily visible at first) to be in the headings rotor as soon as the screen appears? (be it using .accessibilityRotor(.headings) or anything else)
The "SwiftUI Accessibility: Beyond the basics" talk from WWDC 2021 mentions custom rotors, not system rotors, but that should be close enough. It mentions that for accessibilityRotor to work properly it has to be applied on an accessibility container, so just in case I tried to move my .accessibilityRotor(.headings) to multiple places, with and without the accessibilityElement(children: .contain) modifier, but that did not seem to change the behavior (and I could not understand why accessibilityRotor could not automatically make the view it is applied on an accessibility container if needed).
Also, a related question: when using .accessibilityRotor(.headings) on a screen, is it fine to mix uses of .accessibilityRotor(.headings) and .accessibilityRotor(.headings)? In a screen with multiple type of contents (something like ScrollView { VStack { MyHeader(); LazyVStack { /* some content */ }; LazyVStack { /* something else */ } } }), having to declare all headers in one place would make code reusability harder.
Thanks
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
I upgraded to iPadOS 26.1 beta v2 yesterday and suddenly I can’t use share screen in apps like GNeet, Discord, zoom etc.
Topic:
Accessibility & Inclusion
SubTopic:
General
I'd like to add borders to all buttons in the iOS simulator from my Mac app. First I get the simulator window. Then I access the children of all AXGroup and if it's a button or a static text, I add a border.
But for some buttons this does not work. In the example image the NavigationBarButtons are not found. I guess the problem is, that for some AXGroup the children array access with AXChildren is empty.
Here is some relevant code:
- (NSArray<DDHOverlayElement *> *)overlayChildrenOfUIElement:(AXUIElementRef)element index:(NSInteger)index {
NSMutableArray<DDHOverlayElement *> *tempOverlayElements = [[NSMutableArray alloc] init];
NSLog(@">>> -----------------------------------------------------");
NSString *role = [UIElementUtilities roleOfUIElement:element];
NSRect frame = [UIElementUtilities frameOfUIElement:element];
NSLog(@"%@, role: %@, %@", element, role, [NSValue valueWithRect:frame]);
NSArray *lineage = [UIElementUtilities lineageOfUIElement:element];
NSLog(@"lineage: %@", lineage);
NSArray<NSValue *> *children = [UIElementUtilities childrenOfUIElement:element];
if (children.count < 1) {
NSLog(@"NO CHILDREN");
}
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"----%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
}
NSLog(@"<<< -----------------------------------------------------");
for (NSInteger i = 0; i < [children count]; i++) {
NSValue *child = children[i];
AXUIElementRef uiElement = (__bridge AXUIElementRef)child;
NSString *role = [UIElementUtilities roleOfUIElement:uiElement];
NSRect frame = [UIElementUtilities frameOfUIElement:uiElement];
NSLog(@"%@, role: %@, %@", child, role, [NSValue valueWithRect:frame]);
if ([role isEqualToString:@"AXButton"] ||
[role isEqualToString:@"AXTextField"] ||
[role isEqualToString:@"AXStaticText"]) {
NSString *tag = [NSString stringWithFormat:@"%ld%ld", (long)index, (long)i];
NSLog(@"tag: %@", tag);
DDHOverlayElement *overlayElement = [[DDHOverlayElement alloc] initWithUIElementValue:child tag:tag];
[tempOverlayElements addObject:overlayElement];
} else if ([role isEqualToString:@"AXGroup"] ||
[role isEqualToString:@"AXToolbar"]) {
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:++index]];
} else if ([role isEqualToString:@"AXWindow"]) {
[self.overlayWindowController setFrame:[UIElementUtilities frameOfUIElement:uiElement]];
[tempOverlayElements addObjectsFromArray:[self overlayChildrenOfUIElement:uiElement index:index]];
}
}
return [tempOverlayElements copy];
}
For some AXGroup the children are found. For some they are empty. I cannot figure out why.
Does anyone have an idea what I'm doing wrong?
Topic:
Accessibility & Inclusion
SubTopic:
General
We have get the response from Apple pay after the the customer doing the face ID & touch ID authorization.
But the shiping contact is not complete, for examble:
` {
"addressLines": [
"1************ kwy"
],
"administrativeArea": "FL",
"country": "",
"countryCode": "",
"emailAddress": "S*********le.com",
"familyName": "******i",
"givenName": "******m",
"locality": "*******s",
"phoneNumber": "+*******79",
"phoneticFamilyName": "",
"phoneticGivenName": "",
"postalCode": "*****3",
"subAdministrativeArea": "",
"subLocality": ""
},`
as the documents said, it should be the completed shipping contact,
but the country & countrycode is null
https://developer.apple.com/documentation/apple_pay_on_the_web/applepaypayment/1916097-shippingcontact
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that?
STEPS TO REPRODUCE:
Disable VoiceOver (Settings → Accessibility → VoiceOver → Off).
Connect and confirm that you can navigate other iOS interfaces using an external keyboard.
In any app, present a PKAddPassesViewController with a valid .pkpass file.
When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter).
Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus.
EXPECTED RESULT:
All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard.
ACTUAL RESULT:
Keyboard navigation is not possible.
No elements receive focus.
Users cannot activate Cancel or Add buttons using keyboard input.
The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements.
IMPACT:
Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible).
Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet.
Affects users of external keyboards who rely on tab/arrow navigation.
Creates an inconsistent accessibility experience compared to other iOS system modals.
Issue:
When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected.
Details:
Affected apps: Notes, Messages, and some other native apps
Not affected: Browser text fields (Safari, Chrome, etc.)
Does not occur when using Option + Delete or just Delete
macOS [insert beta version + build number]
Mac model: [insert model]
Input sources: Thai – Kedmanee, English – U.S.
Steps to reproduce:
Open Notes (or Messages).
Switch to Thai input.
Type a few Thai words.
Press Command + Delete.
Type again — the first character shows up in English.
Expected:
First character should remain in Thai, consistent with the active input source.
Actual:
First character shows as English, then input switches back to Thai.
Topic:
Accessibility & Inclusion
SubTopic:
General
Is there any way to get history?
Topic:
Accessibility & Inclusion
SubTopic:
General
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired.
Current and desired behavior:
This seems an issue so I filed a feedback.
As part of apple pay implementation we are trying to create a merchant session by trying to connect to apple endpoint https://apple-pay-gateway-cert.apple.com/paymentservices/startSession.
While trying to do so we are facing an error “An error occurred while sending the request. The request was aborted: Could not create SSL/TLS secure channel.” .
I call the validation url by passing to a C# .Net Framework 4.8 Web API. The API setups an HttpClient with the Merchant Identity Validation Certificate found in my apple account and calls the validation url passing in the required Json Validation Object. When I call PostAsync() I get an exception with the above error message
Code is working successfully on my local machine but facing this issue while deployed on Dev / Model environment for testing.
We have used Azure app service for deployment and TLS version 1.2 already present here.
We have used the Merchant Identity certificate that was issued and have also checked with networking and infrastructure team to make its not an issue from our side.
Does anyone have any other idea what could be causing this error.
Thank you,
Supriya
Topic:
Accessibility & Inclusion
SubTopic:
General
How to get approved my NFC based app on IOS store fast
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi!
I'm working on an application where I'd like VoiceOver to give each element of a tab bar the "Tab" trait. I'm testing this using the Accessibility Inspector. Essentially, I'd like to replicate the behavior of how Safari identifies each of its tabs as a "Tab" (I've attached a photo below).
How exactly is this accomplished? I've tried using the .isTabBar trait to designate the child objects as "Tabs", but this doesn't seem to be working and I've struggled to find documentation about this. For additional context, these child items are Buttons, and I would like to have the .isButton trait essentially replaced by something like an .isTab trait. Not sure if this is actually possible or not, but curious how the Accessibility Inspector recognizes this in Safari.