Hello, we encountered a 403 error while accessing AASA.
> curl -i 'https://app-site-association.cdn-apple.com/a/v1/finture.id'
HTTP/1.1 404 Not Found
Content-Type: text/plain; charset=utf-8
Content-Length: 10
Connection: keep-alive
Server: nginx
Date: Fri, 28 Feb 2025 03:17:02 GMT
Expires: Fri, 28 Feb 2025 03:17:12 GMT
Age: 1122
Apple-Failure-Details: {"status":"403 Forbidden"}
Apple-Failure-Reason: SWCERR00101 Bad HTTP Response: 403 Forbidden
Apple-From: https://finture.id/.well-known/apple-app-site-association
Apple-Try-Direct: false
Via: https/1.1 jptyo12-3p-pst-007.ts.apple.com (acdn/14454.1), http/1.1 jptyo12-3p-pac-027.ts.apple.com (acdn/14454.1), https/1.1 jptyo12-3p-pfe-014.ts.apple.com (acdn/14454.1)
X-Cache: MISS KS-CLOUD
CDNUUID: 51e5b30b-1f3c-4778-bb6f-cff5447ad763-1988011596
x-link-via: ntct03:443;xianymp018:443;gzct61:443;xg36:443;
x-b2f-cs-cache: no-cache
X-Cache-Status: MISS from KS-CLOUD-XG-FOREIGN-36-07
X-Cache-Status: MISS from KS-CLOUD-GZ-CT-61-05
X-Cache-Status: MISS from KS-CLOUD-XIANY-MP-018-25
X-Cache-Status: MISS from KS-CLOUD-NT-CT-03-03
X-KSC-Request-ID: f1f2bf47e4b7e7b93596bbe7d60b1583
CDN-Server: KSFTF
X-Cdn-Request-ID: f1f2bf47e4b7e7b93596bbe7d60b1583
Not Found
But we can access https://finture.id/.well-known/apple-app-site-association.
How should we solve this, thank you.
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Hello,
I would like to ask if there is any possibility to invoke the Apple Sysdiagnose via an API call. I cannot find any API reference for Sysdiagnose.
I am aware only about the manually invocation. https://it-training.apple.com/tutorials/support/sup075/
However, this is pretty annoying since a reproduction of a hunted bug takes several hours, so I am looking for the way how to invoke Sysdiagnose from our code.
Is there any way i can show popover tip on tabItem inside of TabView
TabView(selection: selected) {
Group{
HomeView()
.tabItem {
Label {
Text("Home")
} icon: {
Image(selected==1 ? "home-icon" : "home-unselect")
}
// show tip over the Home icon
}
.tag(1)
}
}
We developing an app, It's a Parental control app required to block large number of apps. In child mobile installed more than 200 apps parent has to block and disable these apps but parent cant able to block more than 50 apps. Is there any option is there to block all the 200 apps from child mobile.
The example database/server provided by Apple for Live Caller ID contains a hardcoded database with a tiny number of pre-defined numbers.
However, its not expected to be representational of an live real world usage server.
But the question is how can that be accomplished if its a requirement that the data be KPIR encrypted?
In real world scenarios, the factors that effect whether a number should be blocked or not are continually changing and evolving on a minute-by-minute basis and new information becomes available or existing information changes.
If the database supports tens of millions or hundreds of millions of constantly changing phone numbers, in order to meet the requirements of the Live Caller ID being KPIR encrypted, that would imply the database has to re-encrypt its database of millions endlessly for all time.
That seems unfeasable and impractical to implement.
Therefore how do the Apple designers of this feature envisage/suggest a real-world server supporting millions of changing data should meet the requirement to be KPIR encrypted?
I have an app that I configured correctly. Checked the WeatherKit for the App ID, and added the WeatherKit to the Entitlements, and verified the Bundle ID was correct. I am getting the following error:
Failed to generate jwt token for: com.apple.weatherkit.authservice with error: Error Domain=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors Code=2 "(null)"
Encountered an error when fetching weather data subset; location=<+38.97170000,-95.23530000> +/- 0.00m (speed -1.00 mps / course -1.00) @ 3/1/25, 8:34:24 AM Central Standard Time, error=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors 2 Error Domain=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors Code=2 "(null)"
I then made a super simple app that has a hard coded CLLocation and makes the single API call to get the weather and that fails with the same error. This error is still happening after 12 - 48 hours after creating the AppID. I have dumped the contents of the Provisioning Profile for both apps and the Entitlement section looks correct for both. I am not sure what is configured incorrectly, or what steps I should take next.
Hello Apple Developer Community,
I’m working on integrating Siri into my React Native app using native iOS code and bridging to React Native. I’ve followed the necessary steps to set up Siri support, including:
Adding the Siri capability.
Adding Siri usage descriptions in Info.plist.
Using AppIntent and AppShortcutsProvider to define shortcuts.
However, I’m facing the following issues:
Siri Prompts for Confirmation
When a user says a phrase, Siri asks, "Turn on 'MyApp' shortcuts with Siri?" instead of directly recognizing the phrase. Is this expected behavior? If so, how can I reduce friction for users and make the experience more seamless?
Inconsistent Behavior for Existing Users
For users updating to a version with Siri support:
When the app is closed, Siri says, "MyApp hasn't added support for that with Siri."
When the app is open, Siri prompts, "Turn on shortcut for MyApp?" and rest all working fine
Why does Siri not recognize the shortcut when the app is closed, even though the shortcut is defined in AppShortcutsProvider? How can I ensure that Siri recognizes the shortcut regardless of whether the app is open or closed? Other than using AppIntent and AppShortcutsProvider should i try Donating shortcuts(will that helps for updated user case). Please help me on this
Hello everyone,
I’m currently developing an app that uses the Family Controls API, specifically the Screen Time API. However, my current entitlement is limited to development mode, which prevents me from publishing my app on TestFlight.
I have already contacted Apple Developer Support for production access but wanted to reach out to the community as well and I was referenced to FamilyControls API documentation and I couldn't find anything related to my case. Has anyone successfully upgraded their entitlement from development-only to production? Any insights on the process, tips for communicating with Developer Support, or guidance on ensuring full compliance with the Family Controls guidelines would be extremely helpful.
I have an app with Message Filtering Extension enabled and I have been experiencing unusual behaviour.
When I send messages from local number to local number, the filtering is done correctly, but when I send messages from certain international number to my local number the messages are not filtered. I couldn't find any errors in Console.
I see that the normalisation is correct, is there any specifications for SMS from certain countries? Or a reason why Message Filtering is not activated when a SMS is received?
is using /api/v1/weather/{language}/{latitude}/{longitude} the correct way to retrieve historical weather data for a location? e.g. getting hourly weather data for Seattle from 2016-03-04 to 2016-03-11?
https://developer.apple.com/documentation/weatherkitrestapi/get-api-v1-weather-_language_-_latitude_-_longitude_
this is pointed out in the API docs as being the method to pull hourly weather data for a specific location. It doesn't indicate wether it is for current weather or historical
We're having trouble with getting Siri to hand off specific trigger words to our app via shortcuts. I want to be able to say "Hey Siri Myappname Foobar" but in some cases if Foobar is the name of a specific business it may launch maps instead showing locations of those businesses. Is there any way to inform Siri, "no, *****, launch our app as the shortcut specifies!"
All the threads only contain system calls. The crashed thread only contains a single call to my app's code which is main.swift:13.
What could cause such a crash?
crash.crash
is there documentation where we can find details for historical parameters / limitations - so far i've found that the days limit on single API calls days limit is 7. Any other similar specs would be good to have
regarding forecastHourly payloads, what is the timezone of forecastStart?
just want to confirm what we are seeing in the payloads
where can we find documentation on the following fields included in payloads? They're not listed alongside the other fields in the documentation linked below:
https://developer.apple.com/documentation/weatherkitrestapi/hourweatherconditions
precipitationIntensity
snowfallAmount
Or if we can get the data type, unit used, and description here that would be great
I want to monitor again from the bellow function of DeviceActivityMonitorExtension. I have the function of startMonitoring like this.
override func eventDidReachThreshold(_ event: DeviceActivityEvent.Name, activity: DeviceActivityName) {
super.eventDidReachThreshold(event, activity: activity)
startMonitoring()
}
public func startMonitoring() {
let startTime = DateComponents(hour: 0, minute: 0, second: 0)
let endTime = DateComponents(hour: 23, minute: 59, second: 59)//DateComponents(hour: 11, minute: 0, second: 0)//
let schedule = DeviceActivitySchedule(
intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0),
intervalEnd: endTime,
repeats: true
//warningTime: DateComponents(minute:1)
)
let selection: FamilyActivitySelection = savedSelection() ?? FamilyActivitySelection()
let center = DeviceActivityCenter()
let selections = self.savedSelection() ?? FamilyActivitySelection()
let applications = selections.applicationTokens
let categories = selections.categoryTokens
let webCategories = selections.webDomainTokens
let store = ManagedSettingsStore()
store.shield.applicationCategories = ShieldSettings.ActivityCategoryPolicy.specific(categories, except: Set())
store.shield.applications = applications
store.shield.webDomains = webCategories
let scheduleHard = DeviceActivitySchedule(
intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0),
intervalEnd: endTime,
repeats: true
//warningTime: DateComponents(minute:1)
)
let event = DeviceActivityEvent(
applications: selection.applicationTokens,
categories: selection.categoryTokens,
webDomains: selection.webDomainTokens,
threshold: DateComponents(minute: 0)//timeLimitToUseApp i.e for 15 mins
)
do {
try center.startMonitoring( .weekend,
during: scheduleHard,
events: [
.weekend: event,
]
)
print("ScreenTime Monitoring Started")
} catch let error {
print(error.localizedDescription)
}
}
Please provide us with a solution about starting monitoring from DeviceActivityMonitoringExtension's eventDidReachThreshold function or if there is any other way.
I have been working to implement Apple's Live Caller ID feature, which requires setting up a relay server. Following Apple's guidelines, I submitted a request through the provided link to utilize Apple's relay server. However, it's been three weeks, and I have yet to receive a response. I contacted Apple Support, but they indicated that this is a technical matter beyond their scope.
Has anyone successfully received confirmation from Apple regarding the use of their relay server for Live Caller ID? If so, could you share your experience or any advice on how to proceed?
url: https://developer.apple.com/contact/request/live-caller-id-lookup/
Thank you.
Hi, Team.
We are currently creating a VoIP calling app using pjsip and want to be able to handle 4 calls at the same time. It is also necessary to be able to notice calls that are on hold or have not answered yet in hands-free (without looking the screen).
However, It seems like CallKit has the intention to silence the ringtone when multiple calls come in.
Problem
What does actually happen?
If a second call comes in while the first one have not answered yet, CallKit screen keeps showing about the first one.
If tapped "Accept" button, the second call ends.
If tapped "Decline" button, CallKit screen shows about the second call at last.
If the first call has put on hold before the second one comes in, CallKit screen shows "Hold & Accept" button even though the first call is already on hold.
When "Hold & Accept" button is displayed, ringtones and other sounds from app stop.
When tapped "Hold & Accept" button for the second call and then unhold the first one in provider(_ provider: CXProvider, perform action: CXAnswerCallAction), ringtone of the first call doesn't ring.
What is expected to happen?
If a second call comes in while the first one have not answered yet, CallKit screen shows about the second call.
If tapped "Accept" button, the second call starts and the first call is displayed on CallKit screen.
If tapped "Decline" button, the first call appears again.
If the first call has put on hold before the second one comes in, CallKit screen shows only "Accept" and "Decline" button (not in full screen).
If a second call comes in while the first one have not answered yet, ringtones continue to ring.
When accepted the second call, ringtone of the first call rings again.
Information
Sample code
Using CallKit to simulate three incoming calls and two alarm notifications.
Whether or not the first call is put on hold before the second one comes in is switchable from the bottom menu.
https://github.com/ryu-akaike/CallKit-Multiple-Incoming-Test
Versions
macOS: Sequoia 15.1
Xcode: 16.2
iPhone: 11
iOS: 18.1.1
Thank you.
Ryu Akaike
We had a question that came up when we comparing data from WeatherKit to other sources - WeatherKit visibility was well beyond the boundaries we had historically, even from Darksky. That raises two questions:
is visibility actually in meters like the docs say?
is this visibility at ground level, 500ft, or some other height?
We were seeing visibility numbers of up to 40 miles (after converting the number the API sent to miles), where all of our other sources are usually within 10 miles
we’re looking to get some clarification around how the hourly forecasts should be interpreted to ensure we’re using your data in the correct manner. If you could provide the answers to the following questions would be extremely helpful:
1. What do the data points (e.g temperature) in the hourly forecast represent for a future hour? Do they represent the expected average over that future hour or do they represent the forecast for the point in time at the top of the hour?
2. What do those same data points represent in the hourly forecast for an hour which has already begun? e.g. it’s 8:30 and we pull the hourly forecast and the 8:00 hour is still returned. Which of the following would be the correct interpretation for the values returned for the 8:00 hour:
The values represent the forecast for the point in time at the top of the 8:00 hour (if this is the case we would expect it to stop updating)
The values represent the current forecast i.e. what the weather is right now
The values represent the average over the remaining portion of the 8:00 hour
The values represent the average over the full 8:00 hour including both the portion which has already elapsed and the portion which is still to come
3. What does the data represent after the hour (i.e. looking at historical hours)? Is it:
The last forecast made within the hour? If so, is that point-in-time or average for the hour (as explained above)?
The actual weather for that hour (using some non-forecast measure of real weather)? If so, again is that point-in-time at top of hour / point-in-time at end of hour / average over the hour?