I am currently using Core location to get user's current location and few surrounding coordinates to draw annotations in Augmented reality. It works best on Cellular network and on Wifi network few times it is working ok and sometimes orientation is completely changed when device is connected to WiFi. Checked on Apple map as well, there itself it was giving wrong orientation and even after user is at same location, current location on map got fluctuating. On WiFi only models GPS accuracy is not good.
Maps & Location
RSS for tagLearn how to integrate MapKit and Core Location to unlock the power of location-based features in your app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Are the apple weather precipitation radars available through the rest API or have I hit a brickwall. Would love to have the visuals of the native overlays in my app, cant find an affordable/comparable API. Any help would be appreciated. I really need the smooth hourly scrubbing. Not keen on rendering my own with vectors yet.
Thank you.
hello
I am Asmaa Atine
I would like to suggest an improvement for the Apple Maps app.
My idea is to allow users to draw the general path they would like to follow directly on the map with their finger, and then have the app automatically generate an optimized route that follows the drawn trajectory as closely as possible.
This feature would be very useful in several situations, such as:
• when the user wants to pass through a specific area but the suggested routes don’t match,
• when they want to avoid certain places or include a particular spot,
• or when they simply want a more flexible, intuitive way to customize a route.
The concept would be:
1. the user draws a rough path on the map,
2. Apple Maps interprets the drawing,
3. and then proposes the best possible route based on that drawn line.
I believe this would greatly enhance the flexibility of Apple Maps and provide a more intuitive way to create personalized routes.
Thank you for considering this suggestion, and congratulations on the great work already done on the app.
Topic:
App & System Services
SubTopic:
Maps & Location
I am experiencing a persistent issue with my CarPlay application where images rendered within the CarPlay Template interface disappear after the application has been used for an extended period, typically during prolonged navigation.
Images used directly within the CarPlay Template framework disappear. In the attached image showing the issue (IMG_1022.PNG), you can see that the icons for 'parking', 'gasstation', 'conveniencestore', and 'favoritespot' are missing. The side bar icons (car, battery, etc.) remain visible, and the text labels are present, but the Template-specific images/icons vanish.
Problem Description
Images displayed on a custom UIViewController remain visible. Some of our screens integrate a UIViewController (e.g., for map display), and any images rendered on that view controller (not the template itself) continue to display correctly without issue.
Example Images
IMG_1021.PNG (Normal/Correct Display): This image shows the SearchMenu screen with all icons displayed correctly next to their respective labels ('word', 'home', 'route', 'history', 'parking', 'gasstation', 'conveniencestore', 'favoritespot').
IMG_1022.PNG (Problem State): This image shows the same screen after prolonged use, where the icons next to 'parking', 'gasstation', 'conveniencestore', and 'favoritespot' have disappeared, leaving only the text labels.
Question
Has anyone encountered a similar issue? This seems to be a rendering or resource management problem specific to images within the CarPlay Template components when the application runs for an extended duration.
Recently I noticed an app called “Lookus”. Even if I force‑kill it, it still seems to obtain information such as my charging status and network status, and it can even send real‑time notifications. I’m curious how this is technically possible. Does anyone know how this could be achieved?
In MapKit, the MKAnnotation takes a CLLocationCoordinate2D. However, in 3D/Flyover mode, the user marker has a height position on the map.
We are currently plotting points which have altitude, speed, heading, etc, and I have a method for creating a CLLocation with this information. What I'm trying to figure out is if there's a way to pass that information along to the MapKit rendering engine / annotations / AnnotationViews to recognize and show when in 3D mode. Is there any support for that currently?
Hello,
I’d like to ask about best practices for handling interactive snippet intents when working with the user’s location.
My use case is:
1. Get the user’s location
2. Fetch nearby data
3. Display it
My current flow is: try to show the snippet view in "loading" state while waiting for Core Location Manager, then fetch data and reload() the view.
BUT I’m running into an issue where I sometimes receive Core Location error 1 (not authorized), even though the main app has “While In Use” authorization.
It seems that in some cases, especially when the app has been force-closed, App Intents are unable to start location updates, even though I’m using supportedModes = .foreground(.dynamic).
Any guidance would be appreciated.
Cheers,
Ondrej
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
Core Location
Maps and Location
Intents
App Intents
I have tried to make colored annotations in mapView (shown in the commented sections) but they always appear in black. Any help would be appreciated.
func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
let annotationView = MKAnnotationView(annotation: annotation, reuseIdentifier: "TempAnnotationView")
annotationView.canShowCallout = true
annotationView.rightCalloutAccessoryView = UIButton(type: .detailDisclosure)
let configuration = UIImage.SymbolConfiguration(pointSize: 10, weight: .thin, scale: .default)
if annotation.title == "Start" {
// let config = UIImage.SymbolConfiguration.preferringMulticolor()
// let image = UIImage(systemName: "flag.fill", withConfiguration: config)
// // palette
// let config2 = UIImage.SymbolConfiguration(paletteColors: [.systemRed, .systemGreen, .systemBlue])
// let image2 = UIImage(systemName: "person.3.sequence.fill", withConfiguration: config2)
// // hierarchical symbols
// let config3 = UIImage.SymbolConfiguration(hierarchicalColor: .systemIndigo)
// let image3 = UIImage(systemName: "square.stack.3d.down.right.fill", withConfiguration: config3)
// // color
// let image4 = UIImage(systemName: "cone.fill")?.withTintColor(.systemRed, renderingMode: .alwaysTemplate)
// annotationView.image = image4
annotationView.image = UIImage(systemName: "poweron", withConfiguration: configuration)
}
return annotationView
}
Loading tile overlays is slow even when the raster data is locally available on the device running iOS 18.2 and built with Xcode 16.2.
In this video (https://3dtopo.com/superSlowTileLoading.mov) it takes 38 seconds to load tiles readily available on the device. Then, the whole screen flashes when tiles that are already drawn are redrawn, making for a very poor user experience. 38 seconds to load a dozen or so small images (512x512) stored locally on the device is simply unacceptable. I can't release a product like this that I've spent the last 1.5 years building and many years developing the maps themselves. This severe issue is new since I committed to basing my app on MapKit.
Note that this issue does not occur with Apple's base map tiles.
I created a Feedback Assitant case, FB16110803, for this issue.
For the video, I disabled loading any tiles from the network and disabled loading any other data, such as polylines. Essentially all I am doing is loading the tiles stored on the device and returning them, such as:
public func loadTile(at path: MKTileOverlayPath, result: @escaping (Data?, Error?) -> Void) {
fetchData(forKey: key,
failure: {error in result(nil, error)},
success: {data in result(data, nil)})
}
open func fetchData(forKey key: String, failure fail: ((Error?) -> ())? = nil, success succeed: @escaping (Data) -> ()) {
let path = self.path(forKey: key)
do {
let data = try Data(
contentsOf: URL(fileURLWithPath: path),
options: Data.ReadingOptions())
succeed(data)
self.updateDiskAccessDate(atPath: path)
} catch {
if let block = fail {
block(error)
}
}
}
Hello
I started using CLMonitor on my App, and I am noticing the following crash on Xcode Organizer for dozens of my app users:
Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000001
Exception Codes: 0x0000000000000001, 0x0000000000000001
VM Region Info: 0x1 is not in any region. Bytes before following region: ………….
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
UNUSED SPACE AT START
--->
__TEXT ………-…….. [ 176K] r-x/r-x SM=COW /var/containers/Bundle/Application/.........../MyApp
Termination Reason: SIGNAL 11 Segmentation fault: 11
Terminating Process: exc handler […..]
Thread 4 name:
Thread 4 Crashed:
0 libswiftCoreLocation.dylib 0x000000021680b4c8 @objc completion handler block implementation for @escaping @callee_unowned @convention(block) (@unowned CLMonitor) -> () with result type CLMonitor + 44 (<compiler-generated>:0)
1 CoreLocation 0x0000000196cdddd4 __76-[CLMonitorConfiguration vendMonitorWithIdentityAndAuthorizationAttributes:]_block_invoke + 216 (CLMonitorConfiguration.m:195)
2 libdispatch.dylib 0x0000000191138370 _dispatch_call_block_and_release + 32 (init.c:1549)
3 libdispatch.dylib 0x000000019113a0d0 _dispatch_client_callout + 20 (object.m:576)
4 libdispatch.dylib 0x00000001911416d8 _dispatch_lane_serial_drain + 744 (queue.c:3934)
5 libdispatch.dylib 0x00000001911421e0 _dispatch_lane_invoke + 380 (queue.c:4025)
6 libdispatch.dylib 0x000000019114d258 _dispatch_root_queue_drain_deferred_wlh + 288 (queue.c:7193)
7 libdispatch.dylib 0x000000019114caa4 _dispatch_workloop_worker_thread + 540 (queue.c:6787)
8 libsystem_pthread.dylib 0x0000000211933c7c _pthread_wqthread + 288 (pthread.c:2696)
9 libsystem_pthread.dylib 0x0000000211930488 start_wqthread + 8
Does anyone have similar issue when using CLMonitor?
How can I debug / fix this issue?
Is it an CLMonitor API bug? Should I file a bug report?
I have some questions about the changes that the latest IOS doesn't act (scanning or monitoring) for our custom beacon devices.
Since about 2015, We has provided some 'location based service' by using our custom iBeacon devices.
However We've just realized that the latest IOS devices doesn't work with our custom iBeacon devices.
but also realized It could still work with the other normal iBeacon devices.
So, I've dig this issues for a while and finally I got the answer. It's because the one byte of Ibeacon advertsing packet payload.
the followings are the differences about manufacturer data part between a normal Ibeacon and our custom beacon.
normal Ibeacon
0xFF 0x4C00 0x02 0x15 0x736E75685F70656F706C655F74656331 0xEA61 0x03EB 0xC5
our custom Ibeacon
0xFF 0x4C00 0x02 0x15 0x736E75685F70656F706C655F74656331 0xEA61 0x03EB 0xC5 0xDA
Yes, I know.
after many of searches and research,
Now I've understood the byte (meaning the length of following payload) should be changed as '0x16'.
But It is certainly something that has worked well not so long ago.
Anyway,
The introduction was so long, but this is the one question what I'd like to ask about.
I need to know exactly which version of IOS this change came from.
(I've tried but I couldn't find any thing about this on the official documents.)
I need to expaing to my customers what's going on.
for that, I need the information that exactly which version of IOS It didn't work from.
Thanks in advance.
Regards.
My organization, Los Angeles Pierce College, rents space to "Topanga Vintage Market", which is a monthly weekend swap meet operation.
Apple Maps shows the location as roughly 34.18715° N, 118.58058° W. However, this is the location of the campus Child Development Center, which provides child care services and is not open during the hours of the Topanga Vintage Market.
The actual location should be in the adjacent large parking lot, roughly 34.18740° N, 118.57782° W. They do not have a physical building.
How do I get this resolved? I am putting a campus mapping application into the App Store real soon now.
There is also an entry for "ALC Taco Truck" about 34.18533° N, 118.57349° W, which as far as I know has not been on campus since Covid.
Thanks in advance for any guidance you can provide.
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
MapKit JS
MapKit
Maps and Location
Apple Maps Server API
I really need some help. I have been going back and forth with a customer of mine for weeks. Our app is supposed to track location in the background after a user starts it in the foreground. Every time I test it, it works. I can put the app in the background and walk around for hours. Every time he tests it, it doesn't work. He puts the app into the background and about a minute later, it stops tracking him. Then it starts again when the app comes back to the foreground.
We have each tried it on two devices with the same results.
I'm willing to post the rest of the details if anyone is interested in helping me, but the last couple of times I got no response, so I'm not going to bother unless I can get some help this time. Thanks.
I am working on a duress app and would like to improve location accuracy by encouraging users to enable Wi-Fi. In Apple Maps, I noticed that when Wi-Fi is off, a dialog prompts users to turn on Wi-Fi to enhance location accuracy. I am looking to implement similar functionality in my app.
Specifically, I would like to check whether Wi-Fi is enabled on the user's device (even if it is not connected to a network). Despite exploring several methods, I have been unable to determine a reliable way to check the Wi-Fi status.
Can you guide me on whether it is possible to access this functionality in iOS, and if so, how I can implement it within my app?
The easiest way to explain this is to show it. On any device, open Maps, set it to Driving (which will show traffic). Go to Baltimore Maryland. In the water just south east of the city there is a bridge (Francis Scott Key Bridge). . On Apple Maps the road is colored dark red.
At certain zoom levels, there is a "button" (red circle with a white - in it). When you click on that "button", it says 1 Advisory (Road Closed).
How do I show this "button" on my map. My map shows the dark red color, but no "button" appears.
The only "advisory" that I've been able to find is when you create a route. Of course you can't create a route over a road that fell into the water.
struct ContentView: View {
@State private var position = MapCameraPosition.region(
MKCoordinateRegion(
center: CLLocationCoordinate2D(latitude: 39.22742855118304, longitude: -76.52228412310761),
span: MKCoordinateSpan(latitudeDelta: 0.05407607689684113, longitudeDelta: 0.04606660133347873)
)
)
var body: some View {
Map(position: $position)
.mapStyle(.standard(pointsOfInterest: .all, showsTraffic: true))
.cornerRadius(25)
}
}
Is this a WCDWAD, or is there a way to show the "button"
(We Can't Do What Apple Does)
First of all, my English skills are not good, so I wrote an AI program and sent it to complete the questions. sorry.
I'm developing a safety monitoring application that requires continuous BLE scanning for temperature and humidity sensors. I need clarification on the technical feasibility of background and sleep mode operation.
Key Requirements:
Continuous monitoring of BLE advertisements from temperature/humidity sensors
Must detect critical temperature/humidity changes immediately
Data logging every minute
Includes navigation features showing routes
Technical Questions:
Background Mode Operation
If using background modes (bluetooth-central + location):
Can we receive BLE advertisements reliably?
What is the actual scanning interval limitation?
Will CBCentralManagerScanOptionAllowDuplicatesKey limitation affect critical monitoring?
Sleep Mode Operation
Can the app maintain BLE scanning during device sleep?
Would combining with navigation background mode help?
Are there any recommended approaches for continuous monitoring?
Sample Code of Current Approach:
let options: [String: Any] = [
CBCentralManagerOptionShowPowerAlertKey: true,
CBCentralManagerOptionRestoreIdentifierKey: "uniqueIdentifier"
]
centralManager = CBCentralManager(delegate: self, queue: nil, options: options)
// Scanning setup
centralManager.scanForPeripherals(
withServices: [serviceUUID],
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]
)
Has anyone successfully implemented continuous BLE monitoring in background/sleep modes? Are there any special entitlements or techniques that could help achieve this?
This is for a safety-critical application where missing sensor data could lead to serious issues.
Any guidance would be greatly appreciated.
I want a solution to keep tracking the user once he started in driving state until parking.
I tried many solutions like use significant location changes, and silent push notifications and background tasks, but no one of them worked as expected.
I need when user started in driving the app be active until the user parked his car.
I'm using CoreMotion and CoreLocation.
The challenge is when the app is not active like killed or suspended.
So, how to do this? is this possible or not?
Hello,
I’m experiencing an issue with my iOS app that uses CoreBluetooth in combination with beacon monitoring. My app is designed to wake via beacon region monitoring and then start scanning for a specific BLE peripheral (with specific service UUIDs). When the device screen is bright (i.e., the device is unlocked, or locked but the screen is active/bright), everything works perfectly—the connection is established and maintained without any issues in both: foreground and background.
However, when the device is left alone for a while and the lock-screen dims (sleeps), the app continues to run in the background and range the beacon (I can confirm this via realtime console logs), but the connection attempt fails. Here’s what I observe:
The central manager’s delegate method didConnect is called, indicating that the peripheral was connected.
Almost immediately afterward, didDisconnect is triggered with the error message:
"The specified device has disconnected from us.".
The interesting part is (I repeatedly see this error in the console, because the app repeatedly tries to connect to peripheral until a success), when I touch the lockscreen (not unlock, but just touch, which makes the screen to light up brighter), the connection is being established without any further issues!
I have the necessary background modes enabled in the app’s capabilities (e.g., bluetooth-central, location-always-mode, etc..). My expectation was that, thanks to beacon monitoring, the app would be awakened when needed, and scanning/connection would work reliably in the background regardless of whether the device is active or dimmed.
My questions are:
Why might the connection fail with this error when the device is locked/dimmed?
Is this behavior expected due to iOS power management policies even if the app remains active in the background?
Is there a way to ensure a reliable connection in such cases?
Any insights, workarounds, or suggestions would be greatly appreciated. Thank you in advance!
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
Core Location
Background Tasks
Core Bluetooth
I am able to fetch the location in foreground and background.
I need just confirmation that can we get location apps killed state.
If It's possible then how can I do that.
I'm calling .startUpdatingLocation() from the background to detect user's location but the updates stop shortly after they start.
The issue seem to also be discussed here:
https://developer.apple.com/forums/thread/726945
I wonder if any solution has been found?
This is a critical feature for our app.
I have:
kCLLocationAccuracyBestForNavigation
allowsBackgroundLocationUpdates = true
pausesLocationUpdatesAutomatically = false
Location Updates in background modes
distanceFilter not set or kCLDistanceFilterNone