Maps & Location

RSS for tag

Learn how to integrate MapKit and Core Location to unlock the power of location-based features in your app.

Maps & Location Documentation

Posts under Maps & Location subtopic

Post

Replies

Boosts

Views

Activity

Direction data not available with U2 chip (iPhone 15 Pro and iPhone 16 Pro) when using Murata SR040/SR150 accessory
Hello, I am developing with the Nearby Interaction framework using third-party UWB accessories (Murata SR040/SR150). I observed a difference between U1-based and U2-based iPhones: iPhone 12 Pro (U1 chip) NINearbyObject.direction returns valid 3D vector (x, y, z). Distance and direction both work as expected. iPhone 15 Pro and iPhone 16 Pro (U2 chip) NINearbyObject.direction is always nil. Only distance is returned (around 0.35–0.40 m in my test). Effectively behaves as "distance-only mode". Environment: Hardware: iPhone 12 Pro, iPhone 15 Pro iOS version: 18.5 Accessory: Murata UWB SR040 / SR150 App: Using NINearbyAccessoryConfiguration with BLE-based discovery Info.plist includes NSNearbyInteractionUsageDescription Camera assistance was tested both ON and OFF Expectation: I expected the U2 chip to behave consistently with U1, i.e. provide direction vectors when possible. Instead, on iPhone 15 Pro, direction is always unavailable (nil) while distance is returned correctly. Questions: Is this an intentional limitation for U2 chip + third-party accessories? Is there a new requirement (e.g. certification, firmware update, capability flags) to enable direction on U2 devices? Could this be related to NIDeviceCapability or the new Extended Distance Measurement (EDM) mode in U2? Thanks in advance for any clarification.
2
3
272
Dec ’25
OS Location via Bluetooth GPS receiver
Hello, We are a software and hardware development company for the forestry and environmental sectors. We have been based in Quebec (Canada) for over 30 years now. Our Canadian market covers Quebec, Ontario, and the Maritime provinces in the east. We are currently expanding across Canada and into the northern United States. We are on Android platforms with several map and data entry applications. To ensure the success of our expansion, we aim to become part of the Apple family, which is why we are contacting you today. We have developed our own GNSS receiver to increase the location accuracy of our users. This device is called GSFGPS. It uses Bluetooth BLE to communicate with mobile devices and a high-precision GPS that transmits its position using the NMEA protocol. We would like this device to be compatible with an iPhone/iPad. We have developed a mock location application in MAUI (multi-platform). Based on our interpretation of your documentation, we understand that the concept of mock location does not exist at Apple. How can we ensure that our Bluetooth GNSS device is compatible with your iPhone/iPad devices and that they can use the position of the Bluetooth device rather than the internal GPS of your devices? We are a reseller for Juniper Systems, and we know that they have an app on the App Store that has the same features as our product. https://junipersys.com/index.php/support/article/14709 We look forward to your follow-up and recommendations.
2
0
210
Oct ’25
CLLocation.sourceInformation.isSimulatedBySoftware not detecting third-party location spoofing tools
Summary CLLocationSourceInformation.isSimulatedBySoftware (iOS 15+) fails to detect location spoofing when using third-party tools like LocaChange, despite Apple's documentation stating it should detect simulated locations. Environment iOS 18.0 (tested and confirmed) Physical device with Developer Mode enabled Third-party location spoofing tools (e.g., LocaChange etc.) Expected Behavior According to Apple's documentation, isSimulatedBySoftware should return true when: "if the system generated the location using on-device software simulation. " Actual Behavior Tested on iOS 18.0: When using LocaChange sourceInformation.isSimulatedBySoftware returns false This occurs even though the location is clearly being simulated. Steps to Reproduce Enable Developer Mode on iOS 18 device Connect device to Mac via USB Use LocaChange to spoof location to a different city/country In your app, request location updates and check CLLocation.sourceInformation?.isSimulatedBySoftware Observe that it returns false or sourceInformation is nil Compare with direct Xcode location simulation (Debug → Simulate Location) which correctly returns true
2
0
214
Oct ’25
DataCloneError in MapKit JS Worker when posting non-detachable ArrayBuffers (Chrome ≥120)
Since integrating MapKit JS, we’ve begun receiving production error reports with the following message: Uncaught DataCloneError: Failed to execute 'postMessage' on 'DedicatedWorkerGlobalScope': ArrayBuffer is not detachable and could not be cloned. It appears that MapKit JS’s internal worker occasionally calls postMessage() with an ArrayBuffer that cannot be detached under Chrome 120+. This causes the structured clone to fail and the error surfaces uncaught from within the worker. MapKit JS Version: 5.79.109 Browser: Chrome 120.0+ OS: Windows 10 Is this a known issue with MapKit JS? If so, are there recommended workarounds or planned fixes?
2
0
195
Dec ’25
iOS 26: Maps share sheet no longer provides com.apple.mapkit.map-item and only shares short maps.apple/p/... URLs (how to get coordinates?)
Since iOS 26, the Apple Maps share sheet no longer provides a com.apple.mapkit.map-item attachment when sharing a location to my Share Extension. Additionally, on real devices the shared URL is now a short link (https://maps.apple/p/...), which does not contain coordinates. On the simulator, the URL still includes coordinates (as in previous iOS versions). I'm trying to find the official or recommended way to extract coordinates from these new short URLs. Environment: Devices: iPhone (real device) on iOS 26.0 / 26.0.1 Simulator: iOS 26.0 / 26.0.1 simulator (behaves like iOS 18 — see below) App: Share Extension invoked from Apple Maps -> Share -> my app Xcode: 26.0.1 Steps to Reproduce Open Apple Maps on iOS 26 (real device). Pick a POI (store/restaurant). Share -> choose my share extension. iOS 18 and earlier (lldb) po extensionContext?.inputItems ▿ Optional<Array<Any>> ▿ some : 1 element - 0 : <NSExtensionItem: 0x60000000c5d0> - userInfo: { NSExtensionItemAttachmentsKey = ( "<NSItemProvider: 0x600002930d20> {types = (\"public.plain-text\")}", "<NSItemProvider: 0x600002930c40> {types = (\"com.apple.mapkit.map-item\")}", "<NSItemProvider: 0x600002930bd0> {types = (\"public.url\")}" ); } Typical URL: https://maps.apple.com/place?address=Apple%20Inc.,%201%20Apple%20Park%20Way,%20Cupertino,%20CA%2095014,%20United%20States&coordinate=37.334859,-122.009040&name=Apple%20Park&place-id=I7C250D2CDCB364A&map=explore iOS 26 (lldb) po extensionContext?.inputItems ▿ 1 element - 0 : <NSExtensionItem: 0x6000000058d0> - userInfo: { NSExtensionItemAttachmentsKey = ( "<NSItemProvider: 0x600002900b60> {types = (\"public.url\")}", "<NSItemProvider: 0x600002900fc0> {types = (\"public.plain-text\")}" ); } URL looks like: https://maps.apple/p/U8rE9v8n8iVZjr On simulator iOS 26 same missing map-item provider - but the URL is still long and contains coordinates, like this: https://maps.apple.com/place?coordinate=37.334859,-122.009040&name=Apple%20Park&.. Issue The short URLs (maps.apple/p/...) cannot be resolved directly - following redirects ends with: https://maps.apple.com/unsupported The only way I've found to get coordinates is to intercept intermediate redirects - one of them contains the expanded URL with coordinate=.... Example of my current workaround: final class RedirectSniffer: NSObject, URLSessionTaskDelegate { private(set) var redirects: [URL] = [] func urlSession(_ session: URLSession, task: URLSessionTask, willPerformHTTPRedirection response: HTTPURLResponse, newRequest request: URLRequest) async -> URLRequest? { if let url = request.url { redirects.append(url) } return request } } Then I look through redirects to find a URL containing "coordinate=". This works, but feels unreliable and undocumented. Questions Was the removal of com.apple.mapkit.map-item from the Maps share payload intentional in iOS 26? If yes, is there a new attachment type or API to obtain an MKMapItem? What’s the official or supported way to resolve https://maps.apple/p/... to coordinates? Is there any MapKit API or documented URL scheme for this? Is intercepting redirect chains the only option for now? Why does the iOS 26 simulator still return coordinate URLs, while real devices don't?
2
0
318
Oct ’25
MapKit detailAccessoryView buttons not working on macOS Tahoe
Hi, I have been working with an implementation of MapKit which show custom annotations with a detailCalloutAccessoryView built using SwiftUI. This has been working fine for many years, but starting with macOS Tahoe, somehow the SwiftUI buttons in this view have stopped being tappable. I have reproduced the issue in the code below ... same code works fine in macOS14 and macOS15 now doesn't work correctly in macOS26: import Cocoa import MapKit import SwiftUI class ViewController: NSViewController { private var mapView: MKMapView! override func viewDidLoad() { super.viewDidLoad() setupMapView() } private func setupMapView() { // Create and configure the map view mapView = MKMapView() mapView.translatesAutoresizingMaskIntoConstraints = false mapView.delegate = self view.addSubview(mapView) // Pin the map to all edges of the view NSLayoutConstraint.activate([ mapView.topAnchor.constraint(equalTo: view.topAnchor), mapView.leadingAnchor.constraint(equalTo: view.leadingAnchor), mapView.trailingAnchor.constraint(equalTo: view.trailingAnchor), mapView.bottomAnchor.constraint(equalTo: view.bottomAnchor) ]) // Create an annotation for San Francisco let sanFranciscoCoordinate = CLLocationCoordinate2D(latitude: 37.7749, longitude: -122.4194) let annotation = MKPointAnnotation() annotation.coordinate = sanFranciscoCoordinate annotation.title = "San Francisco" annotation.subtitle = "The City by the Bay" // Add the annotation to the map mapView.addAnnotation(annotation) // Center the map on San Francisco let region = MKCoordinateRegion(center: sanFranciscoCoordinate, latitudinalMeters: 5000, longitudinalMeters: 5000) mapView.setRegion(region, animated: false) } } // MARK: - MKMapViewDelegate extension ViewController: MKMapViewDelegate { func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? { let identifier = "CustomAnnotation" var annotationView = mapView.dequeueReusableAnnotationView(withIdentifier: identifier) as? MKMarkerAnnotationView if annotationView == nil { annotationView = MKMarkerAnnotationView(annotation: annotation, reuseIdentifier: identifier) annotationView?.canShowCallout = true // Create the SwiftUI view for the callout let calloutView = CalloutContentView() let hostingView = NSHostingView(rootView: calloutView) hostingView.frame = NSRect(x: 0, y: 0, width: 200, height: 100) // Set the SwiftUI view as the detail callout accessory annotationView?.detailCalloutAccessoryView = hostingView } else { annotationView?.annotation = annotation } return annotationView } } // MARK: - SwiftUI Callout View struct CalloutContentView: View { var body: some View { VStack(spacing: 12) { Text("Welcome to San Francisco!") .font(.headline) .multilineTextAlignment(.center) HStack(spacing: 12) { Button(action: { print("Directions button tapped") }) { Label("Directions", systemImage: "arrow.triangle.turn.up.right.circle.fill") .font(.caption) } .buttonStyle(.borderedProminent) Button(action: { print("Info button tapped") }) { Label("Info", systemImage: "info.circle.fill") .font(.caption) } .buttonStyle(.bordered) } } .padding() .frame(width: 200) } } I've looked at other problems with Map and onTap handlers not getting called, but this is a SwiftUI view inside an AppKit MapKit annotation's callout view. Any idea of how to handle this?
2
0
169
Nov ’25
Stopping and Resuming Background Location Activity with CLLocationUpdates and CLBackgroundActivitySession
Hello, This is my first post in the forums, and I'm still learning my way with iOS Development and Swift. My apologies if the formatting is not correct, or If I'm making any mistakes. I'm currently trying to implement an iOS App where the device needs to share the location with my server via an API call. The use case is as follows: the server expects location updates to determine if a device is inside/outside a geofence. If the device is stationary, no locations need to be sent. If the device begins moving, regardless of whether the app is in foreground, background, or terminated, the app should resume posting locations to the server. I've decided to use the CLLocationUpdate.liveUpdates() stream, together with CLBackgroundActivitySession(). However, I have not been able to achieve the behavior successfully. My app either maintains the blue CLActivitySession indicator active, regardless of whether the phone is stationary or not, or kills the Indicator (and the background capability) and does not restore it when moving again. Below I've attached my latest code snippet (the indicator disappears and does not come back). // This method is called in the didFinishLaunchingWithOptions func startLocationUpdates(precise: Bool) { // Show the location permission pop up requestAuthorization() // Stop any previous sessions stopLocationUpdates() Task { do { // If we have the right authorization, we will launch the updates in the background // using CLBackgroundActivitySession if self.manager.authorizationStatus == .authorizedAlways { self.backgroundActivity = true } else { self.backgroundActivity = false self.backgroundSession?.invalidate() } // We will start collecting live location updates for try await update in CLLocationUpdate.liveUpdates() { // Handle deprecation let stationary = if #available(iOS 18.0, *) { update.stationary } else { update.isStationary } // If the update is identified as stationary, we will skip this update // and turn off background location updates if stationary { self.backgroundSession?.invalidate() continue } // if background activity is enabled, we restore the Background Activity Session if backgroundActivity == true { self.backgroundSession = CLBackgroundActivitySession() } guard let location = update.location else { continue } // Do POST with location to server } } catch { print("Could not start location updates") } } } I'm not sure why the code does not work as expected, and I believe I may be misunderstanding how the libraries Work. My understanding is that the liveUpdates stream is capable of emitting values, even if the app has gone to the background/terminated, thus why I'm trying to stop/resume the Background Activity using the "stationary" or "isStationary" attribute coming from the update. Is the behavior I'm trying to achieve possible? If so, I'm I using the right libraries for it? Is my implementation correct? And If not, what would be the recommended approach? Regards
2
1
127
Dec ’25
Geolocation tracking for IOS apps using .net Maui
I'm working on an in-house iOS app designed to help users accurately track their routes during trips. Currently, I've implemented a method to track users when the app is open in the background. However, I'm facing challenges, as the tracking stops when the device is locked for more than 10 minutes. I'm looking for a solution to continuously track a user's geolocation, even if the app is closed or not in use. Specifically, I want to ensure uninterrupted tracking, especially when the device is locked. Here are some key points: Current Method: I'm currently using the Core Location method and a combination of background tasks and a repeating timer to fetch the user's location and update a log for geolocation tracking when the app is open in the background. Issues Faced: The tracking stops when the device is locked for more than 10 minutes. This limitation impacts the accuracy of the route tracking during longer trips. Objective: My goal is to achieve continuous geolocation tracking, even when the app is closed or not actively used, to provide users with a seamless and accurate record of their routes. Platform: The app is developed for iOS using the .net maui platform, and I'm seeking solutions or suggestions that are compatible with the iOS .net maui environment. If anyone has experience or insights into achieving continuous geolocation tracking on iOS, especially when the app is not in use or the device is locked, I would greatly appreciate the assistance.
1
1
1.1k
Feb ’25
How to Reduce Geolocation Drifting Inside Buildings on iOS?
I am developing a Flutter app that uses geolocation data extensively. While the location accuracy is excellent under an open sky, I’ve noticed significant drifting when users are inside large buildings. This impacts the app’s functionality as precise location data is critical. I would like to know: Are there any specific configurations or APIs available in the Apple ecosystem to enhance indoor geolocation accuracy? Would combining GPS with other location technologies (like Wi-Fi or Bluetooth) reduce drifting effectively? Are there recommended practices for handling geolocation indoors on iOS? Any advice, examples, or guidance would be greatly appreciated!
1
0
388
Jan ’25
Background location tracking not reliable
I'm writing an app in which the user is expected to initiate location tracking, let the app track for a period of time (a few minutes to a couple of hours) and then discontinue tracking. We want the user to be able to switch apps or let their device lock while tracking without losing any location updates. My understanding is that this can be done with the "While in use" location permission and does not require "Always". We don't want to have to ask our users for the "Always" permission. I'm configuring the location manager this way: locationManager.delegate = self locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation locationManager.allowsBackgroundLocationUpdates = true locationManager.showsBackgroundLocationIndicator = true locationManager.distanceFilter = kCLDistanceFilterNone locationManager.activityType = .otherNavigation locationManager.pausesLocationUpdatesAutomatically = false (The user is expected to be walking around in an outdoor location, stopping occasionally to take notes and pictures). I've tested this using both an iPhone and an iPad that relies on an external GPS device. It works. I can lock the device and see a continuous stream of location updates in the debugger for hours. I've also tested it while walking outdoors. However, my customer keeps reporting that the app stops tracking his location whenever it goes into the background. He says that it will track his location fine while in the foreground, but when he backgrounds it, it stops getting location updates. Then when it comes into the foreground again, it resumes. When we plot the locations on a map, you see a straight line between the place where the app went into background and where it woke up again. We know for sure that the app is just transitioning to and from the background and that it is not being terminated and restarted. I can't reproduce this result on my devices and can't figure out what I'm doing wrong. The customer says he has another app on his device (which is also an iPad with an external GPS) and that the other app does track him when it is in the background. My app does process all of the locations received in the didUpdateLocations method and not just the last one, so it's not that I'm getting the updates and ignoring them. I'm also not receiving any calls to 'locationManagerDidPauseLocationUpdates', 'didFinishDeferredUpdatesWithError', or 'didFailWithError'. The only explanation I can think of at the moment is that something changed in iOS. I know that the other app my customer is using is fairly old and built against an old version of the iOS SDK. Thanks for your help.
1
1
477
Jan ’25
Location not tracking in background
I really need some help. I have been going back and forth with a customer of mine for weeks. Our app is supposed to track location in the background after a user starts it in the foreground. Every time I test it, it works. I can put the app in the background and walk around for hours. Every time he tests it, it doesn't work. He puts the app into the background and about a minute later, it stops tracking him. Then it starts again when the app comes back to the foreground. We have each tried it on two devices with the same results. I'm willing to post the rest of the details if anyone is interested in helping me, but the last couple of times I got no response, so I'm not going to bother unless I can get some help this time. Thanks.
1
0
488
Feb ’25
LookAroundPreview navigation not working on macOS
The code below using LookAroundPreview works fine on iOS (showing the preview image with a button saying "Look Around" at the top to enter full screen with navigation), but on macOS (15.3) there is no button and no way to navigate the view. Is this a bug or is there something I need to do differently on macOS? I have also tried using AppKit with MKLookAroundViewController and I don't seem get the button to launch full screen there either. import SwiftUI import MapKit struct ContentView: View { var body: some View { LookAroundPreviewView(coordinate: CLLocationCoordinate2D(latitude: 37.33182, longitude: -122.03118)) .frame(width: 300, height: 200) } } struct LookAroundPreviewView: View { let coordinate: CLLocationCoordinate2D @State private var scene: MKLookAroundScene? @State private var errorMessage: String? var body: some View { Group { if scene != nil { LookAroundPreview(scene: $scene, allowsNavigation: true) } else if let errorMessage = errorMessage { Text("Error: \(errorMessage)") .foregroundColor(.red) } else { ProgressView("Loading Look Around Preview...") } } .task { do { let request = MKLookAroundSceneRequest(coordinate: coordinate) let fetchedScene = try await request.scene scene = fetchedScene } catch { errorMessage = error.localizedDescription print("Error loading Look Around scene: \(error)") } } } }
1
0
365
Feb ’25
Background BLE Scanning with Navigation Mode - Technical Feasibility for Safety Monitoring App
First of all, my English skills are not good, so I wrote an AI program and sent it to complete the questions. sorry. I'm developing a safety monitoring application that requires continuous BLE scanning for temperature and humidity sensors. I need clarification on the technical feasibility of background and sleep mode operation. Key Requirements: Continuous monitoring of BLE advertisements from temperature/humidity sensors Must detect critical temperature/humidity changes immediately Data logging every minute Includes navigation features showing routes Technical Questions: Background Mode Operation If using background modes (bluetooth-central + location): Can we receive BLE advertisements reliably? What is the actual scanning interval limitation? Will CBCentralManagerScanOptionAllowDuplicatesKey limitation affect critical monitoring? Sleep Mode Operation Can the app maintain BLE scanning during device sleep? Would combining with navigation background mode help? Are there any recommended approaches for continuous monitoring? Sample Code of Current Approach: let options: [String: Any] = [ CBCentralManagerOptionShowPowerAlertKey: true, CBCentralManagerOptionRestoreIdentifierKey: "uniqueIdentifier" ] centralManager = CBCentralManager(delegate: self, queue: nil, options: options) // Scanning setup centralManager.scanForPeripherals( withServices: [serviceUUID], options: [CBCentralManagerScanOptionAllowDuplicatesKey: true] ) Has anyone successfully implemented continuous BLE monitoring in background/sleep modes? Are there any special entitlements or techniques that could help achieve this? This is for a safety-critical application where missing sensor data could lead to serious issues. Any guidance would be greatly appreciated.
1
0
406
Feb ’25
MapPolygon
I recently converted over my map from Mapbox Maps to MapKit Map. I have been able to add my polygons on the Map using MapPolygon. The issue I am having is being able to select the Polygon to be able to view information about the polygon. Has anyone been able to figure out a way to tap on the Polygon? I have tried selection but the Polygon doesn't recognize the tap. I would really appreciate it if anyone could point me in the right direction of how I can accomplish this.
1
0
403
Feb ’25
Keep tracking user driving sessions when the app is killed.
I want a solution to keep tracking the user once he started in driving state until parking. I tried many solutions like use significant location changes, and silent push notifications and background tasks, but no one of them worked as expected. I need when user started in driving the app be active until the user parked his car. I'm using CoreMotion and CoreLocation. The challenge is when the app is not active like killed or suspended. So, how to do this? is this possible or not?
1
0
443
Feb ’25
Bachelor Thesis -Geolocation
Hello, I write a Bachelor-Theses about Geolocation on an iPhone. I have a Signal-generator from R&S to simulate GPS-Data. I write an App on an Android-Phone and can readout the GPS-ID and the strength from the signal and the time and Position for Geolocation zb. Island or Africa like this. Now my thesis is about the iPhone and I write an App for get the location manager Geolocation and to save it to a sqlite-database with longitude, latitude and time. But the App recognizes only the real world for geolocation on GPS (LTE and Wlan are disabled!). With my Radio-Generator it would not recognize any Geolocation like the Android phone.? So I need some fast help for my thesis, where are my problems? I allready have a function like this: func updateAccuracy(highAccuracy: Bool) { locationManager.desiredAccuracy = highAccuracy ? kCLLocationAccuracyBestForNavigation : kCLLocationAccuracyHundredMeters print("🎯 GPS-Genauigkeit geändert: (highAccuracy ? "Hoch" : "Plane-Genauigkeit")") } but nothing happens? Best regards
1
0
334
Feb ’25
MapKit: Customize view shown by .mapFeatureSelectionAccessory
Hello, Probably a noob question but I can't find any example code around this use case, and really no past questions I could find that address it either. So, I really love that, when a user taps a POI on the map in my app, the map figures out the right POI every time. Flawless. However, when using .mapFeatureSelectionAccessory, I've tried probably 10-12 iterations, and there doesn't seem to be any way to either: a) add custom content to the place card that's shown, or b) replace that place card altogether with a view of my own, or c) capture the place data but use it in a custom view I want to be able to leverage the accuracy of Apple's tap gesture detection, but show, for example, only the name of the place, along with buttons and a form I have in a view. Right now if I'm using .mapFeatureSelectionAccessory, I can't seem to bypass the place card at all.
1
0
281
Mar ’25