Reported as FB20357097
In SwiftUI, an empty .safeAreaInset modifier attached to a Map causes the map to become zoomed out to planet level.
Minimal reproduction:
import SwiftUI
import MapKit
@main
struct map_region_safe_area_inset_bugApp: App {
var body: some Scene {
WindowGroup {
Map {
// Any Map content
MapCircle(center: .init(latitude: 35.6895, longitude: 139.6917), radius: 1000)
}
.safeAreaInset(edge: .top) {
// No content, `EmptyView()`, `Color.clear`
}
}
}
}
Note: ZStack { } inside the safeAreaInset prevents the bug.
Empty safeAreaInset (bug)
Non-empty
Maps & Location
RSS for tagLearn how to integrate MapKit and Core Location to unlock the power of location-based features in your app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using Xcode 16 to test ARGeoTracking in India with a recorded session from the U.S. (where ARGeotracking is supported), the app remains stuck at "Attaching to App" on my iPad. I've tried two different methods to include the AR session video: placing it in the project folder and adding it to the Shared folder, but neither has resolved the issue.
I am the developer of the Progressive Web App (PWA) FindMeSAR which displays the user’s coordinates in several different formats. https://findmesar.com
When this PWA is not installed on my iPhone 17 Pro then I can use Safari to open this webpage and give permission for it to use my location. FindMeSAR works fine and displays my coordinates as latitude longitude in decimal degrees.
I can also use Safari to open Google maps and geolocation there also works fine.
But if I install FindMeSAR for use offline as a PWA then I get an error message saying location is denied.
My iPhone has the latest iOS 26.0.1
I should add that I recently traded in an iPhone 13 Pro with iOS 17. FindMeSAR worked fine on that device as a PWA including the geolocation feature.
Has anyone else with iOS 26 tried a PWA that does geolocation? Results?
Topic:
App & System Services
SubTopic:
Maps & Location
I am currently using Core location to get user's current location and few surrounding coordinates to draw annotations in Augmented reality. It works best on Cellular network and on Wifi network few times it is working ok and sometimes orientation is completely changed when device is connected to WiFi. Checked on Apple map as well, there itself it was giving wrong orientation and even after user is at same location, current location on map got fluctuating. On WiFi only models GPS accuracy is not good.
I'm trying to simulate GPS in the iOS Simulator, but Debug → Simulate Location is greyed out even though I'm:
Running the iOS scheme
On an iPhone simulator only (no watch paired)
Build configuration = Debug
allowLocationSimulation="YES" in the scheme
Topic:
App & System Services
SubTopic:
Maps & Location
I'm working on an in-house iOS app designed to help users accurately track their routes during trips. Currently, I've implemented a method to track users when the app is open in the background. However, I'm facing challenges, as the tracking stops when the device is locked for more than 10 minutes.
I'm looking for a solution to continuously track a user's geolocation, even if the app is closed or not in use. Specifically, I want to ensure uninterrupted tracking, especially when the device is locked.
Here are some key points:
Current Method: I'm currently using the Core Location method and a combination of background tasks and a repeating timer to fetch the user's location and update a log for geolocation tracking when the app is open in the background.
Issues Faced: The tracking stops when the device is locked for more than 10 minutes. This limitation impacts the accuracy of the route tracking during longer trips.
Objective: My goal is to achieve continuous geolocation tracking, even when the app is closed or not actively used, to provide users with a seamless and accurate record of their routes.
Platform: The app is developed for iOS using the .net maui platform, and I'm seeking solutions or suggestions that are compatible with the iOS .net maui environment.
If anyone has experience or insights into achieving continuous geolocation tracking on iOS, especially when the app is not in use or the device is locked, I would greatly appreciate the assistance.
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
Core Location
Background Tasks
Maps and Location
This question has been asked several times by other users before. But there is no solution provided it seems. So I am asking the same here. I have a screen where I add mapview as a subview. In that it is showing instead of "Legal".
When I set the values of notifyOnExit and notifyOnEnter to true when registering CLCircularRegion, I checked that the didExitRegion and didEnterRegion functions are called well. However, there is a problem that they are called twice in a row every time they are called. I was wondering if this is an internal bug in the API.
There is also a stackoverflow report related to the above issue. I would appreciate your confirmation.
stackoverflow - why the didEnterRegion called twice?
Thank you.
I'm trying to create a link from a restaurant annotation on a map in my app (created using MapKit) that will open the Apple Maps app on an iphone. I've been using the Restaurant name, telephone number, and coordinates and cannot get applemaps to open the enhanced page (which contains photographs and customer reviews and is much more descriptive than the page that is opening, which only shows the location on a map with the phone number and coordinates. It is not that descriptive, and I'm trying to create a request that will make it very easy to jump back and forth between my app and the enhanced page on apple maps. here's what I'm using in my request: " private func openInAppleMaps() {
let coordinate = CLLocationCoordinate2D(latitude: restaurant.latitude, longitude: restaurant.longitude)
let placemark = MKPlacemark(coordinate: coordinate)
let mapItem = MKMapItem(placemark: placemark)
mapItem.name = restaurant.name
if let phone = restaurant.telephone1 {
mapItem.phoneNumber = phone
}
mapItem.openInMaps(launchOptions: [MKLaunchOptionsShowsTrafficKey: true])
}
}" the entire file is attached. Any help or advice would be much appreciated.
RestaurantCallOutBox.swift
Topic:
App & System Services
SubTopic:
Maps & Location
Having multiple issues with google maps via wireless apply car play.
1.: maps freezing
2.:The direction I’m heading seems to be off and searching some times
3.: The most annoying, the audio doesn’t work when I’m using google maps for a trip.
Topic:
App & System Services
SubTopic:
Maps & Location
I am writing to address a concern regarding the background permission functionality in my app, which is critical for ensuring user safety as they navigate various terrains. This feature also enables users to smoothly record their navigation tracks for review after their activities. Recently, I've noticed that this functionality is not working as seamlessly as before.
Additionally, I observed that the app is not categorized under 'health and fitness'—could reclassifying it improve background activity? Before I delve into a detailed code review, I wanted to check if this issue might be related to sync or settings on the App Store side, such as permission configurations, app updates, or other related factors. Or, is it more likely an issue stemming from the app’s codebase?
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
Maps Web Snapshots
Health and Fitness
Core Location
Background Tasks
I am working on an iOS app where I need to detect when a user starts and stops driving using the Apple Core Motion framework. I've implemented the following MotionActivityManager class to handle activity updates and display the detected states in a SwiftUI view.
While I can accurately detect "Stationary" and "Walking" states, detecting the "Driving" (Automotive) state has been unreliable. The accuracy often fails, and the framework frequently misclassifies driving as other states like "Unknown" or "Walking."
Here's the implementation:
@Published var motionStates: [MotionState] = []
@Published var startDate: String = ""
@Published var confidence: String = ""
init() {
setupDefaultStates()
startActivityUpdates()
}
private func setupDefaultStates() {
motionStates = [
MotionState(label: "Stationary", value: false),
MotionState(label: "Walking", value: false),
MotionState(label: "Running", value: false),
MotionState(label: "Automotive", value: false),
MotionState(label: "Cycling", value: false),
MotionState(label: "Unknown", value: false)
]
}
func startActivityUpdates() {
guard CMMotionActivityManager.isActivityAvailable() else {
print("Motion activity is not available.")
return
}
motionActivityManager.startActivityUpdates(to: .main) { [weak self] motion in
guard let self = self, let motion = motion else { return }
DispatchQueue.main.async {
self.updateProperties(with: motion)
}
}
}
private func updateProperties(with motion: CMMotionActivity) {
motionStates = [
MotionState(label: "Stationary", value: motion.stationary),
MotionState(label: "Walking", value: motion.walking),
MotionState(label: "Running", value: motion.running),
MotionState(label: "Automotive", value: motion.automotive),
MotionState(label: "Cycling", value: motion.cycling),
MotionState(label: "Unknown", value: motion.unknown)
]
startDate = dateFormatter.string(from: motion.startDate)
switch motion.confidence {
case .low:
confidence = "Low"
case .medium:
confidence = "Medium"
case .high:
confidence = "High"
@unknown default:
confidence = "Unknown"
}
}
}
struct MotionState: Identifiable {
let id = UUID()
let label: String
let value: Bool
}
struct ContentView: View {
@StateObject private var motionManager = MotionActivityManager()
var body: some View {
ScrollView {
VStack(spacing: 16) {
ForEach(motionManager.motionStates) { state in
LabelView(label: state.label, value: state.value ? "True" : "False")
}
LabelView(label: "Confidence", value: motionManager.confidence)
}
.padding()
}
.onAppear {
UIApplication.shared.isIdleTimerDisabled = true
motionManager.startActivityUpdates()
}
.navigationTitle("Motion Activity")
}
}
Issues:
The motion.automotive state is often not detected accurately.
The confidence level remains low for the automotive state, even when the device is clearly in a car.
How can I improve the detection accuracy of the "Driving" state using the Core Motion framework?
Hi All,
I am currently working on an app that has some navigation functionality, and since my minimum iOS is 18 wanted to incorporate the new APIs that yield a AsyncStream of locations. I have watched both WWDC sessions, the one where the new API is introduced to retrieve the location points, and also the other video where the new authorization process for location is simplified as well.
I have an app currently working in its current state, but am noticing some weird quirks when using the CLBackgroundActivitySession to get the elevated background permission.
What I am doing here is to create this stream and the background object is below:
return AsyncThrowingStream { continuation in
let task = Task {
do {
for try await update in CLLocationUpdate.liveUpdates(updateType) {
if shouldStopUpdate {
continuation.finish()
break
}
continuation.yield(update)
}
} catch {
continuation.finish(throwing: error)
}
}
state = .started(locationTask: task, background: CLBackgroundActivitySession())
}
When I have an active navigation session going and am strongly holding this object and the user force quits the app (or I stop the target through Xcode) the navigation activity indicator in the status bar (or dynamic island) remains present. Even if I relaunch the app, start navigation again, and then call the invalidate method on the CLBackgroundActivitySession I then am seeing that navigation indicator even if I delete my app, and often need to do a full restart to get out of this state.
Is there a step I am missing, or do I not understand the way the new API works to run in the background?
I have some questions about the changes that the latest IOS doesn't act (scanning or monitoring) for our custom beacon devices.
Since about 2015, We has provided some 'location based service' by using our custom iBeacon devices.
However We've just realized that the latest IOS devices doesn't work with our custom iBeacon devices.
but also realized It could still work with the other normal iBeacon devices.
So, I've dig this issues for a while and finally I got the answer. It's because the one byte of Ibeacon advertsing packet payload.
the followings are the differences about manufacturer data part between a normal Ibeacon and our custom beacon.
normal Ibeacon
0xFF 0x4C00 0x02 0x15 0x736E75685F70656F706C655F74656331 0xEA61 0x03EB 0xC5
our custom Ibeacon
0xFF 0x4C00 0x02 0x15 0x736E75685F70656F706C655F74656331 0xEA61 0x03EB 0xC5 0xDA
Yes, I know.
after many of searches and research,
Now I've understood the byte (meaning the length of following payload) should be changed as '0x16'.
But It is certainly something that has worked well not so long ago.
Anyway,
The introduction was so long, but this is the one question what I'd like to ask about.
I need to know exactly which version of IOS this change came from.
(I've tried but I couldn't find any thing about this on the official documents.)
I need to expaing to my customers what's going on.
for that, I need the information that exactly which version of IOS It didn't work from.
Thanks in advance.
Regards.
My organization, Los Angeles Pierce College, rents space to "Topanga Vintage Market", which is a monthly weekend swap meet operation.
Apple Maps shows the location as roughly 34.18715° N, 118.58058° W. However, this is the location of the campus Child Development Center, which provides child care services and is not open during the hours of the Topanga Vintage Market.
The actual location should be in the adjacent large parking lot, roughly 34.18740° N, 118.57782° W. They do not have a physical building.
How do I get this resolved? I am putting a campus mapping application into the App Store real soon now.
There is also an entry for "ALC Taco Truck" about 34.18533° N, 118.57349° W, which as far as I know has not been on campus since Covid.
Thanks in advance for any guidance you can provide.
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
MapKit JS
MapKit
Maps and Location
Apple Maps Server API
I am developing a Flutter app that uses geolocation data extensively. While the location accuracy is excellent under an open sky, I’ve noticed significant drifting when users are inside large buildings. This impacts the app’s functionality as precise location data is critical.
I would like to know:
Are there any specific configurations or APIs available in the Apple ecosystem to enhance indoor geolocation accuracy?
Would combining GPS with other location technologies (like Wi-Fi or Bluetooth) reduce drifting effectively?
Are there recommended practices for handling geolocation indoors on iOS?
Any advice, examples, or guidance would be greatly appreciated!
I'm writing an app in which the user is expected to initiate location tracking, let the app track for a period of time (a few minutes to a couple of hours) and then discontinue tracking. We want the user to be able to switch apps or let their device lock while tracking without losing any location updates.
My understanding is that this can be done with the "While in use" location permission and does not require "Always". We don't want to have to ask our users for the "Always" permission.
I'm configuring the location manager this way:
locationManager.delegate = self locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation locationManager.allowsBackgroundLocationUpdates = true locationManager.showsBackgroundLocationIndicator = true locationManager.distanceFilter = kCLDistanceFilterNone locationManager.activityType = .otherNavigation locationManager.pausesLocationUpdatesAutomatically = false
(The user is expected to be walking around in an outdoor location, stopping occasionally to take notes and pictures).
I've tested this using both an iPhone and an iPad that relies on an external GPS device. It works. I can lock the device and see a continuous stream of location updates in the debugger for hours. I've also tested it while walking outdoors.
However, my customer keeps reporting that the app stops tracking his location whenever it goes into the background. He says that it will track his location fine while in the foreground, but when he backgrounds it, it stops getting location updates. Then when it comes into the foreground again, it resumes. When we plot the locations on a map, you see a straight line between the place where the app went into background and where it woke up again. We know for sure that the app is just transitioning to and from the background and that it is not being terminated and restarted.
I can't reproduce this result on my devices and can't figure out what I'm doing wrong. The customer says he has another app on his device (which is also an iPad with an external GPS) and that the other app does track him when it is in the background.
My app does process all of the locations received in the didUpdateLocations method and not just the last one, so it's not that I'm getting the updates and ignoring them. I'm also not receiving any calls to 'locationManagerDidPauseLocationUpdates', 'didFinishDeferredUpdatesWithError', or 'didFailWithError'.
The only explanation I can think of at the moment is that something changed in iOS. I know that the other app my customer is using is fairly old and built against an old version of the iOS SDK.
Thanks for your help.
I really need some help. I have been going back and forth with a customer of mine for weeks. Our app is supposed to track location in the background after a user starts it in the foreground. Every time I test it, it works. I can put the app in the background and walk around for hours. Every time he tests it, it doesn't work. He puts the app into the background and about a minute later, it stops tracking him. Then it starts again when the app comes back to the foreground.
We have each tried it on two devices with the same results.
I'm willing to post the rest of the details if anyone is interested in helping me, but the last couple of times I got no response, so I'm not going to bother unless I can get some help this time. Thanks.
The code below using LookAroundPreview works fine on iOS (showing the preview image with a button saying "Look Around" at the top to enter full screen with navigation), but on macOS (15.3) there is no button and no way to navigate the view. Is this a bug or is there something I need to do differently on macOS? I have also tried using AppKit with MKLookAroundViewController and I don't seem get the button to launch full screen there either.
import SwiftUI
import MapKit
struct ContentView: View {
var body: some View {
LookAroundPreviewView(coordinate: CLLocationCoordinate2D(latitude: 37.33182, longitude: -122.03118))
.frame(width: 300, height: 200)
}
}
struct LookAroundPreviewView: View {
let coordinate: CLLocationCoordinate2D
@State private var scene: MKLookAroundScene?
@State private var errorMessage: String?
var body: some View {
Group {
if scene != nil {
LookAroundPreview(scene: $scene, allowsNavigation: true)
} else if let errorMessage = errorMessage {
Text("Error: \(errorMessage)")
.foregroundColor(.red)
} else {
ProgressView("Loading Look Around Preview...")
}
}
.task {
do {
let request = MKLookAroundSceneRequest(coordinate: coordinate)
let fetchedScene = try await request.scene
scene = fetchedScene
} catch {
errorMessage = error.localizedDescription
print("Error loading Look Around scene: \(error)")
}
}
}
}
First of all, my English skills are not good, so I wrote an AI program and sent it to complete the questions. sorry.
I'm developing a safety monitoring application that requires continuous BLE scanning for temperature and humidity sensors. I need clarification on the technical feasibility of background and sleep mode operation.
Key Requirements:
Continuous monitoring of BLE advertisements from temperature/humidity sensors
Must detect critical temperature/humidity changes immediately
Data logging every minute
Includes navigation features showing routes
Technical Questions:
Background Mode Operation
If using background modes (bluetooth-central + location):
Can we receive BLE advertisements reliably?
What is the actual scanning interval limitation?
Will CBCentralManagerScanOptionAllowDuplicatesKey limitation affect critical monitoring?
Sleep Mode Operation
Can the app maintain BLE scanning during device sleep?
Would combining with navigation background mode help?
Are there any recommended approaches for continuous monitoring?
Sample Code of Current Approach:
let options: [String: Any] = [
CBCentralManagerOptionShowPowerAlertKey: true,
CBCentralManagerOptionRestoreIdentifierKey: "uniqueIdentifier"
]
centralManager = CBCentralManager(delegate: self, queue: nil, options: options)
// Scanning setup
centralManager.scanForPeripherals(
withServices: [serviceUUID],
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]
)
Has anyone successfully implemented continuous BLE monitoring in background/sleep modes? Are there any special entitlements or techniques that could help achieve this?
This is for a safety-critical application where missing sensor data could lead to serious issues.
Any guidance would be greatly appreciated.