I recently converted over my map from Mapbox Maps to MapKit Map. I have been able to add my polygons on the Map using MapPolygon. The issue I am having is being able to select the Polygon to be able to view information about the polygon.
Has anyone been able to figure out a way to tap on the Polygon? I have tried selection but the Polygon doesn't recognize the tap. I would really appreciate it if anyone could point me in the right direction of how I can accomplish this.
MapKit
RSS for tagDisplay map or satellite imagery from your app's interface, call out points of interest, and determine placemark information for map coordinates using MapKit.
Posts under MapKit tag
127 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
The easiest way to explain this is to show it. On any device, open Maps, set it to Driving (which will show traffic). Go to Baltimore Maryland. In the water just south east of the city there is a bridge (Francis Scott Key Bridge). . On Apple Maps the road is colored dark red.
At certain zoom levels, there is a "button" (red circle with a white - in it). When you click on that "button", it says 1 Advisory (Road Closed).
How do I show this "button" on my map. My map shows the dark red color, but no "button" appears.
The only "advisory" that I've been able to find is when you create a route. Of course you can't create a route over a road that fell into the water.
struct ContentView: View {
@State private var position = MapCameraPosition.region(
MKCoordinateRegion(
center: CLLocationCoordinate2D(latitude: 39.22742855118304, longitude: -76.52228412310761),
span: MKCoordinateSpan(latitudeDelta: 0.05407607689684113, longitudeDelta: 0.04606660133347873)
)
)
var body: some View {
Map(position: $position)
.mapStyle(.standard(pointsOfInterest: .all, showsTraffic: true))
.cornerRadius(25)
}
}
Is this a WCDWAD, or is there a way to show the "button"
(We Can't Do What Apple Does)
Hi. I've been developing an app that uses Mapkit. Is Mapkit allowed in the Swift Student Challenge? If not, does anyone know of any offline alternatives to Mapkit or a way to make mapkit offline? I've been developing my app for way too long until I questioned whether its allowed.
Thanks,
Creeper25
In Apple's Maps app, an annotation is made up of a circle shape or rounded rectangles with a glyph-image.
When selecting an annotation, the annotation animates into a balloon marker (see attached GIF).
How does Apple Maps solve this - from custom annotation to balloon marker with spring animation?
I switched my Maps implementation from SwiftUI to UIKit with a UIViewRepresentable to support annotation clustering - and it works beautifully.
But how to subclass an MKAnnotationView (or MKMarkerAnnotationView <- the balloon) to enable selection and animation as in Apple Maps?
MKMarkerAnnotationView only show balloon markers and I tried everything inside MKAnnotationView (CALayer, etc.)
The code below using LookAroundPreview works fine on iOS (showing the preview image with a button saying "Look Around" at the top to enter full screen with navigation), but on macOS (15.3) there is no button and no way to navigate the view. Is this a bug or is there something I need to do differently on macOS? I have also tried using AppKit with MKLookAroundViewController and I don't seem get the button to launch full screen there either.
import SwiftUI
import MapKit
struct ContentView: View {
var body: some View {
LookAroundPreviewView(coordinate: CLLocationCoordinate2D(latitude: 37.33182, longitude: -122.03118))
.frame(width: 300, height: 200)
}
}
struct LookAroundPreviewView: View {
let coordinate: CLLocationCoordinate2D
@State private var scene: MKLookAroundScene?
@State private var errorMessage: String?
var body: some View {
Group {
if scene != nil {
LookAroundPreview(scene: $scene, allowsNavigation: true)
} else if let errorMessage = errorMessage {
Text("Error: \(errorMessage)")
.foregroundColor(.red)
} else {
ProgressView("Loading Look Around Preview...")
}
}
.task {
do {
let request = MKLookAroundSceneRequest(coordinate: coordinate)
let fetchedScene = try await request.scene
scene = fetchedScene
} catch {
errorMessage = error.localizedDescription
print("Error loading Look Around scene: \(error)")
}
}
}
}
I have a map where I am using UserAnnotation() to show the user's location.
Location permissions are handled elsewhere in the app.
If the user has previously granted location permission, that is enough for the UserAnnotation() blue pin to appear. Otherwise, it just doesn't draw.
So the Map already knows the user's permission and location without my code again requesting location etc.
I was looking for a way to leverage the map's knowledge of the user's location and came across this struct as described in the Documentation for SwiftUI Mapkit
public struct UserLocation {
public var heading: CLHeading?
public var location: CLLocation?
}
I thought this struct might expose the user's location, but how it is expected to be used or when it should populated is unknown from the point of the documentation.
Would someone please share the purpose and use of this struct?
Hello everyone,
I’m encountering a problem on the latest iOS 18 related to location permissions. When the user denies location access, my app triggers the standard system prompt asking them to enable location from Settings. On iOS 17 and below, tapping the “Settings” button in this system alert would successfully navigate the user to my app’s Settings page. However, on iOS 18, nothing happens. Instead, I see the following warning in the Xcode console:
Warning :
BUG IN CLIENT OF UIKIT: The caller of UIApplication.openURL(:) needs to migrate
to the non-deprecated UIApplication.open(:options:completionHandler:).
Force returning false (NO).
Important details and context:
In my own code, I have already replaced all calls to openURL(:) with open(:options:completionHandler:).
I searched the entire codebase for usage of openURL: and didn’t find any.
The alert that appears is the system location alert (iOS-generated), not a custom UIAlertController. Thus, I have no direct control over the underlying call.
On iOS 17 (and below), tapping “Settings” in the same system dialog works perfectly and takes the user to the app’s permission page.
The console message implies that somewhere—likely inside the system’s own flow—the deprecated API is being called and blocked on iOS 18.
What I’ve tried:
Verified I am not calling openURL: anywhere in my code.
Confirmed that UIApplication.openSettingsURLString works when I programmatically open it in a custom alert.
Tested multiple times on iOS 17 and iOS 18 to confirm the behavior difference.
Steps to reproduce:
Install the app on a device running iOS 18 Beta.
Deny location permission when prompted.
Trigger a piece of code that relies on location (e.g., loading a map screen) so that the OS automatically shows its standard “Location is disabled” alert, which includes a “Settings” button.
Tap “Settings.” On iOS 17, this navigates to the app’s Settings. On iOS 18 Beta, it does nothing, and the console logs the BUG IN CLIENT OF UIKIT warning.
Questions:
Is this a known iOS 18 bug where the system’s own alert is still using the deprecated openURL: call?
If so, are there any workarounds besides presenting a custom alert that manually calls open(_:options:completionHandler:)?
Thank you in advance. Any guidance or confirmation would be appreciated!
My organization, Los Angeles Pierce College, rents space to "Topanga Vintage Market", which is a monthly weekend swap meet operation.
Apple Maps shows the location as roughly 34.18715° N, 118.58058° W. However, this is the location of the campus Child Development Center, which provides child care services and is not open during the hours of the Topanga Vintage Market.
The actual location should be in the adjacent large parking lot, roughly 34.18740° N, 118.57782° W. They do not have a physical building.
How do I get this resolved? I am putting a campus mapping application into the App Store real soon now.
There is also an entry for "ALC Taco Truck" about 34.18533° N, 118.57349° W, which as far as I know has not been on campus since Covid.
Thanks in advance for any guidance you can provide.
We are developers of a golf app designed to assist golfers on the course. A key feature of our app is displaying a map of each hole, and we are currently transitioning to using MapKit and camera functionalities for this purpose. However, we are encountering issues with the downloading of map tiles when using the default satellite imagery.
We have tried several approaches to diagnose the issue:
We have tried several things to diagnose the issue:
We implemented the mapViewDidFailLoadingMap delegate method. But it is inconsistent, sometimes triggering offline errors even when map tiles are cached.
We implemented, the mapViewDidFinishRenderingMap method, but it always returns false when offline or you won't get the callback. Which doesn't let us know that rendering tiles has failed.
We would appreciate your guidance on the following specific questions:
Does MapKit provide a way to confirm if a map tile has fully loaded?
Is there a method to detect if a portion of the map hasn't loaded or if a tile request has failed?
Can we determine whether a map tile is cached, and if so, how long it will remain cached, similar to Cache-Control HTTP headers?
Is there a way to trigger the preloading of map tiles when we know the user has a good internet connection?
Please see the sample project for steps to reproduce the issue.
Thank you for any assistance!
I have a test application I'm working on (so it's a fresh Xcode project under Sonoma - with older map code borrowed from another project). It is a macOS application. And in Obj-C.
When the map window is opened the logs contain the following - I've been trying to hunt down and resolve. Thank you in advance for any clues/pointers.
Failed to locate resource "default.csv"
Failed to locate resource "satellite@2x.styl"
Failed to locate resource "satellite@2x.styl"
Failed to locate resource "satellite.styl"
Failed to locate resource "satellite@2x.styl"
Failed to locate resource "satellite@2x.styl"
Failed to locate resource "satellite.styl"
Failed to locate resource "satellite.styl"
Couldn't find satellite.styl in framework, file name satellite.styl
Authorization status: Authorized
The application does have MapKit.framework included.
Hey everyone!
I’m encountering an issue while attempting to animate height changes of the content inside safeAreaInset(edge:alignment:spacing:content:).
When animating a reduction in the frame height, the container view (in my case, Map) also animates unexpectedly.
However, when animating an increase in the frame height, the animation works smoothly, and the Map view remains still.
How can I address this odd resizing behavior of the container?
Code:
struct MapView: View {
var body: some View {
Map()
.safeAreaInset(edge: .bottom) {
MapDetailView()
}
}
}
struct MapDetailView: View {
@State private var oldHeightOffset: CGFloat = 0
@State private var newHeightOffset: CGFloat = 0
@State private var containerHeight: CGFloat = 0
private var drag: some Gesture {
DragGesture(coordinateSpace: .global)
.onChanged { value in
withAnimation(.interactiveSpring) {
newHeightOffset = oldHeightOffset + value.translation.height
}
}
.onEnded { value in
switch newHeightOffset {
case containerHeight * 0.625...containerHeight:
withAnimation(.spring) {
newHeightOffset = containerHeight * 0.75
}
case containerHeight * 0.25..<containerHeight * 0.625:
withAnimation(.spring) {
newHeightOffset = containerHeight * 0.5
}
case 0..<containerHeight * 0.25:
withAnimation(.spring) {
newHeightOffset = 0
}
default:
break
}
oldHeightOffset = newHeightOffset
}
}
var body: some View {
NavigationStack {
Rectangle()
.fill(.clear)
.containerBackground(.ultraThinMaterial, for: .navigation)
}
.gesture(drag)
.containerRelativeFrame(.vertical) { length, _ in
length - newHeightOffset
}
.onGeometryChange(for: CGFloat.self) { geometryProxy in
let frame = geometryProxy.frame(in: .local)
return frame.height + newHeightOffset
} action: { containerHeight in
self.containerHeight = containerHeight
}
}
}
Reducing safe area inset's content height (drag down):
Increasing safe area inset's content height (drag up):
I have developed a mobile app using SwiftUI that supports GoogleMaps. Now I am in the process of building a CarPlay application. I assume CarPlay only supports Apple MapKit, as I could not find any way to integrate the Google Maps. Below are few queries,
Could you please guide me on how I can obtain the user's current location on the CarPlay app launch? Is there a way CarPlay can get the details from the mobile app(not pretty sure as its using Google Maps)?
If the user is logged out from the mobile app, what is the flow in CarPlay? Do we have any standard login page asking user to login to the mobile app first?
Is there any UI asking the user to capture the location in CarPlay?
This is my first CarPlay app. Kindly guide me to a document or so that covers these details.
Thanks a ton!!
If I change MKMapView's .preferredConfiguration property from .realistic to .flat, or .mapType from .hybridFlyover to .hybrid, subsequent scrolling causes a crash:
-[MTLDebugRenderCommandEncoder validateDrawIndexedPrimitives:indexCount:indexType:indexBuffer:indexBufferOffset:instanceCount:function:]:6179: failed assertion `Draw Indexed Primitives Validation
indexBufferOffset(0) + (indexCount(864) * 2) must be <= [indexBuffer length] (12).
For example, changing:
mapView.preferredConfiguration = MKHybridMapConfiguration(elevationStyle: .realistic)
to:
mapView.preferredConfiguration = MKHybridMapConfiguration(elevationStyle: .flat)
Then, scroll the map view, and it will crash. It is OK the other way around.
Or change:
self.mapView.mapType = .hybridFlyover
to:
self.mapView.mapType = .hybrid
I've tried everything I can think of, including calling functions like these after the change:
mapView.setNeedsDisplay()
mapView.setRegion(self.mapView.region, animated: false)
.mapType and .preferredConfiguration are settable properties, so they should be possible to change. I could create a new MKMapview, but I'd have to perfectly recreate the state which is not trivial and far from ideal.
I'm just trying to work around the issue FB14553276 so my map tiles don't show tiling seems in 2D which is a new issue introduced with iOS 18. This potential workaround still shows the seems in 3D, but is better than always showing seems. Seems like whatever I do, I just can't defeat MapKit bugs and puts me in an impossible situation. :(
I've submitted Feedback this issue: FB16153802
It seems like others are experiencing the issue:
https://forums.developer.apple.com/forums/thread/730780
I'm doing a weather app, users can search locations for getting weather, but the problem is, the results only shows locations in my country, not in global. For example, I'm in China, I can't search New York, it just shows nothing. Here's my code:
@Observable
class SearchPlaceManager: NSObject {
var searchText: String = ""
let searchCompleter = MKLocalSearchCompleter()
var searchResults: [MKLocalSearchCompletion] = []
override init() {
super.init()
searchCompleter.resultTypes = .address
searchCompleter.delegate = self
}
@MainActor
func seachLocation() {
if !searchText.isEmpty {
searchCompleter.queryFragment = searchText
}
}
}
extension SearchPlaceManager: MKLocalSearchCompleterDelegate {
func completerDidUpdateResults(_ completer: MKLocalSearchCompleter) {
withAnimation {
self.searchResults = completer.results
}
}
}
Also, I've tried to set searchCompleter.region = MKCoordinateRegion( center: CLLocationCoordinate2D(latitude: 0, longitude: 0), span: MKCoordinateSpan(latitudeDelta: 180, longitudeDelta: 360) ), but it doesn't work.
Hi, I am having some troubles creating a "nested" RealityView content using MapKit attachment.
I am building a visionOS app that has horizontal MapKit map as an attachment to RealityView. I want to display 3D pins on that map, therefore I am using native map annotation and inside of these annotations, I create a new RealityView just for the 3D pin. This worked completely fine, unitil I wanted to have those RealityViews interact with each other.
By interaction of those RealityViews I mean that I wanted to group entities from the first "main" RealityViews content with the 3D pins using ModelSortGroupComponent.
Why I want this? I want to make the map circular, that is not a problem. Problem is that when I move the map with 3D pins, these pins have their own RealityView space and are only bounded by volumetric window dimensions. What happes is that these pins float next to the map (shown on attached image). So I came up with this solution: create a custom "toroid" like 3D entity model that occludes the pins that go outside the map region. In order to occlude only the pins, I need to use ModelSortGroupComponent to group the "toroid" entity with 3D pins entities (as described in another forum thread).
To summarize: need the content of the superior RealityView to interact with map attachment annotations RealityView content in order to group them. There might be of course another, better way to achieve my whole goal, so I would naturally appreciate any help or guidance.
Image below showing 3D pins on circular map. Since pins RealityView does no know anything about other RealityViews, it just overlows and hangs in space until is cropped by volumetric window boundary.
Simplified code:
var body: some View {
let modelSortGroup = ModelSortGroup(depthPass: .prePass)
RealityView { content, attachments in
var mainEntity = Entity()
// My other entities here...
if let mapAttachment = attachments.entity(for: "mapAttachment") {
// Edit map properties, position, horizontal layout etc.
mainEntity.addChild(mapAttachment)
}
// Create and add to content mask "toroid" entity mapMaskEntity. Use OcclusionMaterial() material.
mapMaskEntity.components.set(ModelSortGroupComponent(group: modelSortGroup, order: 0))
// For all pins, somehow also set the group
// 3DPinEntity.components.set(ModelSortGroupComponent(group: modelSortGroup, order: 1))
content.add(mainEntity)
} attachments: {
Attachment(id: "mapAttachment") {
Map {
ForEach(mapViewModel.clusters, id: \.id) { cluster in
Annotation("", coordinate: cluster.coordinate) {
MapPin3DView(cluster: cluster)
}
}
}
.clipShape(Circle())
}
}
}
// MapPin3DView is an map annotation view that includes a model of 3D pin and some details like image etc., uses RealityView.
struct MapPin3DView: View {
var body: some View {
RealityView { content in
// 3D pin entities...
}
}
}
Hi! I'm attempting to run the Quakes Sample App^1 from macOS. I am running breakpoints and confirming the mapCameraKeyframeAnimator is being called:
.mapCameraKeyframeAnimator(trigger: selectedId) { initialCamera in
let start = initialCamera.centerCoordinate
let end = quakes[selectedId]?.location.coordinate ?? start
let travelDistance = start.distance(to: end)
let duration = max(min(travelDistance / 30, 5), 1)
let finalAltitude = travelDistance > 20 ? 3_000_000 : min(initialCamera.distance, 3_000_000)
let middleAltitude = finalAltitude * max(min(travelDistance / 5, 1.5), 1)
KeyframeTrack(\MapCamera.centerCoordinate) {
CubicKeyframe(end, duration: duration)
}
KeyframeTrack(\MapCamera.distance) {
CubicKeyframe(middleAltitude, duration: duration / 2)
CubicKeyframe(finalAltitude, duration: duration / 2)
}
}
But I don't actually see any map animations taking place when that selection changes.
Running the application from iPhone simulator does show the animations.
I am building from Xcode Version 16.2 and macOS 15.2. Are there known issues with this API on macOS?
Loading tile overlays is slow even when the raster data is locally available on the device running iOS 18.2 and built with Xcode 16.2.
In this video (https://3dtopo.com/superSlowTileLoading.mov) it takes 38 seconds to load tiles readily available on the device. Then, the whole screen flashes when tiles that are already drawn are redrawn, making for a very poor user experience. 38 seconds to load a dozen or so small images (512x512) stored locally on the device is simply unacceptable. I can't release a product like this that I've spent the last 1.5 years building and many years developing the maps themselves. This severe issue is new since I committed to basing my app on MapKit.
Note that this issue does not occur with Apple's base map tiles.
I created a Feedback Assitant case, FB16110803, for this issue.
For the video, I disabled loading any tiles from the network and disabled loading any other data, such as polylines. Essentially all I am doing is loading the tiles stored on the device and returning them, such as:
public func loadTile(at path: MKTileOverlayPath, result: @escaping (Data?, Error?) -> Void) {
fetchData(forKey: key,
failure: {error in result(nil, error)},
success: {data in result(data, nil)})
}
open func fetchData(forKey key: String, failure fail: ((Error?) -> ())? = nil, success succeed: @escaping (Data) -> ()) {
let path = self.path(forKey: key)
do {
let data = try Data(
contentsOf: URL(fileURLWithPath: path),
options: Data.ReadingOptions())
succeed(data)
self.updateDiskAccessDate(atPath: path)
} catch {
if let block = fail {
block(error)
}
}
}
I am trying to derive the Apple Place ID from a CLPlacemark (or via a MKMapItem derived from it) created via either CLGeocoder().reverseGeocodeLocation() or CLGeocoder().geocodeAddressString(). In many cases, the placemark returned from these functions contains detailed information (name, address, coordinates, etc), implying that the Apple Place ID is known, but the identifier is not present.
The only way I have found to get a Place ID is via MKLocalSearch. Wondering if I am missing something here.
Am displaying a number of annotations on a map which vary their location over time and can inherently cluster together. I'd like to be able to set the Z order on the annotations as I have simple logic how to order them in the Z direction.
I see the ZPriority property on UIKit Annotation view, but unclear how I can do that in SwiftUI. Is this exposed in any manner?
I thought I could create a viewModifier and using the content parameter I'd be able to drop down into the UIKIt view to apply the value to the property, but alas I'm not seeing a clear way to do so.
Is this at all possible without dropping down entirely to creating a NSViewRepresentable/UIViewRepresentable implementation of Map?
I have noticed a discrepancy between behavior on physical devices and simulators in iOS 18.
I am using the latest MapKit APIs to fetch MKMapItems using the following MKLocalSearch:
private func performLocalSearch(_ query: String) async throws -> [MKMapItem] {
let request = MKLocalSearch.Request()
request.naturalLanguageQuery = query
let search = MKLocalSearch(request: request)
return try await search.start().mapItems
}
This returns an array of MKMapItem on both the simulator and physical device. The key difference is my physical device (iOS 18.1.1) is missing the MKMapItem's identifier value. On the simulator, identifier is always populated in addition to my search. Any ideas on how to resolve this?
The new MapKit API for those curious:
@available(iOS 6.0, *)
open class MKMapItem : NSObject {
@available(iOS 18.0, *)
open var identifier: MKMapItem.Identifier? { get }