Hello,
I am developing with the Nearby Interaction framework using third-party UWB accessories (Murata SR040/SR150).
I observed a difference between U1-based and U2-based iPhones:
iPhone 12 Pro (U1 chip)
NINearbyObject.direction returns valid 3D vector (x, y, z).
Distance and direction both work as expected.
iPhone 15 Pro and iPhone 16 Pro (U2 chip)
NINearbyObject.direction is always nil.
Only distance is returned (around 0.35–0.40 m in my test).
Effectively behaves as "distance-only mode".
Environment:
Hardware: iPhone 12 Pro, iPhone 15 Pro
iOS version: 18.5
Accessory: Murata UWB SR040 / SR150
App: Using NINearbyAccessoryConfiguration with BLE-based discovery
Info.plist includes NSNearbyInteractionUsageDescription
Camera assistance was tested both ON and OFF
Expectation:
I expected the U2 chip to behave consistently with U1, i.e. provide direction vectors when possible.
Instead, on iPhone 15 Pro, direction is always unavailable (nil) while distance is returned correctly.
Questions:
Is this an intentional limitation for U2 chip + third-party accessories?
Is there a new requirement (e.g. certification, firmware update, capability flags) to enable direction on U2 devices?
Could this be related to NIDeviceCapability or the new Extended Distance Measurement (EDM) mode in U2?
Thanks in advance for any clarification.
Maps & Location
RSS for tagLearn how to integrate MapKit and Core Location to unlock the power of location-based features in your app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
We are a software and hardware development company for the forestry and environmental sectors. We have been based in Quebec (Canada) for over 30 years now. Our Canadian market covers Quebec, Ontario, and the Maritime provinces in the east. We are currently expanding across Canada and into the northern United States. We are on Android platforms with several map and data entry applications.
To ensure the success of our expansion, we aim to become part of the Apple family, which is why we are contacting you today.
We have developed our own GNSS receiver to increase the location accuracy of our users. It uses Bluetooth BLE to communicate with mobile devices and a high-precision GPS that transmits its position using the NMEA protocol. We would like this device to be compatible with an iPhone/iPad. We have developed a mock location application in MAUI (multi-platform). Based on our interpretation of your documentation, we understand that the concept of mock location does not exist at Apple. How can we ensure that our Bluetooth GNSS device is compatible with your iPhone/iPad devices and that they can use the position of the Bluetooth device rather than the internal GPS of your devices?
We are a reseller for Juniper Systems, and we know that they have an app on the App Store that has the same features as our product.
https://junipersys.com/index.php/support/article/14709
We look forward to your follow-up and recommendations.
Hello Apple Developer Team,
I’m currently using Apple MapKit JS as the main map provider for our logistics, telematics, and HR platform TADMIN, and we are extremely satisfied with its reliability, accuracy, and visual quality.
We would now like to expand our Apple integration by migrating our backend reverse geolocation services to Apple as well. However, our current usage requirements significantly exceed the standard 25,000 daily service request limit.
At this stage, we already need between 250,000 and 350,000 reverse geocoding requests per day, and this number will continue to grow rapidly. Our TADMIN Tracking product processes live GPS data from connected vehicle telematics boxes, and each vehicle sends an average of 1.5 pings per minute in normal mode. We currently manage around 140 vehicles and are already in discussions with new customers that will add over 1,000 additional vehicles to the platform soon. As our customer base continues to expand, we expect this growth trend to accelerate significantly over the coming months.
We already make extensive use of caching to minimize redundant geolocation calls. For example, we reuse location data when vehicles remain within a defined radius. However, since trucks rarely stay stationary for long, there is still a constant flow of new coordinates that require reverse geolocation.
To give you a broader picture:
TADMIN is a comprehensive SaaS ecosystem for the transport and logistics industry, combining HR management, telematics and tracking, dispatching, and data analysis into one integrated platform.
The Tracking module is just one part of this system and serves as the live data backbone for our dispatching, HR, and telematics analytics modules.
We would therefore like to increase our quotas for:
Service Requests (especially Reverse Geocoding)
Snapshot Requests, which we use for our Geofence Alerts. These are sent via push notification and email, and we would love to include the snapshot images in the emails for a clear and visually rich presentation
MapKit JS Views, since we also use MapKit JS heavily in our web dashboards, for example in our tracking portal
Higher quotas would allow us to rely even more on Apple services, including Autocomplete and Geocoding for customer-facing address searches inside our applications.
We already have three apps published on the App Store, with a fourth one scheduled for release this week, and I will soon be upgrading my Apple Developer account to an Organization Account for our company.
We are currently evaluating providers for this next stage of integration, as we are preparing a new major version of our TADMIN software, which will introduce a reworked telematics backend. Our goal is to migrate to Apple’s geolocation and map services as part of this new release.
Could you please advise how we can best address this use case, for example through higher quotas or an adjusted configuration?
Thank you very much for your time and support.
Best regards,
Timo Köhler
Founder & CTO, TADMIN GmbH
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
Maps Web Snapshots
MapKit JS
MapKit
Apple Maps Server API
Since integrating MapKit JS, we’ve begun receiving production error reports with the following message:
Uncaught DataCloneError: Failed to execute 'postMessage' on 'DedicatedWorkerGlobalScope': ArrayBuffer is not detachable and could not be cloned.
It appears that MapKit JS’s internal worker occasionally calls postMessage() with an ArrayBuffer that cannot be detached under Chrome 120+. This causes the structured clone to fail and the error surfaces uncaught from within the worker.
MapKit JS Version: 5.79.109
Browser: Chrome 120.0+
OS: Windows 10
Is this a known issue with MapKit JS? If so, are there recommended workarounds or planned fixes?
Are the apple weather precipitation radars available through the rest API or have I hit a brickwall. Would love to have the visuals of the native overlays in my app, cant find an affordable/comparable API. Any help would be appreciated. I really need the smooth hourly scrubbing. Not keen on rendering my own with vectors yet.
Thank you.
Hi,
I’m a member of the Apple Developer Program and I’m planning to use Apple Maps Server API together with MapKit JS for a production, customer-facing web service.
I have reviewed the Apple Developer Program License Agreement (including Schedule 6 – Apple Maps Services) and the documentation, but I still need clarification on several points to ensure that our usage fully complies with Apple’s policies.
Daily quota and additional capacity
From the documentation, I understand that there is a daily limit of 250,000 map views and 25,000 service calls per Apple Developer Program membership, shared between MapKit JS and Apple Maps Server API.
When the 25,000 service calls are exceeded, the API returns HTTP 429.
Should this limit be considered a hard limit for production use?
The wording “For additional capacity needs, contact us” is unclear.
Is there any official channel or program to request a higher quota,
or should we assume this is not practically available and design our system to always stay within the documented limit?
Caching of geocoding / reverse-geocoding results
Schedule 6 section 2.5 restricts caching, prefetching, or storing map data except when temporary and only as necessary for Apple Maps Services, and any cached data must be deleted after use.
To understand what “temporary” means in practice, I would like to confirm whether the following scenarios are acceptable:
(a) In-memory cache during a single page or tab session:
- Store geocoding results (latitude/longitude and normalized address) only in a JavaScript in-memory structure (e.g., a Map object) during the lifetime of the browser tab.
- Delete all cached results when the tab is closed or after a short TTL (for example, a few minutes).
(b) sessionStorage with a short TTL:
- Store geocoding results in window.sessionStorage on a per-tab basis.
- Apply a short TTL (for example, a few minutes), and delete the data when the TTL expires or the tab is closed.
Are both (a) and (b) considered acceptable forms of “temporary caching” under section 2.5,
or should we avoid sessionStorage and limit ourselves to purely in-memory (non-persistent) caching?
Use on a commercial customer-facing website
Our intended use case is a public website that:
Displays store locations on a map
Allows users to search for nearby stores
Optionally shows routing directions
We do not do fleet management, asset tracking, enterprise route optimization, or insurance risk scoring.
Is this type of consumer-facing store-locator use case permitted under the Apple Maps Services terms?
Any clarification from the Maps or MapKit teams would be greatly appreciated.
Thank you.
Best regards,
Naoto Omori
hello
I am Asmaa Atine
I would like to suggest an improvement for the Apple Maps app.
My idea is to allow users to draw the general path they would like to follow directly on the map with their finger, and then have the app automatically generate an optimized route that follows the drawn trajectory as closely as possible.
This feature would be very useful in several situations, such as:
• when the user wants to pass through a specific area but the suggested routes don’t match,
• when they want to avoid certain places or include a particular spot,
• or when they simply want a more flexible, intuitive way to customize a route.
The concept would be:
1. the user draws a rough path on the map,
2. Apple Maps interprets the drawing,
3. and then proposes the best possible route based on that drawn line.
I believe this would greatly enhance the flexibility of Apple Maps and provide a more intuitive way to create personalized routes.
Thank you for considering this suggestion, and congratulations on the great work already done on the app.
Topic:
App & System Services
SubTopic:
Maps & Location
I am experiencing a persistent issue with my CarPlay application where images rendered within the CarPlay Template interface disappear after the application has been used for an extended period, typically during prolonged navigation.
Images used directly within the CarPlay Template framework disappear. In the attached image showing the issue (IMG_1022.PNG), you can see that the icons for 'parking', 'gasstation', 'conveniencestore', and 'favoritespot' are missing. The side bar icons (car, battery, etc.) remain visible, and the text labels are present, but the Template-specific images/icons vanish.
Problem Description
Images displayed on a custom UIViewController remain visible. Some of our screens integrate a UIViewController (e.g., for map display), and any images rendered on that view controller (not the template itself) continue to display correctly without issue.
Example Images
IMG_1021.PNG (Normal/Correct Display): This image shows the SearchMenu screen with all icons displayed correctly next to their respective labels ('word', 'home', 'route', 'history', 'parking', 'gasstation', 'conveniencestore', 'favoritespot').
IMG_1022.PNG (Problem State): This image shows the same screen after prolonged use, where the icons next to 'parking', 'gasstation', 'conveniencestore', and 'favoritespot' have disappeared, leaving only the text labels.
Question
Has anyone encountered a similar issue? This seems to be a rendering or resource management problem specific to images within the CarPlay Template components when the application runs for an extended duration.
I want to use MapKit with App Intents, but the map does not show up.(See attached image)
Can anyone help me solve this?
import SwiftUI
import MapKit
struct ContentView: View {
@State private var region = MKCoordinateRegion(
center: CLLocationCoordinate2D(latitude: 37.334_900,
longitude: -122.009_020),
latitudinalMeters: 750,
longitudinalMeters: 750
)
var body: some View {
VStack {
Map(coordinateRegion: $region).frame(width:300, height:300)
.disabled(true)
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
import AppIntents
import SwiftUI
import MapKit
struct test20220727bAppIntentsExtension: AppIntent {
static var title: LocalizedStringResource = "test20220727bAppIntentsExtension"
func perform() async throws -> some IntentResult {
return .result(value: "aaa", view: ContentView())
}
}
struct testShortcuts:AppShortcutsProvider{
@available(iOS 16.0, *)
static var appShortcuts: [AppShortcut]{
AppShortcut(
intent: test20220727bAppIntentsExtension(),
phrases: ["test20220727bAppIntentsExtension" ]
)
}
}
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
App Intents
wwdc2022-10032
wwdc2022-10170
First of all :
Thanks for the great presentation (wwdc2023-10180), Siraj !
This new, simple API looks like what we've been looking for for easy manageable background location updates with 'automatic battery drain minimization' :-)
There were two questions that came to my mind. As far as I understood, the CLLocationUpdate.LiveConfiguration is used to help the location services to improve the location fixes.
Are there other options planned to specify the granularity of delivered locations e.g., how accurate the locations need to be (as the desiredAccuracy and distanceFilter settings for the olden CLLocationManager)?
Does the Implementation switch between significant location changes and regular, more expensive ways (like GPS hardware) or just deliver the most feasible accuracy available at the time of notification?
I'm just curious - if I get the most feasible granularity, everything is fine for me anyway :-)
Thanks again,
Michael
I have the CarPlay Entitlement "Driving Task" and two of my apps use it.
Now, in both apps, I have implemented Navigation. I requested the Navigation CarPlay Entitlement when the feature was mature and builds were available in Test Flight, since I wanted to release the new versions of the apps with navigation available both on the iPhone and in CarPlay.
I got no answer to my request, so I decided to release the apps with only navigation in the iPhone and the Driving Task functionality in CarPlay, thinking that maybe being live with navigation in the App Store was a requirement. I have asked permission again, and so far, the request is being ignored again.
What are the requirements to get the Navigation CarPlay Entitlement?
If the app is approved for navigation, is there something else the app must do to get the entitlement?
Requirements for CarPlay Entitlements seem quite obscure, are they listed anywhere?
Is there a technical problem to move from an existing CarPlay Entitlement to another? Can that be the reason the entitlement has not been granted?
Some of my competitors have the CarPlay Navigation entitlement. My use case is the same (in a better app in my opinion, of course). But I am only getting bad reviews because "the app does not include the map in CarPlay" after the big investment in implementing navigation in the apps.
Any help or insight would be appreciated.
I have implemented geofencing using CLMonitor.
The implementation follows this general structure:
private var monitorTask: Task<Void, Never>?
private var backgroundSession: CLBackgroundActivitySession?
func start() async {
backgroundSession = CLBackgroundActivitySession()
monitorTask = Task {
do {
let monitor = await CLMonitor("monitor")
for try await event in await monitor.events {
handleEvent(event: event)
}
} catch {}
}
}
func addSpot() async {
let monitor = await CLMonitor("monitor")
let center = CLLocationCoordinate2D(latitude: 0, longitude: 0)
let condition = CLMonitor.CircularGeographicCondition(center: center, radius: 100)
await monitor.add(condition, identifier: "sample-1")
}
When the app is not task-killed, the code inside handleEvent executes as expected.
However, after a user-initiated task kill, the functionality does not work properly.
I have App uploaded to app store but it was rejected 5 times because of the location and photo permission purpose string the idea of the app is to add compliment , choose the location on the map and some info and the location is not associated to user identity it just to show his place on the map if the place he want to pick is near him he can refuse location permission the map can open and pick the point he want and need permission for images to add images with compliment , I need help for location and images permission purpose string.
Topic:
App & System Services
SubTopic:
Maps & Location
Hi,
I have develop the application in the react native. Now this application is related to truck drivers. So we have added load and when they accept the load then we fetch the location to firebase. Now issue is its not working when app close (background) on physical device. We tried on simulator and its working perfectly in the background.
But when i make the build and test on physical device its not working for background task.
Hello,
I'm currently migrating my app location service to use the new CLLocationUpdate.Updates.
I'm trying to understand what can fail in this AsyncSequence. Based on the previous CLError, I thought authorisation was one of them for example but it turns out that this is handled by the CLLocationUpdate where we can check different properties.
So, is there a list of errors available somewhere?
Thanks
Axel, @alpennec
Hi,
I'm building an aftermarket solution to enable Apple Maps to support EV routing for any EV.
I am going through the documentation and found some gaps - does anyone know how the following properties work?
INGetCarPowerLevelStatusIntentResponse - consumptionFormulaArguments
INGetCarPowerLevelStatusIntentResponse - chargingFormulaArguments
Is there a working example that anyone has seen?
Many thanks
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
CarPlay
SiriKit
Maps and Location
App Intents
Does anyone know how long it usually takes for us to hear back from Apple regarding a request for Location Push Service Extension entitlement?
我目前在做一个防止设备使用的位置是虚拟的,目前通过苹果的api无法直接判断位置来源于真实或虚拟。请问我该如何去判断它呢?
Topic:
App & System Services
SubTopic:
Maps & Location
I'm following:
https://developer.apple.com/documentation/applemapsserverapi/creating-a-maps-identifier-and-a-private-key#Create-a-Maps-ID
to create map id and private key. On step #4 I can't find "Maps IDs checkbox" on the web page, blow is the screen capture which contains all options I have on my page:
Context: Currently in iOS, both “Allow Once” and “While Using the App” location permission decisions yield .authorizedWhenInUse. This conflation prevents apps from knowing whether the user has provided a one-time allowance or a persistent in-use allowance.
Problem Statement
Ambiguous App Behavior: After a user selects “Allow Once,” the app remains in .authorizedWhenInUse, making it appear to the developer as if the user granted a more persistent “While Using” permission.
Poor User Experience: If the user later indicates they want to upgrade to “Always,” developers must guess whether iOS will show another system prompt. This can lead to “dead” button presses or pointless transitions to Settings.
Lack of Transparency: The user’s real intention—“I only trust you this one time”—gets lost in .authorizedWhenInUse with no direct or synchronous detection mechanism.
Why This Wouldn’t Violate SRP
The CLLocationManager’s` Single Responsibility: Manage and expose the user’s current location authorization state.
Adding .authorizedOneTime or an isOneTime property fits neatly into that responsibility. It’s still describing the user’s level of trust for location usage, just with more specificity.
No Overreach: This doesn’t add new logic outside location permissions—it merely refines the existing state definitions for clarity.
Simplifies the Developer Flow: Instead of co-mingling “Allow Once” and “While Using,” the system returns the precise state, letting developers handle transitions more gracefully while abiding by iOS’s privacy rules.
Benefits
Improved UX: Developers can present more accurate prompts or guidance. If .authorizedOneTime, the app can immediately direct the user to Settings for a persistent upgrade, rather than futilely calling requestAlwaysAuthorization() again.
Less Confusion: A distinctly reported “Allow Once” state eliminates guesswork, polling, or timed approaches that degrade user experience.
Consistent with iOS’s Privacy Focus: Providing a read-only flag or status for “One Time” aligns with Apple’s approach to clarity around permissions, without letting apps forcibly bypass user intentions.
Topic:
App & System Services
SubTopic:
Maps & Location