I have a Map in SwiftUI using MapKit and the map has several annotations and MapCircles added to it. I need to have the ability to center the map on a specific latitude and longitude. The issue is that the map instead is centering so that all annotations and MapCircles etc. are visible. How can I have it disregard items added to the map and center the map at a specific latitude and longitude and ideally, control the zoom level of the map also?
Search results for
Apple Maps Guides
149,609 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In iOS10 beta 2, the Map app uses a view controller that has three sizes (I'll refer to them as minmized, compact, and expanded) and supports a handle. Pull up on the handle when in compact mode and the view expands over the mapView and dims it. Pull down and the background view is slowly undimmed until you get to the compact size. Keep pulling down and the view is then minimized with just the search bar. A great look for the app I am working on.I am looking to use the system framework that does this or replicate it. Does anyone know if this is offered to developers to use? I cannot figure out what Apple used to support this. Any pointers would be appreciated.
Greetings, I'm attempting to refactor a SwiftUI application. From: IUViewRepresentable of MapView To: SwiftUI's native Map() view. However my application has to be able to react to the user's panning and zooming. In my UIViewRepresentable version, I added MKMapViewDelagate protocol to the Coordinator class, and create mapView(_ mapView:regionDidChangeAnimated) How can assign a delegate class to the SwiftUI native version to accomplish this? I've seen some posts use an init() method to adjust the appearance of the map with MKMapView.appearance(). Turns out this has a delegate property, but assigning a delegate here does not result in the mapView:regionDidChangeAnimated method being called...
This is frustrating. I started using google maps instead to avoid this issue with the apple mouse. It works as expected and scroll zooms on google maps.
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
There are a lot of condtions for app appearing as providing a service to the user to help guide him from A -> BNotably :To register your app as a directions provider:Configure your app to accept directions requests and declare a document type to handle incoming routing requests.Declare the map regions that your app supports using a geographic coverage file (specified using a GeoJSON file).Process direction request URLs when they are sent to your app.Did you check you did everything described in the document ?
Topic:
App & System Services
SubTopic:
Maps & Location
Tags:
So MKMapViewDelegate is no longer supported in watchOS 2. How, then, do I retrieve a map view to show in my watch app? Currently map is still supported as a drag-and-drop object in watchkit IB, but if you can't have a MKMapViewDelegate, nor import MapKit it seems like this is a useless object. Do you have to use WCSession to manage this? If so, how? I've been scouring around and have found nothing so far on how to show a live (updating) map view on the watch as of version 2.Thanks for the help!
Hello everyone, Recently I enabled the Maps capability in my iOS Xcode project and now I cannot disable it.iMac 14,3Xcode 8.2.1macOS Sierra 10.12.2
I was just playing around with the map program ,and it shows my location incorrectly anyone else noticed this ? A 6s + and a 7 show the same but a SE on 11.2.5 is correct.
The labels are stored in your mlmodel file. If you open the mlmodel in Xcode 12, it will display what those labels are. My guess is that instead of actual labels, your mlmodel contains CICAgICAwPmveRIJQWdsYWlzX2lv and so on. If that is the case, you can make a dictionary in the app that maps CICAgICAwPmveRIJQWdsYWlzX2lv and so on to the real labels, or you can replace these labels inside the mlmodel file by editing it using coremltools. (My e-book Core ML Survival Guide has a chapter on how to replace the labels in the model.)
Topic:
Machine Learning & AI
SubTopic:
Core ML
Tags:
Hi all.In the iBook Apple Education course App Development with Swift, there is this statement: “For example, reopen the documentation window (if it's not still open) and search for UIViewController again. Near the bottom of the search results, you'll see two sections called Guides and Sample Code.”Excerpt From: Apple Education. “App Development with Swift.” Apple Inc. - Education, 2017. iBooks. https://itunes.apple.com/ca/book/app-development-with-swift/id1219117996?mt=11Well... in my version of Xcode (9.2) under Os X (10.12.6), those sections are NOT there! And judging from the example provided in the book, I would think they would be VERY helpful!Where are they now???Thanks!
Are Maps turn alerts haptics and sounds working for anyone in watchOS 5?Cannot get to work for me.Tried unpairing and setting up as new - did not work
My issue is essentially the same described in MKUserLocation stops updating when map is touched, except for Mapkit JS. That question has been left without answer and the chat has been removed so I'm left wondering what the outcome was. In short, I create a Mapkit JS map, I set: map.tracksUserLocation = true; map.showsUserLocation = true; This shows and tracks the user's position on the map. Once I pan/zoom the map map.tracksUserLocation is set to false (as it should). However the user's dot stops updating entirely as well. I have ran some testing and it looks that if I initially only set map.showsUserLocation to true but don't specify wether the map should track the user's location, the user's dot will simply not show up at all. I can override this behaviour by constantly setting both variables to true, but as you can imagine the experience is awful. My code is essentially the bare basics. const map = new mapkit.Map('map'); map.tracksUserLocation
I want to know are depth map and RGB image are perfectly aligned(do both have the same principle point)? If yes then how the depth- map is created. The depth map on iphone12 has 256x192 resolution as opposed to an RGB image (1920x1440). I am interested in exact pixel-wise depth. Is it possible to get the raw depth map of 1920x1440 resolution ? How is the depth-map is created at 256 x 192 resolution? Behind the scenes does the pipeline captures it at 1920 x1440 resolution and then resize it to 256x192? I have so many questions as there are no intrinsic, extrinsic, and calibration data given regarding the lidar. I would greatly appreciate it if someone can explain the steps from a computer-vision perspective. Many Thanks
Are external applications not suppose to work yet with maps such as yelp or zomato or is it a bug? The apps don't do what they say they can do with apple maps.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
How are debug maps saved in Mach-O files? I'm trying to essentially recreate the output of dsymutil -dump-debug-map.