Apple Maps Guides

126,103 results found

Post not yet marked as solved
2 Replies
394 Views
I am wondering where the source code of ld-prime is on Github. There are references to it in the github for dyld, but I can't find it anywhere. I have a suspicion that it uses LLVM for linking. I'm looking out of curiosity to see if Apple is slowly closing down its compile toolchain and maybe learn how Apple sped up the linking stage.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1k Views
Hi All, I am writing a small App for Apple Watch 7 where I am interested in using the accelerometer, gyro and magnetometer. But it seems as we can only sample the accelerometer as I am not receiving any data from the others sensors. When running a similar code on an Apple iPhone, I can read raw data from all the sensors. Does anyone know how we can read the gyro and magnetometer? In the Apple Watch 7 Specifications there seems the part of the watch hardware. I would appreciate any help.
Posted Last updated
.
Post not yet marked as solved
1 Replies
Can someone please make the answer to this conversation public? I have been trying to figure this out for a year. ITs such a simple task. capturedRoom.Surface.Walls defines an array of walls. I want to collect the color of the wall that is scanned, and then apply that color to an SCNBox, set with the dimensions of the wall, and replace the wall with my box, so I have a room which detects paint walls. similarly, I want to provide a sort of modelprovider for common floor types, and then map images of those floor types (wood, tile, of many varieties) to the diffuse contents of the floor, but when I try to set anything to diffuse contents of the model that is made by room plan, nothing happens. it remains white and blank. why? please, post the answer here. don take it private, because then when the next person is struggling with this question they have no way of figuring it out beyond waiting for the response to come in an email. ugh.
Post marked as solved
1 Replies
I figured it out. You have to use mapCameraKeyframeAnimator(). @State private var centerCoordinate = CLLocationCoordinate2D(latitude: 38.9072, longitude: -77.0369) @State private var distance: CLLocationDistance = 1000000 @State private var triggerCamera = false Map(initialPosition: .camera(MapCamera(centerCoordinate: centerCoordinate, distance: distance))) { } .frame(height: geo.size.height * 0.60) .shadow(color: .black.opacity(0.5), radius: 1, y: 1) .onReceive(locationManager.$location, perform: { location in if let location { centerCoordinate = location triggerCamera = true } }) .mapCameraKeyframeAnimator(trigger: triggerCamera, keyframes: { camera in KeyframeTrack(MapCamera.centerCoordinate, content: { LinearKeyframe(centerCoordinate, duration: 1) }) KeyframeTrack(MapCamera.distance, content: { LinearKeyframe(300, duration: 1) }) }) Updating the trigger to true will start the animation, which moves to the provided location.
Post marked as solved
1 Replies
75 Views
Map(initialPosition: .camera(mapCamera)) { Marker(Here, coordinate: location) } .frame(height: 300) .clipShape(RoundedRectangle(cornerSize: CGSize(width: 10, height: 10))) .onMapCameraChange(frequency: .continuous) { cameraContext in locationManager.location = cameraContext.camera.centerCoordinate } .onReceive(locationManager.$location, perform: { location in if let location { mapCamera.centerCoordinate = location } }) class LocationDataManager: NSObject, CLLocationManagerDelegate, ObservableObject { enum LoadingState { case loading case notLoading case finished } static let shared = LocationDataManager() private let locationManager = CLLocationManager() @Published var location: CLLocationCoordinate2D? = nil @Published var loading: LoadingState = .notLoading override init() { super.init() locationManager.delegate = self } func resetLocation() { loading = .notLoading location = nil } func getLocation() { locationManager.requestLocation() loading = .loading } func locationManager(_ manager: CLLocationM
Posted
by Xavier-k.
Last updated
.
Post not yet marked as solved
2 Replies
83 Views
I am trying to load and view several locations onto a map from a JSOPN file in my SwiftUI project, but I continually encounter the error no exact matches in call to initializer in my ContentView.swift file. What I Am Trying to Do: I am working on a SwiftUI project where I need to display several locations on a map. These locations are stored in a JSON file, which I have successfully loaded into Swift. My goal is to display these locations as annotations on a Map view. JSON File Contents: coordinates: latitude and longitude name: name of the location uuid: unique identifier for each location Code and Screenshots: Here are the relevant parts of my code and the error I keep encountering: import SwiftUI import MapKit struct ContentView: View { @State private var mapPosition = MapCameraPosition.region( MKCoordinateRegion( center: CLLocationCoordinate2D(latitude: 37.7749, longitude: -122.4194), span: MKCoordinateSpan(latitudeDelta: 0.05, longitudeDelta: 0.05) ) ) @State private var featur
Posted
by deatour.
Last updated
.
Post not yet marked as solved
2 Replies
The error is telling you that there is no such form of Map that matches what you've entered. If you remove showsUserLocation: true it works, because that form of Map() is valid. If you want to show the user's location you need to use UserAnnotation(), i.e.; Map(position: $mapPosition, interactionModes: .all) { UserAnnotation() // <<-- Use this ForEach(features) { feature in Marker(coordinate: feature.coordinate) { FeatureAnnotation(feature: feature) } } }
Post not yet marked as solved
5 Replies
1.8k Views
The Safari version for VisionOS (or spatial computing) supports WebXR, as reported here. I am developing a Web App that intends to leverage WebXR, so I've tested several code samples on the safari browser of the Vision Pro Simulator to understand the level of support for immersive web content. I am currently facing an issue that seems like a bug where video playback stops working when entering an XR session (i.e. going into VR mode) on a 3D web environment (using ThreeJS or similar). There's an example from the Immersive Web Community Group called Stereo Video (https://immersive-web.github.io/webxr-samples/stereo-video.html) that lets you easily replicate the issue, the code is available here. It's worth mentioning that video playback has been successfully tested on other VR platforms such as the Meta Quest 2. The issue has been reported in the following forums: https://discourse.threejs.org/t/videotexture-playback-html5-videoelement-apple-vision-pro-simulator-in-vr-mode-not-playing/53374 https://bug
Posted Last updated
.
Post not yet marked as solved
29 Replies
Hello, Anyone already found a solution for this? Currently experiencing the same issue, submitted a build in testflight but unable to install not sure what to do apple support is still not responding.
Post not yet marked as solved
29 Replies
18k Views
When I Archive and upload to app store connect to test in TestFlight. It cannot download app and show Cloud not install [App] The requested app is not available or doesn't exist. and When I look at previous Builds. It show same message in all build version. It happen after renew apple development programs.
Posted
by ts.dstws.
Last updated
.
Post not yet marked as solved
2 Replies
679 Views
Hi everyone, I use new feature mergeable libraries in XCode 15 beta 2 to create xcframework. Follow the instruction of https://developer.apple.com/documentation/xcode/configuring-your-project-to-use-mergeable-libraries. I can create the megrable framework for devices (ios-arm64) and it can use in another project. But for the megrable framework for simulators (ios-arm64_x86_64-simulator). It can't use this framework for another project and XCode throw the error: Showing Recent Messages arm64-apple-ios-simulator.private.swiftinterface:6:8: No such module 'Framework3' 'Framework3' is linker framework, inspite of it already have this framework in the ReexportedBinaries of the mergeble framework. Structure for mergeable framework: - Mergalble.framework --- ReexportedBinaries ------- Framework3.framework It is have the same issue with the xcframework. So, is it the issue with XCode 15 beta for mergeable framework of simulator and xcframework? Do we have to add any specifice agrument for create xcframework
Posted Last updated
.
Post not yet marked as solved
1 Replies
I have a fully Swift & SwiftUI-based app with the following targets: Main app = com.abc.appname Watch app = com.abc.appname.watchkitapp So I'm not sure why your Apps are not connected unless I changed it to com.x.watchkitapp -> com.x.watch. Are you sure you've only tried to change the one bundle id? Have a look through all your targets' Build Settings and make sure everything is in order there. Xcode's settings and how to get everything to link together is sometimes akin to magic. I had massive issues with IntentsExtensions until Apple moved to AppIntents.
Post not yet marked as solved
1 Replies
84 Views
Im very socked after waiting more when 2 days and still no one contact whit me for charge money for enroll program. Support on apple is same worst write 1-2 business days 2 week not have reply. Im thing google is bad but him support reply 5-6 hrs and have live chat but apple nothing, no phone, no live chat only email and no one look at it. First im download apple dev im try to enroll from app im scan personal ID and im going on step to pay im add debit card in apple id all information ok when try to pay write unknown error contact whit itunes. im delete card from apple id im add again but option from enroll in app is missing and im do whit website and write we contact 2 business days. 1 week pass.
Posted
by citybgsky.
Last updated
.
Post not yet marked as solved
1 Replies
from what I can see, apple has long abandoned the support program and the enroll program for developers, and I have no answer here.I am very disappointed