SwiftUI List performance

39,832 results found

Post not yet marked as solved
0 Replies
21 Views
I am trying to store usdz files with SwiftData for now. I am converting usdz to data, then storing it with SwiftData My model import Foundation import SwiftData import SwiftUI @Model class Item { var name: String @Attribute(.externalStorage) var usdz: Data? = nil var id: String init(name: String, usdz: Data? = nil) { self.id = UUID().uuidString self.name = name self.usdz = usdz } } My function to convert usdz to data. I am currently a local usdz just to test if it is going to work. func usdzData() -> Data? { do { guard let usdzURL = Bundle.main.url(forResource: tv_retro, withExtension: usdz) else { fatalError(Unable to find USDZ file in the bundle.) } let usdzData = try Data(contentsOf: usdzURL) return usdzData } catch { print(Error loading USDZ file: (error)) } return nil } Loading the items @Query private var items: [Item] ... var body: some View { ... ForEach(items) { item in HStack { Model3D(?????) { model in model .resizable() .scaledToFit() } placeholder: { ProgressView() } } } ... } How can
Posted
by
Post not yet marked as solved
0 Replies
20 Views
I'm working on an app that does peer-to-peer communication between Apple devices. As far as I understand, the Network framework is a good choice for this. I have something that works, but I'm curious about the details of how this works and if I might somehow optimize this. My current understanding is that the best connection I can get between two devices is over AWDL. Is this true? If so, does Network use this? Can I ask it to use it preferentially? What kind of bandwidth and latency should I expect out of this, and are there any drawbacks to using it like power usage or transport limitations? If both devices are on the same LAN, I assume they can also talk to each other over Wi-Fi (or a wired connection if both are plugged in, I guess). If I use Bonjour service discovery, is this what I will be getting? What does Network do if the LAN network does not perform well? Will it swap the underlying connection if it figures out there is something better? I am not tied to any particular API or transport pro
Posted
by
Post not yet marked as solved
0 Replies
28 Views
I have an app that has the camera continuously running, as it is doing its own AI, have zero need for Apple'video effects, and am seeing a 200% performance hit after updating to Sonoma. The video effects are the heaviest stack trace when profiling my app with Instruments CPU profiler (see below). Is forcing your software onto developers not something Microsoft would do? Is there really no way to opt out? 6671 Jamscape_exp (23038) 2697 start_wqthread 2697 _pthread_wqthread 2183 _dispatch_workloop_worker_thread 2156 _dispatch_root_queue_drain_deferred_wlh 2153 _dispatch_lane_invoke 2146 _dispatch_lane_serial_drain 1527 _dispatch_client_callout 1493 _dispatch_call_block_and_release 777 __88-[PTHandGestureDetector initWithFrameSize:asyncInitQueue:externalHandDetectionsEnabled:]_block_invoke 777 -[VCPHandGestureVideoRequest initWithOptions:] 508 -[VCPHandGestureClassifier initWithMinHandSize:] 508 -[VCPCoreMLRequest initWithModelName:] 506 +[MLModel modelWithContentsOfURL:configuration:error:] 506 -[MLMod
Posted
by
Post not yet marked as solved
0 Replies
25 Views
I started to use Xcode Cloud recently trying to understand how the whole build process etc. works. I created some workflows, integrated ci scripts to let fastlane create snapshots in the end, and everything seem to work while I was making progress step by step to get it up and running (struggling with the environment, etc.). Then last night it suddenly stopped working in the form, that Xcode shows the builds, but the last build still shows a spinner, although the job has finished already When I try to cancel the last run from Xcode (thought it was not finished) I get an error: Failed to Cancel Build 57. An internal error occurred while authenticating. Try again later. (FB13802231) When I open Manage Workflows…, Xcode suddenly shows This operation could not be completed and details reveal: The operation couldn’t be completed. ((extension in XcodeCloudKit):XcodeCloudAPI.Client.HTTPClientError error 0.) (FB13802952) Trying to access the Xcode Cloud tab of the app or general from users and permissions shows a whi
Posted
by
Post not yet marked as solved
1 Replies
It looks like your Spacer() should be inside the VStack, after the Text. You're currently telling SwiftUI to display a background image, then on top of that (because of the ZStack), display Text inside a VStack, but your Text is the only thing in the VStack so it just sits there in the centre. Then you're saying have a Spacer(), but this is inside the ZStack so it isn't pushing the Text up, it's just layered on top of the VStack. The way to fix it is to put the Spacer() in the VStack. You can see this clearly if you change your Spacer() to something like: Rectangle() .fill(Color.green) .frame(width: 100, height: 50) Let's see what it does in the different formats: ZStack { VStack { Text(What would you like to do?) .font(.title) .fontWeight(.bold) } .padding() // -- Outside of the VStack, as you currently have it: Rectangle() .fill(Color.green) .frame(width: 100, height: 50) // Spacer() } So, here, your Spacer() (the Rectangle()) is where your current Spacer() is, and it produces this: Now, moving the
Post marked as solved
1 Replies
I figured it out. You have to use mapCameraKeyframeAnimator(). @State private var centerCoordinate = CLLocationCoordinate2D(latitude: 38.9072, longitude: -77.0369) @State private var distance: CLLocationDistance = 1000000 @State private var triggerCamera = false Map(initialPosition: .camera(MapCamera(centerCoordinate: centerCoordinate, distance: distance))) { } .frame(height: geo.size.height * 0.60) .shadow(color: .black.opacity(0.5), radius: 1, y: 1) .onReceive(locationManager.$location, perform: { location in if let location { centerCoordinate = location triggerCamera = true } }) .mapCameraKeyframeAnimator(trigger: triggerCamera, keyframes: { camera in KeyframeTrack(MapCamera.centerCoordinate, content: { LinearKeyframe(centerCoordinate, duration: 1) }) KeyframeTrack(MapCamera.distance, content: { LinearKeyframe(300, duration: 1) }) }) Updating the trigger to true will start the animation, which moves to the provided location.
Post not yet marked as solved
2 Replies
43 Views
I am trying to load and view several locations onto a map from a JSOPN file in my SwiftUI project, but I continually encounter the error no exact matches in call to initializer in my ContentView.swift file. What I Am Trying to Do: I am working on a SwiftUI project where I need to display several locations on a map. These locations are stored in a JSON file, which I have successfully loaded into Swift. My goal is to display these locations as annotations on a Map view. JSON File Contents: coordinates: latitude and longitude name: name of the location uuid: unique identifier for each location Code and Screenshots: Here are the relevant parts of my code and the error I keep encountering: import SwiftUI import MapKit struct ContentView: View { @State private var mapPosition = MapCameraPosition.region( MKCoordinateRegion( center: CLLocationCoordinate2D(latitude: 37.7749, longitude: -122.4194), span: MKCoordinateSpan(latitudeDelta: 0.05, longitudeDelta: 0.05) ) ) @State private var featur
Posted
by
Post marked as solved
1 Replies
44 Views
Map(initialPosition: .camera(mapCamera)) { Marker(Here, coordinate: location) } .frame(height: 300) .clipShape(RoundedRectangle(cornerSize: CGSize(width: 10, height: 10))) .onMapCameraChange(frequency: .continuous) { cameraContext in locationManager.location = cameraContext.camera.centerCoordinate } .onReceive(locationManager.$location, perform: { location in if let location { mapCamera.centerCoordinate = location } }) class LocationDataManager: NSObject, CLLocationManagerDelegate, ObservableObject { enum LoadingState { case loading case notLoading case finished } static let shared = LocationDataManager() private let locationManager = CLLocationManager() @Published var location: CLLocationCoordinate2D? = nil @Published var loading: LoadingState = .notLoading override init() { super.init() locationManager.delegate = self } func resetLocation() { loading = .notLoading location = nil } func getLocation() { locationManager.requestLocation() loading = .loading } func locationManager(_ manager: CLLocationM
Posted
by
Post not yet marked as solved
1 Replies
35 Views
No matter what I have tried, I can't get my test What would you like to do? to appear at the top of the screen. What have I missed? Is something g to do with the ZStack? import SwiftUI import SwiftData struct ContentView: View { @Environment(.modelContext) private var context @Query private var readings: [Readings] @State private var readingTimeStamp: Date = Date() var body: some View { ZStack { Image(IPhone baqckgound 3) .resizable() .aspectRatio(contentMode: .fill) .padding(.top, 40) VStack { Text(What would you like to do?) .font(.title) .fontWeight(.bold) } .padding() Spacer() } } }
Posted
by
Post not yet marked as solved
0 Replies
41 Views
I'm having an issue where when my asset catalog have more than 2 images (all have @1x @2x and @3x and PNG format), my NSImage in my NSImageView cannot be clicked. Does anyone know why this happens? Thanks in advance! import SwiftUI struct ContentView: View { @State private var window: NSWindow? var body: some View { VStack { Button(Open Window) { // Create and show the NSWindow self.window = NSWindow( contentRect: NSScreen.main?.frame ?? NSRect.zero, styleMask: [.borderless], backing: .buffered, defer: false ) // Set up window properties self.window?.isOpaque = false self.window?.hasShadow = false self.window?.backgroundColor = .clear self.window?.level = .screenSaver self.window?.collectionBehavior = [.canJoinAllSpaces] self.window?.makeKeyAndOrderFront(nil) // Create an NSImageView let petView = PetView() // Add the NSImageView to the window's content view if let contentView = self.window?.contentView { contentView.addSubview(petView) // Center the petView petView.centerXAnchor.constraint(equalTo:
Posted
by
Post not yet marked as solved
1 Replies
I have a fully Swift & SwiftUI-based app with the following targets: Main app = com.abc.appname Watch app = com.abc.appname.watchkitapp So I'm not sure why your Apps are not connected unless I changed it to com.x.watchkitapp -> com.x.watch. Are you sure you've only tried to change the one bundle id? Have a look through all your targets' Build Settings and make sure everything is in order there. Xcode's settings and how to get everything to link together is sometimes akin to magic. I had massive issues with IntentsExtensions until Apple moved to AppIntents.
Post not yet marked as solved
0 Replies
43 Views
hello I am trying to detect the orientation of text in images. (each image has a label with a number but sometimes the the label is not in the right orientation and I would like two detect these cases and add a prefix to the image files) this code is working well but when the text is upside down it considers that the text is well oriented is it a way to distinguish the difference ? thanks for your help ! import SwiftUI import Vision struct ContentView: View { @State private var totalImages = 0 @State private var processedImages = 0 @State private var rotatedImages = 0 @State private var remainingImages = 0 var body: some View { VStack { Button(action: chooseDirectory) { Text(Choisir le répertoire des images) .padding() } Text(TOTAL: (totalImages)) Text(TRAITEES: (processedImages)) Text(ROTATION: (rotatedImages)) Text(RESTANT: (remainingImages)) } .padding() } func chooseDirectory() { let openPanel = NSOpenPanel() openPanel.canChooseDirectories = true openPanel.canChooseFiles = false openPanel.allowsM
Posted
by
Post not yet marked as solved
0 Replies
59 Views
I have an executable in macOS that I m launching as a User Agent. The same executable can be launched in multiple ways like either user can directly click the exe to launch it, or user can launch it from the terminal using ./ etc. One similar way is when the user launches the exe as a User Agent(i.e daemon in user session). In this scenarios, I want to identify in my exe If my user has launched it as agent to perform certain task. I wanted to know how can I accurately determine this? I have tried figuring out If there is some unique session that agents operate in, but I could not find anything. Can someone help here? Is this even possible?
Posted
by
Post not yet marked as solved
1 Replies
Actually, I want to add another one: (5) Runtime modifiable ReferenceImageLibrary for Image tracking Also, I want to mention that the reason I listed these features is because all of them are existing features on other systems like iOS and iPadOS, furthermore, some of those are purely software features, so very confused why they are not on Vision Pro/VisionOS
Post not yet marked as solved
1 Replies
71 Views
Why does this Regex Builder code in my SwiftUI app not work? I'm parsing a string that might be a date and time with either AM or PM specified for the time. This bit of code looks for the optional AM or PM. The error I get is: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions What would 'distinct sub-expressions' mean in this case? The code: let ampmRef = Reference() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One(am) One(pm) } } } transform: { $0.lowercase } }.ignoresCase() In a related question, is there a way to return a default if the ChoiceOf fails both AM and PM?
Posted
by