Discuss Swift.

Swift Documentation

Post

Replies

Boosts

Views

Activity

Visionkit can lift a subject. But the bounding rectangle is always returning x,y,width,height values as 0,0,0,0
In our app, we needed to use visionkit framework to lift up the subject from an image and crop it. Here is the piece of code: if #available(iOS 17.0, *) { let analyzer = ImageAnalyzer() let analysis = try? await analyzer.analyze(image, configuration: self.visionKitConfiguration) let interaction = ImageAnalysisInteraction() interaction.analysis = analysis interaction.preferredInteractionTypes = [.automatic] guard let subject = await interaction.subjects.first else{ return image } let s = await interaction.subjects print(s.first?.bounds) guard let cropped = try? await subject.image else { return image } return cropped } But the s.first?.bounds always returns a cgrect with all 0 values. Is there any other way to get the position of the cropped subject? I need the position in the image from where the subject was cropped. Can anyone help?
1
0
899
Dec ’23
Strange visionOS Simulator
I found that my visionOS Simulator is very strange. Many functions and features are missing. For example, I learned from the Internet that the immersive scenes of Environments in their visionOS Simulator can be opened, but I click There was no response after the attack. There are not only these, but also many system features. I saw on the Internet that other developers have them, and I am missing. I'm worried that this will have an impact on me when testing my app. May I ask why? Some information: My updated Xcode version is the latest Xcode15.1Beta. Device: iMac (2021) Simulator system number: 21N305
1
0
673
Jan ’24
OpenAPI Swift Generator
Hello, I'm using the generator to create a client from this API: https://github.com/griptape-ai/griptape-cloud-control-plane/blob/main/models/Griptape.openapi.json Everything seems fine until I go to call a method. I get an error on a DateTime format mismatch in the response. Client error - cause description: 'Unknown', underlying error: DecodingError: dataCorrupted - at : Expected date string to be ISO8601-formatted. (underlying error: <nil>), Is there a decoder option or something I can attach to the generated client code to adjust for this?
2
0
874
Jan ’24
Swift and C++ In The Same Project
Hi -- I am attempting to use C++ and Swift in a single project but I am struggling with finding the proper way to do this. Ideally I want to have both my C++ and Swift code in the same project and not use a framework to keep them separate. As an example, how would I create an object of the following C++ class: class Foo { public: Foo() { // do stuff } } From what I read, it should be as simple as this: let foo = Foo() But Xcode says it Cannot find 'Foo' in scope. How do I import Foo into my Swift code? Is there a project setting that needs to be changed to automatically make my C++ classes available? Or do I need to create a Clang module (as described in this page: https://www.swift.org/documentation/cxx-interop/#importing-c-into-swift) to expose the C++ code to Swift? If so, where in my Xcode project should that go? I am using Xcode 15.2 on macOS 14.2.1. I have also set the C++ and Objective-C Interoperability setting for my project to C++/Objective-C++.
3
1
3.7k
Jan ’24
Using the same app group ID with another team?
Hi, Just a quick one. I am working with a client who doesn't share his team's credentials like certificates, mobile provisioning, etc. He even refused to add me as one of the developer in his Apple Dev account. So, I am creating a new scheme for me that will use my own personal team and app ID to build it. While the main app's original scheme is basically unusable since I don't have the credentials to build it. The client still needs it for his CI/CD though. Now, the app has a Notification Service extension that will share UserDefaults via App Group. When I try to create a container with the same group ID as his, it always failed. It seems like we can't use it because it has already been taken by the clent. How do I fix this so I can just change the scheme to switch between the client's and mine? Thanks.
1
0
1.5k
Jan ’24
How to get corresponding pixel from Object Detection
Hey guys! I'm building an app which detects cars via Vision and then retrieves the distance to said car by a synchronized depthDataMap. However, I'm having trouble finding the correct corresponding pixel in that depthDataMap. While the CGRect of the ObjectObservation ranges from 0 - 300 (x) and 0 - 600 (y), The width x height of the DepthDataMap is Only 320 x 180, so I can't get the right corresponding pixel. Any Idea on how to solve this? Kind regards
1
0
630
Jan ’24
Error -16831/START-TIME is too close to live
Hello everyone, I was playing a livestream when I received the error -16831/START-TIME is too close to live returned from the AVPlayerItemNewErrorLogEntry function. I don't know why the error is returned, please help me explain. Similarly, I am also getting error: -12888/Playlist File unchanged for longer than 1.5 * target duration , I also read error -12888 in the documentation page 170: https://docs.huihoo.com/apple/wwdc/ 2018/502_measuring_and_optimizing_hls_performance.pdf but still don't understand the reason. Hope you can help!
1
1
744
Jan ’24
Playing Timed Sound Effects in Background
Hi, I'm relatively new to iOS development and kindly ask for some feedback on a strategy to achieve this desired behavior in my app. My Question: What would be the best strategy for sound effect playback when an app is in the background with precise timing? Is this even possible? Context: I created a basic countdown timer app (targeting iOS 17 with Swift/SwiftUI.). Countdown sessions can last up to 30-60 mins. When the timer is started it progresses through a series of sub-intervals and plays a short sound for each one. I used AVAudioPlayer and everything works fine when the app is in the foreground. I'm considering switching to AVAudioEngine b/c precise timing is very important and the AIs tell me this would have better precision. I'm already setting "App plays audio or streams audio/video using AirPlay" in my Plist, and have configured: AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: .mixWithOthers) Curiously, when testing on my iPhone 13 mini, sounds sometimes still play when the app is in the background, but not always. What I've considered: Background Tasks: Would they make any sense for this use-case? Seems like not if the allowed time is short &amp; limited by the system. Pre-scheduling all Sounds: Not sure this would even work and seems like a lot of memory would be needed (could be hundreds of intervals). ActivityKit Alerts: works but with a ~50ms delay which is too long for my purposes. Pre-Render all SFX to 1 large audio file: Seems like a lot of work and processing time and probably not worth it. I hope there's a better solution. I'd really appreciate any feedback.
1
0
1k
Feb ’24
visionOS WKWebView FullScreen bug
I am having a problem when trying to implement a WKWebView in a window in VisionOS and that when trying to do it in full screen and exiting full screen, the size of the webview changes, becoming smaller or larger while the window remains the same size as before. I used webView.configuration.preferences.isElementFullscreenEnabled = true for enabling fullscreen mode. I made a simple code to test this. import SwiftUI @main struct TestApp: App { var body: some Scene { WindowGroup() { ContentView() } } } import SwiftUI import WebKit struct WebView: UIViewRepresentable { let url: URL func makeUIView(context: Context) -&gt; WKWebView { let wkwebView = WKWebView() wkwebView.configuration.preferences.isElementFullscreenEnabled = true let request = URLRequest(url: url) wkwebView.load(request) return wkwebView } func updateUIView(_ uiView: WKWebView, context: Context) { } } struct ContentView: View { var body: some View { WebView(url: URL(string: "https://glitch.com/~fullscreen-test")!) } }
3
0
1.3k
Feb ’24
ARKit face data in picture-in-picture (PIP) mode
Hi, I have an Iphone app that uses ARKit to track a user's face landmarks for the purpose of measuring a user's biomarkers over time. It takes several minutes to track, so it would be convenient for the user if they could use their phone for other things, while the ARKit session is running, such as scrolling reddit. Is it possible to use an ARKit session in picture-in-picture (PIP) mode, while the user has other apps open? If yes, how? If no, is there any other way to run an AR session and track face landmarks in an app that is not fullscreen in the foreground?
1
0
449
Feb ’24
Overlaying a UIControl over a SKSpriteNode
I'm currently using Metal to create a game board with floating balloons; each balloon is a SKSpriteNode with an image of a balloon attached. The user touches and drags the balloon to a second balloon, merging the two. Exactly how they get merged is based on input from the user. I have a UISegmentedControl that pops up where the user selects one of four responses and the merge occurs. Currently, the UISegmentedControl pops up in the middle of the game board; however, I would like it to overlay on top of the first balloon instead. I have tried this once the two balloons touch each other: bubble1.physicsBody!.velocity = CGVector(dx: 0, dy: 0) // Stopping the balloon requestView.frame.origin.x = bubble1.position.x requestView.frame.origin.y = bubble1.position.y Where requestView is a UIView (with the same dimensions of the balloon) with various subviews (including the UISegmentedControl) and bubble1 is the SKSpriteNode (balloon). However, when I add the requestView as a subview of the game board, it does not overlay on top of the SKSpriteNode (bubble1). In fact, each time I try it, it doesn't even seem to appear in the same space relative to the location of the bubble1. Any thoughts on what I might be doing wrong? Thanks!
1
0
825
Feb ’24
My app was rejected because it offers repair services for mobile devices not produced by Apple
Hello, My app was rejected because it offers repair services for mobile devices not produced by Apple. Could any body explain to me what's the problem here and how can I avoid app rejection due to it. My app is selling mobile spare parts for all kinds of mobiles it doesn't provide any kind of repair services, it just sell mobile spare parts and mobile maintenance tools like tweezers, does the problem is with maintenance tools?! This is apple rejection message: Guideline 2.3.10 - Performance - Accurate Metadata We noticed that your app includes functionality that isn't focused on the iPhone, iPad, Mac, Apple TV, or Apple Watch experience. Specifically, your app offers repair services for mobile devices not produced by Apple, which is not appropriate for the App Store.
2
0
473
Feb ’24
Xcode "Build documentation" not working with Swift Macro package in project
I have a workspace with my project and a Swift Macro. When I use the "Build Documentation" command the build fails with this error: fatal error: module map file '/Users/me/Library/Developer/Xcode/DerivedData/Project-fmdkuqlofexbqdhhitpgjnoqzyrz/Build/Intermediates.noindex/GeneratedModuleMaps-iphoneos/Macros.modulemap' not found Is there a way around this?
2
2
969
Feb ’24
SPM show-dependencies broken
I have a Package.swift // swift-tools-version: 5.9 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "SharedUI", defaultLocalization: "en_US", platforms: [.iOS(.v16)], products: [ .library( name: "SharedUI", targets: [ "AppTheme", ] ), ], dependencies: [ .package(url: "https://github.com/apple/swift-markdown.git", "0.2.0"..<"0.3.0"), ], targets: [ .target( name: "AppTheme", dependencies: [ .product(name: "Markdown", package: "swift-markdown"), ], path: "AppTheme" ), ] ) Run swift package show-dependencies shows error yuantong-macbookpro2:Downloads yuantong$ swift package show-dependencies Fetching https://github.com/apple/swift-markdown.git from cache Fetched https://github.com/apple/swift-markdown.git (0.67s) error: Couldn’t get the list of tags: fatal: cannot use bare repository '/Users/yuantong/Downloads/.build/repositories/swift-markdown-b692ce3c' (safe.bareRepository is 'explicit') which I think used to work before Xcode 15.
8
4
1.9k
Feb ’24
local doc save and iCloud data in same app
I have app that is using container for small settings data and iCloud for larger storage, but I also want to be able to save to local documents folder. When I do that the url that is created is not local docs, but: path: file:///var/mobile/Containers/Data/Application/85A8B8C9-C0C3-4843-A74C-5A951F593790/Documents/Dialog08:45,%2022%20Feb%202024 Here is code using to save to local docs folder. func saveDataToFile(data: String) { //file will be datetimedialog let formatter = DateFormatter() formatter.dateFormat = "HH:mm, d MMM y" var dateTime = Date.getCurrentDate() dateTime = formatter.string(from: Date.now) var saveFilename = "Dialog" saveFilename.append(dateTime) let path = getDocumentsDirectory().appendingPathComponent(saveFilename) print("path: \(path)") // let fileURL = URL(fileURLWithPath: data, relativeTo: path) do { try data.write(to: path, atomically: true, encoding: String.Encoding.utf8) } catch { print(error.localizedDescription) } } func getDocumentsDirectory() -> URL { let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask) return paths[0] }
1
0
669
Feb ’24
AVAudioRecorder currentTime different than AVPlayer.duration
I have a small audio app that records audio and then the user can play back. I am having an issue trying to display elapsed and total times. When I record the file, I use the audiorecorder.currentTime to get the recording length but when I load it in AVPlayer, it shows a different length. It is usually around 200ms off but not always. When comparing the raw files. the AVPlayer does seem to report the correct time (length). I am using a timer when recording that fires every 100ms since I don't think AVAudioRecorder has an observer like the AVPlayer, that updates the recording time using the audiorecorder.currentTime. I've checked out a number of things online and all of the example code I've found seems to have the same issue with the recorded time and player duration not being the same (The examples I have show the recorded time but never show the playback length but I manually loaded the file to check the playback length and it was always longer than the recorded display length) but there has to be a way to do this properly? Hopefully someone has some ideas or can tell me of a way they worked around this. I am recording using the following settings; let settings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 16000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ]
0
0
553
Feb ’24