Swift is a powerful and intuitive programming language for Apple platforms and beyond.

Posts under Swift tag

200 Posts

Post

Replies

Boosts

Views

Activity

Correct way to label TextField inside Form in SwiftUI
Hello everyone. I'm building a simple Form in a Multiplatform App with SwiftUI. Originally I had something like this. import SwiftUI struct OnboardingForm: View { @State var firstName: String = "" @State var lastName: String = "" @State var email: String = "" @State var job: String = "" @State var role: String = "" var body: some View { Form { TextField("First Name", text: $firstName, prompt: Text("Required")) TextField("Last Name", text: $lastName, prompt: Text("Required")) TextField("Email", text: $email, prompt: Text("Required")) TextField("Job", text: $job, prompt: Text("Required")) TextField("Role", text: $role, prompt: Text("Required")) } } } #Preview { OnboardingForm() } In macOS it looks ok but then in iOS it looks like this: and it's impossible to know what each field is for if all the prompts are the same. I tried adding LabeledContent around each text field and that solves it for iOS but then on macOS it looks like this: The labels are shown twice and the columns are out of alignment. I think I could get around it by doing something like this: #if os(iOS) LabeledContent { TextField("First Name", text: $firstName, prompt: Text("Required")) } label: { Text("First Name") } #else TextField("First Name", text: $firstName, prompt: Text("Required")) #endif but it seems to me like reinventing the wheel. Is there a "correct" way to declare TextFields with labels that works for both iOS and macOS?
2
0
896
Feb ’25
TextField using numberPad shows incorrectly shows autofill for one time code
If I show a textfield in my app and set nothing else on it but the following, The keyboard will show an autofill suggestion from a password manager for a one time code. textField.keyboardType = .numberPad In this case, the text field is for typing in a count, so iOS suggesting to autofill a one time code is incorrect. Setting textField.textContentType to nil has no affect on the behaviour. Prerequisites to reproduce an app with an associated domain an entry in a password manager with a one time code for the domain a textfield with keyboardType set to numberPad
Topic: UI Frameworks SubTopic: UIKit Tags:
2
0
643
Feb ’25
SwiftData integration for coexistence with CoreData Error: Persistent truncated
When integrating SwiftData for an already existing app that uses CoreData as data management, I encounter errors. When building the ModelContainer for the first time, the following error appears: Error: Persistent History (184) has to be truncated due to the following entities being removed (all Entities except for the 2 where I defined a SwiftData Model) class SwiftDataManager: ObservableObject { static let shared = SwiftDataManager() private let persistenceManager = PersistenceManager.shared private init(){} lazy var modelContainer: ModelContainer = { do { let storeUrl = persistenceManager.storeURL() let schema = Schema([ HistoryIncident.self, HistoryEvent.self ]) let modelConfig = ModelConfiguration(url: storeUrl) return try ModelContainer(for: schema, configurations: [modelConfig]) } catch { fatalError("Could not create ModelContainer: \(error)") } }() } @Model public class HistoryIncident { var missionNr: String? @Relationship(deleteRule: .cascade) var events: [HistoryEvent]? public init(){} } @Model class HistoryEvent { var decs: String? var timestamp: Date? init(){} } As soon as I call the following function. func addMockEventsToCurrentHistorie() { var descriptor = FetchDescriptor<HistoryIncident>() let key = self.hKey ?? "" descriptor.predicate = #Predicate { mE in key == mE.key } let historyIncident = try? SwiftDataManager.shared.modelContext.fetch(descriptor).first guard var events = historyIncident?.events else {return} events.append(contentsOf: createEvents()) } I get the error: CoreData: error: (1) I/O error for database at /var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite. SQLite error code:1, 'no such column: t0.Z1EVENTS' /var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite. SQLite error code:1, 'no such column: t0.Z1EVENTS' with userInfo of { NSFilePath = "/var/mobile/Containers/Data/Application/55E9D59D-48C4-4D86-8D9F-8F9CA019042D/Library/ Private Documents/appDatabase.sqlite"; NSSQLiteErrorDomain = 1; }
1
0
615
Feb ’25
labels appear as dimmed in UIView
In this setup, label do not show properly because of the textColor. Labels are defined in IB, in the following hierarchy: ViewController View Label 1 scrollView View Button Label 2 Buttons show properly, but labels, even though defined with default label color appear as if their alpha was 0.2. It is even worse in dark mode: I have checked the settings for the label and did not find anything anormal: I have tried to change label color to system.gray 2, to no avail. If I change to red, does not show in red in IB. Problem appears for both Label 1 (at the top level in the view) and Label 2
1
0
352
Feb ’25
Winner's visa issue
It was mentioned in the Swift Student Challenge that outstanding winners will have the opportunity to visit Apple Park in the United States. However, as a challenger from China who is not currently in the U.S., this means that if I receive the outstanding award, I will need to apply for a visa to travel to Apple Park. Since I am under 18, my guardian would also need to apply for a visa. Therefore, I would like to know if Apple provides visa assistance for outstanding winners and their guardians from China, or if we are responsible for applying for the visas on our own.
1
1
468
Feb ’25
SceneView selective draw since concurrency
I have used SceneKit for several years but recently have a problem where a scene with fewer than 50 nodes is partially drawn, i.e., some nodes are, some aren't, and greater than 50 nodes are always draw correctly. This seems to have happened since concurrency was introduced. (w.r.t. concurrency, I had been using DispatchQueue successfully before then.) Since all nodes (few or many) are constructed and implemented by the same functions etc. I'm baffled. When I print the node hierarchy all nodes are present whether few or many. SceneView() has [.rendersContinually] option selected. Every node created (few or many) has .opacity = 1.0, .isHidden = false I haven't tried setting-back the compiler version as that is not a long term solution, and I know the same code worked fine then.
8
0
694
Feb ’25
Run Time Issues with Swift/Core ML
Hello! I have a swift program that tracks the location of a ball (through the back camera). It seems to be working fine, but the only issue is the run time, particularly my concatenate, normalize, and argmax functions, which are meant to be a 1 to 1 copy of the PyTorch argmax function and the following python lines: imgs = np.concatenate((img, img_prev, img_preprev), axis=2) imgs = imgs.astype(np.float32)/255.0 imgs = np.rollaxis(imgs, 2, 0) inp = np.expand_dims(imgs, axis=0) # used to pass into model However, I need my program to run in real time and in an ideal world, I want it to run way under real time. Below is a run down of the run times that result from my code: Starting model inference Setup took: 0.0 seconds Resize took: 0.03741896152496338 seconds Concatenation took: 0.3359949588775635 seconds Normalization took: 0.9906361103057861 seconds Model prediction took: 0.3425499200820923 seconds Argmax took: 28.17007803916931 seconds Postprocess took: 0.054128050804138184 seconds Model inference took 29.934185028076172 seconds Here are the concatenateBuffers, normalizeBuffers, and argmax functions that I use: func concatenateBuffers(_ buffers: [CVPixelBuffer?]) -> CVPixelBuffer? { guard buffers.count == 3, let first = buffers[0] else { return nil } let width = CVPixelBufferGetWidth(first) let height = CVPixelBufferGetHeight(first) let targetChannels = 9 var concatenated: CVPixelBuffer? let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue] as CFDictionary CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32BGRA, attrs, &concatenated) guard let output = concatenated else { return nil } CVPixelBufferLockBaseAddress(output, []) defer { CVPixelBufferUnlockBaseAddress(output, []) } guard let outputData = CVPixelBufferGetBaseAddress(output) else { return nil } let outputPtr = UnsafeMutablePointer<UInt8>(OpaquePointer(outputData)) // Lock all input buffers at once buffers.forEach { buffer in guard let buffer = buffer else { return } CVPixelBufferLockBaseAddress(buffer, .readOnly) } defer { buffers.forEach { CVPixelBufferUnlockBaseAddress($0!, .readOnly) } } // Process each input buffer for (frameIdx, buffer) in buffers.enumerated() { guard let buffer = buffer, let inputData = CVPixelBufferGetBaseAddress(buffer) else { continue } let inputPtr = UnsafePointer<UInt8>(OpaquePointer(inputData)) let bytesPerRow = CVPixelBufferGetBytesPerRow(buffer) let totalPixels = width * height // Process all pixels in one go for this frame for i in 0..<totalPixels { let y = i / width let x = i % width let inputOffset = y * bytesPerRow + x * 4 let outputOffset = i * targetChannels + frameIdx * 3 // BGR order to match numpy outputPtr[outputOffset] = inputPtr[inputOffset + 2] // B outputPtr[outputOffset + 1] = inputPtr[inputOffset + 1] // G outputPtr[outputOffset + 2] = inputPtr[inputOffset] // R } } return output } func normalizeBuffer(_ buffer: CVPixelBuffer?) -> MLMultiArray? { guard let input = buffer else { return nil } let width = CVPixelBufferGetWidth(input) let height = CVPixelBufferGetHeight(input) let channels = 9 CVPixelBufferLockBaseAddress(input, .readOnly) defer { CVPixelBufferUnlockBaseAddress(input, .readOnly) } guard let inputData = CVPixelBufferGetBaseAddress(input) else { return nil } let shape = [1, NSNumber(value: channels), NSNumber(value: height), NSNumber(value: width)] guard let output = try? MLMultiArray(shape: shape, dataType: .float32) else { return nil } let inputPtr = inputData.assumingMemoryBound(to: UInt8.self) let bytesPerRow = CVPixelBufferGetBytesPerRow(input) let ptr = UnsafeMutablePointer<Float>(OpaquePointer(output.dataPointer)) let totalSize = width * height for c in 0..<channels { for idx in 0..<totalSize { let h = idx / width let w = idx % width let inputIdx = h * bytesPerRow + w * channels + c ptr[c * totalSize + idx] = Float(inputPtr[inputIdx]) / 255.0 } } return output } func argmax(_ array: MLMultiArray) -> MLMultiArray? { let shape = array.shape.map { $0.intValue } guard shape.count == 3, shape[0] == 1, shape[1] == 256, shape[2] == 230400 else { return nil } guard let output = try? MLMultiArray(shape: [1, NSNumber(value: 230400)], dataType: .int32) else { return nil } let ptr = UnsafePointer<Float>(OpaquePointer(array.dataPointer)) let outputPtr = UnsafeMutablePointer<Int32>(OpaquePointer(output.dataPointer)) let channelSize = 230400 for pos in 0..<230400 { var maxValue = -Float.infinity var maxIndex: Int32 = 0 for channel in 0..<256 { let value = ptr[channel * channelSize + pos] if value > maxValue { maxValue = value maxIndex = Int32(channel) } } outputPtr[pos] = maxIndex } return output } Are there any glaring areas of inefficiencies that can be reduced to allow for under real time processing whilst following the same logic as found in the python code exactly? Would using Obj-C speed things up for some reason? Are there any tools I can use so I don't have to write these functions myself? Additionally, in the classes init, function, I tried to check the compute units being used since I feel 0.34 seconds for a singular model prediction is also far too long, but no print statements are showing for some reason: init() { guard let loadedModel = try? BallTrackerModel() else { fatalError("Could not load model") } let config = MLModelConfiguration() config.computeUnits = .all guard let configuredModel = try? BallTrackerModel(configuration: config) else { fatalError("Could not configure model") } self.model = configuredModel print("model loaded with compute units \(config.computeUnits.rawValue)") } Thanks!
3
0
692
Feb ’25
Error throws while using the speech recognition service in my app
Recently I updated to Xcode 14.0. I am building an iOS app to convert recorded audio into text. I got an exception while testing the application from the simulator(iOS 16.0). [SpeechFramework] -[SFSpeechRecognitionTask handleSpeechRecognitionDidFailWithError:]_block_invoke Ignoring subsequent recongition error: Error Domain=kAFAssistantErrorDomain Code=1101 "(null)" Error Domain=kAFAssistantErrorDomain Code=1107 "(null)" I have to know what does the error code means and why this error occurred.
20
3
12k
Feb ’25
Overnightforecast data from weatherResponse.dailyForecast.forecast array object
Background: We are using Apple WeatherKit for an app to fetch future weather data specialty cloudCover and condition and pretipitationChance from the Overnightforecast data from weatherResponse.dailyForecast.forecast array object. Code Sample: On xcode on Swift and Swift UI using Apple WeatherKit task{ do{ let weatherResponse= try await self.weatherService.forecast(for: latLong) let cloudCover = weatherResponse.dailyForecast.forecast[0].overnightforecast.cloudCover Issue: While we are on debug mode, we can literally see overnightforecast.cloudCover data (as well as condition and pretipitationChance) is available under from index 0 to index 10 of weatherResponse.dailyForecast.forecast array object but when we are trying to assign this property, its throwing error as "overnightforecast object not present under weatherResponse.dailyForecast.forecast array object" Attachment:
3
0
362
Feb ’25
Quick look preview is closing sheet by accident on visionOS
Hello! I have a simple app that opens a sheet and when you press a button on the sheet it will open a quick look preview of a picture. That works great but when I exit the quick look preview it will close the sheet too. This seems like unexpected behavior because it doesn't happen on iOS. Any help is appreciated, thank you. Here is some simple repo: import QuickLook import SwiftUI struct ContentView: View { @State private var pictureURL: URL? @State private var openSheet = false var body: some View { Button("Open Sheet") { openSheet = true } .sheet(isPresented: $openSheet) { Button("Open Picture") { pictureURL = URL(fileURLWithPath: "someImagePath") } // When quick look closes it will close the sheet too. .quickLookPreview($pictureURL) } } } And here is a quick video:
1
1
297
Jan ’25
@Environment(\.dismiss) causes endless loop in iOS 18.2.1
Hello, I've been using the @Environment(\.dismiss) var dismiss in a SwiftUI app for the last 2 years which means it was working as expected in iOS 16, iOS 17 and for the most part iOS 18 up until iOS 18.2.1 in which it is causing an endless loop and eventually a crash. It seems to be something about using a the @Environment(\.dismiss) with a NavigationLink which seems to cause this issue. When I add a log in my swiftUI views with let _ = Self._printChanges(), I see the following printed out in a loop: CurrentProjectView: _dismiss changed. SurveyView: @self changed. Similar issues have been reported: https://forums.developer.apple.com/forums/thread/720803 https://forums.developer.apple.com/forums/thread/739512 Any idea how to resolve this ?
2
1
589
Jan ’25
Calling Async Functions in SwiftUI
Hi, I'd like to call an Async function upon a state change or onAppear() but I'm not sure how to do so. Below is my code: .onAppear() { if !subscribed { await Subscriptions().checkSubscriptionStatus() } } class Subscriptions { var subscribed = UserDefaults.standard.bool(forKey: "subscribed") func checkSubscriptionStatus() async { if !subscribed { await loadProducts() } } func loadProducts() async { for await purchaseIntent in PurchaseIntent.intents { // Complete the purchase workflow. await purchaseProduct(purchaseIntent.product) } } func purchaseProduct(_ product: Product) async { // Complete the purchase workflow. do { try await product.purchase() } catch { // Add your error handling here. } // Add your remaining purchase workflow here. } }
1
0
366
Jan ’25
Swift Student Challenge Playgrounds app in 3 min
I intend to participate in the Swift Student Challenge 25. I see Rules, It is mentioned that Playgrounds works should be a work that can be experienced in three minutes. However, my work does not meet this requirement. Create an interactive scene in an app playground that can be experienced within three minutes. Initially, my work was not intended for the Challenge but for the App Store. However, I decided to submit it to the Challenge, and my work and I met the requirements of the Challenge. Therefore, my work is a complete application, which makes it impossible for the judges to experience it within three minutes. It may take more time. Does this have any impact?
2
0
612
Jan ’25
Clarification on App Tracking Transparency (ATT) and Cookie Banner Integration
We are currently using Single Sign-On (SSO) for user authentication within our app, which is presented through a web view. This web view includes a cookie banner that allows users to either accept, reject all, or manage cookies. In some reviews, Apple suggests implementing App Tracking Transparency (ATT) if cookies are used. In other reviews, Apple may refer to guideline 5.1.2, which states: “Revise the app so that users are not required to enable tracking in order to access the app's content and functionality.” I have a few questions regarding the interaction between ATT and the cookie banner: 1 Is App Tracking Transparency required for the cookie banner?
If yes, iOS developers have no direct control over the cookies used on the webpage when the user selects "Ask App Not to Track" or "Allow". Despite this selection, the cookie banner still appears, prompting the user to accept or reject cookies. 2 How should App Tracking Transparency be implemented when a cookie banner is presented on a web page within an iOS app?
Since iOS developers do not have control over the cookies stored in the web view, is there a way to manage this interaction so that users aren't repeatedly prompted by the cookie banner after selecting their tracking preference in ATT? I would appreciate any guidance you can provide on how to properly implement ATT in this scenario, particularly when a web page within the app displays a cookie consent banner.
2
2
677
Jan ’25
Scrolling to a `Section` cuts off header
Hello I was wondering if this is expected behavior or if there is a way I can fix this to get the behavior I am expecting. I have a Form that has many sections in it and when the content of the section is selected I would like that section to be scrolled to the top to make it easier for the user to know that they selected that section. But when the Section is selected, it is anchored to the top of the form but the header of the Section is cut off. When the Section is anchored to the top I would like the whole section to be seen (the header, content, and footer). I also tried applying an ID to the section and using that to scroll to and that also didn't work. Any help would be appreciated. Here is some code to repo this: struct ContentView: View { @State private var selectionSectionContent: SectionContent? var body: some View { ScrollViewReader { proxy in Form { ForEach(contents, id: \.self) { content in Section { Text(content.text) .onTapGesture { selectionSectionContent = content } } header: { Text("Header") } footer: { Text("Footer") } } } .onChange(of: selectionSectionContent) { _, newValue in if let newValue { // When text is tapped, scroll that section to the top. withAnimation { proxy.scrollTo(newValue, anchor: .top) } } } .padding() } } let contents: [SectionContent] = [ SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent(), SectionContent() ] } class SectionContent: Hashable { let text = "Fun Section" public var id: ObjectIdentifier { ObjectIdentifier(self) } static func == (lhs: SectionContent, rhs: SectionContent) -> Bool { lhs.id == rhs.id } func hash(into hasher: inout Hasher) { hasher.combine(id) } } Here is a GIF of the header getting cut off when it is pinned to the top.
1
2
358
Jan ’25
Best `AVMediaType` for depth data.
Dear Apple Developer Forum, I have a question regarding the AVCaptureDevice on iOS. We're trying to capture photos in the best quality possible along with depth data with the highest accuracy possible. We were delighted when we saw AVCaptureDevice could be initialized with the AVMediaType=.depthData which works as expected (depthData is a part of the AVCapturePhoto). When setting to AVMediaType=.video, we still receive depth data (of same quality according to our own internal tests). That confused us. Mind you, we set the device format and depth format as well: private func getDeviceFormat() throws -> AVCaptureDevice.Format { // Ensures high video format and an appropriate color profile. let format = camera?.formats.first(where: { $0.isHighPhotoQualitySupported && $0.supportedDepthDataFormats.count > 0 && $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange }) // Check and see if it's available. guard format != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } return format! } private func getDepthDataFormat(for format: AVCaptureDevice.Format) throws -> AVCaptureDevice.Format { // Access the depth format. let depthDataFormat = format.supportedDepthDataFormats.first(where: { $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_DepthFloat32 }) // Check if it exists guard depthDataFormat != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } // Returns it. return depthDataFormat! } We're wondering, what steps we can take to ensure the best quality photo, along with the most accurate depth data? What properties are the most important, which have an effect, which don't? Are there any ways we can optimize our current configuration? We find it difficult as there's very limited guides and explanations on the media subtypes, for example kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. Is it the best? Is it the best for our use case of high quality photo + most accurate depth data? Important comment: Our App only runs on iPhone 14 Pro, iPhone 15 Pro, iPhone 16 Pro on the latest iOS versions. We hope someone with greater knowledge at Apple can help us and guide us on how we can have the photos of best quality and depth data with most accuracy. Thank you very much! Kind regards.
0
0
364
Jan ’25
Content Filter Permission Prompt Not Appearing in TestFlight
I added a Content Filter to my app, and when running it in Xcode (Debug/Release), I get the expected permission prompt: "Would like to filter network content (Allow / Don't Allow)". However, when I install the app via TestFlight, this prompt doesn’t appear at all, and the feature doesn’t work. Is there a special configuration required for TestFlight? Has anyone encountered this issue before? Thanks!
1
0
274
Jan ’25
Using protocols with XPC C API instead of dictionaries for sending and receiving messages
I have followed this post for creating a Launch Agent that provides an XPC service on macOS using Swift- post link - https://rderik.com/blog/creating-a-launch-agent-that-provides-an-xpc-service-on-macos/ In the swift code the interface of the XPC service is defined by protocols which makes the code nice and neat. I want to implement the XPC service using C APIs for XPC, and C APIs send and receive messages using dictionaries, which need manual handling with conditional statements. I want to know if its possible to go with the protocol based approach with C APIs.
2
0
454
Jan ’25