Search results for

SwiftUI List performance

50,605 results found

Post

Replies

Boosts

Views

Activity

Reply to Listing files of a background asset
Hello! There’s not currently an API to do that directly, but you can achieve the same thing with AssetPackManager.url(for:). If you pass a directory path, such as NAV/, to that method, then you can use the standard methods on FileManager from Foundation, including contentsOfDirectory(atPath:), to explore the contents of the directory at the returned URL. For example: let url = try AssetPackManager.shared.url(for: NAV) let itemsInDirectory = try FileManager.default.contentsOfDirectory(atPath: url.path(percentEncoded: false)) for itemName in itemsInDirectory { let fullURL = url.appending(component: itemName) // Do something with the item at fullURL… } If a dedicated method in Background Assets to list the contents of a directory inside your asset packs would be useful, then please file a feedback report with details about your use case (what you mentioned in your original post should suffice) and reply to this thread with the feedback ID. Let us know if you need further assistance!
Topic: App & System Services SubTopic: General Tags:
Jan ’26
SwiftUI .task does not update its references on view update
I have this sample code import SwiftUI struct ContentView: View { var body: some View { ParentView() } } struct ParentView: View { @State var id = 0 var body: some View { VStack { Button { id+=1 } label: { Text(update id by 1) } TestView(id: id) } } } struct TestView: View { var sequence = DoubleGenerator() let id: Int var body: some View { VStack { Button { sequence.next() } label: { Text(print next number).background(content: { Color.green }) } Text(current id is (id)) }.task { for await number in sequence.stream { print(next number is (number)) } } } } final class DoubleGenerator { private var current = 1 private let continuation: AsyncStream.Continuation let stream: AsyncStream init() { var cont: AsyncStream.Continuation! self.stream = AsyncStream { cont = $0 } self.continuation = cont } func next() { guard current >= 0 else { continuation.finish() return } continuation.yield(current) current &*= 2 } } the print statement is only ever executed if I don't click on the update id by 1 button.
1
0
174
Jan ’26
Reply to SwiftUI .task does not update its references on view update
Attempting to follow your application and its current objective when tapping the button labeled that returns without incrementing the counter. However, the previous view does increment the counter, but that is not the button you are referring to. So this will always return without incrementing and that's the button you showing: guard current >= 0 else { continuation.finish() return } This is not the button pressed that does what you want: Button { id+=1 } label: { Text(update id by 1) } But just let me know what's the goal and I'm sure many developers may help you to provide you the swiftUI code you need. Albert Pascual
  Worldwide Developer Relations.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jan ’26
Issue with SwiftPM and multiple targets
Hi! I have a bigger Xcode project that has multiple targets: the main app (MainApp) helper command line tool (HelperTool) The project also lists multiple package dependencies via SwiftPM and these dependencies have dependencies between themselves. One of such packages produces a dynamic library MyFramework which is a dependency for both the main app and the HelperTool which has it listed under Build Phases > Dependencies as well as under Link with Frameworks. This builds just fine, but the issue somes when I want to add another target called AdditionalHelperTool which it has pretty much the same dependencies as HelperTool. When I add this second target, I start running into issues like the following: Multiple commands produce '[...]/Build/Products/Debug/Frameworks/MyFramework.framework/Versions/A' Target 'HelperTool' (project 'MyApp') has copy command from '[...]/Build/Products/Debug/PackageFrameworks/MyFramework.framework' to '[...]/Build/Products/Debug/Frameworks/MyFramework.framework'
3
0
137
Jan ’26
Reply to Can LiveActivityIntent open the app when tapping a Live Activity button on Lock Screen & Dynamic Island expanded view?
Thank you for your post. You have accurately observed the behavior of Live Activities! It is not possible to open an app using a LiveActivity. A LiveActivityIntent is designed for background execution. Its purpose is to perform a specific action within your app’s process without necessarily bringing the app to the foreground or displaying its UI. This is why push notifications are driven. In my opinion, this is an intentional design. A LiveActivityIntent focuses on providing quick, actionable interactions. If every tap on a LiveActivity button brought the app to the foreground, it could lead to a disruptive user experience. For the url parameter in your activityButton action, use a Universal Link. Universal Links open directly in your app if installed. If your app is not installed, they can gracefully fall back to a web page for installation. I hope this information is helpful. But looking for other developers ideas and recommendations as well. Albert Pascual
  Worldwide Developer Relations.
Jan ’26
Hybrid Wired-to-Wireless Audio Mode Using AirPods Charging Case
Many Apple users own both Bluetooth earphones (AirPods) and traditional wired earphones. While Bluetooth audio provides freedom of movement, some users still prefer wired earphones for comfort, sound profile, or personal preference. However, plugging wired earphones directly into an iPhone can feel restrictive and inconvenient during daily use. This proposal suggests a hybrid audio approach where wired earphones can be connected to a Bluetooth-enabled AirPods charging case (or a similar Apple-designed module), allowing users to enjoy wired earphones without a physical connection to the iPhone. #Problem Statement *Wired earphones offer consistent audio quality and zero latency *Bluetooth earphones provide freedom from cables *Users must currently choose one or the other *Plugging wired earphones into an iPhone limits movement and can feel intrusive in daily scenarios (walking, commuting, working) There is no native Apple solution that allows wired earphones to function wirelessly while maintaining Apple’s audi
1
0
291
Jan ’26
Accessible Speech Practice App - R Helper Launch
Hi Community, I'm excited to share R Helper, a speech practice app I built with accessibility as the core focus from day one. App Store: https://apps.apple.com/app/speak-r-clearly/id6751442522 WHY I BUILT THIS I personally struggled with R sound pronunciation growing up. It affected my confidence in school and job interviews. That experience taught me how important accessible practice tools are. R Helper helps children and adults practice R sounds with full accessibility support. ACCESSIBILITY FEATURES IMPLEMENTED VoiceOver - complete navigation and feedback Voice Control - hands-free operation Dynamic Type - scales to large accessibility sizes Reduce Motion - respects user preference Dark Mode - user controllable High Contrast compatibility Differentiate Without Color THE CHALLENGE Most speech practice apps ignore accessibility. I wanted to change that and prove that specialized educational apps can be fully accessible. KEY FEATURES Works 100% offline, no internet needed Zero data collection, privacy first G
2
0
1.5k
Jan ’26
Does accessing multiple Keychain items with .userPresence force multiple biometric prompts despite reuse duration?
Hi everyone, I'm working on an app that stores multiple secrets in the Keychain, each protected with .userPresence. My goal is to authenticate the user once via FaceID/TouchID and then read multiple Keychain items without triggering subsequent prompts. I am reusing the same LAContext instance for these operations, and I have set: context.touchIDAuthenticationAllowableReuseDuration = LATouchIDAuthenticationMaximumAllowableReuseDuration However, I'm observing that every single SecItemCopyMatching call triggers a new FaceID/TouchID prompt, even if they happen within seconds of each other using the exact same context. Here is a simplified flow of what I'm doing: Create a LAContext. Set touchIDAuthenticationAllowableReuseDuration to max. Perform a query (SecItemCopyMatching) for Item A, passing [kSecUseAuthenticationContext: context]. Result: System prompts for FaceID. Success. Immediately perform a query (SecItemCopyMatching) for Item B, passing the same [kSecUseAuthenticationContext: context].
3
0
519
Jan ’26
Reply to Signing succeeds but validate fails with "Missing code-signing certificate"
[quote='872597022, davertay-j, /thread/812770?answerId=872597022#872597022, /profile/davertay-j'] there is mention of the first certificate is the one that matters - why? [/quote] The context here really matters: In a provisioning profile, the certificates act as an allowlist. So the profile holds a list of leaf certificates that it authorises. In a code signature, the certificates act as a chain of trust [1]. The first certificate is the leaf, the next is the one that issued the leaf, and so on until you get to a root. So when the trusted execution system evaluates code for execution, it checks whether the first certificate in the code signature is in the list of certificates in the profile. Regarding your original issue, when you check certificates it’s critical that you look at the serial number. That’s what matters when it comes to matching. If you double check that and the certificate that signed the code is in the profile then the next thing to check is whether this is the right type o
Jan ’26
AVSpeechSynthesizer & Bluetooth Issues
Hello, I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out. After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem? import SwiftUI import AVFoundation class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = TextToSpeechService() override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { speechSynthesizer.delegate = self do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth]) } catch { print(Failed
1
0
742
Jan ’26
Reply to NSHostingView stops receiving mouse events when layered above another NSHostingView (macOS Tahoe 26.2)
Thanks, but if you look at the demo code in my post: https://stackoverflow.com/questions/79862332/nshostingview-with-swiftui-gestures-not-receiving-mouse-events-behind-another-ns, I tried logging the hit tests for both the top and middle NSHostingViews. The logs confirm that clicks are reaching both views, but the middle layer’s button with the DragGesture still doesn’t respond. Do you know how your solution could be used here?
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jan ’26
NSHostingView stops receiving mouse events when layered above another NSHostingView (macOS Tahoe 26.2)
I’m running into a problem with SwiftUI/AppKit event handling on macOS Tahoe 26.2. I have a layered view setup: Bottom: AppKit NSView (NSViewRepresentable) Middle: SwiftUI view in an NSHostingView with drag/tap gestures Top: Another SwiftUI view in an NSHostingView On macOS 26.2, the middle NSHostingView no longer receives mouse or drag events when the top NSHostingView is present. Events pass through to the AppKit view below. Removing the top layer immediately restores interaction. Everything works correctly on macOS Sequoia. I’ve posted a full reproducible example and detailed explanation on Stack Overflow, including a single-file demo: Stack Overflow post: https://stackoverflow.com/q/79862332 I also found a related older discussion here, but couldn’t get the suggested workaround to apply: https://developer.apple.com/forums/thread/759081 Any guidance would be appreciated. Thanks!
4
0
351
Jan ’26
Defining a Foundation Models Tool with arguments determined at runtime
I'm experimenting with Foundation Models and I'm trying to understand how to define a Tool whose input argument is defined at runtime. Specifically, I want a Tool that takes a single String parameter that can only take certain values defined at runtime. I think my question is basically the same as this one: https://developer.apple.com/forums/thread/793471 However, the answer provided by the engineer doesn't actually demonstrate how to create the GenerationSchema. Trying to piece things together from the documentation that the engineer linked to, I came up with this: let citiesDefinedAtRuntime = [London, New York, Paris] let citySchema = DynamicGenerationSchema( name: CityList, properties: [ DynamicGenerationSchema.Property( name: city, schema: DynamicGenerationSchema( name: city, anyOf: citiesDefinedAtRuntime ) ) ] ) let generationSchema = try GenerationSchema(root: citySchema, dependencies: []) let tools = [CityInfo(parameters: generationSchema)] let session = LanguageModelSession(tools: tools, instructions:
2
0
1.2k
Jan ’26