Whenever I start editing TextField or while editing TextField, Xcode shows this worning, and takes a few seconds to show the keyboard.
There is no 'availabilityDetailedInfo' in my source code, and I could not find similar errors on the internet.
Can't find or decode availabilityDetailedInfo
unavailableReasonsHelper: Failed to get or decode availabilityDetailedInfo
Can't find or decode reasons
unavailableReasonsHelper: Failed to get or decode unavailable reasons as well
Can't find or decode availabilityDetailedInfo
unavailableReasonsHelper: Failed to get or decode availabilityDetailedInfo
Can't find or decode reasons
unavailableReasonsHelper: Failed to get or decode unavailable reasons as well
SwiftUI
RSS for tagProvide views, controls, and layout structures for declaring your app's user interface using SwiftUI.
Posts under SwiftUI tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
import SwiftUI
struct Product: Identifiable {
let id = UUID()
let name: String
let pricePerKg: Double
}
struct ContentView: View {
@State private var selectedProduct: Product?
@State private var quantity: Double = 1.0
@State private var orderDate = Date()
@State private var showingConfirmation = false
let products = [
Product(name: "Lamb", pricePerKg: 15.0),
Product(name: "Beef", pricePerKg: 20.0),
Product(name: "Chicken", pricePerKg: 10.0)
]
var body: some View {
NavigationView {
Form {
Section(header: Text("Select Meat")) {
Picker("Meat Type", selection: $selectedProduct) {
ForEach(products) { product in
Text(product.name).tag(product as Product?)
}
}
}
if let selectedProduct = selectedProduct {
Section(header: Text("Quantity (kg)")) {
Stepper(value: $quantity, in: 0.5...10, step: 0.5) {
Text("\(quantity, specifier: "%.1f") kg")
}
}
Section(header: Text("Delivery Date")) {
DatePicker("Select Date", selection: $orderDate, in: Date()..., displayedComponents: .date)
}
Section(header: Text("Total Price")) {
Text("$\(selectedProduct.pricePerKg * quantity, specifier: "%.2f")")
}
Button("Confirm Order") {
showingConfirmation = true
}
.alert(isPresented: $showingConfirmation) {
Alert(title: Text("Order Confirmed"), message: Text("You have ordered \(quantity, specifier: "%.1f") kg of \(selectedProduct.name) for \(orderDate.formatted(date: .long, time: .omitted))."), dismissButton: .default(Text("OK")))
}
}
}
.navigationTitle("Halal Butcher")
}
}
}
@main
struct HalalButcherApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
Hey,
It seems that when Apple Intelligence is enabled, scrolling can become completely broken when using an app. This is affecting several apps, including telegram:
https://github.com/TelegramMessenger/Telegram-iOS/issues/1570?reload=1
It seems that UIPanGesture is affected by this. (MapKit stop being able to scroll too).
Killing + Relaunching the app fix the problem.
Bug report ID, containing a video: FB16780431
I am developing an app in swiftUI using Xcode 12.3, deployment target iOS 14.0. The launch screen is setup through info.plist by specifying 'background color' and 'image name'. The file used in 'image name' is from Assets catalog. (PNG format, size300 x 300 and corresponding @2x and @3x resolutions) What I have observed, when the app is installed for the first time the launch image is centered and have original resolutions but all subsequent launches show launch images stretched to cover full screen. Any ideas why this is happening and how to have more consistent behavior either way?
I have tried 'respect safe area' option but it does not make a difference.
Thank you.
I have a NSViewController as the root view and have a switui view embedded in it via NSHostingView.
override func loadView() {
self.view = NSHostingView(rootView: SwiftUiView())
}
}
In the SwiftUiView, I have a TextField and an NSTextView embedded using NSViewRepresentable, along with a few buttons. There is also a menu:
Menu {
ForEach(menuItems), id: \.self) { item in
Button {
buttonClicked()
} label: {
Text(item)
}
}
} label: {
Image("DropDown")
.contentShape(Rectangle())
.frame(maxWidth: .infinity)
.frame(maxHeight: .infinity)
}
The NSTextView and TextField work fine, and I can type in them until I click on the menu or show an alert. After that, I can no longer place my cursor in the text fields. I am able to select the text but not type in it. When I click on the NSTextView or TextField, nothing happens.
At first, I thought it was just a cursor visibility issue and tried typing, but I received an alert sound. I've been trying to fix this for a couple of days and haven't found any related posts. Any help would be greatly appreciated.
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then?
I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below.
self.configuration = PHPickerConfiguration()
configuration.filter = .any(of: filters)
configuration.selectionLimit = selectionLimit
if #available(iOS 15.0, *), allowOrdering {
configuration.selection = .ordered
}
configuration.preferredAssetRepresentationMode = .current
Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it.
Also I notice animation showing in first video then disappear.
This is an issue that occurred while using SwiftUI.
Cannot find '$state' in scope
The other view finds properties normally.
May I know why the error is occurring?
The following code is the full text of the code that causes problems.
import SwiftUI
@Observable
class HomeState {
var title: String = "Home"
}
struct HomeView: View {
@Binding var state: HomeState
var body: some View {
Text(state.title)
}
}
#Preview {
@Previewable @State var state: HomeState = .init()
HomeView(state: $state) /// Error: Cannot find '$state' in scope
}
The same error occurs when using the String type rather than the object.
What did I do wrong?
I have a simple SwiftUI project with two basic build configurations (Debug, Release) as shown below.
I now choose Build > Scheme > Edit Scheme under Product and select Release as the current build configuration as shown below.
And the Preview canvas exhibit errors.
If I click on the Diagnostics button, it says under PREVIEW UPDATE ERROR
OptimizationLevelError: not building -Onone
”BuildSchemeCrazyDaughter.app” needs -Onone Swift optimization level to use previews (current setting is -O)
What does that mean and why don't I get the preview for the Release build configuration? Thanks.
Perhaps I just have the wrong expectations, but I discovered some odd behavior from SwiftData that sure seems like a bug to me...
If you make any change to any SwiftData model object — even just setting a property to its current value — every SwiftUI view that uses SwiftData is rebuilt. Every query and every entity reference, even if the property was set on a model class that is completely unrelated to the view.
SwiftUI does such a good job of optimizing UI updates that it's hard to notice the issue. I only noticed it because the updates were triggering my debug print statements.
To double-check this, I went back to Apple's new iOS app template — the one that is just a list of dated items — and added a little code to touch an unrelated record in the background:
@Model
class UnrelatedItem {
var name: String
init(name: String) {
self.name = name
}
}
@main
struct jumpyApp: App {
var sharedModelContainer: ModelContainer = {
let schema = Schema([
Item.self,
UnrelatedItem.self
])
let modelConfiguration = ModelConfiguration(schema: schema, isStoredInMemoryOnly: false)
do {
return try ModelContainer(for: schema, configurations: [modelConfiguration])
} catch {
fatalError("Could not create ModelContainer: \(error)")
}
}()
init() {
let context = sharedModelContainer.mainContext
// Create 3 items at launch so we immediately have some data to work with.
if try! context.fetchCount(FetchDescriptor<Item>()) == 0 {
for _ in 0..<3 {
let item = Item(timestamp: Date())
context.insert(item)
}
}
// Now create one unrelated item.
let unrelatedItem = UnrelatedItem(name: "Mongoose")
context.insert(unrelatedItem)
try? context.save()
// Set up a background task that updates the unrelated item every second.
Task {
while true {
try? await Task.sleep(nanoseconds: 1_000_000_000)
Task { @MainActor in
// We don't even have to change the name or save the contxt.
// Just setting the name to the same value will trigger a change.
unrelatedItem.name = "Mongoose"
}
}
}
}
var body: some Scene {
WindowGroup {
ContentView()
}
.modelContainer(sharedModelContainer)
}
}
I also added a print statement to the ContentView so I could see when the view updates.
struct ContentView: View {
@Environment(\.modelContext) private var modelContext
@Query private var items: [Item]
var body: some View {
NavigationSplitView {
List {
let _ = Self._printChanges()
...
The result is that the print statement logs 2 messages to the debug console every second. I checked in iOS 17, 18.1, and 18.2, and they all behave this way.
Is this the intended behavior? I thought the whole point of the new Observation framework in iOS 17 was to track which data had changed and only send change notifications to observers who were using that data.
I am trying to implement "Live activity" to my app. I am following the Apple docs.
Link: https://developer.apple.com/documentation/activitykit/displaying-live-data-with-live-activities
Example code:
struct LockScreenLiveActivityView: View {
let context: ActivityViewContext<PizzaDeliveryAttributes>
var body: some View {
VStack {
Spacer()
Text("\(context.state.driverName) is on their way with your pizza!")
Spacer()
HStack {
Spacer()
Label {
Text("\(context.attributes.numberOfPizzas) Pizzas")
} icon: {
Image(systemName: "bag")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
Label {
Text(timerInterval: context.state.deliveryTimer, countsDown: true)
.multilineTextAlignment(.center)
.frame(width: 50)
.monospacedDigit()
} icon: {
Image(systemName: "timer")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
}
Spacer()
}
.activitySystemActionForegroundColor(.indigo)
.activityBackgroundTint(.cyan)
}
}
Actually, the code is pretty straightforward. We can use the timerInterval for count-down animation. But when the timer ends, I want to update the Live Activity view. If the user re-opens the app, I can update it, but what happens if the user doesn't open the app? Is there a way to update the live activity without using push notifications?
Using, the standard Apple example at https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui
I only made a small change to print when reading a file, with the time.
When you use 'revert to saved', it writes the current version (expected), then loads the saved version (expected), then a few seconds later (not moving the mouse, edits, etc.) it reloads the document again. Then if you click away from the window, it loads it yet again - four times!
This loading of the document twice breaks apps where the loading may take longer (large documents), then the document is replaced while the user has already started editing the recently loaded document.
This is a really bad bug. Any ideas?
Here is the added logs:
reading file! testfile.story at 2025-03-11 20:35:16 +0000
saving file! testfile.story at 2025-03-11 20:35:27 +0000
reading file! testfile.story at 2025-03-11 20:35:27 +0000
reading file! testfile.story at 2025-03-11 20:35:30 +0000
reading file! testfile.story at 2025-03-11 20:35:31 +0000
I see the same behavior with 'Revert To Last Opened'. It seems to work as expected when you browse all versions and pick a specific version.
struct ContentView: View {
var body: some View {
ScrollView(.vertical) {
LazyVStack(spacing: 0) {
ForEach(0..<10000) { index in
// If VStack remove, memory issue occur
// VStack {
CustomView(index: index)
// }
}
}
}
}
}
struct CustomView: View {
var index: Int
var body: some View {
VStack {
Text("\(index)")
}
}
}
I wrapped it into a shorter and simpler version, but it still works.
At first, I struggled to figure out why the initial code was causing lag. After investigating with the Debug Memory Graph, I found that the generated custom view’s memory was not being released properly.
This seemed strange because I was using the custom view inside a LazyHStack.
So, I tried various approaches to resolve the issue.
In the Debug Memory Graph, I started suspecting that SwiftUI’s built-in views like VStack and HStack might be affecting memory management. To test this, I wrapped my custom view inside a VStack, and the memory issue disappeared.
However, I want to understand why I need to include the custom view inside a VStack for proper memory management.
(I simplified this code by wrapping it into a shorter version. However, in a real project, the custom view is more complex, and the data list contains more than 10,000 items. This caused severe lag.)
xcode: 16.2, iOS 18, iOS 16
In a UIKit application, removing a view from the hierarchy is straightforward—we simply call myView.removeFromSuperview(). This not only removes myView from the UI but also deallocates any associated memory.
Now that I'm transitioning to SwiftUI, I'm struggling to understand the recommended way to remove a view from the hierarchy, given SwiftUI's declarative nature.
I understand that in SwiftUI, we declare everything that should be displayed. However, once a view is rendered, what is the correct way to remove it? Should all UI elements be conditionally controlled to determine whether they appear or not?
Below is an example of how I’m currently handling this, but it doesn’t feel like the right approach for dynamically removing a view at runtime.
Can someone guide me on the best way to remove views in SwiftUI?
struct ContentView: View {
@State private var isVisible = true
var body: some View {
VStack {
if isVisible { // set this to false to remove TextView?
Text("Hello, SwiftUI!")
.padding()
}
Button("Toggle View") {
...
}
}
}
}
Hello.
I have created the UIComponent using SwiftUI TextField.
struct SearchBar: View {
@Binding private var text: String
var body: some View {
HStack {
TextField("", text: $text, prompt: Text("Search"))
.textFieldStyle(.plain)
.padding()
.foregroundStyle(.white)
Button {
text = ""
} label: {
Image(systemName: "xmark")
.foregroundStyle(.black)
}
}
.padding(.horizontal, 8)
.background(RoundedRectangle(cornerRadius: 8).fill(.gray))
.padding(.horizontal, 8)
}
init(text: Binding<String>) {
_text = text
}
}
struct ParentView: View {
@State var text = ""
var body: some View {
SearchBar(text: $text)
}
}
A text of type String is binded to the component and the text is cleared when the xmark button on the right is pressed.
However, the Japanese text is not cleared under certain conditions.
Type in Hiragana, press xmark without pressing “confirm” on the keyboard → can be cleared.
Type Hiragana, press “confirm” on the keyboard, and then press xmark→Cannot be cleared.
Convert Japanese text to Kanji characters → can be cleared
Does anyone know of a workaround?
The Xcode I used is 16.0.
I am trying to use Broadcast upload extension but Broadcast picker starts countdown and stops (swiftUI).
Steps i followed.
added BroadcastUploadExtension as target
same app group for for main app and extension
added packages using SPM
i seems the extension functions are not getting triggered, i check using UIScreen.main.isCaptured also which always comes as false.
i tried Using Logs which never Appeared.
I am developing a macOS app using SwiftUI, and I am encountering an issue when launching the app at login. The app starts as expected, but the window does not appear automatically. Instead, it remains in the Dock, and the user must manually click the app icon to make the window appear.
Additionally, I noticed that the timestamp obtained during the app's initialization (init) differs from the timestamp obtained in .onAppear. This suggests that .onAppear does not trigger until the user interacts with the app. However, I want .onAppear to execute automatically upon login.
Steps to Reproduce
Build the app and add it to System Settings > General > Login Items as an item that opens at login.
Quit the app and restart the Mac.
Log in to macOS.
Observe that the app starts and appears in the Dock but does not create a window.
Click the app icon in the Dock, and only then does the window appear.
Expected Behavior
The window should be created and appear automatically upon login without requiring user interaction.
.onAppear should execute immediately when the app starts at login.
Observed Behavior
The app launches and is present in the Dock, but the window does not appear.
.onAppear does not execute until the user manually clicks the app icon.
A discrepancy exists between the timestamps obtained in init and .onAppear.
Sample Code
Here is a minimal example that reproduces the issue:
LoginTestApp.swift
import SwiftUI
@main
struct LoginTestApp: App {
@State var date2: Date
init(){
date2 = Date()
}
var body: some Scene {
WindowGroup {
MainView(date2: $date2)
}
}
}
MainView.swift
import SwiftUI
struct MainView: View {
@State var date1: Date?
@Binding var date2: Date
var body: some View {
Text("This is MainView")
Text("MainView created: \(date1?.description ?? "")")
.onAppear {
date1 = Date()
}
Text("App initialized: \(date2.description)")
}
}
Test Environment
Book Pro 13-inch, M1, 2020
macOS Sequoia 15.2
Xcode 16.2
Questions
Is this expected behavior in macOS Sequoia 15.2?
How can I ensure that .onAppear executes automatically upon login?
Is there an alternative approach to ensure the window is displayed without user interaction?
Hi all, I am looking for a futureproof way of getting the Screen Resolution of my display device using SwiftUI in MacOS. I understand that it can't really be done to the fullest extent, meaning that the closest API we have is the GeometeryProxy and that would only result in the resolution of the parent view, which in the MacOS case would not give us the display's screen resolution. The only viable option I am left with is NSScreen.frame.
However, my issue here is that it seems like Apple is moving towards SwiftUI aggressively, and in order to futureproof my application I need to not rely on AppKit methods as much. Hence, my question: Is there a way to get the Screen Resolution of a Display using SwiftUI that Apple itself recommends? If not, then can I rely safely on NSScreen's frame API?
Hello.
I am currently building an app using AR Kit.
As for the UI, I am using SwiftUI and NavigationStack + NavigationLink for navigation and screen transitions!
Here I need to go back and forth between the AR screen and other screens.
If the number of screen transitions is small, this is not a problem.
However, if the number of screen transitions increases to 10 or 20, it crashes somewhere.
We are struggling with this problem. (The nature of the application requires multiple screen transitions.)
The crash log showed the following.
error: read memory from 0x1e387f2d4 failed
AR_Crash_Sample-2025-03-07-115914.txt
Incident Identifier: B23D806E-D578-4A95-8828-2A1E8D6BB7F8
Beta Identifier: 924A85AB-441C-41A7-9BC2-063940BDAF32
Hardware Model: iPhone16,1
Process: AR_Crash_Sample [2375]
Path: /private/var/containers/Bundle/Application/FAC3D662-DB10-434E-A006-79B9515D8B7A/AR_Crash_Sample.app/AR_Crash_Sample
Identifier: ar.crash.sample.AR.Crash.Sample
Version: 1.0 (1)
AppStoreTools: 16C7015
AppVariant: 1:iPhone16,1:18
Beta: YES
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: ar.crash.sample.AR.Crash.Sample [1464]
Date/Time: 2025-03-07 11:59:14.3691 +0900
Launch Time: 2025-03-07 11:57:47.3955 +0900
OS Version: iPhone OS 18.3.1 (22D72)
Release Type: User
Baseband Version: 2.40.05
Report Version: 104
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: AR_Crash_Sample [2375]
Triggered by Thread: 7
Application Specific Information:
abort() called
Thread 7 name: Dispatch queue: com.apple.arkit.depthtechnique
Thread 7 Crashed:
0 libsystem_kernel.dylib 0x1e387f2d4 __pthread_kill + 8
1 libsystem_pthread.dylib 0x21cedd59c pthread_kill + 268
2 libsystem_c.dylib 0x199f98b08 abort + 128
3 libc++abi.dylib 0x21ce035b8 abort_message + 132
4 libc++abi.dylib 0x21cdf1b90 demangling_terminate_handler() + 320
5 libobjc.A.dylib 0x18f6c72d4 _objc_terminate() + 172
6 libc++abi.dylib 0x21ce0287c std::__terminate(void (*)()) + 16
7 libc++abi.dylib 0x21ce02820 std::terminate() + 108
8 libdispatch.dylib 0x199edefbc _dispatch_client_callout + 40
9 libdispatch.dylib 0x199ee65cc _dispatch_lane_serial_drain + 768
10 libdispatch.dylib 0x199ee7158 _dispatch_lane_invoke + 432
11 libdispatch.dylib 0x199ee85c0 _dispatch_workloop_invoke + 1744
12 libdispatch.dylib 0x199ef238c _dispatch_root_queue_drain_deferred_wlh + 288
13 libdispatch.dylib 0x199ef1bd8 _dispatch_workloop_worker_thread + 540
14 libsystem_pthread.dylib 0x21ced8680 _pthread_wqthread + 288
15 libsystem_pthread.dylib 0x21ced6474 start_wqthread + 8
Perhaps I am using too much memory!
How can I address this phenomenon?
For the AR functionality, we are using UIViewRepresentable, which is written in UIKit and can be called from SwiftUI
import ARKit
import AsyncAlgorithms
import AVFoundation
import SCNLine
import SwiftUI
internal struct MeasureARViewContainer: UIViewRepresentable {
@Binding var tapCount: Int
@Binding var distance: Double?
@Binding var currentIndex: Int
var focusSquare: FocusSquare = FocusSquare()
let coachingOverlay: ARCoachingOverlayView = ARCoachingOverlayView()
func makeUIView(context: Context) -> ARSCNView {
let arView: ARSCNView = ARSCNView()
arView.delegate = context.coordinator
let configuration: ARWorldTrackingConfiguration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
if ARWorldTrackingConfiguration.supportsFrameSemantics(.sceneDepth) {
configuration.frameSemantics = [.sceneDepth, .smoothedSceneDepth]
}
arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
context.coordinator.sceneView = arView
context.coordinator.scanTarget()
coachingOverlay.session = arView.session
coachingOverlay.delegate = context.coordinator
coachingOverlay.goal = .horizontalPlane
coachingOverlay.activatesAutomatically = true
coachingOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight]
coachingOverlay.translatesAutoresizingMaskIntoConstraints = false
arView.addSubview(coachingOverlay)
return arView
}
func updateUIView(_ _: ARSCNView, context: Context) {
context.coordinator.mode = MeasurementMode(rawValue: currentIndex) ?? .width
if tapCount == 0 {
context.coordinator.resetMeasurement()
return
}
if distance != nil {
return
}
DispatchQueue.main.async {
if context.coordinator.distance == nil {
context.coordinator.handleTap()
}
}
}
static func dismantleUIView(_ uiView: ARSCNView, coordinator: Coordinator) {
uiView.session.pause()
coordinator.stopScanTarget()
coordinator.stopSpeech()
DispatchQueue.main.async {
uiView.removeFromSuperview()
}
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
class Coordinator: NSObject, ARSCNViewDelegate, ARSessionDelegate, ARCoachingOverlayViewDelegate {
var parent: MeasureARViewContainer
var sceneView: ARSCNView?
var startPosition: SCNVector3?
var pointedCount: Int = 0
var distance: Float?
var mode: MeasurementMode = .width
let synthesizer: AVSpeechSynthesizer = AVSpeechSynthesizer()
var scanTargetTask: Task<Void, Never>?
var currentResult: ARRaycastResult?
init(_ parent: MeasureARViewContainer) {
self.parent = parent
}
// ... etc
}
}
The documentation here states that the saveItem command group placement contains the Save As as a default command, but it doesn't appear.
I have my document type specifying multiple 'writableContentTypes' - I expected this would enable the save as.
How do I do this?
Hi, I'm experiencing the behaviour outlined below. When I navigate programmatically on iPadOS or macOS from a tab that hides the tab bar to another tab, the tab bar remains hidden. The real app has it's entry point in UIKit (i.e. it uses an UITabBarController instead of a SwiftUI TabView) but since the problem is reproducible with a SwiftUI only app, I used one for the sake of simplicity.
import SwiftUI
@main
struct HiddenTabBarTestApp: App {
@State private var selectedIndex = 0
var body: some Scene {
WindowGroup {
TabView(selection: $selectedIndex) {
Text("First Tab")
.tabItem {
Label("1", systemImage: "1.circle")
}
.tag(0)
NavigationStack {
Button("Go to first tab") {
selectedIndex = 0
}
.searchable(text: .constant(""))
}
.tabItem {
Label("2", systemImage: "2.circle")
}
.tag(1)
}
}
}
}
Reproduction:
Create a new SwiftUI App with the iOS App template and use the code from above
Run the app on iPadOS or macOS
Navigate to the second tab
Click into the search bar
Click the "Go to first tab" button
The tab bar is no longer visible
Is this a bug in the Framework or is it the expected behaviour? If it's the expected behaviour, do you have a good solution/workaround that doesn't require me to end the search programmatically (e.g. by using @Environment(\.dismissSearch)) before navigating to another tab? The goal would be to show the tab bar in the first tab while keeping the search open in the second tab.