When attempting to replicate the tvOS Settings menu layout, where the screen is divided horizontally into two sections, placing a NavigationStack or a Form view on either side of the screen causes focusable views (such as Button, TextField, Toggle, etc.) to be visually clipped when they receive focus and apply the default scaling animation.
Specifically:
If the Form or NavigationStack is placed on the right side, the left edge of the focused view gets clipped.
If placed on the left side, the right edge of the focused view gets clipped.
This issue affects any focusable child view inside the Form or NavigationStack when focus scaling is triggered.
Example code:
struct TVAppMenuMainView: View {
var body: some View {
VStack {
Text("Settings Menu")
.font(.title)
HStack {
VStack {
Text("Left Pane")
}
.frame(width: UIScreen.main.bounds.width * 0.4) // represents only 40% of the screen
.frame(maxHeight: .infinity)
.padding(.bottom)
Divider()
NavigationStack {
Form { // All the buttons will get cut on the left side when each button is focused
Button("First Button"){}
Button("Second Button"){}
Button("Third Button"){}
Button("Forth Button"){}
}
}
}
.frame(maxHeight: .infinity)
.frame(maxWidth: .infinity)
}
.background(.ultraThickMaterial)
}
}
How it looks:
What I have tried:
.clipped modifiers
.ignoresSafeArea
Modifying the size manually
Using just a ScrollView with VStack works as intended, but as soon as NavigationStack or Form are added, the buttons get clipped.
This was tested on the latest 18.5 tvOS BETA
SwiftUI
RSS for tagProvide views, controls, and layout structures for declaring your app's user interface using SwiftUI.
Posts under SwiftUI tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I would like to report a critical issue with Arabic text rendering in SwiftUI apps on iOS and iPadOS.
When using Arabic as the default language (Right-to-Left - RTL), Arabic text appears reversed and disconnected inside several SwiftUI components like:
List
Section
TextField
Picker
Custom views (like StudentRowView)
Even though the environment is set to .layoutDirection(.rightToLeft), the dynamic Arabic text is not rendered properly. Static headers display correctly, but any dynamic content (student names, notes, field titles) becomes broken and unreadable.
Examples where the issue occurs:
AboutView.swift → Arabic text inside List and Section
SettingsView.swift → TextField placeholders and Picker options
StudentRowView.swift → Student names and grade field titles
Environment:
SwiftUI 5 (Xcode 15+)
iOS 17+
Reproducible 100% on both Simulator and real devices.
Expected Behavior:
Arabic text should appear properly connected, right-aligned, and readable without any manual workaround for each Text or TextField.
Workarounds Tried:
Manually setting .multilineTextAlignment(.trailing) (inefficient)
Wrapping every Text inside an HStack with Spacer (hacky)
Building custom UIKit views (defeats purpose of SwiftUI simplicity)
Formal Feedback:
I have submitted a Feedback Assistant report
We hope this issue will be prioritized and fixed to improve SwiftUI's support for Arabic and other RTL languages.
Thank you.
There seems to be a bug; when scrolling very quickly down a List, and then scrolling up at normal speed, scrolling becomes very janky and jumpy, often skipping one or two rows. This only happens on macOS.
I'm kind of surprised I've seen no one else mention this bug, as I can recreate it in a very simple Xcode Project. I'm wondering if anyone knows of a workaround?
Steps to reproduce:
Build and launch the code below
Very quickly scroll all the way down using the scrollbar
Scroll up at a normal speed, after a few rows it will get janky
Code:
struct MinimalAlbum: Identifiable {
let id: Int
let title: String
}
struct ContentView: View {
private let staticAlbums: [MinimalAlbum] = (0..<1000).map { i in
MinimalAlbum(id: i, title: "Album Title \(i)")
}
var body: some View {
List {
ForEach(staticAlbums) { album in
Text("Album ID: \(album.id) - \(album.title)")
.frame(height: 80) // Fixed height
}
}
}
}
I am a little lost. Why is Xcode complaining of this. Everything looks right here. I even removed my code and copy-pasted Apple's sample from here - https://developer.apple.com/documentation/swiftui/list
Clean built and still get this.
Not using List removes the error.
I'm playing with a simple document-based application with TextEditor for macOS. In Cocoa, NSViewController can call updateChangeCount(_:) to clear document changes in NSDocument. I wonder SwiftUI's View has access to the same function? Hopefully, I would like to manually set the change count to zero if the user clears text in TextEditor. I bet SwiftUI doesn't have it. Thanks.
import SwiftUI
struct ContentView: View {
@Binding var document: SampleDocumentApp
var body: some View {
VStack {
TextEditor(text: $document.text)
.onChange(of: document.text) { _, _ in
guard !document.text.isEmpty else {
return
}
// clear change count //
}
}
.frame(width: 360, height: 240)
}
}
Apologies in advance for the long post. I'm new to HomeKit and Matter but not to development, I'm trying to write a SwiftUI app for my smart home to store all of my HomeKit and Matter setup barcodes along with other bits of information.
The intention is to scan the QR codes with my App and then save that QR payload in a simple Database along with other manually entered device details. Example payloads:
X-HM://00GWIN0B5PHPG <-- Eufy V120 HomeKit Camera
MT:GE.01-C-03FOPP6B110 <-- Moes GU10 Matter Bulb
I have it 99% working, my app is even able to discern the manual pairing code from the above payloads. However one of the key feature of this is that I want to open a device entry in my app and tap the HomeKit or Matter code displayed in my app and and either:
a) Ideally pass it off to the Apple Home app to initiate pairing just like the native Camera App can.
b) Create a custom flow in my app using the HomeKit or Matter API's to initiate paring from within my app.
So ideally just like the flow that happens when you scan a setup QR with the normal camera and tap "Open in Home". However I want to trigger this flow with just knowing the Payload and not with scanning it via the camera.
I was hoping there might be something as simple as a URL scheme that I could call with the payload as a variable and it then deep links and switches to the Home app, but I haven't found any info relating to this that actually works.
This is some code I have tried with the HomeKit API but this also results in an error:
import HomeKit
func startHomePairing(with setupCode: String) {
// Handle HomeKit setup
guard let payload = HMAccessorySetupPayload(url: URL(string: setupCode)!) else {
print("Invalid HomeKit setup code or format.")
return
}
let setupRequest = HMAccessorySetupRequest()
setupRequest.payload = payload
let setupManager = HMAccessorySetupManager()
// Perform the setup request and handle the result
setupManager.performAccessorySetup(using: setupRequest) { result, error in
if let error = error {
// Error handling: print the error details
print("Error starting setup: \(error.localizedDescription)")
// Print more details for debugging
print("Full Error: \(error)")
} else {
// Success: pairing was successful
print("Successfully launched Home app for HomeKit setup.")
}
}
}
But when passing in the QR payloads above it give the following ..
HomeKit Code
[0CAB3B05] Failed to perform accessory setup using request: Error Domain=HMErrorDomain Code=17 "(null)"
Matter Code
Failed to create HMSetupAccessoryPayload from setup payload URL MT:GE.01-C-03FOPP6B110: Error Domain=HMErrorDomain Code=3 "(null)"
I have added the "HomeKit" and "Matter Allow Setup Payload" capabilities to my app, I have also ensured I have these in the .plist ..
<key>NSHomeKitUsageDescription</key>
<string>Access required to HomeKit to initiate pairing for new accessories</string>
I also added a call to ensure my app appears in the Settings / Privacy / HomeKit section. I originally thought was a seemingly simple task, but I am really struggling with how to make it work!
Nice to meet you,
I'm currently trying to create an app like a data logger using BLE.
When a user uses the above app, they will probably put the app in the background and lock their iPhone if they want to collect data for a long period of time.
Therefore, the app I want to create needs to continue scanning for BLE even when it goes into the background.
The purpose is to continue to obtain data from the same device at precise time intervals for a long period of time (24 hours).
In that case, can I use the above function to continue to read and record advertising data from the same device periodically (at intervals of 10 seconds, 1 minute, or 5 minutes) after the app goes into the background?
Any advice, no matter how small, is welcome.
Please feel free to reply.
Also, if you have the same question in this forum and it has already been answered, I would appreciate it if you could let me know.
In an iPadOS SwiftUI app supporting multiple scenes, each Scene responds to a particular way in which the app was launched. If app was launched by tapping an associated file or a deep link (custom URL), then, the URLHandlerScene is invoked. If app was launched by QuickAction (long tap on the app icon), then another Scene is invoked etc. Each Scene has a purpose and responds to a particular launch.
But after defining handlesExternlEvents(matching:) scene modifier, the scene was not getting launched when user taps the associated file or the app's Deeplinks was invoked.
@main
struct IOSSwiftUIScenesApp: App {
var body: some Scene {
DefaultScene()
URLHandlerScene()
.handlesExternalEvents(matching: ["file://"]) // Launched by an associated file
.handlesExternalEvents(matching: ["Companion://"]) // Launched by Deeplink.
// Other scenes
}
}
struct URLHandlerScene: Scene {
@State private var inputURL: URL // Store the incoming URL
init() {
self.inputURL = URL(string: "Temp://")!
}
var body: some Scene {
WindowGroup {
URLhandlerView(inputURL: $inputURL)
.onOpenURL(perform: { (fileURL: URL) in
log(String(format: "URLhandlerView().onOpenURL | Thread.current = %@", String(describing: Thread.current)))
log("fileURL = " + String(describing: fileURL))
inputURL = fileURL
})
}
}
}
As shown above, I've attached handlesExternalEvents(matching:) modifier with "file://" for the associate file and "Companion" is my custom URL scheme. As per the scene matching rules documented here, my URLHandlerScene should get launched, but every time I launch the app using associated file or 'open' a Deeplink, the DefaultScene is always launched.
What is missing here? Can someone please help?
I have developed several document-based (NSDocument) applications for macOS is Cocoa. Now, I'm playing with a document app project in SwiftUI. If I launch the application out of box, a file-select panel will open just as you see in TextEdit. (Please see the picture below) How do we prevent it from appearing? I would rather show a blank window, which in fact appears if I just press Command + N. Thanks.
When we place a Button inside a ScrollView , the fade animation of the button is delayed, so most users won't see it I think.
You can see this in the trivial example
struct ContentView: View {
var body: some View {
ScrollView {
Button {
// empty
} label: {
Text("Fade animation test")
}
}
}
}
Is there any way to opt out of this behavior? In UIKit, this was also the default behavior, but you could always change it by overriding touchesShouldCancel method.
I think I can probably do that by rewriting an animation completely with some custom ButtonStyle or by rewriting a Button component completely, but it doesn't seem like a good solution to me, as I want the native look and feel (in case of button animation it is pretty easy to mimic though).
And also for some components, like lists, Apple has already implemented the correct behavior by themselves somehow.
In AppKit, NSSegmentedControl has various styles defined by NSSegmentStyle and various tracking modes defined by NSSegmentSwitchTracking.
How can we set these properties in SwiftUI?
I'm currently using a Picker with the view modifier .pickerStyle(.segmented) applied but this seems to produce a segmented control with tracking set to "select one".
In particular I'm looking for momentary tracking so that I can create navigation-style buttons for backward/forward navigation.
Under AppKit, the canonical way to do this is an NSSegmentedControl of style separated and tracking momentary.
Is that possible under SwiftUI for macOS? (Using the latest versions of everything.)
I am having trouble passing custom data in an array with a navigation stack. I want to display a subview with the same data structure (title, headline, picture placement etc), with different data attached for each of my list view items
When presenting a SwiftUI sheet containing ObservableObject's injected using environmentObject(_) modifier, the objects are unexpectedly retained after the sheet is dismissed if a TextField within the sheet gains focus or is edited.
This issue occurs on iOS and iPadOS (on macOS the objects are always released), observable both in the simulator and on physical devices, and happens even when the view does not explicitly reference these environment objects, and the TextField's content isn't bound to them.
Expected Results:
When the sheet is dismissed, all environment objects passed to the sheet’s content view should be released (deinitialized), regardless of whether the TextField was focused or edited.
Actual Results:
If the TextField was focused or edited, environment objects (ObservableA and ObservableB) are retained after the sheet is dismissed. They are not deinitialized as expected, leading to unintended retention.
Interestingly, previously retained copies of these environment objects, if any, are released precisely at the moment the TextField becomes focused on subsequent presentations, indicating an inconsistent lifecycle behavior.
I have filed an issue FB17226970
Sample Code
Below is a sample code that consistently shows the issue on iOS 18.3+.
Steps to Reproduce:
Run the attached SwiftUI sample.
Tap the button labeled “Show Sheet” to present a sheet.
Tap on the TextField to focus or begin editing.
Dismiss the sheet by dragging it down or by other dismissal methods (e.g., tapping outside on iPadOS).
import SwiftUI
struct ContentView: View {
@State private var showSheet: Bool = false
var body: some View {
VStack {
Button("Show Sheet") {
showSheet = true
}
}
.sheet(isPresented: $showSheet) {
SheetContentView()
.environmentObject(ObservableA())
.environmentObject(ObservableB())
}
}
}
struct SheetContentView: View {
@State private var text: String = ""
var body: some View {
TextField("Select to retain observable objects", text: $text)
.textFieldStyle(.roundedBorder)
}
}
final class ObservableA: ObservableObject {
init() {
print(type(of: self), #function)
}
deinit {
print(type(of: self), #function)
}
}
final class ObservableB: ObservableObject {
init() {
print(type(of: self), #function)
}
deinit {
print(type(of: self), #function)
}
}
#Preview {
ContentView()
}
I use swiftui to build apps on iPhone and iPad.
There is no problem with the iPhone app.
The game display is fully shown on iPhone.
However, for the iPad, the game display is not shown and the screen goes black.
I had to tap the button on the upper left side.(looks like a side view button)
After that, the game display is only shown in the left side in a very small size.
How can I make the game display fully shown in the iPad?
While using Screen Mirroring in developer mode within my immersive space, I noticed an alignment issue with the computer cursor (transparent circle). When I move it toward an attachment view, the cursor remains horizontal instead of aligning with the surface of the attachment view. It shows correctly on a 2D window only wrong on attachment view.
Is this behavior a bug, or could it be caused by a missing or incorrect configuration on the attachment view?
Want help, thanks.
I can't for the life of me get transitions and animations to work well with SwiftData and List on MacOS 15 and iOS 18.
I've included an example below, where I define several animations and a transition type, but they are all ignored.
How do I animate items being added to / removed from a List()?
I am attached to List() due to its support for selection, context menu, keyboard shortcuts, etc. If I would switch to ScrollView with VStack I would have to rebuild all of that.
Also, this is super basic and should just work, right?
Thanks for reading.
import SwiftUI
import SwiftData
struct ContentView: View {
@Environment(\.modelContext) private var modelContext
/// Issues on iOS:
/// Items animate into and out of view, but I seem to have no control over the animation.
/// In the code here I've specified a 'bouncy' and a slow 'easeIn' animation: both are not triggered.
/// The code also specifies using a 'slide' transition, but it is ignored.
/// -> How do I control the transition and animation timing on iOS?
///
/// Issues on MacOS:
/// Items do not animate at all on MacOS! They instantly appear and are instantly removed.
/// -> How do I control the transition and animation timing on MacOS?
// animation added here -> has no effect?
@Query(animation: .bouncy) private var items: [Item]
var body: some View {
VStack {
Button("Add to list") {
// called without 'withAnimation' -> no animation
let newItem = Item(timestamp: Date())
modelContext.insert(newItem)
}
List() {
ForEach(items, id: \.self) { item in
Text(item.timestamp, format: Date.FormatStyle(date: .numeric, time: .standard))
.transition(.slide) // items do not slide in/out of view
.onTapGesture {
// called with 'withAnimation' -> no animation
withAnimation(.easeIn(duration: 2)) {
modelContext.delete(item)
}
}
}
.animation(.spring(duration: 3), value: items)
}
}
.padding()
}
}
#Preview {
ContentView()
.modelContainer(for: Item.self, inMemory: true)
}
I can‘t use breakpoints on the simulator after updating Xcode and the simulator. I can use breakpoints on a physical iPhone. I tired to download other iOS simulator version, but still not working. Both SwiftUI and UIKit not working.
Xcode version: 16.2
SDK version: 18.2 (22C146)
Simulator version: 18.3.1 (22D8075)
Mac OS version: 15.4 Beta
Couldn't find the Objective-C runtime library in loaded images.
Message from debugger: The LLDB RPC server has crashed. You may need to manually terminate your process. The crash log is located in ~/Library/Logs/DiagnosticReports and has a prefix 'lldb-rpc-server'. Please file a bug and attach the most recent crash log.
Goal : Drag a sphere across the room and track it's position
Problem: The gesture seems to have no effect on the sphere ModelEntity. I don't know how to properly attach the gesture to the ModelEntity. Any help is great. Thank you
import SwiftUI
import RealityKit
import RealityKitContent
import Foundation
@main
struct testApp: App {
@State var immersionStyle:ImmersionStyle = .mixed
var body: some Scene {
ImmersiveSpace {
ContentView()
}
.immersionStyle(selection: $immersionStyle, in: .mixed, .full, .progressive)
}
}
struct ContentView: View {
@State private var lastPosition: SIMD3<Float>? = nil
@State var subscription: EventSubscription?
@State private var isDragging: Bool = false
var sphere: ModelEntity {
let mesh = MeshResource.generateSphere(radius: 0.05)
let material = SimpleMaterial(color: .blue, isMetallic: false)
let entity = ModelEntity(mesh: mesh, materials: [material])
entity.generateCollisionShapes(recursive: true)
return entity
}
var drag: some Gesture {
DragGesture()
.targetedToEntity(sphere)
.onChanged { _ in self.isDragging = true }
.onEnded { _ in self.isDragging = false }
}
var body: some View {
Text("Hello, World!")
RealityView { content in
//1. Anchor Entity
let anchor = AnchorEntity(world: SIMD3<Float>(0, 0, -1))
let ball = sphere
//1.2 add component to ball
ball.components.set(InputTargetComponent())
//2. add anchor to sphere
anchor.addChild(ball)
content.add(anchor)
subscription = content.subscribe(to: SceneEvents.Update.self) { event in
let currentPosition = ball.position(relativeTo: nil)
if let last = lastPosition, last != currentPosition {
print("Sphere moved from \(last) to \(currentPosition)")
}
lastPosition = currentPosition
}
}
.gesture(drag)
}
}
I have a TextField and entered for example "sg?!". At the TextField I set the modifier speechAlwaysIncludesPunctuation(). But when I activate VoiceOver the content of TextField is reading. The special characters don't read out.
How can I fix this?
I am developing an application which make use of 2 ornaments anchored to a volumetric window, one used a toolbar and one to display different views.
The problem I am facing consistently is that the ornaments seems to scale up or down after moving the volume using the OS handle or starting a GroupActivity session.
This first image shows the ornaments as soon as I started the app, no dragging nor group activities:
This second images shows them as soon as I join a group activity session:
The map, which might seem smaller, has not been touched and has always the same scale.
In this last image I had just dragged the entire volume using the OS toolbar, resulting in the ornaments scaling down:
This is how the volume and the ornaments are declared:
WindowGroup(id: "CityVolume") {
let cityVM = CityViewModel(volumeSize: CityView.initialVolumeSize)
CityView(cityVM: cityVM)
.ornament(attachmentAnchor: .scene(.bottomFront)) {
HStack {
TourismChartsButton()
LandmarksListButton()
CenterMapButton()
ToggleImmersiveSpaceButton()
TrafficDataButton()
BusLinesButton()
}
.padding()
.offset(z: 10)
.rotation3DEffect(Angle(degrees: 15), axis: (x: 1.0, y: 0.0, z: 0.0))
}
.ornament(attachmentAnchor: .scene(.back)) {
ZStack {
if AppModel.Instance.tourismVM.isChartViewVisible {
TourismChartsView()
}
if AppModel.Instance.busLinesVM.isDataViewEnabled {
BusLineView()
}
}
}
.task(observeGroupActivity)
.onAppear {
appModel.cityVM = cityVM
}
}
.windowStyle(.volumetric)
.windowResizability(.contentSize)
.volumeWorldAlignment(.gravityAligned)
.defaultSize(CityView.initialVolumeSize, in: .meters)
It happens also without starting a SharePlay session, but not as frequently as during SharePlay. Experienced the same behaviour with toolbars.
Am I doing something wrong with how I created the ornaments? Am I missing something?