Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
SwiftUI
RSS for tagProvide views, controls, and layout structures for declaring your app's user interface using SwiftUI.
Posts under SwiftUI tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm trying to make a Swift Chart where 24 AreaMarks an hour apart on X axis over a day display a vertical gradient.
The gradient is vertical and is essentially [Color.opacity(0.1),Colour,Color.opacity(0.1]
The idea here is where the upper and lower points of each AreaMark are the same or close to each other in the Y axis, the chart essentially displays a line, where they are far apart you get a nice fading vertical gradient.
However, it seems that the .alignsMarkStylesWithPlotArea modifier is always set for AreaMarks even if manually applying it false.
Investigating further, I've learnt that with AreaMarks in a series, Swift Charts seems to only listen to the first foreground style set in. I've created some sample code to demonstrate this.
struct DemoChartView: View {
var body: some View {
Chart {
AreaMark(x: .value("Time", Date().addingTimeInterval(0)), yStart: .value("1", 40), yEnd: .value("2", 60))
.foregroundStyle(LinearGradient(colors: [.pink, .teal], startPoint: .top, endPoint: .bottom))
.alignsMarkStylesWithPlotArea(false)
AreaMark(x: .value("Time", Date().addingTimeInterval(3600)), yStart: .value("1", 44), yEnd: .value("2", 58))
.foregroundStyle(LinearGradient(colors: [.orange, .yellow], startPoint: .top, endPoint: .bottom))
.alignsMarkStylesWithPlotArea(false)
AreaMark(x: .value("Time", Date().addingTimeInterval(03600*2)), yStart: .value("1", 50), yEnd: .value("2", 90))
.foregroundStyle(LinearGradient(colors: [.green, .blue], startPoint: .top, endPoint: .bottom))
.alignsMarkStylesWithPlotArea(false)
}
}
}
Which produces this:
So here, all the different .foregroundStyle LinearGradients are being ignored AND the .alignsMarkStylesWithPlotArea(false) is also ignored - the amount of pink on the first mark is different to the second and third 🤷♂️
Has anyone encountered this. Are AreaMarks the correct choice or are they just not setup to create this type of data display. Thanks
Hello, I am wondering if it is possible to have a Line Mark with different line styles. I am trying to create a Line Mark where part of the line is solid and another part of the line is dashed. Even with a conditional it only displays one or the other. Is it currently possible in SwiftCharts to do something like the attached image? Thank you.
If you try to add a graph for a function in Apple Notes you can see that numbers marking coordinates are positioned along the axes (see screenshot 1).
But when I am making my own plot view with Swift Charts I don't see that option. Marks for X axis are positioned at the bottom, and marks for Y axis are positioned to the right. I don't see an API that can configure them to be shown along the axes.
Is there something that I am missing? Or is Apple just using some private API for that?
I could make a custom overlay to display these marks, but then I will have to adjust them while zooming myself, which can be problematic.
Hello there!
I wanted to give a native scrolling mechanism for the Swift Charts Graph a try and experiment a bit if the scenario that we try to achieve might be possible, but it seems that the Swift Charts scrolling performance is very poor.
The graph was created as follows:
X-axis is created based on a date range,
Y-axis is created based on an integer values between moreless 0-320 value.
the graph is scrollable horizontally only (x-axis),
The time range (x-axis) for the scrolling content was set to one year from now date (so the user can scroll one year into the past as a minimum visible date (.chartXScale).
The X-axis shows 3 hours of data per screen width (.chartXVisibleDomain).
The data points for the graph are generated once when screen is about to appear so that the Charts engine can use it (no lazy loading implemented yet).
The line data points (LineMark views) consist of 2880 data points distributed every 5 minutes which simulates - two days of continuous data stream that we want to present. The rest of the graph displays no data at all.
The performance result:
The graph on the initial loading phase is frozen for about 10-15 seconds until the data appears on the graph.
Scrolling is very laggy - the CPU usage is 100% and is unacceptable for the end users.
If we show no data at all on the graph (so no LineMark views are created at all) - the result is similar - the empty graph scrolling is also very laggy.
Below I am sharing a test code:
@main
struct ChartsTestApp: App {
var body: some Scene {
WindowGroup {
ContentView()
Spacer()
}
}
}
struct LineDataPoint: Identifiable, Equatable {
var id: Int
let date: Date
let value: Int
}
actor TestData {
func generate(startDate: Date) async -> [LineDataPoint] {
var values: [LineDataPoint] = []
for i in 0..<(1440 * 2) {
values.append(
LineDataPoint(
id: i,
date: startDate.addingTimeInterval(
TimeInterval(60 * 5 * i) // Every 5 minutes
),
value: Int.random(in: 1...100)
)
)
}
return values
}
}
struct ContentView: View {
var startDate: Date {
return endDate.addingTimeInterval(-3600*24*30*12) // one year into the past from now
}
let endDate = Date()
@State var dataPoints: [LineDataPoint] = []
var body: some View {
Chart {
ForEach(dataPoints) { item in
LineMark(
x: .value("Date", item.date),
y: .value("Value", item.value),
series: .value("Series", "Test")
)
}
}
.frame(height: 200)
.chartScrollableAxes(.horizontal)
.chartYAxis(.hidden)
.chartXScale(domain: startDate...endDate) // one year possibility to scroll back
.chartXVisibleDomain(length: 3600 * 3) // 3 hours visible on screen
.onAppear {
Task {
dataPoints = await TestData().generate(startDate: startDate)
}
}
}
}
I would be grateful for any insights or suggestions on how to improve it or if it's planned to be improved in the future.
Currently, I use UIKit CollectionView where we split the graph into smaller chunks of the graph and we present the SwiftUI Chart content in the cells, so we use the scrolling offered there. I wonder if it's possible to use native SwiftUI for such a scenario so that later on we could also implement some kind of lazy loading of the data as the user scrolls into the past.
I noticed on the Find My app in the new iOS 26 beta that the TabView and the sheet seem to be part of the same view. When you collapse the sheet, the TabView is still visible, and you can swipe up to view the sheet again. Is there a way to recreate this effect? Preferably in SwiftUI, but UIKit works too.
I understand this is a known issue, but it’s truly unacceptable that it remains unresolved. Allowing users to customize toolbars is a fundamental macOS feature, and it has been broken since the release of macOS 15.
How is it possible that this issue persists even in macOS 15.3 beta (24D5040f)?
FB15513599
import SwiftUI
struct ContentView: View {
@State private var showEditItem = false
var body: some View {
VStack {
VStack {
Text("Instructions to reproduce the crash")
.font(.title)
.padding()
Text("""
1. Click on "Toggle Item"
2. In the menu go to File > New Window
3. In new window, click on "Toggle Item"
""")
}
.padding()
Button {
showEditItem.toggle()
} label: {
Text("Toggle Item")
}
}
.padding()
.toolbar(id: "main") {
ToolbarItem(id: "new") {
Button {
} label: {
Text("New…")
}
}
if showEditItem {
ToolbarItem(id: "edit") {
Button {
} label: {
Text("Edit…")
}
}
}
}
}
}
Why is the SwiftUI re-render the UI event if the view does not use the counter like in the example bellow...shouldn't SwiftUI framework be smart enough to detect that??
import SwiftUI
class ViewModel: ObservableObject {
@Published var counter: Int = 0 // Not used in the view's body
@Published var displayText: String = "Hello" // Used in the view's body
}
struct ContentView: View {
@StateObject private var viewModel = ViewModel()
var body: some View {
VStack {
Text(viewModel.displayText) // Depends on displayText
}
.onChange(of: viewModel.counter) { newValue in
print("Counter changed to: \(newValue)")
}
}
}
Is there any solution more elegant without using Publishers??
let dic : [AnyHashable:Any] = [
kCGPDFXRegistryName: "http://www.color.org" as CFString,
kCGPDFXOutputConditionIdentifier: "FOGRA43" as CFString,
kCGPDFContextOutputIntent: "GTS_PDFX" as CFString,
kCGPDFXOutputIntentSubtype: "GTS_PDFX" as CFString,
kCGPDFContextCreateLinearizedPDF: "" as CFString,
kCGPDFContextCreatePDFA: "" as CFString,
kCGPDFContextAuthor: "Placeholder" as CFString,
kCGPDFContextCreator: "Placeholder" as CFString
]
Hello,
Now I would like to export my PDF's as PDF/A. In my opinion, there is also the right option for this under Core Graphics.
Unfortunately, the documentation does not show what is 'kCGPDFContextCreatePDFA' or 'kCGPDFContextLinearizedPDF' for
a stringvalue is required.
What I have already tried: GTS_PDFA1 , PDF/A-1, true as CFString.
(Above my CFDictionary. ...Author e.g are working perfectly.)
In the Finder you can see these two options, which I would also like to implement in my app.
Thank you in advance!
The other day I was playing with iBeacon and found out that CLBeaconIdentityConstraint will be deprecated after iOS 18.5. So I've written code with BeaconIdentityCondition in reference to this Apple's sample project.
import Foundation
import CoreLocation
let monitorName = "BeaconMonitor"
@MainActor
public class BeaconViewModel: ObservableObject {
private let manager: CLLocationManager
static let shared = BeaconViewModel()
public var monitor: CLMonitor?
@Published var UIRows: [String: [CLMonitor.Event]] = [:]
init() {
self.manager = CLLocationManager()
self.manager.requestWhenInUseAuthorization()
}
func startMonitoringConditions() {
Task {
print("Set up monitor")
monitor = await CLMonitor(monitorName)
await monitor!.add(getBeaconIdentityCondition(), identifier: "TestBeacon")
for identifier in await monitor!.identifiers {
guard let lastEvent = await monitor!.record(for: identifier)?.lastEvent else { continue }
UIRows[identifier] = [lastEvent]
}
for try await event in await monitor!.events {
guard let lastEvent = await monitor!.record(for: event.identifier)?.lastEvent else { continue }
if event.state == lastEvent.state {
continue
}
UIRows[event.identifier] = [event]
UIRows[event.identifier]?.append(lastEvent)
}
}
}
func updateRecords() async {
UIRows = [:]
for identifier in await monitor?.identifiers ?? [] {
guard let lastEvent = await monitor!.record(for: identifier)?.lastEvent else { continue }
UIRows[identifier] = [lastEvent]
}
}
func getBeaconIdentityCondition() -> CLMonitor.BeaconIdentityCondition {
CLMonitor.BeaconIdentityCondition(uuid: UUID(uuidString: "abc")!, major: 123, minor: 789)
}
}
It works except that my sample app can take as long as 90 seconds to see event changes. You would get an instant update with an fashion (CLBeacon and CLBeaconIdentityConstraint). Is there anything that I can do to see changes faster? Thanks.
Hi,
I’d like to display items in a grid from right to left. Like the image:
Is this possible with a grid? What would be the best approach in terms of performance?
Hi,
I'm developing a SwiftUI app using RealityKit and ARKit for an AR measuring feature. I’ve noticed that after navigating away from my AR view and performing extensive cleanup (including removing all anchors/entities, pausing the ARSession, and nil-ing out all references), memory usage remains elevated and sometimes grows with repeated AR sessions.
Each time I enter and exit the AR view, memory increases
The memory does not return to the baseline after cleanup, even though all custom objects are deallocated.
Are there best practices beyond what I’ve described to ensure all ARKit/RealityKit resources are released after an AR session?
Edit: Well this is embarassing. It looks like I didn't research this thoroughly enough, animations block UIVIew tap events. I found a solution by using DispatchQueue
I ran into an unexpected issue when presenting a UIView-based toast inside a separate UIWindow in a SwiftUI app. Specifically, when animations are applied to the toast view (UIToastView), the tap gesture no longer works.
To help identify the root cause, I created a minimal reproducible example (MRE) with under 500 lines of code, demonstrating the behavior:
Demo GIF: Screen Recording
Code Repo: ToastDemo
What I Tried:
Using a separate UIWindow to present the toast overlay.
Adding a tap gesture directly to the UIView.
Referencing related solutions:
A Blog Post explaining UIWindow usage in SwiftUI - https://www.fivestars.blog/articles/swiftui-windows (Sorry, Apple Dev Forum will not allow a link to this)
A Stack Overflow thread on handling touch events in multiple windows.
Problem Summary:
When animations are involved (fade in, slide up), taps on the toast are not recognized.
Without animations, taps work as expected.
UIWindow setup seems correct, so I’m wondering if animation effects are interfering with event propagation.
I could potentially work around this by restructuring the touch handling, but I'd love insight from the community on why this happens, or if there’s a cleaner fix.
Edit: Well this is embarassing. It looks like I didn't research this thoroughly enough, animations block UIVIew tap events. I found a solution by using DispatchQueue
I have encountered the following error and reduced my code to the minimum necessary to reliably reproduce this error.
Fatal error: Duplicate keys of type 'AnyHashable2' were found in a >Dictionary.
This usually means either that the type violates Hashable's >requirements, or
that members of such a dictionary were mutated after insertion.
It occurs when
instances of a swiftdata model are inserted (the error occurs reliably when inserting five or more instances. Fewer insertions seems to make the error either more rare or go away entirely) and
a Picker with .menu pickerStyle is present.
Any of the following changes prevents the error from occuring:
adding id = UUID() to the Item class
removing .tag(item) in the picker content
using any pickerStyle other than .menu
using an observable class instead of a swiftdata class
I would greatly appreciate if anyone knows what exactly is going on here.
Tested using
XCode Version 16.4 (16F6),
iPhone 16 Pro iOS 18.5 Simulator and
iPhone 15 Pro iOS 18.5 real device.
import SwiftUI
import SwiftData
@Model class Item {
var name: String
init(name: String) {
self.name = name
}
}
struct DuplicateKeysErrorView: View {
@Environment(\.modelContext) private var modelContext
@Query(sort: \Item.name) private var items: [Item]
@State var selection: Item? = nil
var body: some View {
List {
Picker("Picker", selection: $selection) {
Text("Nil").tag(nil as Item?)
ForEach(items) { item in
Text(item.name).tag(item)
}
}
.pickerStyle(.menu)
Button("Add 5 items") {
modelContext.insert(Item(name: UUID().uuidString))
modelContext.insert(Item(name: UUID().uuidString))
modelContext.insert(Item(name: UUID().uuidString))
modelContext.insert(Item(name: UUID().uuidString))
modelContext.insert(Item(name: UUID().uuidString))
}
}
.onAppear {
try! modelContext.delete(model: Item.self)
}
}
}
#Preview {
DuplicateKeysErrorView()
.modelContainer(for: Item.self)
}
According to docs, .focusedObject() usage should be moved to .focusedValue() when migrating to @Observable, but there is no .focusedSceneValue() overload that accepts Observable like with .focusedValue(). So how are we supposed migrate .focusedSceneObject() to @Observable?
In reference to this webpage, I'm turning my iPad to an iBeacon device.
class BeaconViewModel: NSObject, ObservableObject, CBPeripheralManagerDelegate {
private var peripheralManager: CBPeripheralManager?
private var beaconRegion: CLBeaconRegion?
private var beaconIdentityConstraint: CLBeaconIdentityConstraint?
//private var beaconCondition: CLBeaconIdentityCondition?
override init() {
super.init()
if let uuid = UUID(uuidString: "abc") {
beaconIdentityConstraint = CLBeaconIdentityConstraint(uuid: uuid, major: 123, minor: 456)
beaconRegion = CLBeaconRegion(beaconIdentityConstraint: beaconIdentityConstraint!, identifier: "com.example.myDeviceRegion")
peripheralManager = CBPeripheralManager(delegate: self, queue: nil, options: nil)
}
}
func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager) {
switch peripheral.state {
case .poweredOn:
startAdvertise()
case .poweredOff:
peripheralManager?.stopAdvertising()
default:
break
}
}
func startAdvertise() {
guard let beaconRegion = beaconRegion else { return }
let peripheralData = beaconRegion.peripheralData(withMeasuredPower: nil)
peripheralManager?.startAdvertising(((peripheralData as NSDictionary) as! [String: Any]))
}
func stopAdvertise() {
peripheralManager?.stopAdvertising()
}
}
In Line 10, I'm using CLBeaconidentityConstraint to constrain the beacon. Xcode says that this class is deprecated and suggests that we use CLBeaconIdentityCondition. But if I try to use it, Xcode says
Cannot find type 'CLBeaconIdentityCondition' in scope
I've just updated Xcode to 16.4. I still get the same error. So how do we use CLBeaconIdentityCondition to constrain the beacon? My macOS version is Sequoia 15.5. Thanks.
Hi there,
I’m developing a watchOS app using SwiftUI, and I want to allow users to interact with the map using the panning gesture and also drop waypoints by long pressing anywhere on the map—just like in the built-in Apple Maps app on watchOS, where a long press drops a pin and panning still works seamlessly.
However, with SwiftUI’s Map, any attempt to attach a gesture other than .onTapGesture (such as LongPressGesture or DragGesture) seems to block the built-in map interactions, making panning impossible.
Is there a supported approach to detect long press gestures anywhere on the map while still allowing all standard map interactions (as seen in Apple Maps on watchOS)? Or is this something only possible with private APIs or internal access?
Any guidance or best practices would be greatly appreciated!
Thank you!
Hello!
We encountered a very intermittent crash with our application starting with devices running iOS 18.4. We have a screen that can display a long list of products in 2 states (expanded or collapsed) based off of a boolean if the user has interacted with that product yet.
With this list, we very intermittently encounter a crash when we
scroll like crazy up and down the list
search the list quickly (search is performed each character change and list is filtered)
Our project has iOS 17.0 as a minimum deployment target, and is Swift 6 enabled. Again, this started happening only with iOS 18.4, and is still visible (handful of occurrences each week).
The crash report seems to be very internal to SwiftUI/Obj-c runtime.
5895AC17-6886-4070-BC80-8912E8394BDB.crash
Any insights would be greatly appreciated!
I'm using RealityView in my iOS game mxied with SwiftUI. For the following 2 example usages, the simulator will only render the first RealityView, and the second one is either super laggy or show a black model. Running on the real device is all good, just simualtor has this issue.
Have a TabView and each tab has a RealityView.
Have a root view and detail view connected via a push navigation, both root and detail have a RealityView.
In the Simulator, the second RealityView is going to be very choppy and basically unusable, but on a real iPhone everything looks great.
Is this a known simulator issue or I did something bad?
Hello, I'm working on an SwiftUI iOS app that shows a list of timers. When the timer is up then I pop up an alert struct. The user hits "ok" to dismiss the alert. I am trying to include an alarm sound using AVFoundation. I can get the sounds to play if I change the code to play when a button clicks so I believe I have the url path correct. But I really want it to play during the alert pop up. I have not been able to find examples where this is done using an alert so I suspect I need a custom view but thought I'd try the alert route first. Anyone try this before?
@State var audioPlayer: AVAudioPlayer?
.alert(isPresented: $showAlarmAlert) {
playSound() -- Calls AVFoundation
return Alert(title: Text("Time's Up!"))
}
func playSound() {
let alertSoundPath = Bundle.main.url(forResource: "classicAlarm", withExtension: "mp3")!
do {
audioPlayer = try AVAudioPlayer(contentsOf: alertSoundPath)
audioPlayer?.play()
}
catch {
appData.logger.debug("Error playing sound: \(alertSoundPath)")
}
}