Instruments

RSS for tag

Instruments is a performance-analysis and testing tool for iOS, iPadOS, watchOS, tvOS, and macOS apps.

Posts under Instruments tag

39 Posts

Post

Replies

Boosts

Views

Activity

Instruments Malfunction
I’m reporting a severe reproducible issue in Instruments, specifically when using the SwiftUI instrument and opening Show Cause & Effect Graph. What happens: • Instruments becomes extremely laggy/unresponsive • The graph/detail area can turn solid magenta/pink • Memory usage rapidly increases (I observed around 18 GB, 25 GB, and up to 34 GB) • My Mac has crashed/restarted during this, or in other terms, had a kernel panic, where my Mac froze, and everything unresponsive. The Trackpad wouldn't even click. Important detail: • I could not find a generated kernel panic log after the crash/restart. Repro context: • SwiftUI iOS app profiled from Xcode • Trigger is specifically entering Show Cause & Effect Graph • Recordings can be short and still trigger it • Issue is much less severe or absent if I avoid that view What I already tried: • Rebooting • Short captures / fewer instruments • Clearing Xcode/Instruments caches/preferences • Retesting after cleanup • Reinstalling Xcode Is this a known Instruments regression? Is there a workaround besides avoiding Show Cause & Effect Graph? What exact diagnostics should I collect when no kernel panic file is generated? Specs: Xcode Version 26.3 (17C529) Instruments Version 26.3 (17C529) macOS Version 26.4 Beta (25E5223i) MacBook Pro 13-inch, M1, 2020, 16 GB RAM
2
0
126
1w
SwiftUI Instruments tool error: "Time Profiler: Time Profiler does not support the iOS platform"
I am trying to run the SwiftUI instruments tool for an iOS app and every time I run it, it either switches from giving me the "Time Profiler: Time Profiler does not support the iOS platform" error, or I end up with no data at all; however, when I run just the Time Profiler by itself it works fine. I am running this on a physical device
1
0
60
3w
How to programmatically determine fixed CPU frequency for memory latency benchmarking on Apple Silicon?
Hi everyone, I am developing a benchmarking tool to measure memory latency (L1/L2/DRAM) on Apple Silicon. I am currently using Xcode Instruments (CPU Counters) to validate my results. In my latest run for a 128 MB buffer with random access, Instruments shows: Latency (cycles): ~259 cycles (derived from LDST_UNIT_OLD_L1D_CACHE_MISS / L1D_CACHE_MISS_LD). Manual Timer Result: ~80 ns. To correlate these two values, I need the exact CPU Frequency (GHz) at the time of the sample. My Questions: Is there a recommended way to programmatically fetch the current frequency of the Performance cores (p-cores) during a benchmark run? Does Apple provide a "nominal" frequency value for M-series chips that we should use for cycle-to-nanosecond conversions? In Instruments, is there a hidden counter or "Average Frequency" metric that I can enable to avoid manual math? Hardware/Software Environment: Tool: Instruments 26.3+ (CPU Counters Template). Chip: A19, iPhone 17 pro. OS: 26.3.
0
0
101
4w
SwiftUI Instruments Template doesn't work
I am profiling a simple SwiftUI test app on my new iPhone through my new MacBook Pro and everything is version 26.2 (iOS, macOS, Xcode). I run Instruments with the SwiftUI template using all of the default settings and get absolutely zero data after interacting with the app for about 20 seconds. Using the Time Profiler template yields trace data. Trying the SwiftUI template again with the sample Landmarks app has the same issue as my app.
2
0
271
Jan ’26
SwiftUI List cell reuse / view lifecycle behavior when scrolling
I’m trying to understand how SwiftUI List handles row lifecycle and reuse during scrolling. I have a list with around 60 card views; on initial load, only about 7 rows are created, but after scrolling to the bottom all rows appear to be created, and when scrolling back to the top I again observe multiple updates and apparent re-creation of rows. I confirmed this behavior using Instruments by profiling my app. Even though each row has a stable identifier, the row views still seem to be destroyed and recreated, which doesn’t resemble UIKit’s cell reuse model. I’d like clarity on how List uses identifiers internally, what actually gets reused versus recreated, and how developers should reason about performance and view lifetime in this case.
0
1
302
Dec ’25
How to help Instrument's Swift task task lifetime summary group the same tasks so that the count for tasks is not always 1.
This is a screenshot from the Swift Task track in Xcode. I made these tasks with public actor ResourceManager { func foo() { for observer in observers { Task(name: "ResourceManager notify observers") { await notification(observer) } } } } I am confused why each of the task is showing as a separate task in the task lifetime summary. Is there a way to queue the trace in Instruments into the fact that these are indeed the same task?
1
0
162
Dec ’25
CoreImage memory build up on real device but not on simulator
I'm trying to benchmark a Core Image filter chains memory footprint and notice a weird quirk in instruments. On a real device, even with a simple Core Image chain, the memory balloons each time I ran the filter. See attached screen shots. Running on iPhone 17 Pro: Running on simulator (M2 Macbook Pro) As you can see there's a huge build up of 4MB "VM: IOSurface" memory on the real device, but the simulator seems to clean it up correctly. Here's my basic code: func processImage() { guard let inputImage = ContentViewModel.loadImageFromBundle(name: "kitty.HEIC") else { print("Failed to load sample_image from bundle") return } var outputImage = inputImage outputImage = outputImage.applyingFilter("CIBloom", parameters: [ kCIInputRadiusKey: 20, kCIInputIntensityKey: 0.8 ]) DispatchQueue.global(qos: .userInitiated).async { let data = self.context.jpegRepresentation(of: outputImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!) if let data = data, let uiImage = UIImage(data: data) { DispatchQueue.main.async { self.displayImage = Image(uiImage: uiImage) } } } } Why is this happening? Seems like a bug to me or I need to release an object. At the very least makes it challenging to measure memory usage. Any help is greatly appreciated. Alex
1
0
420
Nov ’25
SwifUI Performance With Rapid UI Updates
The code below is a test to trigger UI updates every 30 seconds. I'm trying to keep most work off main and only push to main once I have the string (which is cached). Why is updating SwiftUI 30 times per second so expensive? This code causes 10% CPU on my M4 Mac, but comment out the following line: Text(model.timeString) and it's 0% CPU. The reason why I think I have too much work on main is because of this from instruments. But I'm no instruments expert. import SwiftUI import UniformTypeIdentifiers @main struct RapidUIUpdateTestApp: App { var body: some Scene { DocumentGroup(newDocument: RapidUIUpdateTestDocument()) { file in ContentView(document: file.$document) } } } struct ContentView: View { @Binding var document: RapidUIUpdateTestDocument @State private var model = PlayerModel() var body: some View { VStack(spacing: 16) { Text(model.timeString) // only this changes .font(.system(size: 44, weight: .semibold, design: .monospaced)) .transaction { $0.animation = nil } // no implicit animations HStack { Button(model.running ? "Pause" : "Play") { model.running ? model.pause() : model.start() } Button("Reset") { model.seek(0) } Stepper("FPS: \(Int(model.fps))", value: $model.fps, in: 10...120, step: 1) .onChange(of: model.fps) { _, _ in model.applyFPS() } } } .padding() .onAppear { model.start() } .onDisappear { model.stop() } } } @Observable final class PlayerModel { // Publish ONE value to minimize invalidations var timeString: String = "0.000 s" var fps: Double = 30 var running = false private var formatter: NumberFormatter = { let f = NumberFormatter() f.minimumFractionDigits = 3 f.maximumFractionDigits = 3 return f }() @ObservationIgnored private let q = DispatchQueue(label: "tc.timer", qos: .userInteractive) @ObservationIgnored private var timer: DispatchSourceTimer? @ObservationIgnored private var startHost: UInt64 = 0 @ObservationIgnored private var pausedAt: Double = 0 @ObservationIgnored private var lastFrame: Int = -1 // cache timebase once private static let secsPerTick: Double = { var info = mach_timebase_info_data_t() mach_timebase_info(&info) return Double(info.numer) / Double(info.denom) / 1_000_000_000.0 }() func start() { guard timer == nil else { running = true; return } let desiredUIFPS: Double = 30 // or 60, 24, etc. let periodNs = UInt64(1_000_000_000 / desiredUIFPS) running = true startHost = mach_absolute_time() let t = DispatchSource.makeTimerSource(queue: q) // ~30 fps, with leeway to let the kernel coalesce wakeups t.schedule( deadline: .now(), repeating: .nanoseconds(Int(periodNs)), // 33_333_333 ns ≈ 30 fps leeway: .milliseconds(30) // allow coalescing ) t.setEventHandler { [weak self] in self?.tick() } timer = t t.resume() } func pause() { guard running else { return } pausedAt = now() running = false } func stop() { timer?.cancel() timer = nil running = false pausedAt = 0 lastFrame = -1 } func seek(_ seconds: Double) { pausedAt = max(0, seconds) startHost = mach_absolute_time() lastFrame = -1 // force next UI update } func applyFPS() { lastFrame = -1 } // next tick will refresh string // MARK: - Tick on background queue private func tick() { let s = now() let str = formatter.string(from: s as NSNumber) ?? String(format: "%.3f", s) let display = "\(str) s" DispatchQueue.main.async { [weak self] in self?.timeString = display } } private func now() -> Double { guard running else { return pausedAt } let delta = mach_absolute_time() &- startHost return pausedAt + Double(delta) * Self.secsPerTick } } nonisolated struct RapidUIUpdateTestDocument: FileDocument { var text: String init(text: String = "Hello, world!") { self.text = text } static let readableContentTypes = [ UTType(importedAs: "com.example.plain-text") ] init(configuration: ReadConfiguration) throws { guard let data = configuration.file.regularFileContents, let string = String(data: data, encoding: .utf8) else { throw CocoaError(.fileReadCorruptFile) } text = string } func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper { let data = text.data(using: .utf8)! return .init(regularFileWithContents: data) } }
1
0
283
Nov ’25
iPhone 17, IOS 26 - No option to enable ‘Processor Trace’
According to the documentation for Processor Trace, it should be available on the iPhone 16 or later. Going off of the Optimize CPU performance with Instruments WWDC session, the toggle for it should be under Developer > Performance, but I don’t see this option anywhere on my iPhone 17. I can’t run a Processor Trace in Instruments without this feature turned on, because it claims my iPhone’s CPU is unsupported. Has anyone else managed to enable Processor Trace on the A19 chips?
5
1
728
Oct ’25
Lock Contention in APFS/Kernel?
Hello! Some colleagues and work on Jujutsu, a version control system compatible with git, and I think we've uncovered a potential lock contention bug in either APFS or the Darwin kernel. There are four contributing factors to us thinking this is related to APFS or the Kernel: jj's testsuite uses nextest, a test runner for Rust that spawns each individual test as a separate process. The testsuite slowed down by a factor of ~5x on macOS after jj started using fsync. The slowdown increases as additional cores are allocated. A similar slowdown did not occur on ext4. Similar performance issues were reported in the past by a former Mercurial maintainer: https://gregoryszorc.com/blog/2018/10/29/global-kernel-locks-in-apfs/. My friend and colleague André has measured the test suite on an M3 Ultra with both a ramdisk and a traditional SSD and produced this graph: (The most thorough writeup is the discussion on this pull request.) I know I should file a feedback/bug report, but before I do, I'm struggling with profiling and finding kernel/APFS frames in my profiles so that I can properly attribute the cause of this apparent lock contention. Naively, I ran xctrace record --template 'Time Profiler' --output output.trace --launch /Users/dbarsky/.cargo/bin/cargo-nextest nextest run, and while that detected all processes spawned by nextest, it didn't record all processes as part of the same inspectable profile and didn't really show any frames from the kernel/APFS—I had to select individual processes. So I don't waste people's time and so that I can point a frame/smoking gun in the right system, how can I can use instruments to profile where the kernel and/or APFS are spending its time? Do I need to disable SIP?
9
1
519
Nov ’25
Regarding the issue of Xcode not displaying Apple Watch devices
My version is iOS 18.6.2 (22G100), watchOS 10.6.1 (21U580), macOS 15.3.1 (24D70), Xcode Version 16.4 (16F6). My iPhone can connect to Xcode and complete app installation testing. I have connected the iPhone to Xcode via USB, and both the iPhone and Apple Watch have been set to trust this device. The iPhone has Developer Mode enabled, but I cannot find Developer Mode in the Privacy & Security settings on the Apple Watch. As shown in the image, the Devices section in Xcode's Developer Documentation displays my iPhone but not the Apple Watch. However, the Open Console for the displayed iPhone shows my Apple Watch—actually two of them, though I only have one Apple Watch (the other might be from a previous backup). The iPhone app installs and tests normally, but the Apple Watch app cannot select a target device and fails to start installation. How can I resolve this issue?
1
0
411
Sep ’25
RealityKit / visionOS – Memory not released after dismissing ImmersiveSpace with USDZ models
Hi everyone, I’m encountering a memory overflow issue in my visionOS app and I’d like to confirm if this is expected behavior or if I’m missing something in cleanup. App Context The app showcases apartments in real scale using AR. Apartments are heavy USDZ models (hundreds of thousands of triangles, high-resolution textures). Users can walk inside the apartments, and performance is good even close to hardware limits. Flow The app starts in a full immersive space (RealityView) for selecting the apartment. When an apartment is selected, a new ImmersiveSpace opens and the apartment scene loads. The scene includes multiple USDZ models, EnvironmentResources, and dynamic textures for skyboxes. When the user dismisses the experience, we attempt cleanup: Nulling out all entity references. Removing ModelComponents. Clearing cached textures and skyboxes. Forcing dictionaries/collections to empty. Despite this cleanup, memory usage remains very high. Problem After dismissing the ImmersiveSpace, memory does not return to baseline. Check the attached screenshot of the profiling made using Instruments: Initial state: ~30MB (main menu). After loading models sequentially: ~3.3GB. Skybox textures bring it near ~4GB. After dismissing the experience (at ~01:00 mark): memory only drops slightly (to ~2.66GB). When loading the second apartment, memory continues to increase until ~5GB, at which point the app crashes due to memory pressure. The issue is consistently visible under VM: IOSurface in Instruments. No leaks are detected. So it looks like RealityKit (or lower-level frameworks) keeps caching meshes and textures, and does not free them when RealityView is ended. But for my use case, these resources should be fully released once the ImmersiveSpace is dismissed, since new apartments will load entirely different models and textures. Cleanup Code Example Here’s a simplified version of the cleanup I’m doing: func clearAllRoomEntities() { for (entityName, entity) in entityFromMarker { entity.removeFromParent() if let modelEntity = entity as? ModelEntity { modelEntity.components.removeAll() modelEntity.children.forEach { $0.removeFromParent() } modelEntity.clearTexturesAndMaterials() } entityFromMarker[entityName] = nil removeSkyboxPortals(from: entityName) } entityFromMarker.removeAll() } extension ModelEntity { func clearTexturesAndMaterials() { guard var modelComponent = self.model else { return } for index in modelComponent.materials.indices { removeTextures(from: &modelComponent.materials[index]) } modelComponent.materials.removeAll() self.model = modelComponent self.model = nil } private func removeTextures(from material: inout any Material) { if var pbr = material as? PhysicallyBasedMaterial { pbr.baseColor.texture = nil pbr.emissiveColor.texture = nil pbr.metallic.texture = nil pbr.roughness.texture = nil pbr.normal.texture = nil pbr.ambientOcclusion.texture = nil pbr.clearcoat.texture = nil material = pbr } else if var simple = material as? SimpleMaterial { simple.color.texture = nil material = simple } } } Questions Is this expected RealityKit behavior (textures/meshes cached internally)? Is there a way to force RealityKit to release GPU resources tied to USDZ models when they’re no longer used? Should dismissing the ImmersiveSpace automatically free those IOSurfaces, or do I need to handle this differently? Any guidance, best practices, or confirmation would be hugely appreciated. Thanks in advance!
9
0
2.1k
Jan ’26
Where is the instruments command line tool?
I was reading through this documentation about instruments command line tool https://help.apple.com/instruments/mac/current/#/devb14ffaa5 and how it can be launched from the command line. However, unlike what the documentation states, there's no such instruments command anywhere on my macos M1 (OS version 15.6). That command gives: $> instruments zsh: command not found: instruments I do have XCode installed which has the Instruments.App (GUI app) but not the command line utility: $> ls Xcode.app/Contents/Applications/ ... Instruments.app Is that linked documentation up-to-date (it does say "latest" in the URL)? Is there some other way to install this command line utility?
3
0
573
Aug ’25
How do I capture localhost traffic in instruments?
I'm currently exploring Instruments for profiling and tracing on macOS 15.6.1. I know there is the "network connections" instrument which records TCP/UDP information, however it seems to not include the "lo0" (loopback) interface. Is there a way to configure it so that localhost traffic is included in the recording? as the application I'm tracing uses that and I want that information to be included in traces. The documentation for network-interface-detection schema makes no mention of how it detects interfaces. Thanks in advance.
8
0
797
Sep ’25
Instruments Failure in Xcode 16.4
Updated Xcode from 16.2 to 16.4, running Time Profile in Instruments, it launches the trace, but does not install or load on connected device, breaks the functionality. I am unable to debug... ERROR: Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" } } Domain: IXRemoteErrorDomain Code: 6 User Info: { DVTErrorCreationDateKey = "2025-08-09 00:47:53 +0000"; } -- Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" } } Domain: IXRemoteErrorDomain Code: 6 IOS 18.5
1
0
566
Aug ’25
Processor Trace cannot finish due to "failed stoping ktrace session"
Enabled processor trace on my mac and other types of profiler work fine. However, Processor Trace keeps showing nothing and I see the error "Failed to stop recording session. Failed stoping ktrace session" How to solve this?
Replies
1
Boosts
0
Views
51
Activity
6d
Instruments Malfunction
I’m reporting a severe reproducible issue in Instruments, specifically when using the SwiftUI instrument and opening Show Cause & Effect Graph. What happens: • Instruments becomes extremely laggy/unresponsive • The graph/detail area can turn solid magenta/pink • Memory usage rapidly increases (I observed around 18 GB, 25 GB, and up to 34 GB) • My Mac has crashed/restarted during this, or in other terms, had a kernel panic, where my Mac froze, and everything unresponsive. The Trackpad wouldn't even click. Important detail: • I could not find a generated kernel panic log after the crash/restart. Repro context: • SwiftUI iOS app profiled from Xcode • Trigger is specifically entering Show Cause & Effect Graph • Recordings can be short and still trigger it • Issue is much less severe or absent if I avoid that view What I already tried: • Rebooting • Short captures / fewer instruments • Clearing Xcode/Instruments caches/preferences • Retesting after cleanup • Reinstalling Xcode Is this a known Instruments regression? Is there a workaround besides avoiding Show Cause & Effect Graph? What exact diagnostics should I collect when no kernel panic file is generated? Specs: Xcode Version 26.3 (17C529) Instruments Version 26.3 (17C529) macOS Version 26.4 Beta (25E5223i) MacBook Pro 13-inch, M1, 2020, 16 GB RAM
Replies
2
Boosts
0
Views
126
Activity
1w
SwiftUI Instruments tool error: "Time Profiler: Time Profiler does not support the iOS platform"
I am trying to run the SwiftUI instruments tool for an iOS app and every time I run it, it either switches from giving me the "Time Profiler: Time Profiler does not support the iOS platform" error, or I end up with no data at all; however, when I run just the Time Profiler by itself it works fine. I am running this on a physical device
Replies
1
Boosts
0
Views
60
Activity
3w
How to programmatically determine fixed CPU frequency for memory latency benchmarking on Apple Silicon?
Hi everyone, I am developing a benchmarking tool to measure memory latency (L1/L2/DRAM) on Apple Silicon. I am currently using Xcode Instruments (CPU Counters) to validate my results. In my latest run for a 128 MB buffer with random access, Instruments shows: Latency (cycles): ~259 cycles (derived from LDST_UNIT_OLD_L1D_CACHE_MISS / L1D_CACHE_MISS_LD). Manual Timer Result: ~80 ns. To correlate these two values, I need the exact CPU Frequency (GHz) at the time of the sample. My Questions: Is there a recommended way to programmatically fetch the current frequency of the Performance cores (p-cores) during a benchmark run? Does Apple provide a "nominal" frequency value for M-series chips that we should use for cycle-to-nanosecond conversions? In Instruments, is there a hidden counter or "Average Frequency" metric that I can enable to avoid manual math? Hardware/Software Environment: Tool: Instruments 26.3+ (CPU Counters Template). Chip: A19, iPhone 17 pro. OS: 26.3.
Replies
0
Boosts
0
Views
101
Activity
4w
Audio System Trace: Zero Time Stamp
In Instruments, I'm seeing "Zero Time Stamp" events in the "Audio Server" lane. What does that mean?
Replies
1
Boosts
0
Views
176
Activity
4w
SwiftUI Instruments Template doesn't work
I am profiling a simple SwiftUI test app on my new iPhone through my new MacBook Pro and everything is version 26.2 (iOS, macOS, Xcode). I run Instruments with the SwiftUI template using all of the default settings and get absolutely zero data after interacting with the app for about 20 seconds. Using the Time Profiler template yields trace data. Trying the SwiftUI template again with the sample Landmarks app has the same issue as my app.
Replies
2
Boosts
0
Views
271
Activity
Jan ’26
SwiftUI List cell reuse / view lifecycle behavior when scrolling
I’m trying to understand how SwiftUI List handles row lifecycle and reuse during scrolling. I have a list with around 60 card views; on initial load, only about 7 rows are created, but after scrolling to the bottom all rows appear to be created, and when scrolling back to the top I again observe multiple updates and apparent re-creation of rows. I confirmed this behavior using Instruments by profiling my app. Even though each row has a stable identifier, the row views still seem to be destroyed and recreated, which doesn’t resemble UIKit’s cell reuse model. I’d like clarity on how List uses identifiers internally, what actually gets reused versus recreated, and how developers should reason about performance and view lifetime in this case.
Replies
0
Boosts
1
Views
302
Activity
Dec ’25
Instruments: Trace file had no SwiftUI data
using Version 26.2 (17C52) I often get "Trace file had no SwiftUI data" why so?
Replies
5
Boosts
0
Views
270
Activity
Dec ’25
How to help Instrument's Swift task task lifetime summary group the same tasks so that the count for tasks is not always 1.
This is a screenshot from the Swift Task track in Xcode. I made these tasks with public actor ResourceManager { func foo() { for observer in observers { Task(name: "ResourceManager notify observers") { await notification(observer) } } } } I am confused why each of the task is showing as a separate task in the task lifetime summary. Is there a way to queue the trace in Instruments into the fact that these are indeed the same task?
Replies
1
Boosts
0
Views
162
Activity
Dec ’25
CoreImage memory build up on real device but not on simulator
I'm trying to benchmark a Core Image filter chains memory footprint and notice a weird quirk in instruments. On a real device, even with a simple Core Image chain, the memory balloons each time I ran the filter. See attached screen shots. Running on iPhone 17 Pro: Running on simulator (M2 Macbook Pro) As you can see there's a huge build up of 4MB "VM: IOSurface" memory on the real device, but the simulator seems to clean it up correctly. Here's my basic code: func processImage() { guard let inputImage = ContentViewModel.loadImageFromBundle(name: "kitty.HEIC") else { print("Failed to load sample_image from bundle") return } var outputImage = inputImage outputImage = outputImage.applyingFilter("CIBloom", parameters: [ kCIInputRadiusKey: 20, kCIInputIntensityKey: 0.8 ]) DispatchQueue.global(qos: .userInitiated).async { let data = self.context.jpegRepresentation(of: outputImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!) if let data = data, let uiImage = UIImage(data: data) { DispatchQueue.main.async { self.displayImage = Image(uiImage: uiImage) } } } } Why is this happening? Seems like a bug to me or I need to release an object. At the very least makes it challenging to measure memory usage. Any help is greatly appreciated. Alex
Replies
1
Boosts
0
Views
420
Activity
Nov ’25
SwifUI Performance With Rapid UI Updates
The code below is a test to trigger UI updates every 30 seconds. I'm trying to keep most work off main and only push to main once I have the string (which is cached). Why is updating SwiftUI 30 times per second so expensive? This code causes 10% CPU on my M4 Mac, but comment out the following line: Text(model.timeString) and it's 0% CPU. The reason why I think I have too much work on main is because of this from instruments. But I'm no instruments expert. import SwiftUI import UniformTypeIdentifiers @main struct RapidUIUpdateTestApp: App { var body: some Scene { DocumentGroup(newDocument: RapidUIUpdateTestDocument()) { file in ContentView(document: file.$document) } } } struct ContentView: View { @Binding var document: RapidUIUpdateTestDocument @State private var model = PlayerModel() var body: some View { VStack(spacing: 16) { Text(model.timeString) // only this changes .font(.system(size: 44, weight: .semibold, design: .monospaced)) .transaction { $0.animation = nil } // no implicit animations HStack { Button(model.running ? "Pause" : "Play") { model.running ? model.pause() : model.start() } Button("Reset") { model.seek(0) } Stepper("FPS: \(Int(model.fps))", value: $model.fps, in: 10...120, step: 1) .onChange(of: model.fps) { _, _ in model.applyFPS() } } } .padding() .onAppear { model.start() } .onDisappear { model.stop() } } } @Observable final class PlayerModel { // Publish ONE value to minimize invalidations var timeString: String = "0.000 s" var fps: Double = 30 var running = false private var formatter: NumberFormatter = { let f = NumberFormatter() f.minimumFractionDigits = 3 f.maximumFractionDigits = 3 return f }() @ObservationIgnored private let q = DispatchQueue(label: "tc.timer", qos: .userInteractive) @ObservationIgnored private var timer: DispatchSourceTimer? @ObservationIgnored private var startHost: UInt64 = 0 @ObservationIgnored private var pausedAt: Double = 0 @ObservationIgnored private var lastFrame: Int = -1 // cache timebase once private static let secsPerTick: Double = { var info = mach_timebase_info_data_t() mach_timebase_info(&info) return Double(info.numer) / Double(info.denom) / 1_000_000_000.0 }() func start() { guard timer == nil else { running = true; return } let desiredUIFPS: Double = 30 // or 60, 24, etc. let periodNs = UInt64(1_000_000_000 / desiredUIFPS) running = true startHost = mach_absolute_time() let t = DispatchSource.makeTimerSource(queue: q) // ~30 fps, with leeway to let the kernel coalesce wakeups t.schedule( deadline: .now(), repeating: .nanoseconds(Int(periodNs)), // 33_333_333 ns ≈ 30 fps leeway: .milliseconds(30) // allow coalescing ) t.setEventHandler { [weak self] in self?.tick() } timer = t t.resume() } func pause() { guard running else { return } pausedAt = now() running = false } func stop() { timer?.cancel() timer = nil running = false pausedAt = 0 lastFrame = -1 } func seek(_ seconds: Double) { pausedAt = max(0, seconds) startHost = mach_absolute_time() lastFrame = -1 // force next UI update } func applyFPS() { lastFrame = -1 } // next tick will refresh string // MARK: - Tick on background queue private func tick() { let s = now() let str = formatter.string(from: s as NSNumber) ?? String(format: "%.3f", s) let display = "\(str) s" DispatchQueue.main.async { [weak self] in self?.timeString = display } } private func now() -> Double { guard running else { return pausedAt } let delta = mach_absolute_time() &- startHost return pausedAt + Double(delta) * Self.secsPerTick } } nonisolated struct RapidUIUpdateTestDocument: FileDocument { var text: String init(text: String = "Hello, world!") { self.text = text } static let readableContentTypes = [ UTType(importedAs: "com.example.plain-text") ] init(configuration: ReadConfiguration) throws { guard let data = configuration.file.regularFileContents, let string = String(data: data, encoding: .utf8) else { throw CocoaError(.fileReadCorruptFile) } text = string } func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper { let data = text.data(using: .utf8)! return .init(regularFileWithContents: data) } }
Replies
1
Boosts
0
Views
283
Activity
Nov ’25
SwiftUI Instrumentation Fails to start
I am trying to perform swiftUI instrumentation on my ios app. whenever i hit the rocord button, the app launches on target device and closes with the error: Failed to start the recording: Failed starting ktrace session. How do i resolve this please?
Replies
6
Boosts
1
Views
843
Activity
Jan ’26
Excessive red flags in SwiftUI Instruments on iOS 26
I'm running the new iOS 26 SwiftUI Instruments and seeing a lot of red/hitches, even with Apple's own Backyard Birds sample project. Is anyone else experiencing this? Not sure if it's a general iOS 26 issue or specific to my device (iPhone SE 2nd gen). Would appreciate hearing if others are seeing similar results.
Replies
1
Boosts
0
Views
531
Activity
Oct ’25
iPhone 17, IOS 26 - No option to enable ‘Processor Trace’
According to the documentation for Processor Trace, it should be available on the iPhone 16 or later. Going off of the Optimize CPU performance with Instruments WWDC session, the toggle for it should be under Developer > Performance, but I don’t see this option anywhere on my iPhone 17. I can’t run a Processor Trace in Instruments without this feature turned on, because it claims my iPhone’s CPU is unsupported. Has anyone else managed to enable Processor Trace on the A19 chips?
Replies
5
Boosts
1
Views
728
Activity
Oct ’25
Lock Contention in APFS/Kernel?
Hello! Some colleagues and work on Jujutsu, a version control system compatible with git, and I think we've uncovered a potential lock contention bug in either APFS or the Darwin kernel. There are four contributing factors to us thinking this is related to APFS or the Kernel: jj's testsuite uses nextest, a test runner for Rust that spawns each individual test as a separate process. The testsuite slowed down by a factor of ~5x on macOS after jj started using fsync. The slowdown increases as additional cores are allocated. A similar slowdown did not occur on ext4. Similar performance issues were reported in the past by a former Mercurial maintainer: https://gregoryszorc.com/blog/2018/10/29/global-kernel-locks-in-apfs/. My friend and colleague André has measured the test suite on an M3 Ultra with both a ramdisk and a traditional SSD and produced this graph: (The most thorough writeup is the discussion on this pull request.) I know I should file a feedback/bug report, but before I do, I'm struggling with profiling and finding kernel/APFS frames in my profiles so that I can properly attribute the cause of this apparent lock contention. Naively, I ran xctrace record --template 'Time Profiler' --output output.trace --launch /Users/dbarsky/.cargo/bin/cargo-nextest nextest run, and while that detected all processes spawned by nextest, it didn't record all processes as part of the same inspectable profile and didn't really show any frames from the kernel/APFS—I had to select individual processes. So I don't waste people's time and so that I can point a frame/smoking gun in the right system, how can I can use instruments to profile where the kernel and/or APFS are spending its time? Do I need to disable SIP?
Replies
9
Boosts
1
Views
519
Activity
Nov ’25
Regarding the issue of Xcode not displaying Apple Watch devices
My version is iOS 18.6.2 (22G100), watchOS 10.6.1 (21U580), macOS 15.3.1 (24D70), Xcode Version 16.4 (16F6). My iPhone can connect to Xcode and complete app installation testing. I have connected the iPhone to Xcode via USB, and both the iPhone and Apple Watch have been set to trust this device. The iPhone has Developer Mode enabled, but I cannot find Developer Mode in the Privacy &amp; Security settings on the Apple Watch. As shown in the image, the Devices section in Xcode's Developer Documentation displays my iPhone but not the Apple Watch. However, the Open Console for the displayed iPhone shows my Apple Watch—actually two of them, though I only have one Apple Watch (the other might be from a previous backup). The iPhone app installs and tests normally, but the Apple Watch app cannot select a target device and fails to start installation. How can I resolve this issue?
Replies
1
Boosts
0
Views
411
Activity
Sep ’25
RealityKit / visionOS – Memory not released after dismissing ImmersiveSpace with USDZ models
Hi everyone, I’m encountering a memory overflow issue in my visionOS app and I’d like to confirm if this is expected behavior or if I’m missing something in cleanup. App Context The app showcases apartments in real scale using AR. Apartments are heavy USDZ models (hundreds of thousands of triangles, high-resolution textures). Users can walk inside the apartments, and performance is good even close to hardware limits. Flow The app starts in a full immersive space (RealityView) for selecting the apartment. When an apartment is selected, a new ImmersiveSpace opens and the apartment scene loads. The scene includes multiple USDZ models, EnvironmentResources, and dynamic textures for skyboxes. When the user dismisses the experience, we attempt cleanup: Nulling out all entity references. Removing ModelComponents. Clearing cached textures and skyboxes. Forcing dictionaries/collections to empty. Despite this cleanup, memory usage remains very high. Problem After dismissing the ImmersiveSpace, memory does not return to baseline. Check the attached screenshot of the profiling made using Instruments: Initial state: ~30MB (main menu). After loading models sequentially: ~3.3GB. Skybox textures bring it near ~4GB. After dismissing the experience (at ~01:00 mark): memory only drops slightly (to ~2.66GB). When loading the second apartment, memory continues to increase until ~5GB, at which point the app crashes due to memory pressure. The issue is consistently visible under VM: IOSurface in Instruments. No leaks are detected. So it looks like RealityKit (or lower-level frameworks) keeps caching meshes and textures, and does not free them when RealityView is ended. But for my use case, these resources should be fully released once the ImmersiveSpace is dismissed, since new apartments will load entirely different models and textures. Cleanup Code Example Here’s a simplified version of the cleanup I’m doing: func clearAllRoomEntities() { for (entityName, entity) in entityFromMarker { entity.removeFromParent() if let modelEntity = entity as? ModelEntity { modelEntity.components.removeAll() modelEntity.children.forEach { $0.removeFromParent() } modelEntity.clearTexturesAndMaterials() } entityFromMarker[entityName] = nil removeSkyboxPortals(from: entityName) } entityFromMarker.removeAll() } extension ModelEntity { func clearTexturesAndMaterials() { guard var modelComponent = self.model else { return } for index in modelComponent.materials.indices { removeTextures(from: &modelComponent.materials[index]) } modelComponent.materials.removeAll() self.model = modelComponent self.model = nil } private func removeTextures(from material: inout any Material) { if var pbr = material as? PhysicallyBasedMaterial { pbr.baseColor.texture = nil pbr.emissiveColor.texture = nil pbr.metallic.texture = nil pbr.roughness.texture = nil pbr.normal.texture = nil pbr.ambientOcclusion.texture = nil pbr.clearcoat.texture = nil material = pbr } else if var simple = material as? SimpleMaterial { simple.color.texture = nil material = simple } } } Questions Is this expected RealityKit behavior (textures/meshes cached internally)? Is there a way to force RealityKit to release GPU resources tied to USDZ models when they’re no longer used? Should dismissing the ImmersiveSpace automatically free those IOSurfaces, or do I need to handle this differently? Any guidance, best practices, or confirmation would be hugely appreciated. Thanks in advance!
Replies
9
Boosts
0
Views
2.1k
Activity
Jan ’26
Where is the instruments command line tool?
I was reading through this documentation about instruments command line tool https://help.apple.com/instruments/mac/current/#/devb14ffaa5 and how it can be launched from the command line. However, unlike what the documentation states, there's no such instruments command anywhere on my macos M1 (OS version 15.6). That command gives: $> instruments zsh: command not found: instruments I do have XCode installed which has the Instruments.App (GUI app) but not the command line utility: $> ls Xcode.app/Contents/Applications/ ... Instruments.app Is that linked documentation up-to-date (it does say "latest" in the URL)? Is there some other way to install this command line utility?
Replies
3
Boosts
0
Views
573
Activity
Aug ’25
How do I capture localhost traffic in instruments?
I'm currently exploring Instruments for profiling and tracing on macOS 15.6.1. I know there is the "network connections" instrument which records TCP/UDP information, however it seems to not include the "lo0" (loopback) interface. Is there a way to configure it so that localhost traffic is included in the recording? as the application I'm tracing uses that and I want that information to be included in traces. The documentation for network-interface-detection schema makes no mention of how it detects interfaces. Thanks in advance.
Replies
8
Boosts
0
Views
797
Activity
Sep ’25
Instruments Failure in Xcode 16.4
Updated Xcode from 16.2 to 16.4, running Time Profile in Instruments, it launches the trace, but does not install or load on connected device, breaks the functionality. I am unable to debug... ERROR: Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" } } Domain: IXRemoteErrorDomain Code: 6 User Info: { DVTErrorCreationDateKey = "2025-08-09 00:47:53 +0000"; } -- Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents = "XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" } } Domain: IXRemoteErrorDomain Code: 6 IOS 18.5
Replies
1
Boosts
0
Views
566
Activity
Aug ’25