Instruments

RSS for tag

Instruments is a performance-analysis and testing tool for iOS, iPadOS, watchOS, tvOS, and macOS apps.

Instruments Documentation

Posts under Instruments subtopic

Post

Replies

Boosts

Views

Activity

Error recording of CPU Profiler in CLI: [Error] Failed to start the recording: Failed to force all hardware CPU counters: 13.
Context I created a short script to CPU profile a program from the command line. I am able to record via the Instruments app, but when I try from the command line I get the following error shown below. This example is just profiling the grep command. Error: % cpu_profile /usr/bin/grep \ --recursive "Brendan Gregg" \ "$(xcode-select --print-path)/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk" Profiling /usr/bin/grep into /tmp/cpu_profile_grep.trace Starting recording with the CPU Profiler template. Launching process: grep. Ctrl-C to stop the recording Run issues were detected (trace is still ready to be viewed): * [Error] Failed to start the recording: Failed to force all hardware CPU counters: 13. Recording failed with errors. Saving output file... Script: #!/bin/sh set -o errexit set -o nounset if [ "$#" -lt 1 ] then echo "Usage $0 <program> [arguments...]" 1>&2 exit 1 fi PROGRAM="$(realpath "$1")" shift OUTPUT="/tmp/cpu_profile_$(basename "$PROGRAM").trace" echo "Profiling $PROGRAM into $OUTPUT" 1>&2 # Delete potential previous traces rm -rf "$OUTPUT" xcrun xctrace record \ --template 'CPU Profiler' \ --no-prompt \ --output "$OUTPUT" \ --target-stdout - \ --launch -- "$PROGRAM" "$@" open "$OUTPUT" I think the error has to do with xctrace based on this post, but according to this post it should have been resolved in MacOS version 15.4. System Chip: Apple M3 Pro macOS: Sequoia 15.4.1 xctrace version: 16.0 (16E140) xcrun version: 70. Xcode version: 16.3 (16E140) Working Screenshots from Instruments App:
2
0
157
May ’25
Recording WKWebView
I am trying to record the requests and responses in a WKWebView, but instruments does not seem to record them. Is this to be expected? The webView is set to inspectable, and I am using the HTTP Traffic instrument. All the requests the app is doing are recorded, but neither the original request for the webView, nor subsequent traffic is recorded. When I use Safari to inspect the webView, all I see is the last page (even when I start the inspector before the first request is made). How can I see these requests?
0
0
89
May ’25
trouble with MDLMesh.newBo()
I made a box with MDLMesh.newBox(). I added normals. let mdlMesh = MDLMesh.newBox(withDimensions: SIMD3<Float>(1, 1, 1), segments: SIMD3<UInt32>(2, 2, 2), geometryType: MDLGeometryType.triangles, inwardNormals:false, allocator: allocator) mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0.25) After I convert to MTKMesh the normals are (0,0,0) for a group of vertices. I can only inspect the geometry after I convert to MTKMesh. Is there a way you can use Geometry Viewer on a MDLMesh?
0
0
57
May ’25
CoreML memory allocation logic
hello, I got a question about coreml. I loaded the coreml model in the project and set the computing unit to CPU+GPU. When I used instruments to analyze the performance, I found that there was an overhead of prepare gpu request before each inference. I also checked the freezing point graph and found that memory was frequently allocated. Is this as expected? Is there any way to avoid frequent prepares? I have tried some methods, such as memory sharing of predict interface input parameters, but it seems to be ineffective.
0
0
72
May ’25
Cannot get Instruments to profile my apps
Hi, I need help to get Instruments running to profile my application. I tried to profile my main app (Qt-5.15-Framework, c++, Intel-arch only) from Xcode. My app starts and Instruments runs time profiler or Leaks for about 15 seconds and the quits. No crash, no message nothing. This has been happening for a while on my Mac Studio M1 Max running macOS 14.7.6 and Xcode 15.4 IDE with a toolchain from Xcode 14.3 for the qmake (qt) project. However, this also happens if i set up a new vanilla Swift UI project from scratch wihtout any Qt stuff. In addition to the Mac Studio I also have Mac Book Pro M4 running macOS 15.5 and Xcode 16.4. On that machine I get the same results, no matter if I try Instruments on my qt project or a vanilla SwiftUI project. Also it does not make a difference if I change the toolchain with: sudo xcode-select -s /Applications/Xcode_143.app or sudo xcode-select -s /Applications/Xcode_164.app. Same results in either case. I also tried switching to Debug build in the editing the scheme for profiling, but got no better results. I also tried to lauch Instruments from Xcode using the Open Developer Tool menu entry, but got no better results. When I start Instruments first, run my program in Xcode and attach to it, I get the same results. Do you have any advice what to check for or to setup, maybe in signing or such? I am probably missing something basic. Thanks in advance
4
0
180
Jun ’25
(iOS 26) - PowerProfiler trace file cannot be opened
I kept CoreLocation’s startUpdatingLocation running for a full day and used Performance trace - PowerProfiler to track the power usage during that time. The trace file was successfully generated on the iOS device, and I later transferred it to my MacBook. However, when I tried to open the .atrc file, I received the following warning: The document cannot be imported because of an error: File ‘/Users/jun/Downloads/PowerProfiler_25-06-16_181049_to_25-06-17_091037_001.atrc’ doesn’t contain any events. Why is this happening? Is there a known issue with PowerProfiler in iOS 26, or am I missing something in the tracing setup? Note: The .aar file and the extracted .atrc file are not attached here, as forum uploads do not support these formats.
1
0
98
Jun ’25
Request for clarification / Documentation Feedback
Dear Apple Developer Support team, I would like to request an official confirmation regarding the handling of transaction status in the App Store Server API, specifically for the GET /inApps/v1/transactions/{transactionId} endpoint. As per our current understanding from the official documentation (Get Transaction Info), the API’s behavior appears to be: If a transaction is finalized and successfully processed by App Store, querying this API will return HTTP 200 OK along with transaction details. If a transaction is still in a pending or deferred state (such as awaiting Ask to Buy approval or pending authorization), the API will not return a 200, and instead respond with HTTP 404 Not Found or an appropriate error. Could you please confirm if this behavior is accurate and officially supported? Specifically: Does a 200 OK response guarantee that a transaction is finalized and successfully recorded on App Store servers? In cases where a transaction is pending approval (e.g. Ask to Buy), is it correct that GET /transactions/{transactionId} would return 404 Not Found until the transaction is finalized? We would greatly appreciate your confirmation to align our server-side logic for transaction validation accordingly. Thank you very much for your support! Kind regards, cuongnx
0
0
107
Jun ’25
Request for PMU Counter Support on Context Switches in Instruments
Hi, My name is Hani Nemati, and I work at Microsoft, where we support several macOS applications such as Microsoft Edge and Teams. I’m also the primary contributor to Microsoft Performance Tools for Apple (https://github.com/microsoft/Microsoft-Performance-Tools-Apple), an open-source project aimed at improving trace analysis across platforms. We are exploring ways to enhance our performance tracing capabilities on macOS and are particularly interested in the ability to attach PMU (Performance Monitoring Unit) counters to context switch events during trace collection. For reference, this capability is supported on Linux via LTTng using the add-context option (https://lttng.org/man/1/lttng-add-context/v2.13), and on Windows through Windows Performance Recorder (WPR), which allows PMU counters to be added at the start and end of context switches, enabling delta computation. Would it be possible to introduce similar support in Instruments for macOS? I’d appreciate any guidance or suggestions you might have on this request. Thank you, Hani Nemati Email: hanemati@microsoft
2
0
172
Jul ’25
Bottleneck analysis is not available in my Instruments
Hello, I wanted to try new Bottleneck analysis mode showcased in recent Apple's video, however when I select CPU Counters template in Instruments, there's no such option - just the same old "sample by Time/Events". I have the latest XCode 16.4 and OS Sonoma 15.5, the system is M4 Max. While Instruments shows version 16.0 in About dialog for some reason (a bug?), it definitely comes from the Xcode 16.4 package and the build id is the same (16F6) as for XCode 16.4. I also checked just in case on another M1 system (all updated as well) and it's all the same. Any clues why Bottleneck analysis is missing? Regards, Maxim
1
0
143
Jul ’25
Datadog Mobile Vitals equivalent in Instruments
Hello We use Datadog Mobile Vitals in our app and I'm trying to run some tools in Instruments for comparison. I'm not sure what tool should I use for some of those metrics: Slow Renders Description: With slow renders data, you can monitor which views are taking longer than 16ms or 60Hz to render. Instruments equivalent: Hangs including microhangs (?) CPU ticks per second Description: RUM tracks CPU ticks per second for each view and the CPU utilization over the course of a session. The recommended range is <40 for good and <80 for moderate. Instruments equivalent: CPU Profiler (?) Frozen Frames - Description: Frames that take longer than 700ms to render appear as stuck and unresponsive in your application. These are classified as frozen frames. Instruments equivalent: Hangs with > 500ms (?) Memory Utilization Description: The amount of physical memory used by your application in bytes for each view, over the course of a session. The recommended range is <200MB for good and <400MB for moderate. Instruments equivalent: Allocation (?)
0
0
125
Jul ’25
Question about Metrics Analysis in Xcode 26
Hello, I have recently been using the new Power Profiler tool introduced in Xcode 26 to analyze the power consumption of my app. My app primarily operates in the background. During a profiling session of 5 hours and 30 minutes, I observed that the app was active in the background for 2 hours and 30 minutes, while it remained in a suspended state for the remaining 3 hours. While the Power Profiler allows me to identify spikes in CPU, networking, and other resource usage at specific points, it is difficult to determine whether these values are objectively considered high. For example, in my case, the total QoS Execution Time of CPU Impact recorded during the 5 hours and 30 minutes was 12.18 seconds. I am wondering whether this is considered a good metric. Could you please advise on the following points? 1. Is there a commonly accepted or recommended ratio between app active time and CPU time that developers should aim for? 2. Are there any guidelines or reference materials on how to interpret CPU usage and other resource metrics for apps that primarily run in the background? Any insights or advice would be greatly appreciated. Thank you.
1
0
114
Aug ’25
Complete control flow trace... possible?
Is there an xctrace instrument capable of capturing the complete control flow of a process? So far the best I can find is high-frequency sampling, but what I need is a trace of all machine instructions executed. This is easily done on Linux/Intel using the perf tool, which provides access to Intel's hardware-assisted tracing module (ptrace). According to the arm specification, my mac mini M1 (armv8.4-a) and M4 (armv9.2-a) both have hardware support in the CoreSight ETM (embedded trace macrocell) for full instruction tracing (i.e., no sampling, no gaps, no statistics--capturing the complete execution path). But it's not clear how I can access these features, if they are supported by the macos XNU kernel at all. After hours of searching online, it's nothing but dead ends. Any suggestions for documentation or Xcode tools or open-source tools or built-in macos tools would be much appreciated!
1
0
90
Aug ’25
Touch screen stops working on shutdown screen in single app mode
We have 2 iPhones (16 pro - iOS 18.2, 16 regular - iOS 18.5 ) in single app mode and sometimes we need to shut them down manually. After holding Power and VolumeUp, shutdown screen appears as usual, but the slider isn't responding to touch, as well as the whole screen. After force restart using volume buttons, this issue disappears, but reappears after next phone restart. If we disable single app mode -the issue is gone and touch screen works every time on shutdown screen. Both iPhones share the same behavior. Is there any other way to reliably shut down the iPhone locally without using MDM or a way to fix this issue?
2
0
65
Aug ’25
Capturing the instruction trace from the ARM ETM
According to the ARM documentation for the CPU models available in Apple Silicon, the CoreSight implementation includes an Embedded Trace Macrocell which can perform a complete "Instruction Trace" (https://developer.arm.com/documentation/102119/0200/What-is-trace-). Although other operating systems such as Linux make this easy, we have not been able to find any tools or even a system-level API for accessing this feature of the ETM. In the "Instruments" window of Xcode 16+, there is a "Processor Trace" instrument, but this performs sampling and is totally unrelated to the Instruction Trace we need for debugging and analysis purposes. Because it produces a complete, contiguous sequence of branch instructions, the Instruction Trace is essential for identifying precise execution behaviors that are otherwise invisible to the developer. On other platforms, an alternative is debugger scripting, but we have found far too many bugs and reliability issues with the macOS implementation of lldb. Any suggestions would be greatly appreciated!
1
0
146
2w
Xcode 26 - Create ML don't work
I tried using Create ML of Xcode 26.0 beta 7 to generate a model using the "Word Tagging" template, and I received the error: Training progress unavailable - Unexpected error. Using Create ML of XCode 16.4 with the same documentation, I was able to build the model and use it in a test app. I'd like to understand why Create ML of Xcode 26 no longer works.
0
0
80
2w
How to export Allocations report in XML (Call Tree format) with xctrace?
Hello Apple team, I am using xctrace to record an Allocations trace on iOS. For example: xctrace record --template "Allocations" --launch com.example.myapp --time-limit 30s --output alloc.trace After recording, I can export the results in Allocations List format (flat list of allocations) using: xcrun xctrace export --input ./alloc.trace --xpath '/trace-toc/run/tracks/track[@name="Allocations"]/details/detail[@name="Allocations List"]' --output ./alloc.xml This works fine and produces an XML output. However, what I really need is to export the data in Call Tree format (as shown in Instruments GUI). I checked xctrace export --help, but it seems that the Allocations template only supports the List view for export, not the Call Tree breakdown. My question is: 👉 Is there a way to export an Allocations trace in XML with Call Tree details using xctrace? 👉 If not, is there an API or recommended workflow to automate this instead of exporting manually from Instruments GUI? Thanks in advance for your help!
0
0
130
2w
On-device hang detection with threshold lower than 250 ms
There is a WWDC session about this cool instrument to detect hangs on-device: https://developer.apple.com/videos/play/wwdc2022/10082/ I very like this tool; it's handy and gives useful stack traces, but the threshold of 250 ms is practically too high to use in a testing environment. It would be better if there was an option like 100 ms or something like that. I think most of the people can see the scrolling hang with a duration of 100-250 ms, and it would be cool to detect such hangs with this instrument. Is there a way to set a lower threshold, or can Apple consider adding such a threshold? Thank you!
1
0
719
Sep ’24