I am a developer creating an app for iOS 16 and my app mysteriously started showing a blank screen during previews and the simulator. I profiled the app launch with Instruments, and the results said that dlopen was running on the main thread for the whole minutes of profiling:
Instruments
RSS for tagInstruments is a performance-analysis and testing tool for iOS, iPadOS, watchOS, tvOS, and macOS apps.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I've discovered what appears to be a system-level memory leak when pressing any key in Swift UI projects. This issue occurs even in a completely empty SwiftUI project with no custom code or event handlers.
When monitoring with Instruments' Leaks tool, I observe multiple memory leaks each time any key is pressed. These leaks consist primarily of:
NSExtraData objects (240 bytes each)
NSMenuItem objects (112 bytes each)
Other AppKit and Foundation objects
Has anyone else encountered this issue? How can I fix this behavior? While the leaks are small (about 5-6KB per keypress), they could potentially accumulate in applications where keyboard input is frequent.
I want to check the disassembly code of vImage function by Instrument.
But when I double click the stack of time profiler, I can see nothing.
My Mac is intel chip and iPhone is 14 pro max
Xcode version is 16.3.
How can I fix this(maybe no .dSYM in iOS device support?)
I kept CoreLocation’s startUpdatingLocation running for a full day and used Performance trace - PowerProfiler to track the power usage during that time. The trace file was successfully generated on the iOS device, and I later transferred it to my MacBook.
However, when I tried to open the .atrc file, I received the following warning:
The document cannot be imported because of an error: File ‘/Users/jun/Downloads/PowerProfiler_25-06-16_181049_to_25-06-17_091037_001.atrc’ doesn’t contain any events.
Why is this happening? Is there a known issue with PowerProfiler in iOS 26, or am I missing something in the tracing setup?
Note: The .aar file and the extracted .atrc file are not attached here, as forum uploads do not support these formats.
Hello,
I wanted to try new Bottleneck analysis mode showcased in recent Apple's video, however when I select CPU Counters template in Instruments, there's no such option - just the same old "sample by Time/Events".
I have the latest XCode 16.4 and OS Sonoma 15.5, the system is M4 Max. While Instruments shows version 16.0 in About dialog for some reason (a bug?), it definitely comes from the Xcode 16.4 package and the build id is the same (16F6) as for XCode 16.4. I also checked just in case on another M1 system (all updated as well) and it's all the same.
Any clues why Bottleneck analysis is missing?
Regards,
Maxim
Topic:
Developer Tools & Services
SubTopic:
Instruments
Hello,
I have recently been using the new Power Profiler tool introduced in Xcode 26 to analyze the power consumption of my app. My app primarily operates in the background. During a profiling session of 5 hours and 30 minutes, I observed that the app was active in the background for 2 hours and 30 minutes, while it remained in a suspended state for the remaining 3 hours.
While the Power Profiler allows me to identify spikes in CPU, networking, and other resource usage at specific points, it is difficult to determine whether these values are objectively considered high.
For example, in my case, the total QoS Execution Time of CPU Impact recorded during the 5 hours and 30 minutes was 12.18 seconds. I am wondering whether this is considered a good metric.
Could you please advise on the following points?
1. Is there a commonly accepted or recommended ratio between app active time and CPU time that developers should aim for?
2. Are there any guidelines or reference materials on how to interpret CPU usage and other resource metrics for apps that primarily run in the background?
Any insights or advice would be greatly appreciated.
Thank you.
Is there an xctrace instrument capable of capturing the complete control flow of a process? So far the best I can find is high-frequency sampling, but what I need is a trace of all machine instructions executed. This is easily done on Linux/Intel using the perf tool, which provides access to Intel's hardware-assisted tracing module (ptrace). According to the arm specification, my mac mini M1 (armv8.4-a) and M4 (armv9.2-a) both have hardware support in the CoreSight ETM (embedded trace macrocell) for full instruction tracing (i.e., no sampling, no gaps, no statistics--capturing the complete execution path). But it's not clear how I can access these features, if they are supported by the macos XNU kernel at all. After hours of searching online, it's nothing but dead ends. Any suggestions for documentation or Xcode tools or open-source tools or built-in macos tools would be much appreciated!
Topic:
Developer Tools & Services
SubTopic:
Instruments
Updated Xcode from 16.2 to 16.4, running Time Profile in Instruments, it launches the trace, but does not install or load on connected device, breaks the functionality. I am unable to debug...
ERROR:
Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents =
"XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" }
}
Domain: IXRemoteErrorDomain
Code: 6
User Info: {
DVTErrorCreationDateKey = "2025-08-09 00:47:53 +0000";
}
--
Connection with the remote side was unexpectedly closed : <dictionary: 0x1f3c8b6d0> { count = 1, transaction: 0, voucher = 0x0, contents =
"XPCErrorDescription" => <string: 0x1f3c8b850> { length = 22, contents = "Connection interrupted" }
}
Domain: IXRemoteErrorDomain
Code: 6
IOS 18.5
According to the ARM documentation for the CPU models available in Apple Silicon, the CoreSight implementation includes an Embedded Trace Macrocell which can perform a complete "Instruction Trace" (https://developer.arm.com/documentation/102119/0200/What-is-trace-). Although other operating systems such as Linux make this easy, we have not been able to find any tools or even a system-level API for accessing this feature of the ETM.
In the "Instruments" window of Xcode 16+, there is a "Processor Trace" instrument, but this performs sampling and is totally unrelated to the Instruction Trace we need for debugging and analysis purposes. Because it produces a complete, contiguous sequence of branch instructions, the Instruction Trace is essential for identifying precise execution behaviors that are otherwise invisible to the developer. On other platforms, an alternative is debugger scripting, but we have found far too many bugs and reliability issues with the macOS implementation of lldb.
Any suggestions would be greatly appreciated!
Topic:
Developer Tools & Services
SubTopic:
Instruments
What exactly is included/calculated in the following metrics within RealityKit Trace - RealityKit Metrics - 3D render attributes:
3D Mesh Triangles
Total Triangles Submitted
3D Mesh Vertices
Total Vertices Submitted
As the second part of the question, what differentiates between:
3D Mesh Triangles vs Total Triangles Submitted
3D Mesh Vertices vs Total Vertices Submitted
When I was using Instruments to test the Display on my phone, I discovered a long-duration frame. Below that frame, there were some gaps in the vsync queue. As I understand it, vsync signals should appear consistently and steadily. How can this behavior be explained?
My phone is iPhone 15 Plus and iOS 18.
First of all, you cant unlock the device, even if you know the password. Thats one thing.
Then, you cant use your main device to navigate trough the device you are controling. You basically can’t do shit to be honest. And Siri is on a bad mood this days.
That’s what I have to say to you my friends.
Eu amo vocês.
beijos.
Consertem essa porra.
E o “Eye Tracking” está bugado também. Não funciona direito.
Hello, I develop app using pure React Native which uses pods under the hood uses pods to install dependencies.
With some of dependencies I have such error only on a real device(on a simulator everything is ok):
"dyld[53510]: Symbol not found: __ZN5swift39swift51override_conformsToSwiftProtocolEPKNS_14TargetMetadataINS_9InProcessEEEPKNS_24TargetProtocolDescriptorIS1_EEN7__swift9__runtime4llvm9StringRefEPFPKNS_35TargetProtocolConformanceDescriptorIS1_EES4_S8_SC_E
Referenced from: <4A3492BF-0479-3124-BE58-05BAED71BB20> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/appName/Frameworks/pathToPod
Expected in: <0549B906-CB15-3735-AA15-FAEB5F687C8B> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/appName/Frameworks/pathToPod"
Anyone else having the same problem or have any ideas on how to fix this?
Topic:
Developer Tools & Services
SubTopic:
Instruments
Hi there,
How to let instruments - app launch template wait for process to launch , rather than actively launch by instruments?
I need to profile the app launch performance when clicking a push notification or cold launch via a url link. How to let instruments to wait for the process and collect the data?
Currently, I tried the command
xctrace record --template "App Launch"
--attach MyApp
--device-name 'Phone-Dev'
--output mytrace.trace
But it soon failed with 'Cannot find process matching name: MyApp'. How to make it work?
Topic:
Developer Tools & Services
SubTopic:
Instruments
Hello,
I’m encountering an issue with the Instruments app while running a benchmark on an M2 Ultra Mac Studio. Despite being certain that GPU activities involving memory read and write operations are occurring, all related performance counters consistently return 0.
Interestingly, this problem does not occur when using the same code on an M1 MacBook Air, where the counters behave as expected.
What could be causing this discrepancy? Any insights or suggestions would be greatly appreciated.
Thank you!
Topic:
Developer Tools & Services
SubTopic:
Instruments
Tags:
Metal
Metal Performance Shaders
metal-cpp
Hey folks,
We are looking for a way to increase the sampling frequency beyong what's currently called "high frequency sampling" for CPU profiler (or time profiler -- doesn't really matter for us).
We are aware that this is not offered through the UI but wondering if we can somehow experimentally enable this via the .tracetemplate (plist).
Basically, we see that samplingRate exists (in the plist) but don't see it having an effect on the actual runs. The resulting trace file always lists sample-rate-micro-seconds="1000" for the data table. E.g.,
<table trigger="time" pmc-events="Cycles" target-pid="SINGLE" schema="counters-profile" needs-kernel-callstack="0" sample-rate-micro-seconds="1000"/>
Cheers
Topic:
Developer Tools & Services
SubTopic:
Instruments
It looks like, for some reason, our apps are using a bunch of power sometimes. sysdiagnose has this in the power log:
Never mind. Including the output of sysdiagnose has "sensitive language," and it won't tell me what is sensitive, making this a waste of my time.
ETA: Ok, I I can attach the file: power.log
I've gone through the energy documentation, but it seems geared towards embedded, not macOS, so I'm not sure how I can figure this out more. The extra problem, of course, is that we have a network extension, two daemons, and a GUI app. 😄
I am using XCode16 and macOS 14.7.2. Previously, using the instruments on an iPhone with iOS 14.3 was normal, but when I upgraded to iOS 18, the instruments often couldn't find the library.
I have to restart the instruments to restore normal operation, but the problem will occur again after using it for a period of time
I am trying to record the requests and responses in a WKWebView, but instruments does not seem to record them. Is this to be expected?
The webView is set to inspectable, and I am using the HTTP Traffic instrument.
All the requests the app is doing are recorded, but neither the original request for the webView, nor subsequent traffic is recorded.
When I use Safari to inspect the webView, all I see is the last page (even when I start the inspector before the first request is made).
How can I see these requests?
I made a box with MDLMesh.newBox(). I added normals.
let mdlMesh = MDLMesh.newBox(withDimensions: SIMD3<Float>(1, 1, 1),
segments: SIMD3<UInt32>(2, 2, 2),
geometryType: MDLGeometryType.triangles,
inwardNormals:false,
allocator: allocator)
mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0.25)
After I convert to MTKMesh the normals are (0,0,0) for a group of vertices. I can only inspect the geometry after I convert to MTKMesh. Is there a way you can use Geometry Viewer on a MDLMesh?