I am running Xcode 16.1, macOS 15.1 , iOS 18.1, and I see the error when trying to run the Instruments Network Profile
Instruments
RSS for tagInstruments is a performance-analysis and testing tool for iOS, iPadOS, watchOS, tvOS, and macOS apps.
Post
Replies
Boosts
Views
Activity
Does the new HTTP Traffic instrument require iOS15 running on the device? I am attempting to profile my app and I get an error saying 'This device is lacking a required capability'?
I'm trying to run the Instrument "Animation Hitches", but it fails immediately after starting on the iPhone. This happens on multiple projects, including a brand new project.
I get the following errors:
Timestamp | Message
(Before Run Started) | Unexpected failure: Couriers have returned unexpectedly.
(Before Run Started) | Failed to start the recording: Failed starting
ktrace session.
The issue also happens with the instruments App Launch, but not with any other that I tested.
Is this a bug?
Using Xcode Version 16.1, Mac Mini M1 Sequoia 15.1. Running project on iPhone 14 Pro iOS 18.1.
While I was recently profiling some code from a Swift library, I noticed that XCTest added in signposts for the measurement tests, which I found really helpful to "home in" on the code I wanted to profile digging around in the stack trace.
I tried to add my own signposts to provide just a bit of my own markers in there, but while it compiles and profiles equivalently, the signposts just aren't showing up. This is with Xcode 16.1, macOS Sequoia (15.1) and a swift library, using XCTest and profiling within one of the unit tests.
Is there something in this sequence that doesn't allow the library to set up signposts and have instruments collect them?
The flow I'm using:
import os
let subsystem = "MyLibrary"
class MyClass {
let logger: Logger = .init(subsystem: subsystem, category: "fastloop")
let signposter: OSSignposter
init() {
signposter = OSSignposter(logger: logger)
}
func goFast() {
let signpostId = signposter.makeSignpostID()
let state = signposter.beginInterval("tick", id: signpostId)
// ... do a bunch of work here - all synchronous
signposter.endInterval("tick", state)
}
}
Is there something I'm doing incorrectly in using this API, or not enabling to allow those signposts to be collected by the profiler?
I do see the signposts that XCTests injects into the system, just not any of the ones I'm creating.
I've been developing an app for macOS for some time. As I've been approaching the app's final development stages, I decided to try Instruments as I've suspected a memory leak was occurring, since my app's memory usage slowly grows over time. Instruments has found one leak, and I've spent considerable time trying to find the cause. Long story short, I've ended up with just an EmptyView() and Instruments were still showing a leak. I've tried creating a new project with a placeholder "Hello, world!" text, and Instruments were still detecting a leak. Am I doing something wrong here? Maybe I'm not using Instruments correctly? Or is this a bug? My Instruments version is 16.0, macOS Sequoia 15.1.
Hi Apple Engineers,
I am encountering an issue where the memory usage reported by the Xcode memory report and the Xcode Instruments memory profiler are not aligned. Specifically:
Xcode Memory Report:
After implementing autoreleasepool, URLSession reading a zip file, and moving the task inside DispatchQueue.global().async, the memory usage goes down from 900MB to 450MB, indicating a potential memory leak.
Xcode Instruments Memory Profiler:
The memory usage goes down from 900MB to 100MB, suggesting that the memory has been properly released and there is no significant memory leak.
Could you please help me understand the discrepancy between these two tools and provide guidance on the appropriate way to interpret the memory usage in my application? Which result I should rely on it?
I would greatly appreciate your insights and expertise on this matter. Thank you in advance for your assistance.
Good afternoon)
I am doing one of the test projects to create a PKPASS and a server to update this file on the device, as well as sending PUSH-a. APN service is used with JWT Bearer token. The server is written in Java Springboot.
We try to send push notification, pkpass update happens, response from APN 200 OK. But there is no notification, can you please tell me why this is happening?
I am also sending you a sample request:
curl -v -X POST
-H "apns-push-type: alert"
-H "apns-id: 5af474c5-a212-42f3-9f99-70a9e587e1e2"
-H "apns-topic: pas.example225"
-H "apns-expiration: 0"
-H "authorization: bearer eyJraWQiOiJGQ0pQVFFMV1ZNIiwiYWxnIjoiRVMyNTYifQ.eyJpc3MiOiI2N0ZGUTY1TEQzIiwiaWF0IjoxNzMwMzY3NzQ4fQ.FGXSLCR6mxkQyi7bNliZKZbVdN3m0xQzFSMUDRFU4aAYIgsgflk5MDEkS9k5riMHp10wpr80b20uq9cuPnoQqw"
--data '{"aps":{"alert":{"title":"title","subtitle":"subtitle","body":"body"}}}' \
P.S. I checked JWT on the page: https://icloud.developer.apple.com JWT was built correctly.
Thank you for your reply)
Translated with DeepL.com (free version)
Hello, I develop app using pure React Native which uses pods under the hood uses pods to install dependencies.
With some of dependencies I have such error only on a real device(on a simulator everything is ok):
"dyld[53510]: Symbol not found: __ZN5swift39swift51override_conformsToSwiftProtocolEPKNS_14TargetMetadataINS_9InProcessEEEPKNS_24TargetProtocolDescriptorIS1_EEN7__swift9__runtime4llvm9StringRefEPFPKNS_35TargetProtocolConformanceDescriptorIS1_EES4_S8_SC_E
Referenced from: <4A3492BF-0479-3124-BE58-05BAED71BB20> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/appName/Frameworks/pathToPod
Expected in: <0549B906-CB15-3735-AA15-FAEB5F687C8B> /private/var/containers/Bundle/Application/0D9FDF5C-BBC9-4060-972B-B2D6FD91E321/appName/Frameworks/pathToPod"
Anyone else having the same problem or have any ideas on how to fix this?
Hello,
I've upgraded both of my Apple TVs to tvOS 18. Since then, my app developed with SwiftUI has become almost unusable due to severe lag, particularly when scrolling in a LazyVStack. On the A1625 (Apple TV HD), the lag can last up to 20 seconds, while on the A2843 (Apple TV 4K, 3rd generation, Wi-Fi + Ethernet), it’s about one second.
I can consistently reproduce the issue with this minimal example:
@main
struct MyApp: App {
var body: some Scene {
WindowGroup {
ScrollView {
LazyVStack {
ForEach(0..<1000) { nb in
Button("Item \(nb)") {}
}
}
}
}
}
}
Using Instruments, I found that the hang is related to this call:
389.00 ms 71,4 % 6.00 ms +[_UIFocusRegionEvaluator __regionsByEvaluatingOcclusionsForBaseRegions:occludingRegions:baseRegionsCanOccludeEachOther:inSnapshot:]
Unfortunately, I can't attach the Instruments trace directly here, but you can download it from this link: https://drive.google.com/file/d/1sEIwXhr7_ajjRHZevCIW6jNOlPjaeU6L/view?usp=sharing
Important notes:
The same screen, when written in UIKit, runs smoothly on both devices.
After performing a factory reset on the older device, the performance issue disappeared. However, as you can imagine, I’m already receiving complaints from users who are understandably unwilling to reset their devices.
Does anyone know of a workaround until this is addressed by Apple?
First of all, you cant unlock the device, even if you know the password. Thats one thing.
Then, you cant use your main device to navigate trough the device you are controling. You basically can’t do shit to be honest. And Siri is on a bad mood this days.
That’s what I have to say to you my friends.
Eu amo vocês.
beijos.
Consertem essa porra.
E o “Eye Tracking” está bugado também. Não funciona direito.
When I was using Instruments to test the Display on my phone, I discovered a long-duration frame. Below that frame, there were some gaps in the vsync queue. As I understand it, vsync signals should appear consistently and steadily. How can this behavior be explained?
My phone is iPhone 15 Plus and iOS 18.
Recently, we reworked a crucial part of our app and managed to half the amount of CPU cycles our app requires (according to Xcode Instruments).
Nonetheless, when using the Time Profiler component in instruments, it shows that the CPU time spent was either higher or the same (depending on execution).
The main time-consuming factor here: libsystem_pthread.dylib - the amount of CPU time spent by this library has doubled from original implementation to reworked implementation.
Therefore, I'm having a few questions:
How should I interpret this result?
How is this even possible if the CPU clock cycles halved?
What is the better metric here, the CPU cycles or the time profiler?
How can I reduce the impact of that said library? What does that library do and how can I influence its performance?
Thanks in advance.
What exactly is included/calculated in the following metrics within RealityKit Trace - RealityKit Metrics - 3D render attributes:
3D Mesh Triangles
Total Triangles Submitted
3D Mesh Vertices
Total Vertices Submitted
As the second part of the question, what differentiates between:
3D Mesh Triangles vs Total Triangles Submitted
3D Mesh Vertices vs Total Vertices Submitted
Trying to examine performance issues in Xcode Instruments using the Animation Hitches instrument in Xcode 16.0 beta 6 (16A5230g).
When connected to my iPhone 15 Pro Max and I try to start a run with my app, it has an error “Failed to split user provided arguments: working directory doesn't exist” with timestamp “(Before Run Started)”. When running the app on an iOS simulator, the instrument runs fine—but I want to profile on a real device.
Instruments > Settings, Recording Location set to Default and that directory does exist.
Can somebody help me find the official documentation for instruments?
Google brings you to this forum, then the link at the top of this forum for Instruments documentation brings you to the Xcode page, which links to these Apple developer docs, wherein a search returns nothing for "Instruments". Searching these Xcode docs for "Instruments" returns only this specific use case on analyzing HTTP traffic.
Hi there,
In a project that I am working on, whenever I try running instruments for allocations to see the memory allocations that are happening under the hood, I see the statistics, and the traces updating, however the chart never updates.
I have made new projects on the machine, and I have tried different Xcode versions, and they all show the chart just fine. I have tried running the project on other machines with no success. I have double checked the arguments and options on the active schema I am trying to profile with the schema of a new project and they are identical.
Here is a picture of how it looks:
My questions are as follows:
What properties and settings can disable the chart from showing up?
What diagnostic steps recommended that I should take?
I can not share a reproducible as this is the only project I have with this problem and it is not mine, but please tell me if there is anything else I can provide in order to debug this.
All the best
Parsa
I cannot my AirPods 3rd gen to my phone it won’t play videos or music when connected. Screen bugs out some times doesn’t see when I swipe. Too many bugs .
There is a WWDC session about this cool instrument to detect hangs on-device:
https://developer.apple.com/videos/play/wwdc2022/10082/
I very like this tool; it's handy and gives useful stack traces, but the threshold of 250 ms is practically too high to use in a testing environment. It would be better if there was an option like 100 ms or something like that.
I think most of the people can see the scrolling hang with a duration of 100-250 ms, and it would be cool to detect such hangs with this instrument.
Is there a way to set a lower threshold, or can Apple consider adding such a threshold?
Thank you!
Hello,
I'm trying to use the itmstransporter installer (iTMSTransporter_installer_linux_aarch64_3.3.0.8.sh ) I downloaded from my Mac using the following command:
iTMSTransporter -m downloadInstaller -arch aarch64 -os Linux -destination ~/Downloads
when I move it on the Linux machine and I run it I receive the terms to be approved, I write Yes and return, then it asks "Continue? [yes or no] "
and again I write yes and return, but immediately the command quit without any output.
Any idea what's happening?
Hi everyone,
I'm developing a visionOS app using SwiftUI and RealityKit, and I'm encountering an issue with the pivot points of primitive shapes created in Reality Composer Pro.
Scenario:
When I use Reality Composer Pro within Xcode to add primitive shapes (such as cubes, capsules, etc.) to my scene, the pivot points for these objects seem to be set incorrectly. The pivot is located far from the actual object, which affects transformations and positioning.
Question:
Is there a way to correct or adjust the pivot point for primitive shapes created in Reality Composer Pro?
Additional Information:
I’ve attached a screenshot illustrating the issue with the pivot point being misaligned. Any guidance on how to resolve this would be greatly appreciated.
Thanks in advance for your help!
Best,
Siddharth