I am trying to debug a crash due to uncaught exception 'NSInvalidArgumentException', reason: '-[__NSFrozenDictionary layoutSublayers]: unrecognized selector sent to instance.
I see 2 things that I find interesting about it.
The fact that the instance is a __NSFrozenDictionary tells me that the reference to a CALayer that has since been evicted from memory and re-written.
The call to layoutSublayers tells me that the CALayer was dealloc-ed at some point between the call to setNeedsLayout (or layoutIfNeeded)
This seemingly occurs as part of a call to -[UITableView reloadData]
Furthermore, each cell created by the UITableView has a UIStackView.
As part of the call to cellForRowAtIndexPath the code adds an instance of "custom view" to the stack view's subviews.
As part of the call to prepareForReuse the code removes the "custom view" from the stack view's subviews.
Therefore as part of the prepareForReuse the "custom view" (and its layer) is evicted from memory.
My theory is that the tableview does a layout pass on the visible cell which has since had its subview removed which causes the crash.
My question is what are the constraints on when/where to call reloadData and/or when/where you should definitely avoid it as it relates to this context?
This is code that modifies the view hierarchy of the cell as part of its lifecycle which AFAIK this is "not supported" since prepareForReuse is meant to be used for resetting view state and cellForRowAtIndexPath to reset content.
In that sense, another question, are you not allowed to modify the cell view hierarchy period as part of the cycle to draw the visible cells or is it more of a case, do not call reloadData?
Core Animation
RSS for tagRender, compose, and animate visual elements using Core Animation.
Posts under Core Animation tag
43 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
In vision OS, the tab bar of TabView is outside the window by default.
If I switch a page without TabView to a page that needs TabView in my program, the tab bar will suddenly appear on the left side of the screen without any animation. I hope it has an animation when it appears (such as easeIn, move). I tried it in Tab. Other animation-related modifiers such as animation are added under View, but there is no animation in the tab bar. Only the view in the tab has an animation effect, but this is not what I want. What I want is that the tab bar outside the window can have animation. What should I do?
On the WWDC24 session video 'Enhance your UI animations and transitions', Appls shows these new animation methods for UIKIT:
switch gesture.state {
case .changed:
UIView. animate(.interactiveSpring) {
bead.center = gesture.translation
}
case .ended:
UIView. animate(spring) {
bead.center = endOfBracelet
}
}
As of iOS 18 Beta 2, I get an error for `UIView. animate(.interactiveSpring)`
These new methods are not available yet?
We want to animate the images using animationImages propery of UIImageView in our app.
There are 45 heif images with dimensions of 1668x2388.
Below is the code i am using to animate images.
let imageViewAnimation = UIImageView()
imgViewAnimation.animationImages = images
imgViewAnimation.animationDuration = 5.0
imgViewAnimation.animationRepeatCount = 1
imgViewAnimation.startAnimating()
Timer.scheduledTimer(withTimeInterval: 5.0 + 0.5, repeats: false) { _ in
DispatchQueue.main.async {
self.showAlert()
}
}
The issue i am facing is, it takes more than 5.0 seconds (animationDuration) to display all the images.
Alert is shown before all the images are shown.
We face this issue only for few sets of images. For some other sets, it is working fine.
Is this issue due to heif images used, or due to memory and CPU load of using large number of images.
When I use CAEmitterLayer on my screen, it does not affect performance no matter how many layers and particles I use. I always see CPU usage zero.
Is it possible to know how CAEmitterLayer affects performance or even measure this?
Repo is here
Visual Description is here:
Basically the red view should never be inside the blue view. The red view has a fixed aspect ratio. The blue view initially has an aspect ratio different from the red view and is animated to have the same aspect ratio.
The red view is constrained inside the blue view as follows:
let ac = brokenView.widthAnchor.constraint(equalTo: brokenView.heightAnchor, multiplier: 9/16)
let xc = brokenView.centerXAnchor.constraint(equalTo: animView.centerXAnchor)
let yc = brokenView.centerYAnchor.constraint(equalTo: animView.centerYAnchor)
let widthC = brokenView.widthAnchor.constraint(equalTo: animView.widthAnchor)
widthC.priority = .defaultLow
let gewc = brokenView.widthAnchor.constraint(greaterThanOrEqualTo: animView.widthAnchor)
let geHC = brokenView.heightAnchor.constraint(greaterThanOrEqualTo: animView.heightAnchor)
geHC.priority = .required
So the red view should start at equal width, but prioritize always been taller and wider than the blue view while staying centered and keeping a fixed aspect ratio. As you can see in the visual description it is not priortizing being taller.
Any suggestions on how to fix, work around, or otherwise get past this would be appreciated. Really trying to avoid manually doing a spring animation with keyframes and an assload of math.
I have filed a bug on feedback assistant, but figured someone here might have experience//know how. Thanks
Why do I get this error almost immediately on starting my rendering pass?
Multiline
BlockQuote. 2024-05-29 20:02:22.744035-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
2024-05-29 20:02:22.744455-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
2024-05-29 20:05:54.079981-0500 RoomPlanExampleApp[491:10025] [CAMetalLayer nextDrawable] returning nil because allocation failed.
2024-05-29 20:05:54.080144-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
With the introduction of the new matchedTransitionSource from iOS 18, we can apply a zoom transition in the navigation view using .navigationTransition(.zoom) This works well for zoom animations.
However, when I try to apply a matched geometry effect to views that are similar in the source and destination views, the zoom transition works, but those views don't transition seamlessly as they do with a matched geometry effect.
Is it possible to still use matched geometry for subviews of the source and destination views along with the new navigationTransition?
Here’s a little demo that reproduces this behaviour:
struct ContentView: View {
let colors: [[Color]] = [
[.red, .blue, .green],
[.yellow, .purple, .brown],
[.cyan, .gray]
]
@Namespace() var namespace
var body: some View {
NavigationStack {
Grid(horizontalSpacing: 50, verticalSpacing: 50) {
ForEach(colors, id: \.hashValue) { rowColors in
GridRow {
ForEach(rowColors, id: \.self) { color in
NavigationLink {
DetailView(color: color, namespace: namespace)
.navigationTransition(
.zoom(
sourceID: color,
in: namespace
)
)
.edgesIgnoringSafeArea(.all)
} label: {
ZStack {
RoundedRectangle(cornerRadius: 5)
.foregroundStyle(color)
.frame(width: 48, height: 48)
Image(systemName: "star.fill")
.foregroundStyle(Material.bar)
.matchedGeometryEffect(id: color,
in: namespace,
properties: .frame, isSource: false)
}
}
.matchedTransitionSource(id: color, in: namespace)
}
}
}
}
}
}
}
struct DetailView: View {
var color: Color
let namespace: Namespace.ID
var body: some View {
ZStack {
color
Image(systemName: "star.fill")
.resizable()
.foregroundStyle(Material.bar)
.matchedGeometryEffect(id: color,
in: namespace,
properties: .frame, isSource: false)
.frame(width: 100, height: 100)
}
.navigationBarHidden(false)
}
}
#Preview {
ContentView()
}
I've got an iOS app that is using MetalKit to display raw video frames coming in from a network source. I read the pixel data in the packets into a single MTLTexture rows at a time, which is drawn into an MTKView each time a frame has been completely sent over the network. The app works, but only for several seconds (a seemingly random duration), before the MTKView seemingly freezes (while packets are still being received).
Watching the debugger while my app was running revealed that the freezing of the display happened when there was a large spike in memory. Seeing the memory profile in Instruments revealed that the spike was related to a rapid creation of many IOSurfaces and IOAccelerators. Profiling CPU Usage shows that CAMetalLayerPrivateNextDrawableLocked is what happens during this rapid creation of surfaces. What does this function do?
Being a complete newbie to iOS programming as a whole, I wonder if this issue comes from a misuse of the MetalKit library. Below is the code that I'm using to render the video frames themselves:
class MTKViewController: UIViewController, MTKViewDelegate {
/// Metal texture to be drawn whenever the view controller is asked to render its view.
private var metalView: MTKView!
private var device = MTLCreateSystemDefaultDevice()
private var commandQueue: MTLCommandQueue?
private var renderPipelineState: MTLRenderPipelineState?
private var texture: MTLTexture?
private var networkListener: NetworkListener!
private var textureGenerator: TextureGenerator!
override public func loadView() {
super.loadView()
assert(device != nil, "Failed creating a default system Metal device. Please, make sure Metal is available on your hardware.")
initializeMetalView()
initializeRenderPipelineState()
networkListener = NetworkListener()
textureGenerator = TextureGenerator(width: streamWidth, height: streamHeight, bytesPerPixel: 4, rowsPerPacket: 8, device: device!)
networkListener.start(port: NWEndpoint.Port(8080))
networkListener.dataRecievedCallback = { data in
self.textureGenerator.process(data: data)
}
textureGenerator.onTextureBuiltCallback = { texture in
self.texture = texture
self.draw(in: self.metalView)
}
commandQueue = device?.makeCommandQueue()
}
public func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
/// need implement?
}
public func draw(in view: MTKView) {
guard
let texture = texture,
let _ = device
else { return }
let commandBuffer = commandQueue!.makeCommandBuffer()!
guard
let currentRenderPassDescriptor = metalView.currentRenderPassDescriptor,
let currentDrawable = metalView.currentDrawable,
let renderPipelineState = renderPipelineState
else { return }
currentRenderPassDescriptor.renderTargetWidth = streamWidth
currentRenderPassDescriptor.renderTargetHeight = streamHeight
let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: currentRenderPassDescriptor)!
encoder.pushDebugGroup("RenderFrame")
encoder.setRenderPipelineState(renderPipelineState)
encoder.setFragmentTexture(texture, index: 0)
encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4, instanceCount: 1)
encoder.popDebugGroup()
encoder.endEncoding()
commandBuffer.present(currentDrawable)
commandBuffer.commit()
}
private func initializeMetalView() {
metalView = MTKView(frame: CGRect(x: 0, y: 0, width: streamWidth, height: streamWidth), device: device)
metalView.delegate = self
metalView.framebufferOnly = true
metalView.colorPixelFormat = .bgra8Unorm
metalView.contentScaleFactor = UIScreen.main.scale
metalView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
view.insertSubview(metalView, at: 0)
}
/// initializes render pipeline state with a default vertex function mapping texture to the view's frame and a simple fragment function returning texture pixel's value.
private func initializeRenderPipelineState() {
guard let device = device, let library = device.makeDefaultLibrary() else {
return
}
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.rasterSampleCount = 1
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineDescriptor.depthAttachmentPixelFormat = .invalid
/// Vertex function to map the texture to the view controller's view
pipelineDescriptor.vertexFunction = library.makeFunction(name: "mapTexture")
/// Fragment function to display texture's pixels in the area bounded by vertices of `mapTexture` shader
pipelineDescriptor.fragmentFunction = library.makeFunction(name: "displayTexture")
do {
renderPipelineState = try device.makeRenderPipelineState(descriptor: pipelineDescriptor)
}
catch {
assertionFailure("Failed creating a render state pipeline. Can't render the texture without one.")
return
}
}
}
My question is simply: what gives?
Our on field app is facing significant crashes at CA::Transaction::commit() from QuartzCore.
The crash report does not point to any code in our app and the information available is very limited.
It's a EXC_BAD_ACCESS KERN_INVALID_ADDRESS crash.
We have tried using sanitisers and zombies but unable to reproduce the crash locally.
Can someone help explain and point in right direction?
Below are the crash details.
Full crash report
Distributor ID: com.apple.AppStore
Hardware Model: iPhone15,3
AppStoreTools: 15F31e
AppVariant: 1:iPhone15,3:16
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Date/Time: 2024-06-03 20:41:59.3315 +0530
Launch Time: 2024-06-03 19:15:20.6051 +0530
OS Version: iPhone OS 17.4.1 (21E236)
Release Type: User
Baseband Version: 2.51.04
Report Version: 104
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000110
Exception Codes: 0x0000000000000001, 0x0000000000000110
VM Region Info: 0x110 is not in any region. Bytes before following region: 4372119280
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
UNUSED SPACE AT START
--->
__TEXT 104994000-105420000 [ 10.5M] r-x/r-x SM=COW /var/containers/Bundle/Application/
Termination Reason: SIGNAL 11 Segmentation fault: 11
Terminating Process: exc handler [35195]
Triggered by Thread: 0
Kernel Triage:
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
Thread 0 name:
Thread 0 Crashed:
0 QuartzCore 0x000000018f4633b4 CA::Transaction::commit() + 1152 (CATransactionInternal.mm:480)
1 QuartzCore 0x000000018f4633ec CA::Transaction::commit() + 1208 (CATransactionInternal.mm:480)
2 QuartzCore 0x000000018f462e64 CA::Transaction::flush_as_runloop_observer(bool) + 88 (CATransactionInternal.mm:942)
3 UIKitCore 0x00000001900b1260 _UIApplicationFlushCATransaction + 52 (UIApplication.m:3160)
4 UIKitCore 0x00000001900b0d78 _UIUpdateSequenceRun + 84 (_UIUpdateSequence.mm:119)
5 UIKitCore 0x00000001900b0468 schedulerStepScheduledMainSection + 144 (_UIUpdateScheduler.m:1037)
6 UIKitCore 0x00000001900b0524 runloopSourceCallback + 92 (_UIUpdateScheduler.m:1186)
7 CoreFoundation 0x000000018ddc162c __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 28 (CFRunLoop.c:1957)
8 CoreFoundation 0x000000018ddc08a8 __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2001)
9 CoreFoundation 0x000000018ddbf0b8 __CFRunLoopDoSources0 + 340 (CFRunLoop.c:2046)
10 CoreFoundation 0x000000018ddbdd88 __CFRunLoopRun + 828 (CFRunLoop.c:2955)
11 CoreFoundation 0x000000018ddbd968 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420)
12 GraphicsServices 0x00000001d20b34e0 GSEventRunModal + 164 (GSEvent.c:2196)
13 UIKitCore 0x0000000190230edc -[UIApplication _run] + 888 (UIApplication.m:3692)
14 UIKitCore 0x0000000190230518 UIApplicationMain + 340 (UIApplication.m:5282)
15 AppName 0x000000010542072c main + 64 (AppDelegate.swift:13)
16 dyld 0x00000001b12ded84 start + 2240 (dyldMain.cpp:1298)
Recently I tried to apply a custom transition to a custom contextMenu.
However, I want to make sure that during the transition process (which is not over yet), my contextMenu elements such as buttons can be tapped.
But I tried a lot of things without success. I know you have a lot of experience, so I would like to ask you about how to implement the transition and be able to interact before it is over.
I know that a UIView can be tapped during animation, but I haven't tried the button in a UIView.
I've been trying to transition ViewControllers. For example, in a transition from fromViewController to toViewController, I wanted to be able to tap on a tableView in toViewController during the transition, but I was frustrated and found it very difficult to implement. I would like to ask you about the possibilities of interaction during the Viewcontrollers transition.
(PS: In Github Issues, I uploaded a GIF example of the plus button on the left of the input box in iMessage. After tapping the plus button, you can tap the "Apple Cash" button before the transition is finished.)
Your advice would be incredibly valuable to me. Thank you in advance for your time and assistance.
[GithubLink]https://github.com/Juhnkerg/DemoForInteractionDuringTransition)
I've caught a very strange problem (THAT ONLY OCCURS ON REAL DEVICES) that has to do with using @NameSpace and MatchedGeometryEffect using NavigationStack. The problem is that if I go from the first screen to the second screen (on the second screen I have a View with MatchedGeometryEffect) , I get a strange bug where the MatchedGeometryEffect animation starts to work very badly (ONLY ON REAL DEVICE), it feels like the animation frame count drops to a minimum, but on simulator or preview everything works fine as it should.
However, if I use the screen without NavigationStack, there is no such animation problem. What can this be related to ? And how can this be fixed ?
It only takes a couple of lines of code to catch the animation problem, but it took all day to figure out what the problem is.
FirstView
struct NameSpaceTest2Navigation: View {
@State private var nameSpaceTest2Path: [String] = []
var body: some View {
NavigationStack {
Button(action: {
nameSpaceTest2Path.append("nameSpaceTest2")
}, label: {
Text("Button")
})
.navigationDestination(for: String.self) { path in
NameSpaceTest2()
}
}
}
}
Second View
struct NameSpaceTest2: View {
@State private var selection: Int = 0
var body: some View {
VStack {
TabView(selection: $selection) {
ForEach(0..<5, id: \.self) { _ in
Color.white
}
}
.tabViewStyle(.page(indexDisplayMode: .never))
}
.overlay(alignment: .top) {
NameSpaceTest2Header(selection: $selection)
}
}
}
Third View
struct NameSpaceTest2Header: View {
@Binding var selection: Int
@Namespace var sectionUnderline
var body: some View {
ScrollViewReader { scrollReader in //START: ScrollViewReader
ScrollView(.horizontal, showsIndicators: false) { //START: ScrollView
HStack(spacing: 0) { //START: HStack
ForEach(0..<5, id: \.self) { index in
VStack { //START: VStack
Text("Name: \(index)")
.foregroundStyle(Color.green)
.padding(.horizontal, 24)
.padding(.vertical, 15)
.overlay(alignment: .bottom) {
if selection == index {
Rectangle()
.fill(Color.red)
.frame(height: 5)
.matchedGeometryEffect(id: "sectionUnderline", in: sectionUnderline, properties: .frame)
.transition(.scale(scale: 1))
}
}
.animation(.smooth, value: selection)
} //END: VStack
.onTapGesture {
withAnimation(.smooth) {
scrollReader.scrollTo(index)
selection = index
}
}
.tag(index)
}
} //END: HStack
} //END: ScrollView
.onChange(of: selection, perform: { value in
withAnimation(.smooth) {
scrollReader.scrollTo(value, anchor: .center)
}
})
} //END: ScrollViewReader
}
}
Try to repeat this code and run it on a real device with NavigationStack and you will get a bug with the animation, it will twitch like it has 5-10 frames per second.
Then try to run Second View without Navigation Stack and you will get smooth animation, which is what it should be.
What could be the problem ? How to get the smooth animation back ?
Once again, you should only test on a real device.
I've got a full-screen animation of a bunch of circles filled with gradients, with plenty of (careless) overdraw, plus real-time audio processing driving the animation, plus the overhead of SwiftUI's dependency analysis, and that app uses less energy (on iPhone 13) than the Xcode "Metal Game" template which is a rotating textured cube (a trivial GPU workload). Why is that? How can I investigate further?
Does CoreAnimation have access to a compositor fast-path that a Metal app cannot access?
Maybe another data point: when I do the same circles animation using SwiftUI's Canvas, the energy use is "Very High" and GPU utilization is also quite high. Eventually the phone's thermal state goes "Serious" and I get a message on the device that "Charging will resume when iPhone returns to normal temperature".
with the latest Xcode that runs with Mac OS 14.5 Developer Beta has messages with a time and date in them There are also some other fields of an indeterminate origin/type.
"2024-05-06 15:37:32.383996-0500 RoomPlanExampleApp[24190:1708576] [CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead."
specifically I need to know how the string [24190:1708576] relates to a location in my application so I can act on the message. I certainly can't find the text in the "[CAMetalLayerDrawable texture]". field anywhere in the user documentation OR the Development documentation. In order for a diagnostic message to be Actionable and remedied by a user it must identify the module and source line of the initiating code and there must be accessible documentation for users to access to get an explanation of potential remedies.. This interface fails to supply enough information to diagnose the problem. The label in [CAMetalLayerDrawable texture] cannot even be found in a search of the package information attached to the Xcode Release paired with the IOS and Mac OS system releases.
I am implementing pan and zoom features for an app using a custom USB camera device, in iPadOS. I am using an update function (shown below) to apply transforms for scale and translation but they are not working. By re-enabling the animation I can see that the scale translation seems to initially take effect but then the image animates back to its original scale. This all happens in a fraction of a second but I can see it. The translation transform seems to have no effect at all. Printing out the value of AVCaptureVideoPreviewLayer.transform before and after does show that my values have been applied.
private func updateTransform() {
#if false
// Disable default animation.
CATransaction.begin()
CATransaction.setDisableActions(true)
defer { CATransaction.commit() }
#endif
// Apply the transform.
logger.debug("\(String(describing: self.videoPreviewLayer.transform))")
let transform = CATransform3DIdentity
let translate = CATransform3DTranslate(transform, translationX, translationY, 0)
let scale = CATransform3DScale(transform, scale, scale, 1)
videoPreviewLayer.transform = CATransform3DConcat(translate, scale)
logger.debug("\(String(describing: self.videoPreviewLayer.transform))")
}
My question is this, how can I properly implement pan/zoom for an AVCaptureVideoPreviewLayer? Or even better, if you see a problem with my current approach or understand why the transforms I am applying do not work, please share that information.
I have an app that displays an instrument dial that looks a bit like a speedometer or voltmeter with a circular face and a pointer. When operating, it updates perhaps once or twice a second, possibly more (it's triggered by data updates received by the network), using a CABasicAnimation as shown in the code below. But it's also quite possible that the dial is not visible to the user, for instance, if another tab view is active, or the dial is scrolled off-screen. What happens to animations on views/layers that aren't visible, do they just get discarded? I'm wondering if I should add a check of some sort in my code to minimize unnecessary performance hits. If so, how would I check if a UIView/CALayer is user-visible?
I'm assuming animations are "efficient" in the way I imagine but you know what they say about assuming...
func updateDialWithAnimation(degrees: CGFloat) {
let radians = degrees * CGFloat.pi / 180
let animation = CABasicAnimation()
animation.keyPath = "transform.rotation.z"
animation.fromValue = CGFloat(angle.value) * CGFloat.pi / 180
animation.toValue = radians
animation.duration = 0.33
angleLabel.text = String(Float(degrees))
angle.value = Float(degrees)
// needle is a subclass of UIView
needle.layer.add(animation, forKey: "basic")
needle.layer.transform = CATransform3DMakeRotation(radians, 0, 0, 1)
}
In the iOS app I'm developing, I've noticed that since upgrading to iOS 17 (Xcode 15.1), crashes of this type occur frequently. The crashes are random and can't be reliably reproduced. Below is a typical crash report:
CrashReporter Key: fd24cf14a51d73ebfc1852cccb1b8d50822b247c
Hardware Model: iPhone11,2
Process: MyApp [89057]
Path: /private/var/containers/Bundle/Application/06B982E0-B818-48A9-B2D1-F28999EC3BC0/MyApp.app/MyApp
Identifier: com.company.MyApp
Version: 2.0.0 (72)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: com.company.MyApp [4659]
Date/Time: 2024-03-24 14:50:46.3982 +0800
Launch Time: 2024-03-24 14:38:38.1438 +0800
OS Version: iPhone OS 17.3.1 (21D61)
Release Type: User
Baseband Version: 6.00.00
Report Version: 104
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x000000018c44e838
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [89057]
Triggered by Thread: 0
Kernel Triage:
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0 Crashed:
0 libobjc.A.dylib 0x18c44e838 object_getClass + 48
1 Foundation 0x1930807b4 _NSKeyValueObservationInfoGetObservances + 264
2 Foundation 0x19307fc7c NSKeyValueWillChangeWithPerThreadPendingNotifications + 232
3 QuartzCore 0x19572f14c CAAnimation_setter(CAAnimation*, unsigned int, _CAValueType, void const*) + 128
4 QuartzCore 0x19574a6b4 -[CAAnimation setBeginTime:] + 52
5 QuartzCore 0x1957485b4 CA::Layer::commit_animations(CA::Transaction*, double (*)(CA::Layer*, double, void*), void (*)(CA::Layer*, CA::Render::Animation*, void*), void (*)(CA::Layer*, __CFString const*, void*), CA::Render::TimingList* (*)(CA::Layer*, void*), void*) + 740
6 QuartzCore 0x195700bf0 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 148
7 QuartzCore 0x195700af8 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368
8-14QuartzCore 0x195700a84 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
15 QuartzCore 0x195745248 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11192
16 QuartzCore 0x19573bb80 CA::Transaction::commit() + 648
17 QuartzCore 0x19573b828 CA::Transaction::flush_as_runloop_observer(bool) + 88
18 CoreFoundation 0x1940ff7bc __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__ + 36
19 CoreFoundation 0x1940fe1c4 __CFRunLoopDoObservers + 548
20 CoreFoundation 0x1940fd8e0 __CFRunLoopRun + 1028
21 CoreFoundation 0x1940fd3f8 CFRunLoopRunSpecific + 608
22 GraphicsServices 0x1d768b4f8 GSEventRunModal + 164
23 UIKitCore 0x1965238a0 -[UIApplication _run] + 888
24 UIKitCore 0x196522edc UIApplicationMain + 340
25 MyApp 0x102c1f014 main + 140
26 dyld 0x1b6e52dcc start + 2240
By looking up information on the Exception Type and Termination Reason, I found that Apple officially mentions that EXC_BREAKPOINT (SIGTRAP) SIGNAL 5 Trace/BPT trap: 5 could be caused by Swift runtime error crashing mechanisms, mainly due to:
If you use the ! operator to force unwrap an optional value that’s nil, or if you force a type downcast that fails with the as! operator, the Swift runtime catches these errors and intentionally crashes the app.
For details, see the link: https://developer.apple.com/documentation/xcode/addressing-crashes-from-swift-runtime-errors
My project is a mix of Objc+Swift compilation, the crash usually occurs after a button click triggers a change in UIView properties, mostly related to layout. All Swift code in the project is unrelated to this type of UI. So, I speculate that it might not be related to Swift runtime errors, but I'm unsure what other possible causes could lead to the aforementioned crash.
A common denominator in all similar crash reports is that they occur on the main thread during system framework function calls,
All show multiple instances of
VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter
CA::Layer::commit_if_needed is always invoked. I noticed many crashes relate to CALayer setting properties internally calling CAAnimation, so I added:
@implementation CALayer (Animation)
/// Prevent crashes
+ (void)disableAnimation:(VoidBlock)block{
[CATransaction begin];
[CATransaction setValue:(id)kCFBooleanTrue forKey:kCATransactionDisableActions];
block();
[CATransaction commit];
}
@end
Using [CALayer disableAnimation:^{view.layer.someProperty = someValue;}] to disable animations has prevented some crashes, but I'm powerless in situations like the stack trace above, where all calls are made by system frameworks.
I've also noticed other similar crash issues on forums:
EXC_BREAKPOINT - libobjc.A.dylib object_getClass Crash on the main thread.
The author experienced this issue after iOS 16 and iOS 17, with very similar stack information to mine.
I suspect other potential causes might include:
Whether it's related to KVO in UI code not being correctly released.
Whether it involves calls to GPU resources from other threads. I've rewritten most of the code to ensure no GPU-related image operations occur on other threads during CoreAnimation runtime.
Whether it's related to high memory usage and peak virtual memory. My app is related to image processing, and opening 4K photos for processing typically consumes more than 500MB of memory.
If you've encountered similar situations or can help identify potential causes, please advise. Many thanks!
Hi. Can anyone who's into cosmology give hints as to how I might depict and animate dark matter for the VisionPro?
To reproduce this bug:
Set the .large navigation title in a View (which is in a NavigationView)
Add the .searchable modifier
And add the .refreshable modifier over a scroll view for example
Drag the pull to refresh
Actual result:
The navigation bar is blinking and jumping to the top without smooth animation
Expected result:
Navigation bar behaves expectedly as usual
I noticed if we use the .inline navigation title style or remove the .searchable modifier it work correctly
I've started seeing several users getting an app crash that I've been unable to find the root cause for so far. I've tried running the app in release build with address sanitizer and zombie objects checks enabled but have been unable to reproduce it. It only occurs for iOS 17 users. Any ideas on how I can troubleshoot this?
Crashed: com.apple.main-thread
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000
Crashed: com.apple.main-thread
0 libsystem_platform.dylib 0xed4 _platform_memmove + 52
1 QuartzCore 0x137120 CA::Render::InterpolatedFunction::encode(CA::Render::Encoder*) const + 248
2 QuartzCore 0x136f40 CA::Render::GradientLayer::encode(CA::Render::Encoder*) const + 44
3 QuartzCore 0x2e384 CA::Render::Layer::encode(CA::Render::Encoder*) const + 284
4 QuartzCore 0x2e224 CA::Render::encode_set_object(CA::Render::Encoder*, unsigned long, unsigned int, CA::Render::Object*, unsigned int) + 196
5 QuartzCore 0x2b654 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 244
6 QuartzCore 0x2b4fc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368
7 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
8 QuartzCore 0x2b4bc CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 304
9 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
10 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
11 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
12 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
13 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
14 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
15 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
16 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
17 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
18 QuartzCore 0x2b488 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
19 QuartzCore 0x6fc60 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11192
20 QuartzCore 0x66574 CA::Transaction::commit() + 648
21 UIKitCore 0x31b5ec __34-[UIApplication _firstCommitBlock]_block_invoke_2 + 36
22 CoreFoundation 0x373a8 __CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 28
23 CoreFoundation 0x35b9c __CFRunLoopDoBlocks + 356
24 CoreFoundation 0x33a9c __CFRunLoopRun + 848
25 CoreFoundation 0x33668 CFRunLoopRunSpecific + 608
26 GraphicsServices 0x35ec GSEventRunModal + 164
27 UIKitCore 0x22c2b4 -[UIApplication _run] + 888
28 UIKitCore 0x22b8f0 UIApplicationMain + 340
29 Coach 0x799d8 main + 14 (main.m:14)
30 ??? 0x1abefadcc (Missing)