Hi @Vision Pro Engineer and thanks for the reply!
I don't necessarily need this to work on color. That was just an example to make it easily reproducible.
My usecase is basically: I have a view with a sidebar on the left that contains items which should be keyboard navigatable. e.g list cells which can get focus by moving up and down and then make them listen for keypress. E.g I could press backspace to delete them or enter to make them renamable (turn the name label into a textfield). On the right side of the sidebar is a RealityKit scene view with custom camera controls. The idea would be that if the user puts focus on the right, or "clicks/taps" the scene view it becomes focused and then accepts keyboard input via onKeyPress. But this currently does not seem possible with what SwiftUI offers on Mac Catalyst? So should I rather roll my own focus management? I'm using a mix of UIKit and SwiftUI and currently retrieve keypresses on a container UIViewController via pressesBegan.
Post
Replies
Boosts
Views
Activity
Anyone? This is one of the last steps of polish of my Catalyst app that I would like to implement, but I just don't see what's going wrong.
Based on these docs: https://developer.apple.com/documentation/swiftui/view/focusable(_:interactions:)
The focus interactions allowed for custom views changed in macOS 14—previously, custom views could only become focused with keyboard navigation enabled system-wide. Clients built using older SDKs will continue to see the older focus behavior, while custom views in clients built using macOS 14 or later will always be focusable unless the client requests otherwise by specifying a restricted set of focus interactions.
It reads to me like this should be possible with macOS 14 or later?
Essentially what I want is similar to the behavior in Reality Composer Pro where when the user clicks the scene view keyboard controls are enabled to move around in the scene, whereas when focus changes e.g to elements on the sidebar keyboard navigation will cycle between them.
What works in my example above is using the arrow keys right away to focus the way. Although then I only get the focus ring, but the isFocused FocusState still does not update. Very confusing.
This video also didn't help much: https://developer.apple.com/documentation/swiftui/focus-cookbook-sample
Is this just somewhat unfinished behavior specific to Mac Catalyst?
The issue is still present in iOS 18 and macOS 15.2.
Hi, what is your https://developer.apple.com/documentation/realitykit/particleemittercomponent/simulationspace property currently set to? Also is this set to true or false: https://developer.apple.com/documentation/realitykit/particleemittercomponent/particlesinherittransform ?
You probably need to set their physics simulation to .none (https://developer.apple.com/documentation/realitykit/anchoringcomponent/physicssimulation-swift.enum/none). The default is .isolated where AnchorEntities run in their own Physics Simulation scope.
Did you call https://developer.apple.com/documentation/realitykit/component/registercomponent() at some point?
Found a workaround! Setting
environmentTexturing = .manual
brings back working environment texturing!
Nevermind, solved it by setting Other Linker Flags → -weak_framework RealityFoundation
Feedback ID: FB15081450
I finally found the culprit!
»Dead Code Stripping« was set to »Yes«. If I set it to no, the toolbar magically appears and the delegate is set properly.
Can anyone explain to me why that would happen or is it a bug?
My toolbar code is wrapped in a compiler condition #if targetEnvironment(macCatalyst)…#endif but that's the only thing I could think of.
Okay so what I've discovered so far is that for some reason the toolbars NSToolBarDelegate is being reset to nil (in the code above you can see that I indeed set it) and thus no items are being added. I have absolutely no idea why though and tested countless different setups. The same exact app runs fine on Sonoma, so what could have changed here?
Hi, you likely forgot to reapply the material to the mesh after updating its parameters. RealityKit Materials are value types.
I always use a helper function like this:
extension Entity {
func modifyMaterials(_ closure: (Material) throws -> Material) rethrows {
try children.forEach { try $0.modifyMaterials(closure) }
guard var comp = components[ModelComponent.self] as? ModelComponent else { return }
comp.materials = try comp.materials.map { try closure($0) }
components[ModelComponent.self] = comp
}
}
You could use a SwiftUI 3D Transform and apply it on the view (https://developer.apple.com/documentation/swiftui/view/rotation3deffect(_:axis:anchor:anchorz:perspective:)) or put your view into a RealityKit Attachment and then rotate the Attachment Entity (https://developer.apple.com/documentation/realitykit/transform/rotation).
Could you share the console crash log for more context?
One idea coming to my mind:
updateImage() might be called on a non-main thread. And afaik ModelEntity should be created on the main- tread. So you could try marking the function as @MainActor.
Same exact issue here, I think this is a bug on Apples side.
Any workarounds to fix this?