Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

[Newbie] Why does my ShaderGraphMaterial appear distorted?
Disclaimer: I am new to all things 3D. There could be a variety of things wrong with what I'm doing that are not unique to RealityKit. Any domain info would be appreciated. So, I'm following, what I think are, the recommended steps to import a shader-node material from reality composer pro and apply it to another modelEntity. I do the following: guard let entity = try? Entity.load(named: "Materials", in: RealityKitContent.realityKitContentBundle) else { return model } let materialEntity = entity.findEntity(named: "materialModel") as? ModelEntity guard let materialEntity else { return model } I then configure a property on it like so: guard var material = materialEntity.model?.materials[0] as? ShaderGraphMaterial else { return model } try coreMaterial.setParameter(name: "BaseColor", value: .color(matModel.matCoreUIColor)) I then apply it. This is what my texture looks like in RealityComposer: I notice that my rendered object has distortions in the actual RealityView. Note the diagonal lines that appear "Stretched". What could be doing this? I thought Node Shaders were supposed to be more resilient to distortions like this? I'm not sure if I've got a bug or if I'm using it wrong. FWIW, this is a shader based on apple's felt material shader. My graph looks like this: Thanks
2
0
680
Aug ’23
How to impose an Impulse
Hi. Given an RC entity var physicsBody = tappedObject.entity.components[PhysicsBodyComponent.self]! as PhysicsBodyComponent how does one call func applyImpulse(_ impulse: SIMD3, at position: SIMD3, relativeTo referenceEntity: Entity?) to impose an impulse on the object? Or the real question is how does one obtain a physiceBody on the Model.Entity? Thanks!
6
0
840
Sep ’23
WebDriverAgent issue, appium inspector black screen
Hello, I am trying to do automation on Apple Vision Pro, I use these capabilities { “platformName”: “iOS”, “appium:platformVersion”: “17.0”, “appium:deviceName”: “Apple Vision Pro”, “appium:automationName”: “XCUITest”, “appium:udid”: “EB2BB922-DCAF-44BF-8C15-2EDE677C08DA” } but appium inspector is showing just black screen and doesnt work properly. There is an issue with WebDriverAgent, so could you help me to fix these black screen issue on Appium Inspector or refactor code in WebDriverAgent project?
1
0
354
Aug ’23
In an Immersion scene, how do I anchor my model to the surroundings rather than my head position?
I'm loading a model of a room's interior from Reality Composer Pro in a RealityView in my full Immersion scene. The model loads fine, but the problem is that if I pan the simulator camera around, the model follows the camera, rather than remaining stationary and allowing the camera to move around it. How can I keep the model stationary?
1
0
422
Aug ’23
How to play video in full immersive space in vision Pro
I am trying to implement a feature to play video in full immersive space but unable to achieve desired results. When I run app in full immersive space, it shows video in center of screen/window. See screenshot below. Can you please guide/refer me how to implement a behaviour to play video in full immersive space, just like below image Regards, Yasir Khan
2
0
2.2k
Aug ’23
VisionOS - RealityKit objects cut out / occluded when out of RealityView since beta8
I've updated my project from Beta7 to Beta8. Now, the models outside the screen view are being cut off, invisible when out of the 2D screen's width/height. Depth seems ok. Is there any property like CSS's overflow:hidden etc, that I can give to the RealityView so I can see the outsides? Code is something like this. struct AnimationTest: View { var body: some View { RealityView { content in if let scene = try? await Entity(named: "Card", in: realityKitContentBundle) { content.add(scene) } let transform = Transform(translation: [0.25, 0.0, 0.0]) ent.transform = transform ...
0
0
376
Aug ’23
Reality Composer Pro Version 1.0 (393.3) - no longer able to drag images into scene
I am no longer able to drag images into the scene for Version 1.0 (393.3) of Reality Composer Pro. This used to work in the older versions. Is it no longer possible to do it? It was really nice for prototyping, I guess I can write a feedback unless I'm doing something wrong. It was a bit of a PITA to drop images as materials on 3D objects in the past... hope that's not ht only way I got this version with Xcode Version 15.0 beta 8 (15A5229m)
1
0
478
Sep ’23
where to apply "grounding shadow" in Reality Composer Pro?
So there's a "grounding shadow" component we can add to any entity in Reality Composer Pro. In case my use case is Apple Vision Pro + Reality Kit: I'm wondering, by default I think I'd want to add this to any entity that's a ModelEntity in my scene... right? Should we just add this component once to the root transform? Or should we add it to any entity individually if it's a model entity? Or should we not add this at all? Will RealityKit do it for us? Or does it also depend if we use a volume or a full space?
2
0
685
Sep ’23
Failed to preview iOS app due to RealityKitContent.rkassets (should be linked only on visionOS)
Hello! My app supports iOS and visionOS in a single target. But when I preview the app on iOS device/simulator, an error occurred: The RealityKitContent.rkassets is located in my RealityKitContent package that is linked on visionOS. It seems that Xcode Preview is ignoring the link settings and attempt to build RealityKitContent.rkassets on iOS. Reproduce Steps: Clone my demo project at https://github.com/gongzhang/rkassets-preview-issue-demo Build the app for iOS (success) Preview ContentView.swift (failed due to RealityKitContent.rkassets issue)
3
0
701
Sep ’23
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
2
1
981
Oct ’23
Diorama Demo Issues -
I'm trying to run this demo, which opened a month ago, but now returns 2 issues in xCode: https://developer.apple.com/documentation/visionos/diorama My goal is to better understand how to use Reality Composer Pro to develop visionOS apps for the Vision Pro. BillboardSystem.swift:39:58 Value of type 'WorldTrackingProvider' has no member 'queryDeviceAnchor' Cannot find 'Attachment' in scope. Preview paused shows. Any thoughts on where I'm off? Thanks!
2
0
471
Sep ’23