Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

[VisionOS] Entity spawning during the runtime won't respond to my gesture
I'm creating an immersive experience with RealityView (just consider it Fruit Ninja like experience). Saying I have some random generated fruits that were generated by certain criteria in System.update function. And I want to interact these generated fruits with whatever hand gesture. Well it simply doesn't work, the gesture.onChange function isn't fire as I expected. I put both InputTargetComponent and CollisionComponent to make it detectable in an immersive view. It works fine if I already set up these fruits in the scene with Reality Composer Pro before the app running. Here is what I did Firstly I load the fruitTemplate by: let tempScene = try await Entity(named: fruitPrefab.usda, in: realityKitContentBundle) fruitTemplate = tempScene.findEntity(named: "fruitPrefab") Then I clone it during the System.update(context) function. parent is an invisible object being placed in .zero in my loaded immersive scene let fruitClone = fruitTemplate!.clone(recursive: true) fruitClone.position = pos fruitClone.scale = scale parent.addChild(fruitClone) I attached my gesture to RealityView by .gesture(DragGesture(minimumDistance: 0.0) .targetedToAnyEntity() .onChanged { value in print("dragging") } .onEnded { tapEnd in print("dragging ends") } ) I was considering if the runtime-generated entity is not tracked by RealityView, but since I have added it as a child to a placeholder entity in the scene, it should be fine...right? Or I just needs to put a new AnchorEntity there? Thanks for any advice in advance. I've been tried it out for the whole day.
2
1
723
Jan ’24
Reality Composer Pro: Triggering Animations and changing scenes from an onClick Event
Hello, Currently working on a project that was finished in Reality Composer but then we noticed the pink material after changing scenes on iOS 17 devices. So did some updating to Mac OS 14 and X Code 15 beta to use Reality Composer Pro and currently stuck on how to setup the animations and onClick triggers to be able play animation from the USDZ model in the scene. Once the animation is finished, it will trigger the next scene. This was done through behaviors in Reality Composer and it was simple drag and drop. But now it seem we need to do it by components which i don't mind just don't see much resources on how to set this up properly. Is there are way to do behaviors like in Reality Composer? Extra: If there is a way to use alpha pngs or be able to drag PNG's into the scene like in Reality Composer?
1
1
629
Oct ’23
How to place a 3D model in front of you in the Full Space app.
Hi, I am currently developing a Full Space App. I have a question about how to implement the display of Entity or Model Entity in front of the user. I want to move the Entity or Model Entity to the user's front, not only at the initial display, but also when the user takes an action such as tapping. (Animation is not required.) I want to perform the initial placement process to the user's front when the reset button is tapped. Thanks. Sadao Tokuyama https://twitter.com/tokufxug https://www.linkedin.com/in/sadao-tokuyama/ https://1planet.co.jp/tech-blog/category/applevisionpro
1
0
424
Oct ’23
Reality View toggle Entities activation
I'm trying to create a feature in my app vision OS app where i show a reality view and on button click toggle different entities as in showing them on button click and vice versa. Is this possible in Vision os? if so how can i do this ? All that I did now was to instanciate my scene which contains a car 3d model, red tyres and blue tyres. On a button click i'm trying to show the blue tyres instead of the red ones. Is this possible ? Thank you,
1
0
577
Sep ’23
Playing Animation using visionOS - Reality Composer Pro - USDZ
I have a usdz model with animation that I can preview in RCP. When I create a new base/example visionOS project in Xcode it's set up to load in the 'Scene" and "Immersive" reality kit content. But my models don't play the animation. How do I fire off the contained animations in those files? Is there a code snippet that someone can share that takes into account how the example project is setup?
1
2
1.1k
Oct ’23
Xcode Version 15.0 (15A240d) Reality Composer is missing
Hi guys, I've just installed the newest Xcode Version 15.0 (15A240d) and I can see that the Reality Composer is missing. It is not appearing in Xcode -> Open Developer Tool menu. Where/how I can find it? My old reality compose project now opens like a text file, and I have no option to open it in Reality Composer like it was in old Xcode. I'm kinda stuck with my project, so any help could be useful. Thanks, J
9
2
2.6k
Apr ’24
Diorama Demo Issues -
I'm trying to run this demo, which opened a month ago, but now returns 2 issues in xCode: https://developer.apple.com/documentation/visionos/diorama My goal is to better understand how to use Reality Composer Pro to develop visionOS apps for the Vision Pro. BillboardSystem.swift:39:58 Value of type 'WorldTrackingProvider' has no member 'queryDeviceAnchor' Cannot find 'Attachment' in scope. Preview paused shows. Any thoughts on where I'm off? Thanks!
2
0
504
Sep ’23
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
2
1
1.1k
Oct ’23
Failed to preview iOS app due to RealityKitContent.rkassets (should be linked only on visionOS)
Hello! My app supports iOS and visionOS in a single target. But when I preview the app on iOS device/simulator, an error occurred: The RealityKitContent.rkassets is located in my RealityKitContent package that is linked on visionOS. It seems that Xcode Preview is ignoring the link settings and attempt to build RealityKitContent.rkassets on iOS. Reproduce Steps: Clone my demo project at https://github.com/gongzhang/rkassets-preview-issue-demo Build the app for iOS (success) Preview ContentView.swift (failed due to RealityKitContent.rkassets issue)
3
0
758
Sep ’23
where to apply "grounding shadow" in Reality Composer Pro?
So there's a "grounding shadow" component we can add to any entity in Reality Composer Pro. In case my use case is Apple Vision Pro + Reality Kit: I'm wondering, by default I think I'd want to add this to any entity that's a ModelEntity in my scene... right? Should we just add this component once to the root transform? Or should we add it to any entity individually if it's a model entity? Or should we not add this at all? Will RealityKit do it for us? Or does it also depend if we use a volume or a full space?
2
0
715
Sep ’23
Reality Composer Pro Version 1.0 (393.3) - no longer able to drag images into scene
I am no longer able to drag images into the scene for Version 1.0 (393.3) of Reality Composer Pro. This used to work in the older versions. Is it no longer possible to do it? It was really nice for prototyping, I guess I can write a feedback unless I'm doing something wrong. It was a bit of a PITA to drop images as materials on 3D objects in the past... hope that's not ht only way I got this version with Xcode Version 15.0 beta 8 (15A5229m)
1
0
515
Sep ’23
VisionOS - RealityKit objects cut out / occluded when out of RealityView since beta8
I've updated my project from Beta7 to Beta8. Now, the models outside the screen view are being cut off, invisible when out of the 2D screen's width/height. Depth seems ok. Is there any property like CSS's overflow:hidden etc, that I can give to the RealityView so I can see the outsides? Code is something like this. struct AnimationTest: View { var body: some View { RealityView { content in if let scene = try? await Entity(named: "Card", in: realityKitContentBundle) { content.add(scene) } let transform = Transform(translation: [0.25, 0.0, 0.0]) ent.transform = transform ...
0
0
389
Aug ’23
How to play video in full immersive space in vision Pro
I am trying to implement a feature to play video in full immersive space but unable to achieve desired results. When I run app in full immersive space, it shows video in center of screen/window. See screenshot below. Can you please guide/refer me how to implement a behaviour to play video in full immersive space, just like below image Regards, Yasir Khan
2
0
2.4k
Aug ’23