Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Posts under Reality Composer tag

75 Posts
Sort by:
Post not yet marked as solved
1 Replies
449 Views
Hi there, I was wondering whether its possible, in Reality Composer, to add a mesh object into the scene with an animated texture onto it? Whether it's a sprite sheet or a simple tiling offset animation? If not, is there a way to switch textures that are on an object? Either immediately on tap, or gradually fade from one to the other.
Posted
by
Post not yet marked as solved
1 Replies
345 Views
Hello everyone , i am working on Quick look project using model-viewer. I am trying to play USDZ animation using tap behavior using reality composer and its look good in composer. once export and play on click animation not working. Any trick or hack for this? You can test here what i want. https://ti-hardikshah.github.io/UsExpo/index.html
Posted
by
Post not yet marked as solved
0 Replies
316 Views
I design a scene in Reality Composer, design triggers, events and actions. I save the scene as a *.rcproject file and export it as a *.reality file. In Xode, it turns out to load the file as shown here https://developer.apple.com/documentation/realitykit/creating_3d_content_with_reality_composer/loading_reality_composer_files_manually_without_generated_code, post it as Entity in ARSession. Everything works perfectly according to the designed logic (by tapping on the screen, the billiard ball starts moving and knocks down the pins). My questions: How do I access the Xcode triggers that I have designed in Reality Composer from code? No, the generated code is not here for me? Can I upload the .*reality file only as an Entity? If I cannot work with the .*reality file as I suppose, can I not physically add the *.rcproject file to the project, but load the file from an external source, thereby gaining access to the generated code and event triggers?
Post not yet marked as solved
0 Replies
425 Views
Hello, I would like to ask if I can set the image tracking via realitykit and using the reality composer so that it does not disappear when the iphone does not see the image? In short, if it is easy to set the image in reality composer so that the 3D object is fixed in this position after scanning the image and I could go anywhere and still be there. Now I only created a simple project with RealityKit. I added the model to Experience and set the image tracking to it via the composer. However, when the phone does not see the image, it disappears. Can I do this to stay put? Thank you for any advice.
Posted
by
Post not yet marked as solved
1 Replies
386 Views
I'm aware that you can create a simple scene in RC, import it to Xcode, and then add additional 3D objects with OcclusionMaterial applied to them using RealityKit. However, I would like to accomplish that the other way around, i.e. exporting a .reality-file (or usdz?) from Xcode with OcclusionMaterial applied into RC. Is that possible? If so, what's the workflow?
Posted
by
Post not yet marked as solved
1 Replies
282 Views
Hi there, I recorded an AR Session using the built int AR Session recorder inside Reality Composer on my iPhone 12 Max Pro. Problematically, Reality Composer refuses to let me play the recorded session, as the play button remains grayed out: According to the Apple docs I should be able to simply press the play button to begin playback, but the button remains disabled: Is there some minimum amount of data required for the session to be considered valid? Pretty stumped here - we need to be able to record and playback AR sessions to automate tests.
Posted
by
Post not yet marked as solved
0 Replies
383 Views
Using Maya’s latest version we are attempting to convert a few fbx animations, with sound into a .reality file, without success. In the converter we get a shutdown in conversion after about 2 seconds on two of the three animations and in the latest attempt on the 3rd animation is that the animated characters are stacking on top of each other in the viewer (see image) and we get root node errors. Can you assist us on solving the issue and helping us better understand how to build iOS animations from Maya?
Posted
by
Post not yet marked as solved
1 Replies
397 Views
Hello, I’m trying to bring a USDZ model of a radio tower with a red light on top (using the emissive channel in my USDZ) into Reality Composer. As of right now, it looks like RC ignores emissive materials. If I finish all the interaction I want in RC, is there an easy way to add a bloom post-processing effect in XCode to make my red light glow? I saw there is Post Processing in Reality Kit 2 but as a new developer, not sure how or if it is possible to “dit into” my RC project with XCode to add that post processing. If anybody could provide a good beginners tutorial or a path to start learning, I’d really appreciate it. Thank you, Daniel Jones
Posted
by
Post not yet marked as solved
2 Replies
490 Views
Hello, I'm having a problem with showing/hiding objects in the AR view. The asset transparency doesn't transition 0 to 100% and vice-versa and I can see the object just like switching it on/off. Is this something anyone else is experiencing? Or maybe related to memory? Appreciate any advice.
Posted
by
Post marked as solved
2 Replies
401 Views
I'm currently on a research project where we are making some trees for AR experiences and I'm having some issues with the PNGs I'm using for leaves. I model the trees in a program called Tree It and then export that into Maya to convert it to a GLB with Babylon.js' Maya Plugin. Then to test how it looks I put the tree into Babylon viewer and it looks perfect. The png transparency works fine and there is no hazy white background around the planes i'm using as leaves. Now comes the issue. When I drop the GLB into Reality Converter the pngs suddenly have a mostly transparent but still visible white background. It seems the issue is with Reality Converter since I don't run into this problem anywhere else in the pipeline. I vaguely remember reading something about Apple adding a faint white background to pngs but I can't remember if i just imagined that. Anyways here is a closeup of the issue:
Posted
by
Post not yet marked as solved
1 Replies
482 Views
Hi, I'm struggling to find a way to get a simple Unlit Material working with Reality Composer. With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature. The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material. I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.
Post not yet marked as solved
1 Replies
304 Views
Hello, I have an augmented reality app with a simple Reality Composer project. It works fine on an ipad 14.4 but I'm having problems on higher versions (14.7 and 15). Anchor detection is much more sensitive. This has the consequence of restarting my scenes with each new detection. On the other hand, the scenes are interrupted as soon as the image of the anchor is no longer visible by the camera. I am using xcode 13.1 I use this simple code : class ViewController: UIViewController { @IBOutlet var arView: ARView! override func viewDidLoad() { super.viewDidLoad() guard let anchor2 = try? Enigme1.loadDebut() else { return } arView.scene.anchors.append(anchor2) } } Thank you very much for the help you could give me.
Posted
by
Post not yet marked as solved
2 Replies
528 Views
I have two scenes in my .reality file that was built using Reality Composer. The first scene gets loaded first, then after 1 second, I load the second scene. But this crashes the Quick Look app randomly. When I removed the Change Scene function, it works fine. I tested with IPhone 7. Any pointers to avoid the crashes?
Post not yet marked as solved
6 Replies
809 Views
Hi, So I created a scene in Reality Composer but when trying to export to USDZ, all of the animations, audio and interactions work, but the textures have been completely changed... (Map should be green, for example) Everything looks fine in Reality Composer Unfortunately, exporting as a .reality file isnt an option in this project. Could I get some insight into this please? Thank you
Posted
by
Post marked as solved
2 Replies
398 Views
Hi all! Does anyone know if there is a version of reality composer on mac book pro? I recently saw a girl on youtube in a video, and she had the program working on a mac. My app store says that the program is only available on the iPad and iPhone. video link : https://youtu.be/WdxA9Y0k3EU?t=768
Posted
by
Post not yet marked as solved
0 Replies
216 Views
When I clone a rather complex scene in realitykit, then immediately try to reference that clone's entity content, I get a nil return. There is no completion: form of clone, to my awareness. How can one safely assure a scene is fully cloned before use? I load this scene using load..Async from RealityComposer. This completes successfully before I do a clone operation.
Posted
by
Post not yet marked as solved
0 Replies
169 Views
I am new to ARkit. Can we add gestures to a asynchronously loaded model?
Posted
by
Post not yet marked as solved
1 Replies
488 Views
After I update the Xcode to 13 the animation that comes with the USDZ is not played anymore in the Reality Composer / RealityKit even though it's working fine on older Xcode 12 and 11 the problem: 'Nothing happens when playing any USDZ animation even when running it on a real device' Does anyone face this problem cause didn't find a solution and update to fix it yet
Posted
by