Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Posts under Reality Composer tag

75 Posts
Sort by:
Post not yet marked as solved
2 Replies
319 Views
Any USDZ file exported from Reality Composer fails to import into Reality Converter. Every USDZ file I have exported from Composer (even just a cube with a default glossy paint finish anchored on a horizontal surface) fails on import to Converter with the message "Conversion Failed: 1 Error". I suspect that the problem is with Composer and not Converter, since USDZ files created by other means import into Converter just fine. Steps to repro: Enable USDZ export in Reality Composer (Preferences/Enable USDZ export) Add a cube Export USDZ, current scene only (although this seems to make no difference) Open Reality Converter Drag USDZ file from step 3 into the window Observe error message Better import diagnostics would be good too.
Posted Last updated
.
Post not yet marked as solved
2 Replies
490 Views
Hello, I'm having a problem with showing/hiding objects in the AR view. The asset transparency doesn't transition 0 to 100% and vice-versa and I can see the object just like switching it on/off. Is this something anyone else is experiencing? Or maybe related to memory? Appreciate any advice.
Posted
by Dan1979.
Last updated
.
Post not yet marked as solved
1 Replies
343 Views
I have a Reality Composer file with several scenes, all of which starts empty and then some models appear one by one every second. The animation works perfectly in Quicklook and Reality Composer, but has a glitch when rendered in ARKit. When the very first scene is launched or when we go to another scene, they don't start empty.. For a tiny split second, we see all the models of that scene being displayed, only to disappear immediately. I have tried fixing it using Async loading, as well as delaying the DispatchQueue by 0.5 seconds. If anyone has an idea, it would mean the world if you could help. This is the final bug before the launch of our app.
Posted
by tea_fm.
Last updated
.
Post marked as solved
2 Replies
401 Views
I'm currently on a research project where we are making some trees for AR experiences and I'm having some issues with the PNGs I'm using for leaves. I model the trees in a program called Tree It and then export that into Maya to convert it to a GLB with Babylon.js' Maya Plugin. Then to test how it looks I put the tree into Babylon viewer and it looks perfect. The png transparency works fine and there is no hazy white background around the planes i'm using as leaves. Now comes the issue. When I drop the GLB into Reality Converter the pngs suddenly have a mostly transparent but still visible white background. It seems the issue is with Reality Converter since I don't run into this problem anywhere else in the pipeline. I vaguely remember reading something about Apple adding a faint white background to pngs but I can't remember if i just imagined that. Anyways here is a closeup of the issue:
Posted Last updated
.
Post not yet marked as solved
1 Replies
256 Views
With the RealityComposer I could get my react-native app to replay the recorded scene through Xcode. Question -- Is using the replay data supported in standalone app with no reliance on Xcode? Why? I am looking to automate our Augmented Reality tests in lab's physical devices that won't be physically moving for plane detection.
Posted
by jypandey.
Last updated
.
Post not yet marked as solved
1 Replies
272 Views
I converted a fbx file to USDZ using Reality Converted. This seemed to work and I can double click in the USDZ file and a Xcode view pops up and shows the file/object with out a problem. But when I go to Reality Composer and try to import this USDZ I can see it added to the import list under + (add) and I can see it drag to the scene, but when I release everything disappears and I see nothing in the scene. How can I figure out what happen. Is there a console view I can bring up. Is there a way to list all the objects in the scene?
Posted
by fwalker.
Last updated
.
Post not yet marked as solved
0 Replies
152 Views
Hello All, hope someone can help with this issue I have encountered. I was using the Reality Composer to create an image anchor AR scene. I could load it in a previous version of Xcode and it worked fine. I upgraded Xcode from 12 to 13 and it still worked fine. when I try to change the Reality Composer file and run it again I receive a runtime error that the anchor was failed to be created because the anchor does not have an asset named locatorImage.png. var imageAnchor: RealComp.Scene? // to simply load the file in the viewDidLoad: imageAnchor = try! RealComp.loadScene() Fatal error: 'try!' expression unexpectedly raised an error: Failed to create anchor file because the archive does not have an asset named locatorImage.png. Has anyone else come across this issue? The archive refers to the realityComposer file? - it does contain the image - as per previous Hope someone can help :-)
Posted
by ACTYAR.
Last updated
.
Post not yet marked as solved
3 Replies
328 Views
This code works perfectly on a simple cube, however when loading a larger (250mb) .rcproject file I get the "Unexpectedly found nil while implicitly unwrapping an Optional value" error. My code is as follows: struct ARSandBox: UIViewRepresentable { func makeUIView(context: Context) -> ARView { var map: ModelEntity! let arView = ARView(frame: .zero) let delayTime = DispatchTime.now() + 3.0 let scene = try! TeaTestMap.loadMapScene() map = scene.theMap!.children[0] as? ModelEntity (Error is here at map.isEnabled) map.isEnabled = false arView.scene.anchors.append(scene) DispatchQueue.main.asyncAfter(deadline: delayTime) { map.isEnabled = true }
Posted
by tea_fm.
Last updated
.
Post not yet marked as solved
2 Replies
528 Views
I have two scenes in my .reality file that was built using Reality Composer. The first scene gets loaded first, then after 1 second, I load the second scene. But this crashes the Quick Look app randomly. When I removed the Change Scene function, it works fine. I tested with IPhone 7. Any pointers to avoid the crashes?
Posted Last updated
.
Post not yet marked as solved
1 Replies
482 Views
Hi, I'm struggling to find a way to get a simple Unlit Material working with Reality Composer. With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature. The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material. I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.
Posted Last updated
.
Post not yet marked as solved
0 Replies
425 Views
Hello, I would like to ask if I can set the image tracking via realitykit and using the reality composer so that it does not disappear when the iphone does not see the image? In short, if it is easy to set the image in reality composer so that the 3D object is fixed in this position after scanning the image and I could go anywhere and still be there. Now I only created a simple project with RealityKit. I added the model to Experience and set the image tracking to it via the composer. However, when the phone does not see the image, it disappears. Can I do this to stay put? Thank you for any advice.
Posted Last updated
.
Post not yet marked as solved
1 Replies
568 Views
I'm currently trying to project a transparent .png file on to a flat surface (table/paper), but the shadow is giving me a gray box. I am a huge fan of the shadows in Reality Composer, however I'm searching for an option to adjust the opacity or turn off the shadow.
Posted Last updated
.
Post not yet marked as solved
1 Replies
347 Views
I'm new to Reality Composer and RealityKit. I've set up a .rcproject with multiple ImageTracking scenes. Each are setup identically. At this point, when images are recognized, a cube is overlaid and has a tap gesture that should send a notification to Xcode. In Xcode, I'm loading each of the scenes and adding the anchor's to my ARView. When I run the app on my device, each image overlay displays, however only the first loaded anchor's Tap behavior works. Is this how it's meant to work? Any suggestions on getting multiple image tracking behaviors to work from Reality Composer in Xcode. Any suggestions are appreciated.
Posted
by bryanmm.
Last updated
.
Post not yet marked as solved
0 Replies
223 Views
Hello, I am trying to export to usdz the Comsonaut model which is available here https://developer.apple.com/augmented-reality/quick-look/ It has points which trigger animations and I wonder if something like that is supported in usdz. My guess is no since I can't export this particular model to usdz in RealityComposer. But I wanted to be 100% sure. So I would really appreciate if someone can confirm that this functionalty (triggered animations) is not available in USDZ
Posted
by rrrzzz.
Last updated
.
Post not yet marked as solved
1 Replies
367 Views
Forgive me but I am new to AR and Reality Composer and I am at a loss on what app someone needs on their iPhone, iPad, Andriod, etc to read the files. As an example, I made a program on my iPad that a student can go up to a book and it will give information about that book, but that is ONLY when it's in Reality Composer. I don't want students to be in Reality Composer, as that is fraught with issues and concerns. What application does one use or need so that when the student points their phone or iPad at the book it does what it does in Reality Composer? Is this going to be only Apple-based products that can read it because we use iPads, but we have a mix of Andriod and iPhones. I am really enjoying the making of items, and they will soon be ready for prime time, but there is a missing link on how to properly share what you created on everyone's else devices. Thanks in advanced, Tim
Posted Last updated
.
Post marked as solved
2 Replies
639 Views
Just upgraded my ipadOS app to iOS15 and the AR part of the app is now completely broken. Literally no AR functionality works. Other than that, an awesome upgrade;) I verified I'm not crazy by installing the same app to iPad8 running iOS14.8 and my AR functionality works fine. broken environment: xcode13, ios15 (prod release), iPadPro working environment: xcode13, ios14.8, iPad8 I think this is worth a zoom meeting to demo my findings with the apple dev team.
Posted
by MikeNunez.
Last updated
.
Post not yet marked as solved
1 Replies
198 Views
I I opened Reality Composer in x-code, and created a lot of items, most of which are from existing libraries, but I can't download them in ipad. Display: "the object type is no longer available"
Posted
by BBEETTYY.
Last updated
.
Post not yet marked as solved
1 Replies
488 Views
After I update the Xcode to 13 the animation that comes with the USDZ is not played anymore in the Reality Composer / RealityKit even though it's working fine on older Xcode 12 and 11 the problem: 'Nothing happens when playing any USDZ animation even when running it on a real device' Does anyone face this problem cause didn't find a solution and update to fix it yet
Posted
by Ahmad1652.
Last updated
.