Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Posts under Reality Composer tag

76 Posts
Sort by:
Post not yet marked as solved
5 Replies
1.3k Views
One of people's favourite features of AR Quicklook is the video playback. Aside from the feature its self, what it represents is familiarity as people have used it in video for decades so it helps place AR right next to other common formats, which is awesome! Previously, if a USDZ's animation was longer than 10 seconds, the playback would appear. This rule was recently changed to allow the option from the USDZ metadata using the python tool. Unfortunately most designers are still not using the python toolkit so the feature is now hidden in most cases, which is too bad. Would be great to add the ability to set that metadata from Reality Converter & Composer. Better yet, if it was available as an html fragment identifier it would be super helpful, similar to "allowsContentScaling"
Posted
by
Post not yet marked as solved
1 Replies
446 Views
Right, So I been playing with RealityKit and Reality composer on my mac pro. In general what I want to achvie is a scene that can respond to the tap interactions, but I want to build it as an app rather than access it via Reality Composer App. So I setup my ar project, I build bunch of tap based flows with behaviours like hide/unhide etc. I have imported the rcproejct to xcode and called each scene with let PlaneAnchor = try! ExperienceRcproject203847722.loadSceneName() followed with  arView.scene.anchors.append(PlaneAnchor) then when I play the app i get the following log. 2020-07-23 09:06:12.818800+0100 realityComposer[17010:2669060] Compiler failed to build request 2020-07-23 09:06:12.819015+0100 realityComposer[17010:2669060] [Graphics] makeRenderPipelineState failed [output of type ushort is not compatible with a MTLPixelFormatR16Float color attachement.]. 2020-07-23 09:06:12.819055+0100 realityComposer[17010:2669060] [Graphics] makeRenderPipelineState failed. 2020-07-23 09:06:19.027024+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable 2020-07-23 09:06:19.027401+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable 2020-07-23 09:06:20.365286+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable 2020-07-23 09:06:20.365588+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable 2020-07-23 09:06:25.644804+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable 2020-07-23 09:06:25.690250+0100 realityComposer[17010:2669100] [Graphics] Failed to find reflection for buffer clusterIndexTable plus each time when I tap on objects to perform the behaviour that I created in the realityComposer I get following bug in output log: 2020-07-23 09:06:31.537449+0100 realityComposer[17010:2669060] [Collision] Bad paramater (SphereRadius), value = 0.000000, passed to shape creation. Anyone can advise on what could be the solution here?
Posted
by
Post not yet marked as solved
1 Replies
678 Views
Hi! I've been working with AR Quick Look a lot recently and find it really useful. However, when I'm combining two .usdz-files into one, the experience isn't as great. I'm working with 3D-models that are supposed to be attached to vertical surfaces, and one by one they work flawlessly. But as soon as I add more models into the same, the objects won't "stick" as close to the wall as they do when they are separated. It's like there are some sort of "margin" applied to it. To create nested .usdz-files, I use Apple's command line tool: $ usdzcreateassetlib outputFile.usdz asset1.usdz [asset2.usdz [...]] Any idea why this might be the case? Thanks!
Posted
by
Post not yet marked as solved
1 Replies
568 Views
I'm currently trying to project a transparent .png file on to a flat surface (table/paper), but the shadow is giving me a gray box. I am a huge fan of the shadows in Reality Composer, however I'm searching for an option to adjust the opacity or turn off the shadow.
Posted
by
Post not yet marked as solved
1 Replies
402 Views
Hi, is it possible to a function like „installGestures“ .translation, .scale, .rotatation. , while using a Reality Composer project in Xcode ? This works fine when applied to a single imported usdz file, but I need to get this to work on a realty composer project, which has various behaviors set. Thanks in advance for any tips. Regards Steven
Posted
by
Post not yet marked as solved
1 Replies
790 Views
I have created a multi-level scene where the user can click an audio file icon and an .mp3 file will play a short audio file. This works beautifully locally on my computer and when I share to my iPhone/iPad. However, when I export the USDZ file, upload to my website and test from there the audio does not play. The other behaviors (Tap + Show) works, but the audio does not play. I've tried the .mp3, .wav, and .caf format with no luck. I created a new reality composer file, added a cube and added a tap + sound behavior using the built in audio options (Happy Chime), but the same exact issue occurs. Any suggestions on what I can adjust in my project to allow audio to play? Do the audio files need to be in the same directory as the usdz file? Any help is appreciated.
Posted
by
Post marked as solved
5 Replies
1.1k Views
After updating my Iphone XS Max to iOS 14.3 I experienced all of my exported USDZ files are having z-buffer issues and clipping when it comes to opacity. The files are displayed correctly in Reality Converter. I am in process of rolling phone back now to see if it fixes the issue. Has anyone else encountered this issue after updating?
Posted
by
Post not yet marked as solved
1 Replies
952 Views
I use usdzaudioimport to embed a simple audio file in my usdz. But in QuickLook (iOS or macOS) the audio is never played. What I've tried : usdzaudioimport ./myfile.usdz ./myfile.usda -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart usdzip myfile.usdz myfile.usda ./0/* where ./0 is the asset folder of myfile.usda I tried with usdc usdzaudioimport ./myfile.usdz ./myfile.usdc -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart usdzip myfile.usdz myfile.usdc ./0/* I tried with usdz without converting to usd(a/c) usdzaudioimport ./myfile.usdz -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart I tried m4a and mp3. I tried the sample cube from Reality Composer by adding my audio.mp3 as behavior at start of scene and exporting in usdz for Quicklook. But none of this solutions worked. I'm not able to play any audio with AR Quick Look usdz file at the start of the scene.
Posted
by
Post not yet marked as solved
1 Replies
559 Views
Hi I'm exporting an object from Blender to an FBX (or GLTF/GLB) file to then convert it in Reality Converter. Inside there, everything looks good, all animations work and the object is positioned on 0 0 0, however when I view the exported USDZ file in Safari on my iPhone, the object does not align to the horisontal plane but floats in the air, yet in the Reality Composer, the object is aligned perfectly with the floor. How can you set the correct alignment position for the object, either in Blender, Reality Composer or Reality Converter? Thanks
Posted
by
Post not yet marked as solved
1 Replies
423 Views
In RC, I designate an Object as hidden at Scene start. In my application, I anchor the Object as a new Child (anEntity) to an AnchorEntity active in the arview. anchorEntity.addChild(anEntity) When I execute the following line: anEntity.isEnabled = true, the object does not become visible. anEntity.isActive value true. What am I missing? or is this a RC/RealityKit bug?
Posted
by
Post not yet marked as solved
1 Replies
415 Views
We are producing augmented reality in about 2 minutes with various scenes and animations. On ios14, everything worked without any problem, but on ios15, it suddenly says that AR Quick Look cannot open it. It seems that the reality file with a large capacity or two or more animations does not open. Please check. I attach a video that worked well on ios14. https://youtu.be/KPo_ikRclmc Thanks.
Posted
by
Post marked as solved
2 Replies
638 Views
Just upgraded my ipadOS app to iOS15 and the AR part of the app is now completely broken. Literally no AR functionality works. Other than that, an awesome upgrade;) I verified I'm not crazy by installing the same app to iPad8 running iOS14.8 and my AR functionality works fine. broken environment: xcode13, ios15 (prod release), iPadPro working environment: xcode13, ios14.8, iPad8 I think this is worth a zoom meeting to demo my findings with the apple dev team.
Posted
by
Post marked as solved
1 Replies
447 Views
Long time listener, first time caller here, We'd like to use AR Quick Look to render a rigged game character that walks to where you tap on the base surface. So far, I'm only seeing the ability to sequentially trigger actions. So move+animate simultaneously seems un-doable. I guess one could parent the USDZ to a "mover" object and then run the transform on the mover and the animation on the USDZ but ugh. Anyone done something like this?
Posted
by
Post marked as solved
3 Replies
567 Views
Hoping someone can help me with this. I'm trying to create an AR experience for a prefab house builder. I've made a video showing my reality composer screen and viewing the files on iphone viewable here: https://www.youtube.com/watch?v=7VsBxxnw3pE I exported both a .reality file and a .usdz file because I wasn't sure what was the proper way. The first thing I'm showing on my iphone is the .reality file. You can see at first it has issues placing the blue hexagon shape but then locates correctly after I move the phone left to right. The main problem here is that it doesn't bring in the house at all. You can see when I switch to object view that only the blue hexagon is present - no house. The other problem is that the model loses its tether to the image target as soon as the image is out of view of the camera. This obviously wont work for something like a prefab house walkthrough because people want to walk all around and through the house on site. Now for the .usdz export. As you can see the house is present in object view although it is tipped on its side. I believe I can fix this issue with the Revit exporter. It did bring the house in like this in reality composer but I used the rotation tools to make it sit flat and orient it with the image target. The other problem is that in AR the image tracking doesn't seem to be working at all. I did notice a slight vibration when the phone saw the image but no blue hexagon or house was visible. For some background I modeled this house in Revit and exported it as a usdz file using a plugin. I'm running reality composer version 1.5 in xcode. I'm trying to develop a procedure for this so I can do it for clients often. Please help! Thanks.
Posted
by
Post not yet marked as solved
0 Replies
255 Views
Is anyone else having trouble with image targets in reality composer? When I move my phone directly over or in very close proximity to an image target, my scene shows correctly. But as soon as I move my camera away the scene seems to jump to some arbitrary location in the room. When I move back to the target it comes back to the correct location shortly.
Posted
by
Post not yet marked as solved
0 Replies
228 Views
I'd like to bring a prefab house into reality composer and the model has a foundation that needs to be below earth level. Reality composer seems to be locating the bottom of all objects to the surface of wherever you are but this of course doesn't work for a house with a subterranean foundation. Is there a way to force it to be 2' or so lower?
Posted
by
Post not yet marked as solved
0 Replies
375 Views
Is there a way to import custom font into Reality composer on iPad? I can see them properly on desktop but I'm trying to figure out how to do the same on Ipad so that when I export my project on the remote device there is visual parity. Thanks
Posted
by