Post not yet marked as solved
xcode project contains:
7 RC .rcprojects, size range: 11.5MB-16MB/.rcproject
Crash:
Beach ball spin after toggling between .rcprojects in xcode where I Open RC for each project.
I am inferring that there is a memory deallocate bug in xcode where after a .rcproject is opened in a tab, I open RC, work on a scene. Close the scene, then pick another .rcproject to work on. After about 3-4 iterations, xcode becomes non-resonsive, beach ball happens, then xcode crashes or I have to force kill it.
Post not yet marked as solved
If I create an object: for example a sphere of a certain material can I know its weight? Can I know the specific weight of a material among those I can choose in Reality Composer? Thank you to those who can help me.
Post not yet marked as solved
Can we put Videos in RealityComposer on iPad or iPhone?
Post not yet marked as solved
Hello, is there a way to import 3d hat objects and have it anchored to head? Using face anchor only seems to anchor to the front of face, no matter how I position it.
Thanks in advance.
Post not yet marked as solved
Using Reality Composer [within Xcode]... I've built an .rcproject file with 25 scenes containing Behaviors w/Triggers + Actions to change scenes, one after another.
When exported as a .reality file, it does run successfully through all 25 scenes.
❓However... when exported as a .usdz file via "Export: Project" (not "Export: Current Scene"), it only runs the 1st scene, as though the rest of the scenes do not exist. The Trigger to change to the 2nd scene does nothing.
✅ Btw I do have "Enable USDZ export" checked (which is not on by default) in the Preferences for Reality Composer.
✨Thank you✨ I'd greatly appreciate any troubleshooting tips❗️
Post not yet marked as solved
Hi everyone, is it possible to use the ARKit Replay Data option for XCUITests? If not, this would be a great feature for automation.
Thanks!
Post not yet marked as solved
Been a few years since I've worked on a Mac, client needs some work with Reality Composer. So forgive me if I'm forgetting or don't know more modern troubleshooting on macOS. I come from the days of zapping the PRAM.
https://youtu.be/O95MCQmECcM
A screen capture of the behavior.
In short, after launching the app - upon selecting an object, it closes out/crashes.
Any guidance would help.
Some error info:
RAQLPreviewExtension (process name)
Logical CPU: 4
Error Code: 0x00000015 (invalid protections for user instruction read)
Trap Number: 14
Post not yet marked as solved
I'm having some issues with Reality Composer (the latest v1.5 with the latest Xcode beta) and I just wanted to check if these are known issues.
I have an image of a printed map which I'd like to turn into an AR-based interactive map. Something simple, where I tap a pin on the map, and details about that location move up from beneath the map, then move back out of view. The map, the pin and the location information are all separate flat images, and I'm using a horizontal anchor. Unfortunately, it seems that simple bugs are stopping this from working at all.
Say I set the pin to respond to a Tap. If I use the "Move, Rotate, Scale to" action to move a second object (location info) up, then add a Wait Action, then another "Move, Rotate, Scale to" to move the info object back down to its original position. The result: the info object doesn't initially move up as far as it should, and the second "Move" pushes the info object away and out of sight completely.
If I try another approach, using the Show and Hide actions (using "Move from below" and "Move to below" as the Motion type), and again with a Wait in the middle, it works the first time, but subsequent taps cause the info object to simply appear, with no incoming animation, and then the outgoing animation works correctly.
Is it just something wrong with my system, or is this broken? If I don't try to move objects around (i.e. Show/Hide with "No Motion") then I have more luck, but I'm feeling pretty constrained.
Thanks in advance for all help with this.
Post not yet marked as solved
Hi, I'm working with RealityKit and Reality Composer. When I build a scene in Reality Composer, place the experience in Xcode and try the app out, the 3D models appears fine. When I place my hand over it, it doesn't recognise that my hand is in front of it and shows through. When place an object in front of it, it also doesn't recognise whether it is in front or behind the object. How do I fix this?
Hoping someone can help me with this. I'm trying to create an AR experience for a prefab house builder. I've made a video showing my reality composer screen and viewing the files on iphone viewable here:
https://www.youtube.com/watch?v=7VsBxxnw3pE
I exported both a .reality file and a .usdz file because I wasn't sure what was the proper way.
The first thing I'm showing on my iphone is the .reality file. You can see at first it has issues placing the blue hexagon shape but then locates correctly after I move the phone left to right. The main problem here is that it doesn't bring in the house at all. You can see when I switch to object view that only the blue hexagon is present - no house. The other problem is that the model loses its tether to the image target as soon as the image is out of view of the camera. This obviously wont work for something like a prefab house walkthrough because people want to walk all around and through the house on site.
Now for the .usdz export. As you can see the house is present in object view although it is tipped on its side. I believe I can fix this issue with the Revit exporter. It did bring the house in like this in reality composer but I used the rotation tools to make it sit flat and orient it with the image target. The other problem is that in AR the image tracking doesn't seem to be working at all. I did notice a slight vibration when the phone saw the image but no blue hexagon or house was visible.
For some background I modeled this house in Revit and exported it as a usdz file using a plugin. I'm running reality composer version 1.5 in xcode. I'm trying to develop a procedure for this so I can do it for clients often. Please help! Thanks.
Post not yet marked as solved
Can you place an Augmented Reality Anchor in a private apple indoor map?
https://register.apple.com/resources/indoor/program/indoor_maps
https://developer.apple.com/augmented-reality/tools/
Post not yet marked as solved
Start xcode 13.4, choose a .rcproject file in the current project. Go to window where RC scene is rendered, choose Open in Reality Composer and xcode crash occurs. Restart xcode and crash cycle happens immediately due to current window being a RC scene. Repeat this bad behavior till you quickly choose another (non-RC) file Tab upon open of xcode and before xcode is in a steady run state.
Post not yet marked as solved
Hi, I have a strange problem with my exported reality composer scene. So i have composed a reality scene that is around 19mb with an animated character, two iphones, text prompts, and quite a few behaviors. When I export the project and share and use it on any of my devices(Iphone 12, X and older iPad) it works perfect. Whenever I share this same file with anyone else(tried friends and many models at apple store) the same bugs persist. Shadows are weird, but worst of all certain buttons(made from text prompts) that have a behavior attached to them are of incorrect size and don't behave correctly. I have tried exporting several times, from different devices, same result. Can anyone help?
Post not yet marked as solved
I'm aware that you can create a simple scene in RC, import it to Xcode, and then add additional 3D objects with OcclusionMaterial applied to them using RealityKit.
However, I would like to accomplish that the other way around, i.e. exporting a .reality-file (or usdz?) from Xcode with OcclusionMaterial applied into RC. Is that possible? If so, what's the workflow?
Post not yet marked as solved
Hi,
So I created a scene in Reality Composer but when trying to export to USDZ, all of the animations, audio and interactions work, but the textures have been completely changed... (Map should be green, for example)
Everything looks fine in Reality Composer
Unfortunately, exporting as a .reality file isnt an option in this project.
Could I get some insight into this please? Thank you
Post not yet marked as solved
Hello, I am new to reality composer and AR, so i would like some advice.
I've added a hide, show, and wait trigger to my image planes via the Reality Composer tool on the Mac. Through this, I've created a fake 'frame by frame' animation. The animation plays smoothly in reality composer. However, when exported as a usdz file, the file only plays the animation in Mac's quick look but not when it is imported in Xcode. In Xcode, it just displays all my images and does not hide them. Does anyone have any idea how to fix this? Thanks!
tl;dr hide and show triggers created in reality composer does not play in xcode, why?
Post not yet marked as solved
I am developing an app that stick a 3d object to the face. Right now i am using xcode and reality composer. The camera on arview that shows up is the rear, i need the front camera. I don’t know if is the right code for the reality composer file. Anyone could help me?
Post not yet marked as solved
Any way to anchor a 3d object to face in reality composer using the rear camera?
The context is:
ARSessionDelegate
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
Upon Image Detection, place an overlay of a RC Entity at the ImageAnchor location.
Using IOS 14, I have been reliably using:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
to render additive RC scene content. Additive model entities render correctly in the field of view.
When I upgraded to IOS 15 using the same code that had been working for many (~12) months on all versions of IOS 14, my code failed to render any RC scene content. I get a ghosting of all of the render content in the proper location, but visible only for a moment, then it disappears.
So, I finally found the root cause of the issue. It appears that IOS 15 only renders correctly in my application using:
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform).
This led to many frustrating days of debugging to find the root cause. As a side note, IOS 14 renders RC scene entities correctly using both variants of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
and
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform)
So, this leads me to believe there is an issue with the behavior of IOS 15 with the following version of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
Post not yet marked as solved
Any USDZ file exported from Reality Composer fails to import into Reality Converter. Every USDZ file I have exported from Composer (even just a cube with a default glossy paint finish anchored on a horizontal surface) fails on import to Converter with the message "Conversion Failed: 1 Error". I suspect that the problem is with Composer and not Converter, since USDZ files created by other means import into Converter just fine.
Steps to repro:
Enable USDZ export in Reality Composer (Preferences/Enable USDZ export)
Add a cube
Export USDZ, current scene only (although this seems to make no difference)
Open Reality Converter
Drag USDZ file from step 3 into the window
Observe error message
Better import diagnostics would be good too.