Post not yet marked as solved
If I create an object: for example a sphere of a certain material can I know its weight? Can I know the specific weight of a material among those I can choose in Reality Composer? Thank you to those who can help me.
Post not yet marked as solved
Can we put Videos in RealityComposer on iPad or iPhone?
Post not yet marked as solved
xcode project contains:
7 RC .rcprojects, size range: 11.5MB-16MB/.rcproject
Crash:
Beach ball spin after toggling between .rcprojects in xcode where I Open RC for each project.
I am inferring that there is a memory deallocate bug in xcode where after a .rcproject is opened in a tab, I open RC, work on a scene. Close the scene, then pick another .rcproject to work on. After about 3-4 iterations, xcode becomes non-resonsive, beach ball happens, then xcode crashes or I have to force kill it.
Post not yet marked as solved
Hello, is there a way to import 3d hat objects and have it anchored to head? Using face anchor only seems to anchor to the front of face, no matter how I position it.
Thanks in advance.
Post not yet marked as solved
Using Reality Composer [within Xcode]... I've built an .rcproject file with 25 scenes containing Behaviors w/Triggers + Actions to change scenes, one after another.
When exported as a .reality file, it does run successfully through all 25 scenes.
❓However... when exported as a .usdz file via "Export: Project" (not "Export: Current Scene"), it only runs the 1st scene, as though the rest of the scenes do not exist. The Trigger to change to the 2nd scene does nothing.
✅ Btw I do have "Enable USDZ export" checked (which is not on by default) in the Preferences for Reality Composer.
✨Thank you✨ I'd greatly appreciate any troubleshooting tips❗️
Post not yet marked as solved
Hi everyone, is it possible to use the ARKit Replay Data option for XCUITests? If not, this would be a great feature for automation.
Thanks!
Post not yet marked as solved
I'm having some issues with Reality Composer (the latest v1.5 with the latest Xcode beta) and I just wanted to check if these are known issues.
I have an image of a printed map which I'd like to turn into an AR-based interactive map. Something simple, where I tap a pin on the map, and details about that location move up from beneath the map, then move back out of view. The map, the pin and the location information are all separate flat images, and I'm using a horizontal anchor. Unfortunately, it seems that simple bugs are stopping this from working at all.
Say I set the pin to respond to a Tap. If I use the "Move, Rotate, Scale to" action to move a second object (location info) up, then add a Wait Action, then another "Move, Rotate, Scale to" to move the info object back down to its original position. The result: the info object doesn't initially move up as far as it should, and the second "Move" pushes the info object away and out of sight completely.
If I try another approach, using the Show and Hide actions (using "Move from below" and "Move to below" as the Motion type), and again with a Wait in the middle, it works the first time, but subsequent taps cause the info object to simply appear, with no incoming animation, and then the outgoing animation works correctly.
Is it just something wrong with my system, or is this broken? If I don't try to move objects around (i.e. Show/Hide with "No Motion") then I have more luck, but I'm feeling pretty constrained.
Thanks in advance for all help with this.
Post not yet marked as solved
Hi, I'm working with RealityKit and Reality Composer. When I build a scene in Reality Composer, place the experience in Xcode and try the app out, the 3D models appears fine. When I place my hand over it, it doesn't recognise that my hand is in front of it and shows through. When place an object in front of it, it also doesn't recognise whether it is in front or behind the object. How do I fix this?
Post not yet marked as solved
Start xcode 13.4, choose a .rcproject file in the current project. Go to window where RC scene is rendered, choose Open in Reality Composer and xcode crash occurs. Restart xcode and crash cycle happens immediately due to current window being a RC scene. Repeat this bad behavior till you quickly choose another (non-RC) file Tab upon open of xcode and before xcode is in a steady run state.
Post not yet marked as solved
Hi, I have a strange problem with my exported reality composer scene. So i have composed a reality scene that is around 19mb with an animated character, two iphones, text prompts, and quite a few behaviors. When I export the project and share and use it on any of my devices(Iphone 12, X and older iPad) it works perfect. Whenever I share this same file with anyone else(tried friends and many models at apple store) the same bugs persist. Shadows are weird, but worst of all certain buttons(made from text prompts) that have a behavior attached to them are of incorrect size and don't behave correctly. I have tried exporting several times, from different devices, same result. Can anyone help?
Post not yet marked as solved
Been a few years since I've worked on a Mac, client needs some work with Reality Composer. So forgive me if I'm forgetting or don't know more modern troubleshooting on macOS. I come from the days of zapping the PRAM.
https://youtu.be/O95MCQmECcM
A screen capture of the behavior.
In short, after launching the app - upon selecting an object, it closes out/crashes.
Any guidance would help.
Some error info:
RAQLPreviewExtension (process name)
Logical CPU: 4
Error Code: 0x00000015 (invalid protections for user instruction read)
Trap Number: 14
Post not yet marked as solved
Can you place an Augmented Reality Anchor in a private apple indoor map?
https://register.apple.com/resources/indoor/program/indoor_maps
https://developer.apple.com/augmented-reality/tools/
Post not yet marked as solved
Hello, I am new to reality composer and AR, so i would like some advice.
I've added a hide, show, and wait trigger to my image planes via the Reality Composer tool on the Mac. Through this, I've created a fake 'frame by frame' animation. The animation plays smoothly in reality composer. However, when exported as a usdz file, the file only plays the animation in Mac's quick look but not when it is imported in Xcode. In Xcode, it just displays all my images and does not hide them. Does anyone have any idea how to fix this? Thanks!
tl;dr hide and show triggers created in reality composer does not play in xcode, why?
Post not yet marked as solved
I am developing an app that stick a 3d object to the face. Right now i am using xcode and reality composer. The camera on arview that shows up is the rear, i need the front camera. I don’t know if is the right code for the reality composer file. Anyone could help me?
Post not yet marked as solved
Any way to anchor a 3d object to face in reality composer using the rear camera?
Post not yet marked as solved
In the Run scheme options for my ARKit application I am able to select a Reality Composer video to be replayed instead of using the devices camera/sensors. This is very useful for manual testing of my AR application. But what I would really like to do is use this features in my automated UI tests. However, when launching my application:
let app = XCUIApplication()
app.launch()
The Reality Composer video is not replayed.
Am I missing something? Is this feature supported in UI Tests?
Post not yet marked as solved
Any USDZ file exported from Reality Composer fails to import into Reality Converter. Every USDZ file I have exported from Composer (even just a cube with a default glossy paint finish anchored on a horizontal surface) fails on import to Converter with the message "Conversion Failed: 1 Error". I suspect that the problem is with Composer and not Converter, since USDZ files created by other means import into Converter just fine.
Steps to repro:
Enable USDZ export in Reality Composer (Preferences/Enable USDZ export)
Add a cube
Export USDZ, current scene only (although this seems to make no difference)
Open Reality Converter
Drag USDZ file from step 3 into the window
Observe error message
Better import diagnostics would be good too.
Post not yet marked as solved
I have made a .reality file in Reality Composer and have imported it into Swift Playgrounds however I cannot find a way on how to run the file.
How would I go about this?
Post not yet marked as solved
I have created a .rcproject on iPad Reality Composer and would like to know how to add it in to my Swift Playground App.
Post not yet marked as solved
Hello All,
hope someone can help with this issue I have encountered.
I was using the Reality Composer to create an image anchor AR scene.
I could load it in a previous version of Xcode and it worked fine.
I upgraded Xcode from 12 to 13 and it still worked fine.
when I try to change the Reality Composer file and run it again I receive a runtime error that the anchor was failed to be created because the anchor does not have an asset named locatorImage.png.
var imageAnchor: RealComp.Scene?
// to simply load the file in the viewDidLoad:
imageAnchor = try! RealComp.loadScene()
Fatal error: 'try!' expression unexpectedly raised an error: Failed to create anchor file because the archive does not have an asset named locatorImage.png.
Has anyone else come across this issue?
The archive refers to the realityComposer file? - it does contain the image - as per previous
Hope someone can help :-)