Post not yet marked as solved
I converted a fbx file to USDZ using Reality Converted. This seemed to work and I can double click in the USDZ file and a Xcode view pops up and shows the file/object with out a problem.
But when I go to Reality Composer and try to import this USDZ I can see it added to the import list under + (add) and I can see it drag to the scene, but when I release everything disappears and I see nothing in the scene.
How can I figure out what happen. Is there a console view I can bring up. Is there a way to list all the objects in the scene?
Post not yet marked as solved
With the RealityComposer I could get my react-native app to replay the recorded scene through Xcode.
Question -- Is using the replay data supported in standalone app with no reliance on Xcode?
Why? I am looking to automate our Augmented Reality tests in lab's physical devices that won't be physically moving for plane detection.
Post not yet marked as solved
This code works perfectly on a simple cube, however when loading a larger (250mb) .rcproject file I get the "Unexpectedly found nil while implicitly unwrapping an Optional value" error.
My code is as follows:
struct ARSandBox: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
var map: ModelEntity!
let arView = ARView(frame: .zero)
let delayTime = DispatchTime.now() + 3.0
let scene = try! TeaTestMap.loadMapScene()
map = scene.theMap!.children[0] as? ModelEntity
(Error is here at map.isEnabled)
map.isEnabled = false
arView.scene.anchors.append(scene)
DispatchQueue.main.asyncAfter(deadline: delayTime) {
map.isEnabled = true
}
Post not yet marked as solved
I have a Reality Composer file with several scenes, all of which starts empty and then some models appear one by one every second. The animation works perfectly in Quicklook and Reality Composer, but has a glitch when rendered in ARKit. When the very first scene is launched or when we go to another scene, they don't start empty.. For a tiny split second, we see all the models of that scene being displayed, only to disappear immediately.
I have tried fixing it using Async loading, as well as delaying the DispatchQueue by 0.5 seconds. If anyone has an idea, it would mean the world if you could help. This is the final bug before the launch of our app.
Post not yet marked as solved
Hello, I am trying to export to usdz the Comsonaut model which is available here https://developer.apple.com/augmented-reality/quick-look/
It has points which trigger animations and I wonder if something like that is supported in usdz. My guess is no since I can't export this particular model to usdz in RealityComposer. But I wanted to be 100% sure. So I would really appreciate if someone can confirm that this functionalty (triggered animations) is not available in USDZ
Post not yet marked as solved
Forgive me but I am new to AR and Reality Composer and I am at a loss on what app someone needs on their iPhone, iPad, Andriod, etc to read the files.
As an example, I made a program on my iPad that a student can go up to a book and it will give information about that book, but that is ONLY when it's in Reality Composer.
I don't want students to be in Reality Composer, as that is fraught with issues and concerns.
What application does one use or need so that when the student points their phone or iPad at the book it does what it does in Reality Composer?
Is this going to be only Apple-based products that can read it because we use iPads, but we have a mix of Andriod and iPhones.
I am really enjoying the making of items, and they will soon be ready for prime time, but there is a missing link on how to properly share what you created on everyone's else devices.
Thanks in advanced,
Tim
Post not yet marked as solved
I I opened Reality Composer in x-code, and created a lot of items, most of which are from existing libraries, but I can't download them in ipad. Display: "the object type is no longer available"
Post not yet marked as solved
Hi,
I have a question about how to control behaviors from Xcode .
I have loaded a RealityComposer file (.rcproject-file) to Xcode. In Xcode storyboard I made a button and executed a behavior, which I have implemented in RealityComposer before.
This works great.
@IBAction func button(_ sender: Any) {
Anchor!.notifications.actionname.post()
}
But now I want the notification be executed when the rcproject file starts. So, when the .rcproject file starts, this behavior should be executed first.
How can I do this??
Thanks
Post not yet marked as solved
Hello everyone,
I created a 3D character animation in Blender and I would like to import it in Reality Composer.
However i export the animation from Blender, it wont show in Composer, just the static object.
My usdz scene has character parrented on animated armature.
is there any way to import 3D character animation (made in 3D software) to Reality Composer?
thnx
Post not yet marked as solved
Hi, it's me again. I have now solved my three problems.
i can load .rcproject files in Xcode.
i can assign gestures to a group of objects created in RealityComposer (the group contains two spirals. One
I can use a menu button to affect behaviors of objects I created in RealityComposer. (A spiral fades out at startup and fades back in when the button is clicked).
Individually, this all works the way I thought it would.
But now I want to bring it all together.
So I put everything together in a ViewController. But now I see two groups of objects on my device. One group reacts to the gestures and the other reacts to the button and the notifications. How do I get both the notifications and the gesture control to work together on one group.
You can find me project here:
https://wetransfer.com/downloads/2e217e5d86d273360fb50a7298e986b520220203152701/26a044d7ffc54197ac6055c4065f6f1c20220203152827/cf5ad8
Thanks for your help.
Thomas
Post not yet marked as solved
I'm new to Reality Composer and RealityKit. I've set up a .rcproject with multiple ImageTracking scenes.
Each are setup identically. At this point, when images are recognized, a cube is overlaid and has a tap gesture that should send a notification to Xcode.
In Xcode, I'm loading each of the scenes and adding the anchor's to my ARView. When I run the app on my device, each image overlay displays, however only the first loaded anchor's Tap behavior works.
Is this how it's meant to work? Any suggestions on getting multiple image tracking behaviors to work from Reality Composer in Xcode. Any suggestions are appreciated.
Post not yet marked as solved
We create reality files on Reality Composer on an iPad Pro 2020 (iPad OS 14.8.1).
They can be opened in AR correctly on iOS 14 devices (our iPad & an iPhone 12 Pro).
However on our iOS 15 iPhone X (iOS 15.2.1), AR Quick Look opens but it displays an error message saying “the object cannot be opened”.
I have tried creating a .reality file on Reality Composer directly on my iOS 15 iPhone X (I just exported the template project for horizontal plane tracking), but the export fails to open on the very same device.
However, the file generated on my iOS 15 iPhone X opens fine on my iPad Pro (iPad OS 14.8.1).
USDZ files work fine on iOS though.
Any lead on how to solve this issue?
The context is:
ARSessionDelegate
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
Upon Image Detection, place an overlay of a RC Entity at the ImageAnchor location.
Using IOS 14, I have been reliably using:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
to render additive RC scene content. Additive model entities render correctly in the field of view.
When I upgraded to IOS 15 using the same code that had been working for many (~12) months on all versions of IOS 14, my code failed to render any RC scene content. I get a ghosting of all of the render content in the proper location, but visible only for a moment, then it disappears.
So, I finally found the root cause of the issue. It appears that IOS 15 only renders correctly in my application using:
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform).
This led to many frustrating days of debugging to find the root cause. As a side note, IOS 14 renders RC scene entities correctly using both variants of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
and
mySceneAnchorEntity = AnchorEntity(world: imageAnchor.transform)
So, this leads me to believe there is an issue with the behavior of IOS 15 with the following version of AnchorEntity:
mySceneAnchorEntity = AnchorEntity(anchor: imageAnchor)
Hello, I’ve noticed that when I set the image of a picture frame asset in Reality Composer it will change its size and aspect ratio to match the image. That’s pretty nice!
I would like to let a user dynamically modify that picture while running the app. Is this possible? Or are the models properties you set in the composer locked in when you export?
Post not yet marked as solved
Hi!
I'm designing a scaled model of the solar system in Reality Composer (v1.5).
When I set the position of a planet to be meters apart, it still shows up on one screen - only centimeters apart.
Even more so, the sun is set to be at one end of the building (0,0,0) and Neptune at the other (584 meters away) and when I play the experience on my iPad, it is happy to let me see them on my screen at the same time.
Thoughts? Thank you!
Post not yet marked as solved
After I update the Xcode to 13 the animation that comes with the USDZ is not played anymore in the Reality Composer / RealityKit even though it's working fine on older Xcode 12 and 11
the problem: 'Nothing happens when playing any USDZ animation even when running it on a real device'
Does anyone face this problem cause didn't find a solution and update to fix it yet
Post not yet marked as solved
I am new to ARkit. Can we add gestures to a asynchronously loaded model?
Post not yet marked as solved
When I clone a rather complex scene in realitykit, then immediately try to reference that clone's entity content, I get a nil return. There is no completion: form of clone, to my awareness. How can one safely assure a scene is fully cloned before use? I load this scene using load..Async from RealityComposer. This completes successfully before I do a clone operation.
Hi all! Does anyone know if there is a version of reality composer on mac book pro?
I recently saw a girl on youtube in a video, and she had the program working on a mac.
My app store says that the program is only available on the iPad and iPhone.
video link : https://youtu.be/WdxA9Y0k3EU?t=768
Post not yet marked as solved
Hi,
So I created a scene in Reality Composer but when trying to export to USDZ, all of the animations, audio and interactions work, but the textures have been completely changed... (Map should be green, for example)
Everything looks fine in Reality Composer
Unfortunately, exporting as a .reality file isnt an option in this project.
Could I get some insight into this please? Thank you