Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Posts under Reality Composer tag

89 Posts
Sort by:
Post not yet marked as solved
1 Replies
349 Views
Is anyone else having trouble with image targets in reality composer? When I move my phone directly over or in very close proximity to an image target, my scene shows correctly. But as soon as I move my camera away the scene seems to jump to some arbitrary location in the room. When I move back to the target it comes back to the correct location shortly.
Posted
by skyeg3.
Last updated
.
Post not yet marked as solved
1 Replies
63 Views
Is there a timeline for when additional features will be added to Reality Composer? Right now, the tool offers some nice features, but it's missing a lot of things that could be really powerful: adding clickable links within the experience dragging and dropping video like you can do with pictures adding haptics I know some of this is possible with Xcode, but including these features would expand the use of Reality Composer significantly.
Posted
by KDP.
Last updated
.
Post not yet marked as solved
0 Replies
106 Views
I'm making an app using Reality Composer that has multiple scenes, each with a different anchor type. Several use a horizontal anchor, one uses an image anchor, and one uses a face anchor. Works great when testing within the realitycomposer app on iOS device, but the face and image anchor scenes do not work when publishing from x-code. Anyone know if this is possible or a limitation in Reality composer, or I'm not including something properly in X-Code?
Posted Last updated
.
Post not yet marked as solved
1 Replies
134 Views
I tried to display a Scene loaded from a .rcproject file in ARView with .nonAR mode. But could not display it on my iPhone 8 which is actual device. I have confirmed that the simulator display the scene properly. If the camera mode is set to .ar, the scene is displayed in actual device. I am puzzled as to why scene loaded from .rcproject file does not show up with my actual device. If anyone has had the similar experience or has an idea of the cause, I would appreciate it if you could help me. Thank you for taking the time to read this post. struct ARViewContainer: UIViewRepresentable {     func makeUIView(context: Context) -> ARView { // if cameramode is ".ar", work properly         let arview = ARView(frame: .zero, cameraMode: .nonAR)         Sample.loadMySceneAsync { (result) in             do {                 let myScene = try result.get()                 arview.scene.anchors.append(myScene)             } catch {                 print("Failed to load myScene")             }         }                  let camera = PerspectiveCamera()         let cameraAnchor = AnchorEntity(world: [0, 0.2, 0.5])         cameraAnchor.addChild(camera)         arView.scene.addAnchor(cameraAnchor)         return arview     }     func updateUIView(_ uiView: ARView, context: Context) {} }
Posted
by polaris_.
Last updated
.
Post not yet marked as solved
0 Replies
135 Views
I'm new in Reality Composer world and I'm doing a project where I need to load in the same App more than 1 reality composer project but it doesn't work. For exemple, I have 2 different RCproject, pro1 and pro2, that stand out according to the framed image that acts as an anchor for the scenario. I put them in Xcode and I add them in the contentView in this way // Load the "Box" scene from the Reality File let boxAnchor = try! pro1.loadMenu() let boxAnchor2 = try! pro2.loadScene() // add the boxAnchor to the scene arView.scene.anchors.append(boxAnchor) arView.scene.anchors.append(boxAnchor2) when I start the project on the ipad it installs the app and it works, it recognizes the image I use as an anchor and loads the correct project but after the first interaction it does nothing. If I change the framed image with the one connected to pro2 the app loads the right project but, again, after the first interaction it does nothing. While I use the app from my Ipad pro I have the following error as output in Xcode: "World tracking performance is being affected by resource constraints [1]" However, the app continues to be active and every time I change the framed image, the project I view also changes and, moreover, these keep the state they were in, thus allowing me to interact with the objects always only for a single tap and then freeze. Is there a solution to make the ipad select only the RC project requested when I frame a certain image? As an alternative solution I had thought of creating an initial menu that would make me choose which project to use, thus creating a different ContentView for each of them in order to show the right project through the user's choice and no longer through the framed image. is this a possible solution? Thanks in advance for your attention and any answers.
Posted Last updated
.
Post not yet marked as solved
0 Replies
118 Views
There is absolutely no documentation I can find about this. on the ios version of Reality composer, when you are editing an action sequence, you can toggle Loop and Modal. in the mac version you can toggle an icon that says toggle exclusive state for this action sequence when you hover over it. What is that?
Posted
by mklitsner.
Last updated
.
Post not yet marked as solved
0 Replies
146 Views
I am trying to run the application on the iPad M1 Device pinned to the Xcode debugger. The scheme has the Replay data enabled with the film prerecorded from the very same iPad M1. Errors: 2022-09-19 19:21:52.790061+0100 ARPathFinder[1373:320166] ⛔️⛔️⛔️ ERROR [MOVReaderInterface]: Error Domain=com.apple.AppleCV3DMOVKit.readererror Code=9 "CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}" UserInfo={NSLocalizedDescription=CVAUserEvent: Error Domain=com.apple.videoeng.streamreaderwarning Code=0 "Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'." UserInfo={NSLocalizedDescription=Cannot grab metadata. Unknown metadata stream 'CVAUserEvent'.}} ⛔️⛔️⛔️ 2022-09-19 19:21:54.103813+0100 ARPathFinder[1373:320166] [Session] ARSession <0x104f77ec0>: did fail with error: Error Domain=com.apple.arkit.error Code=101 "Required sensor unavailable." UserInfo={NSLocalizedDescription=Required sensor unavailable., NSLocalizedFailureReason=A required sensor is not available on this device.} Any help will be appreciated, thanks.
Posted
by jmgawe.
Last updated
.
Post not yet marked as solved
0 Replies
175 Views
Working in the real work sharing and creating new 3d models often results in dissimilar pivot points of each model. This results in difficulty adding and manipulating models within an active AR scene. Having an ability to change the anchor point on the fly would provide a world of difference in user experience.
Posted Last updated
.
Post not yet marked as solved
1 Replies
168 Views
I created a scene in Reality Composer which includes lots of different models. Then I load scene and try to load model and place them separately, the following is my UITapGestureHandler: guard let loadModel = loadedScene.findEntity(named: selectedPlant.selectedModel) else{ return } loadModel.setPosition(SIMD3(0.0,0.0,0.0), relativeTo: nil) My confusion is, when you use .findentity and place this model to the detected plane, it cannot be retrieved again: I try to call this again to place a second model after placing the first one, and .findentity returns nil. Does anyone know the mechanism behind it? I thought loading the scene will create a template in memory, but in contrary, it seems like only creating a sort of list and pop out entity for every call.
Posted
by daiyukun.
Last updated
.
Post not yet marked as solved
1 Replies
209 Views
I have several reality composer models in my project. when I open more than one of them at a time, xcode loves to **** up (crash). Workflow: open xcode, open RC project1 , open RC projet2. share objects across scenes. Close RC project2, open RC project3, share and repeat. This causes xcode to beach ball and crash. Poor quality AR support/testing/quality.
Posted
by MikeNunez.
Last updated
.
Post not yet marked as solved
0 Replies
119 Views
Hello, when I go to select an image from the preloaded images , many are missing . For example , under nature there is just rocks … not forest with a river. please help.
Posted
by Lsnyders.
Last updated
.
Post not yet marked as solved
0 Replies
208 Views
Hi there, I am happy to have found Reality Composer. As I continue to create content, I have run into some issues. When exporting my model into .OBJ format to upload to the composer, I notice that the objects are not solid. I have gone through the exporting process to make sure that they are, but when viewed in Reality Composer, the closest surfaces are not showing and the object looks see-through. Any idea of what is going on?
Posted Last updated
.
Post not yet marked as solved
0 Replies
228 Views
To reproduce the issue: After importing a RC .project scene, associate a scene entity (theEnntity) to an anchor entity. Enable theEntity (isEnabled = true). theEntity renders in the scene correctly. Execute a scene reality composer behavior Notification to disable theEntity (Hide > Affected Object > theEntity). theEntity no longer renders in the scene, which is correct behavior. Then the bug occurs where you are now unable to Enable theEntity (isEnabled = true). theEntity no longer renders correctly in the scene.
Posted
by MikeNunez.
Last updated
.
Post not yet marked as solved
1 Replies
199 Views
Background I have generated some content using Reality Composer and imported the resulting .rcproject file into my iOS application in Xcode. Xcode creates a .reality file that is then loaded onto the App bundle while also auto-generating code to streamline the interface with this file. load<scene name>() type method is an example of a useful method from the generated code, which returns the specified scene from the .reality file. Observations Invoking the above method synchronously results in the App crashing (and only signalling Thread <thread>: EXC_BREAKPOINT). The crash occurs at loadAnchor(contentsOf:withName:) within the load<scene name>() method. The documentation for this does state that the method "Blocks your app while loading an anchor entity from a file URL", but I don't see why this should result in a crash (I would anticipate unresponsive UI, and this is only if it's being run on the main thread to begin with). Scheduling this operation on the main queue (via DispatchQueue or MainActor) overcomes the above issue. Placing the method within an unstructured Task does not result in a crash either, but scheduling it on a global queue results in a crash. The above behaviour is observed during unit tests as well, but the use of an unstructured Tasks also results in a crash here. Versions While I've listed the respective versions below, I've also experienced this behaviour in prior versions. Xcode 13.4.1 iOS 15.6
Posted
by ranveerm.
Last updated
.
Post not yet marked as solved
0 Replies
256 Views
Hello :), My name is Alec and I'm experimenting with the USD file format. Here's what I've done: I used my iPad to take a point scan of my own head (funny I know). I brought the PLY cloud into Blender where I used GeoNodes to instance objects on those points. I also loaded audio into my Blend file because I wanted my head to react to audio I exported my model into Alembic format (.abc) and exported the animation data as well (.mdd). I followed this video to do it https://www.youtube.com/shorts/awArQyvncmU I brought the .mdd & .abc back into files back into Blender and rexported as a GLB. AT THIS POINT I KNOW I HAVE A WORKING GLB MODEL AND ANIMATION (see a link to file here. Warning, its 500MB: https://drive.google.com/file/d/15nQ2eIqhFQtHce5EbwH4_pXq-maR3rKo/view?usp=sharing ) I brought the glb file into Reality Converter hoping there would be a seamless conversion to USDZ. For some reason, my animation data is getting lost :( I don't know what to do. Does anyone know what I might be doing wrong? Bonus Question: Is it even possible to put audio in a USDZ file in Reality Composer? Many thanks to everyone out there willing to help! All the best, Alec
Posted Last updated
.