Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Posts under Reality Composer tag

65 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Quick Look AR shows "Object requieres a newer version of IOS" on iOS17 when using .reality files
Hi there Hosting in my server a no-doubt-well-formed AR file, as is the "CosmonautSuit_en.reality" from Apple's examples (https://developer.apple.com/augmented-reality/quick-look/) the infamous and annoying "Object requires a newer version of iOS." message appears, even when I'm running iOS 17.1 in my iPad. That is, the very last available version. All works flawless in uOS16 and below. Of course, my markup is following the required format, namely: <a rel="ar" href="https://artest.myhost.com/CosmonautSuit_en.reality"> <img class="image-model" src="https://artest.myhost.com/cosmonaut.png"> </a> Accessing this same .reality file from the aforementioned Apple's site page works fine. Why is not working in my hosting server? For you rinformation, when I use in my server a USDZ instead, also from the Apple's web page of examples, as is the toy_drummer_idle.usdz file, all works flawless. Again, I'm using the same markup schema: <a rel="ar" href="https://artest.myhost.com/toy_drummer_idle.usdz"> <img class="image-model" src="https://artest.myhost.com/toy_drummerpng"> </a> Also, when I delete the rel="ar" option, AR experience is launched, but by means of an extra step, that implied go thought an ugly poster (generated by QLAR on-the-fly), that ruins all the UX/UI of my webapp. This bahavior is, by the way, the same that you can experience when accessing directly the .realiity file by typing its URL in the Safari browser box. Any tip on this? Thanks for your time.
2
0
776
Oct ’23
Reloading a scene from reality composer.
Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error. validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].' Any help would be very much appreciated. Thanks
1
0
743
Oct ’23
[BUG] Gizmo Doesn't Update When Changing 'Up Axis' to 'Z' in Reality Composer Pro
When changing the 'Up axis' setting in the Layer Data tab to 'Z', the gizmo does not reflect the change. It continues to display as if the Up axis is 'Y'. This results in the gizmo becoming disconnected from the object itself, making it challenging to perform accurate transformations using the gizmo. Steps to Reproduce: Open Reality Composer Pro in the latest XCode Beta. Click on empty space inside of your scene. Navigate to the Layer Data tab. Change the "Up axis" setting to 'Z'. Observe the gizmo's orientation.
0
0
334
Oct ’23
Reality Composer Exporting USDZ files - slow animation
It seems that something must of changed in the reality composer export feature to USDZ. Importing any animated .usdz file into reality composer and then exporting it is reducing the playback frame rate to about 30%. The same file imported and then exported as a .reality file plays back just fine. Anyone else experiencing this issue, as its happening for every usdz file imported and also across 2 different apple laptops running the software?
2
1
1.6k
Oct ’23
Workflow Suggestions from Blender to Reality Composer
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
4
0
1.7k
Oct ’23
Face Anchor in Reality Composer: Enabling Ball Movement Based on Head Tilts
Using the face anchor feature in Reality Composer, I'm exploring the potential for generating content movement based on facial expressions and head movement. In my current project, I've positioned a horizontal wood plane on the user's face, and I've added some dynamic physics-enabled balls on the wood surface. While I've successfully anchored the wood plane to the user's head movements, I'm facing a challenge with the balls. I'm aiming to have these balls respond to the user's head tilts, effectively rolling in the direction of the head movement. For instance, a tilt to the right should trigger the balls to roll right, and likewise for leftward tilts. However, my attempts thus far have not yielded the expected results, as the balls seem to be unresponsive to the user's head movements. The wood plane, on the other hand, follows the head's motion seamlessly. I'd greatly appreciate any insights, guidance, or possible solutions you may have regarding this matter. Are there specific settings or techniques I should be implementing to enable the balls to respond to the user's head movement as desired? Thank you in advance for your assistance.
0
0
586
Oct ’23
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
2
1
1.1k
Oct ’23
How to trigger scene custom behaviour in Xcode 15
Hi, I'm working on an AR App. With Reality Composer and Xcode 14 I was triggering custom behaviours from with just: myScene.notifications.myBox.post() called from let myScene = try! Experience.loadBox() Now in Xcode 15 I don't have an Experience instead with the .reality file I have to use Entities so : let objectAR = try! Entity.load(named: "myProject.reality") How can I trigger my previously Reality Composer exported custom behaviour from that ?
0
1
411
Oct ’23
Reality Composer Pro: Triggering Animations and changing scenes from an onClick Event
Hello, Currently working on a project that was finished in Reality Composer but then we noticed the pink material after changing scenes on iOS 17 devices. So did some updating to Mac OS 14 and X Code 15 beta to use Reality Composer Pro and currently stuck on how to setup the animations and onClick triggers to be able play animation from the USDZ model in the scene. Once the animation is finished, it will trigger the next scene. This was done through behaviors in Reality Composer and it was simple drag and drop. But now it seem we need to do it by components which i don't mind just don't see much resources on how to set this up properly. Is there are way to do behaviors like in Reality Composer? Extra: If there is a way to use alpha pngs or be able to drag PNG's into the scene like in Reality Composer?
1
1
638
Oct ’23
Creating 3D Content with Reality Composer no longer possible in Xcode 15
With Xcode 11 through 14, Reality Composer for Mac has been included with Xcode. Apples's own documentation states: "You automatically get Reality Composer for macOS when you install Xcode 11 or later. The app is one of the developer tools bundled with Xcode. From the menu, choose Xcode > Open Developer Tool, and select Reality Composer." This simply not the case with Xcode 15. In fact, we cannot even open existing RCPROJECTS such as the classic Experience.rcproject found in Apple's own sample code Creating a Game with Reality Composer. As an AR developer this makes my currently project virtual unworkable. It's my understanding that when Reality Composer Pro is finally released -- it will only be compatible with visionOS apps, as opposed to iOS apps that are built with RealityKit; this was certainly the case with early iterations. Apple's Creation tools for spatial apps still mention Reality Composer, but it is only available for iOS and iPadOS. Taking away such a critical tool without notice is crippling, to say the least. Any official announcements on the future state RealityKit development would be welcome.
2
0
1.1k
Sep ’23
Reality Converter, problem converting usdz file with multiple animation
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
3
1
1.7k
Sep ’23
Bounce a virtual object off real world wall?
Hi! While waiting for scene understanding to work in the visionOS simulator, I am trying to bounce a virtual object off a real world wall or other real world object on iOS ;). I load my virtual objects from a Reality Composer file where I set them to participate in physics with dynamic motion type. With this I am able to have them collide with each other nicely, occlusion also works, but they go right through walls and other real world objects rather than bouncing off... I've tried a couple of variations of the following code: func makeUIView(context: Context) -> ARGameView { arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.physics) arView.debugOptions.insert(.showSceneUnderstanding) arView.automaticallyConfigureSession = false let config = ARWorldTrackingConfiguration() config.planeDetection = [.horizontal, .vertical] config.sceneReconstruction = .meshWithClassification arView.session.run(config) if let myScene = try? Experience.loadMyScene() { ... arView.scene.anchors.append(myScene) } return arView } I have found several references that his should "just work", e.g. in https://developer.apple.com/videos/play/tech-talks/609 What am I missing? Testing on iPhone 13 Pro Max with iOS 16.5.1 🤔
1
0
817
Aug ’23
Construct a Voronoi diagram with the Shader Graph Editor?
I'm trying to make a material that has flecks of glitter in it. The main technique I've found to achieve this effect is to use a Voronoi diagram as a normal map, with various amounts of embellishment on top. The shader graph editor has a Worley noise node which is related but produces the "spider web" version of a Voronoi diagram, instead of flat polygons of a consistent color. Is there a trick for converting this Worley texture into a vanilla voronoi diagram, or am I missing something else obvious? Or is what I want not currently possible?
0
0
534
Aug ’23
Image Input for ShaderGraphMaterial
In RealityComposerPro, I've set up a Custom Material that receives an Image File as an input. When I manually select an image and upload it to RealityComposerPro as the input value, I'm able to easily drive the surface of my object/scene with this image. However, I am unable to drive the value of this "cover" parameter via shaderGraphMaterial.setParameter(name: , value: ) in Swift since there is no way to supply an Image as a value of type MaterialParameters.Value. When I print out shaderGraphMaterials.parameterNames I see both "color" and "cover", so I know this parameter is exposed. Is this a feature that will be supported soon / is there a workaround? I assume that if something can be created as an input to Custom Material (in this case an Image File), there should be an equivalent way to drive it via Swift. Thanks!
1
0
911
Aug ’23