Notes on Reality Composer for use in K12 courses

I recently spent several days messing around in Reality Composer with the intention of creating a course that we could teach to students to get them started learning how to use Augmented Reality to tell stories and play with digital assets. We would use it in combination with other apps like TinkerCad to teach them modeling, the Voice Memo recorder so they can record dialogue and interaction sounds, iMovie to edit a demo reel of their application, as well as taking advantage of online assets libraries like Sketchfab that does have .usdz files and even some animations available for free. The focus would be on creating an interactive application that works in AR.

Here are some notes I took while trying things out.

UI Frustrations:

The behaviors tab doesn’t go up far enough, I’d like to be able to drag it to take up more space on the screen. Several of the actions have sections that go just below the edge of the screen, and it’s frustrating to have to constantly scroll in order to see all the information. I’ll select an “Ease Type” on the Move, Rotate, Scale To action and buttons will appear on the very edge of my screen in such a way that I can’t read them until I scroll down. This happens for so many different Actions that it feels like I don’t have enough space to see all the necessary information

The audio importing from the content library is difficult to navigate. First, I wish there was a way to import only one sound instead of having to import the entire category of sounds. Second, it would be nice to see all the categories of sounds in some kind of sidebar, similar to the “Add Object” menu that already exists.

I wish there was a way to copy and paste position and rotation vectors easily so we could make sure objects are in the same place, especially if we need to duplicate objects in order to get a second tap implementation. Currently you have to keep flipping back and forth between objects to get the numbers right

Is there a way to see all of the behaviors a selected object is referenced in? Since the “Affected Objects” list is inside all sorts of behaviors, actions, triggers, etc, it can be hard to find exactly where a behavior is coming from, especially if your scene has a lot of behaviors. I do come from a Unity background, so I’m used to object behaviors being put directly onto the object itself, so to not know which behaviors are referencing any given object makes it possible to accidentally have my physics response overwritten by an animation triggered from somewhere else and then causing me to search for it through all of the behaviors in my scene.

Is there a way to see the result of my object scanning? Right now it’s all sort of behind the scenes, and it feels like the object scanning doesn’t work unless the object is in the same position in relation to the background as it was before. It’s a black box and hard to understand what I’m doing wrong with the scanning, cuz when I move the object everything stops working.

I could use a scene Hierarchy or list of all objects in a scene. Sometimes I don’t know where an object is but I know what it is called, and I’d like to be able to select it to make changes to it. Sometimes objects start right on top of eachother in the scene (like for a reset button for a physics simulation), which makes it frustrating to select one of them over the other, especially since it seems that the only way to select “affected objects” is to tap on them, instead of choosing from a list of those available in the scene.

Feature Requests:

One thing other apps have that makes it easy to add personality into the scene is characters that have a variety of animations they can play depending on context. It would be nice to have some kind of character creator that came with a bunch of pre-made animations, or at least some kind of library of characters with animations. So for example, if we want to create a non-player character that waves to the player, then moves somewhere else, and talks again, we can switch the animation of the character at the appropriate parts of the movement to make the character feel more real. This is a little more difficult to do with usdz files that only play one animation, although the movement is cool it typically only fits in one setting, so you have to juggle turning a bunch of objects off and on, even if you do find an importable character with a couple of animations (such as you might find on mixamo in .fbx format). Although I believe it may be possible for a .usdz file to have more than one animation in it? I haven't seen any examples of this.

Any chance we’ll have a version of the Reality Converter app that works on iPads and iPhones? We don’t want to assume our students have access to a macbook, and being able to convert fbx files or obj files would open up access to a wider variety of online downloadable assets.

Something that would really help make more complex scenes is the ability to make a second trigger for an object that relies on a condition being met first. The easiest example is being able to click on a tour guide a second time in order to move onto the next object. This gets a little deeper into code blocks, but possibly there could be an if block or a condition statement that checks if something is in proximity before allowing a tap, or checks how many times the object has been tapped by storing it in an integer variable you could set and check the value of. The way I first imagined it, maybe you’d be able to add a Trigger that enables AFTER the first action sequence has completed, so you can build longer chains. This also comes into play with physics interactions. Let’s say I want to click on a ball to launch it, but when it stops moving at a speed greater than some number I want it to automatically reset.

I’d like the ability to make one object follow another one with a block, or some kind of system that’s similar to “parenting” objects together like you can in a 3D engine. This way, you could separate the visuals of an object from its physics, letting you play animations on launched objects, spin them, emphasize them, while still allowing the physics simulation to work.

For Physics simulations, is it possible to implement a feature where the direction of force is pointing towards another object in the scene? Or better yet, away from them using negative values? Specifically, I’d like to be able to launch a projectile in the direction the camera is facing, or give the user some control over the direction during playtime. Would be nice to edit the material or color of an object with an action, give the user a little pulse of color when they tap as feedback, or even allow the user to customize their environment with different textures

If you want this app to be used in education, there must be a way for teachers to share their built experiences with eachother, some kind of online repository where you can try out what others have made.

Replies

Hi, thanks for your detailed feedback and for using Reality Composer for educational purposes. I would suggest that you report your frustrations and feature requests on Feedback Assistant if you haven't already. That way the requests can make it to the team and be prioritized accordingly. Thank you!