Reality Converter

RSS for tag

Convert, view, and customize USDZ 3D objects on Mac using Reality Converter.

Reality Converter Documentation

Posts under Reality Converter tag

29 Posts
Sort by:
Post not yet marked as solved
4 Replies
1.5k Views
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
Posted
by
Post not yet marked as solved
2 Replies
892 Views
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again." This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
Posted
by
Post not yet marked as solved
0 Replies
935 Views
Currently, I have a requirement to use models created in Unity for use in Reality Kit, so I need to convert the models to USDZ formats. I used this approach (How to easily create AR content for iPhone using Unity), but the result was not as expected. Converted models do not display correctly, and animation on objects does not appear in their converted files. It was also noticed that objects made using the Unity particle system (e.g., confetti) were not converted using this approach. I also tried to convert by selecting the ‘Export selected as USDZ’ menu from Unity’s main menu bar, but nothing worked. So is there any effective way to convert the unity models, including the particle systems, to USDZ?
Posted
by
Post not yet marked as solved
1 Replies
897 Views
I recently spent several days messing around in Reality Composer with the intention of creating a course that we could teach to students to get them started learning how to use Augmented Reality to tell stories and play with digital assets. We would use it in combination with other apps like TinkerCad to teach them modeling, the Voice Memo recorder so they can record dialogue and interaction sounds, iMovie to edit a demo reel of their application, as well as taking advantage of online assets libraries like Sketchfab that does have .usdz files and even some animations available for free. The focus would be on creating an interactive application that works in AR. Here are some notes I took while trying things out. UI Frustrations: The behaviors tab doesn’t go up far enough, I’d like to be able to drag it to take up more space on the screen. Several of the actions have sections that go just below the edge of the screen, and it’s frustrating to have to constantly scroll in order to see all the information. I’ll select an “Ease Type” on the Move, Rotate, Scale To action and buttons will appear on the very edge of my screen in such a way that I can’t read them until I scroll down. This happens for so many different Actions that it feels like I don’t have enough space to see all the necessary information The audio importing from the content library is difficult to navigate. First, I wish there was a way to import only one sound instead of having to import the entire category of sounds. Second, it would be nice to see all the categories of sounds in some kind of sidebar, similar to the “Add Object” menu that already exists. I wish there was a way to copy and paste position and rotation vectors easily so we could make sure objects are in the same place, especially if we need to duplicate objects in order to get a second tap implementation. Currently you have to keep flipping back and forth between objects to get the numbers right Is there a way to see all of the behaviors a selected object is referenced in? Since the “Affected Objects” list is inside all sorts of behaviors, actions, triggers, etc, it can be hard to find exactly where a behavior is coming from, especially if your scene has a lot of behaviors. I do come from a Unity background, so I’m used to object behaviors being put directly onto the object itself, so to not know which behaviors are referencing any given object makes it possible to accidentally have my physics response overwritten by an animation triggered from somewhere else and then causing me to search for it through all of the behaviors in my scene. Is there a way to see the result of my object scanning? Right now it’s all sort of behind the scenes, and it feels like the object scanning doesn’t work unless the object is in the same position in relation to the background as it was before. It’s a black box and hard to understand what I’m doing wrong with the scanning, cuz when I move the object everything stops working. I could use a scene Hierarchy or list of all objects in a scene. Sometimes I don’t know where an object is but I know what it is called, and I’d like to be able to select it to make changes to it. Sometimes objects start right on top of eachother in the scene (like for a reset button for a physics simulation), which makes it frustrating to select one of them over the other, especially since it seems that the only way to select “affected objects” is to tap on them, instead of choosing from a list of those available in the scene. Feature Requests: One thing other apps have that makes it easy to add personality into the scene is characters that have a variety of animations they can play depending on context. It would be nice to have some kind of character creator that came with a bunch of pre-made animations, or at least some kind of library of characters with animations. So for example, if we want to create a non-player character that waves to the player, then moves somewhere else, and talks again, we can switch the animation of the character at the appropriate parts of the movement to make the character feel more real. This is a little more difficult to do with usdz files that only play one animation, although the movement is cool it typically only fits in one setting, so you have to juggle turning a bunch of objects off and on, even if you do find an importable character with a couple of animations (such as you might find on mixamo in .fbx format). Although I believe it may be possible for a .usdz file to have more than one animation in it? I haven't seen any examples of this. Any chance we’ll have a version of the Reality Converter app that works on iPads and iPhones? We don’t want to assume our students have access to a macbook, and being able to convert fbx files or obj files would open up access to a wider variety of online downloadable assets. Something that would really help make more complex scenes is the ability to make a second trigger for an object that relies on a condition being met first. The easiest example is being able to click on a tour guide a second time in order to move onto the next object. This gets a little deeper into code blocks, but possibly there could be an if block or a condition statement that checks if something is in proximity before allowing a tap, or checks how many times the object has been tapped by storing it in an integer variable you could set and check the value of. The way I first imagined it, maybe you’d be able to add a Trigger that enables AFTER the first action sequence has completed, so you can build longer chains. This also comes into play with physics interactions. Let’s say I want to click on a ball to launch it, but when it stops moving at a speed greater than some number I want it to automatically reset. I’d like the ability to make one object follow another one with a block, or some kind of system that’s similar to “parenting” objects together like you can in a 3D engine. This way, you could separate the visuals of an object from its physics, letting you play animations on launched objects, spin them, emphasize them, while still allowing the physics simulation to work. For Physics simulations, is it possible to implement a feature where the direction of force is pointing towards another object in the scene? Or better yet, away from them using negative values? Specifically, I’d like to be able to launch a projectile in the direction the camera is facing, or give the user some control over the direction during playtime. Would be nice to edit the material or color of an object with an action, give the user a little pulse of color when they tap as feedback, or even allow the user to customize their environment with different textures If you want this app to be used in education, there must be a way for teachers to share their built experiences with eachother, some kind of online repository where you can try out what others have made.
Posted
by
Post not yet marked as solved
0 Replies
684 Views
Hi I was wondering if there was any way that I could either extract depth data from a 3d photogrammetry model or some kind of scan? I am doing a project on measuring skin wounds/extrusions and I am trying to see if I could use an iPhone to accurately enough measure certain parts of wounds to measure their depths. I have also seen openCV stereo vision being used to measure depth, but I was hoping someone had some idea as to how I could accurately extract this data using an apple sdk.
Posted
by
Post not yet marked as solved
2 Replies
1.9k Views
Hello Dev Community, I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice. USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility. Here are some of my questions toward the use of USDZ: Why did Apple choose USDZ over other 3D file formats like GLTF? What benefits does USDZ bring to Apple's AR and 3D content ecosystem? Are there any limitations of USDZ compared to other file formats? Could factors like compatibility, security, or integration ease have influenced Apple's decision? I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
Posted
by
Post not yet marked as solved
3 Replies
782 Views
Hi, I have this pesky issue where whenever I open an fbx or usdc file in Reality Converter it fails to load the image texture due to lack of permissions. I then have to click on each individual one to open a file dialog and manually open it. This gets boring very quickly. I have granted full disk access to Reality Converter in my Privacy & Security Preferences but this made no difference. Does anyone know how to get around this issue?
Posted
by
Post not yet marked as solved
3 Replies
947 Views
If you have a scene with a simple custom .usda material applied to a primitive like a cube, the exported (.usdz) material definition is unknown for tools like Reality Converter Version 1.0 (53) or Blender Version 3.6.1. Reality Converter shows up some warnings "Missing references in USD file", "Invalid USD shader node in USD file". Even Reality Composer Pro is unable to recreate the material correct with it's own exported .usdz files. Feedback: FB12699421
Posted
by
Post not yet marked as solved
0 Replies
455 Views
Just trying to figure out how to make an occlusion mask in Reality Composer or Reality Converter. I use Blender to make my assets and want to be able to make something similar to a portal which would require an occlusion mask.
Posted
by
Post not yet marked as solved
0 Replies
471 Views
I'm trying to convert GLB file to USDZ using Reality Converter. According to Khronos gltf validator, it is valid without warnings. The file contains single animation. It is possible to get any more information about error that happened than this? The file I'm trying to convert is https://www.dropbox.com/scl/fi/ng38kle0drr4srypw5ntz/Project-Name-copy-1-1.glb?rlkey=ay4d1q3n1ykyixe3krlvnfeyb&dl=1 (I cannot attach it here because limit is 244.140625Kb...) The workaround to get this conversion working is to remove the animation, or change order or objects. Removing animation is not an option, and changing order of objects is bit of a magic solution that cannot really be automated without understanding what is happening I've downloaded Reality Converter from here https://developer.apple.com/augmented-reality/tools/ today, so I assume that is latest version
Posted
by
Post marked as solved
2 Replies
607 Views
I can only download usdpython from following website: https://developer.apple.com/augmented-reality/tools/ And where could I get the various versions of the usdpython? Is the usdzconvert(usdpython) of Apple open source? Because I wanna learn how Apple achieves the conversion from GLTF to USDZ because I'm currently using version 0.66, and I feel that the conversion of GLTF features is not quite sufficient.
Posted
by