Apple's own USDZ files from their website does not display properly both in OPReview and in Keynote when imported. It displays properly, however, in Reality Converter bit with this error:
"Invalid USD shader node in USD file
Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source."
I tried importing other models from external sources and they work without any issue at all.
Is there any potential fix or workaround this?
Thanks in advance.
Reality Converter
RSS for tagConvert, view, and customize USDZ 3D objects on Mac using Reality Converter.
Posts under Reality Converter tag
14 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre.
I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject.
I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
Hey Everyone, this is like my first post here in the apple forum.
I need your help to understand better Reality Kit and file exports, but let me explain.
I'm trying to create a little 3D Object editor, and it looks like to work pretty well using RealityViews and managing materials on the Entity.
I'm currently working with all the Beta Apis and I would like to export my entity into an .usdz or a .obj file.
I've found a method that allows me to create a .Reality File
let path = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask)[0].appendingPathComponent("model.reality")
try await self.appState.parentEntity.write(to: path)
but I now I don't know how to convert it into a .usdz or a .obj file, or otherwise any standard 3d format.
Do you have any idea on how could I do?
Thankyou so much!
Have a nice day ^^
I’d like to convert a Filemaker 18 Runtime to a Mac (Catalina) application package.
I found a youtube video that describes how to convert a shell script into a Mac OSX App.
This is the shell script:
I’ve had no luck adapting it to convert my Filemaker runtime into an app.
Thanks in advance for any advice.
Regards,
Lara
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama).
When I try to drag the PNGs to the workspace or the file tree…nothing happens.
I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then?
Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
I'm trying to make a simple demo of using ShaderGraphMaterial in a USDZ file that I can preview on Mac and VisionOS but I'm having trouble.
In Reality Composer, I make a sphere, then assign a ShaderGraphMaterial to the material, with a simple diffuse color (green) input. When I save the file as .usda, it displays as a gray sphere on mac rather than the green sphere shown in reality composer. If I then convert to usdz using Reality Converter, I get a warning on import:
"Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source."
And the exported .usdz also shows as a gray sphere.
Is there a simple demo of a .usda file using ShaderGraphMaterial that displays on Mac, iOS, and VisionOS that I can look at to see how it looks internally?
My actual problem is creating usdz / usda files on visionOS for viewing on iOS / Mac / VisionOS.. but the first step is showing it's possible to even use ShaderGraphMaterial across all platforms.
Thanks
We are porting a iOS Unity AR app to native visionOS.
Ideally, we want to re-use our AR models in both applications. These AR models are rather simple. But still, converting them manually would be time-consuming, especially when it gets to the shaders.
Is anyone aware of any attempts to write conversion tools for this? Maybe in other ecosystems like Godot or Unreal, where folks also want to convert the proprietary Unity format to something else?
I've seen there's an FBX converter, but this would not care for shaders or particles.
I am basically looking for something like the Polyspatial-internal conversion tools, but without the heavy weight of all the rest of Unity. Alternatively, is there a way to export a Unity project to visionOS and then just take the models out of the Xcode project?
I'm working on a project wherein RealityKit for iOS will be used to display 3D files (USDZ) in a real-world environment. The model will also need to animate differently depending on which button is pressed. When using models that are downloaded from various websites or via Apple QuickLook, the code functions well. I can hold the animation in place and click a button to play it.
Unfortunately, although the model (through blender) my team provided is animating in SceneKit, it does not play at all when left in the real world, not even when a button is pressed.
I checked RealityKit USDZ tool, and found usdz file is not valid, they are not figure out whats wrong.
Could you please help me figure out what's wrong with my USDZ file?
Working USDZ: https://developer.apple.com/augmented-reality/quick-look/models/drummertoy/toy_drummer_idle.usdz
My file: https://drive.google.com/file/d/1UibIKBy2fx4q0XxSNodOwQZMLgktKiKF/view?usp=sharing
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter.
See Attached image:
It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear.
Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
I am trying to use my animated model in XCode with SceneKit. I exported my model from Maya with Animation Data in .usd format, then converted it to .usdz with Reality Converter. When I open it in XCode viewer it is animated and everything is fine. However when I try to use it in my app it doesn't animate. On the other hand, when I try with the robot_walk_idle model from Apple's example models, it is animated. Maybe I am missing a option in export settings. Thanks for any help.
import SwiftUI
import SceneKit
struct ModelView: View {
var body: some View{
VStack{
SceneView(scene: SCNScene(named: "robot_walk_idle.usdz"))
}
}
}
RealityKit doesn't appear to support particles. After exporting particles from Blender 4.0.1, in standard .usdz format, the particle system renders correctly in Finder and Reality Converter, but when loaded into and anchored in RealityKit...nothing happens. This appears to be a bug in RealityKit. I tried one or more particle instances and nothing renders.
Hello Dev Community,
I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice.
USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility.
Here are some of my questions toward the use of USDZ:
Why did Apple choose USDZ over other 3D file formats like GLTF?
What benefits does USDZ bring to Apple's AR and 3D content ecosystem?
Are there any limitations of USDZ compared to other file formats?
Could factors like compatibility, security, or integration ease have influenced Apple's decision?
I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
I have a single fbx model with several animations(idle, walk, run, eat......), but after I convert it to usdz format, I can only play 1 animation, where can I find other animations and how can I play them, or USDZ doesn't support it yet?
Thank you.
Cyan
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again."
This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)