USDZ is a 3D file format that shows up as AR content on a website.

USDZ Documentation

Posts under USDZ tag

74 Posts
Sort by:
Post not yet marked as solved
0 Replies
45 Views
Please Refer to: https://webkit.org/blog/8421/viewing-augmented-reality-assets-in-safari-for-ios/ In the "Linking to USDZ" section you can read that once the content is served with the correct MIME-type, you can link to USDZ content in the normal manner, e.g: <a href="https://webkit.org/demos/ar/heart.usdz">https://webkit.org/demos/ar/heart.usdz</a> https://webkit.org/demos/ar/heart.usdz When you tap that link, Safari on iOS navigates to a NEW page that shows a static thumbnail of the 3D asset: this thumbnail is a render of the usdz that is computed on the fly (we are not passing a img placeholder like in the other method mentioned on that page). What I want is to see this preview without the need to click the link, so, somehow to force the link execution and stream back to the original page (such as this forum post) the rendered thumbnail. Basically Apple offers two methods: page 1: a text link is presented --> click --> go to page 2 where asset thumbnail is rendered --> click --> enter AR mode page 1: a static pre-rendered image is presented as a link --> click image --> go to ar mode (no page 2 needed in this case) We need to combine these methods: In page 1 you see an on-the-fly rendered preview (no pre-rendered static image) that is also a link, click --> go to ar view. What I am saying is that the necessity of a pre-rendered static image is an unnecessary complication since the browser can obviously render this in method 1. Paolo
Posted
by pberto.
Last updated
.
Post not yet marked as solved
0 Replies
148 Views
I am trying to play a specific skeletal animation on my 3D object (loaded from a USDZ file). The animation file is also in format of USDZ. I tried the following: Entity.loadAsync(contentsOf: Bundle.main.url(forResource: "Character", withExtension: "usdz")!) .append(Entity.loadAsync(contentsOf: Bundle.main.url(forResource: "Animation", withExtension: "usdz")!)) .collect() .sink(receiveCompletion: { if case .failure(let error) = $0 { print(error) } }, receiveValue: { data in let character = data[0] self.anchorEntity?.addChild(character) DispatchQueue.main.asyncAfter(deadline: .now() + 5) { let animationEntity = data[1] animationEntity.transform.matrix = character.transform.matrix if let animation = animationEntity.availableAnimations.first { character.playAnimation(animation, startsPaused: false) } } }) .store(in: &self.cancellables) I am seeing these in the console: [Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.Elbow_R_bnd.Transform.transform [Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.ShoulderPart1_R_bnd.Transform.transform [Animation] Invalid bind path: Ann_Body_anim_Neutral.RootNode.root.Root_M_bnd.Spine1_M_bnd.Spine2_M_bnd.Spine3_M_bnd.Chest_M_bnd.Scapula_R_bnd.Shoulder_R_bnd.ShoulderPart2_R_bnd.Transform.transform ... It seems that the transform is different between the animation file joints/nodes and the character ones. Is there any way to fix this in code? If not, how I can make it work? I am receiving the animation as FBX file and then I am converting it into gtlf using the fbx2gltf tool, then I am converting the gltf into usdz using usdzconvert. let task = Process() task.executableURL = Self.fbx2gltfBinURL task.arguments = [ "-i", url.path, "-o", temporaryURL.path, "-b", "--blend-shape-normals", "--blend-shape-tangents", "--fbx-temp-dir", temporaryDirectory.path, ] try task.run()
Posted Last updated
.
Post not yet marked as solved
0 Replies
155 Views
I'm using usdzconvert command to make this happen. I get this error. I tried obj file, that converts fine to usdz Input file: cube.abc Traceback (most recent call last):  File "/Applications/usdpython/usdzconvert/usdzconvert", line 859, in   errorValue = main()  File "/Applications/usdpython/usdzconvert/usdzconvert", line 854, in main   return tryProcess(sys.argv[1:])  File "/Applications/usdpython/usdzconvert/usdzconvert", line 809, in tryProcess   ret = process(argumentList)  File "/Applications/usdpython/usdzconvert/usdzconvert", line 665, in process   usdStage = Usd.Stage.Open(srcPath) pxr.Tf.ErrorException:  Error in 'pxrInternal_v0_22__pxrReserved__::SdfLayer::OpenLayerAndUnlockRegistry' at line 3257 in file /Users/sergei/repos/USD/pxr/usd/sdf/layer.cpp : 'Cannot determine file format for @cube.abc:SDF_FORMAT_ARGS:target=usd@' Error in 'pxrInternal_v0_22__pxrReserved_::UsdStage::Open' at line 879 in file /Users/sergei/repos/USD/pxr/usd/usd/stage.cpp : 'Failed to open layer @cube.abc@'
Posted
by Senchoi.
Last updated
.
Post not yet marked as solved
1 Replies
204 Views
I have two USDZ files, the Cube one was created using Blender and has a size of 10cm, the RetroTV one was provided by apple in the AR Quicklook Gallery. The issue is that after loading them the TV shows to scale, but the cube is too small to be seen. From what I had understood, and seen in the WWDC talks, the unit of measurement in RealityKit is the meter, so in the case of the Cube 0.1 meters is 10 cm. But that means that the tv is supposed to be 84m in width, and that can't be right, specially if the model is coming directly from Apple. I also tried converting both to USDA and both of them have metersPerUnit = 1. The only way I could find to make it work is after loading the square entity into RealityKit I can then change the scale to [1, 1, 1], but if I do the same to the TV it becomes huge. What am I missing in here, what's wrong?
Posted
by sGerli.
Last updated
.
Post not yet marked as solved
1 Replies
254 Views
I'm trying to convert a model from Blender to USDZ, and I am having trouble getting alpha map to show transparency properly on the hair and eyelashes. As you can see, the hair becomes too transparent with a white sheen over the surface. It is not just the hair, but also eyelashes and hair piece are more transparent than they intent to be. I have tried using different file format for the opacity map, but it seems Reality Converter can only read png or jpeg. I have also tried using a diffuse map that is transparent (pre-multiplied?) png, and it also doesn't work.
Posted Last updated
.
Post not yet marked as solved
5 Replies
503 Views
I am trying the new Photogrammetry command line sample code provided by Apple from this link and ended with the error Type 'PhotogrammetrySession' has no member 'isSupported' The documentation given here included with the member boolean. Specs are below: macOS Monterey 12.5 Macbook pro 2021 M1 Max Memory 32 GB Xcode Version 13.4.1 (13F100) Any help would be appreciated !
Posted
by viknesh.
Last updated
.
Post not yet marked as solved
0 Replies
195 Views
Hi, I,m trying to open a gltf file in Reality Converter and the textures maps like this: When I import as fbx the texture mapping is right. (but unfortunately the model get bad) Any one who gives a clue what's going on?
Posted
by Ladde.
Last updated
.
Post not yet marked as solved
0 Replies
204 Views
I'm loading a .usdz file into my View and so far it works fine, except from the fact that when I load multiple objects into my scene the app crashes with message: Terminated due to memory issue. I'm using Apple's default .usdz samples. It works with up to 8-10 instances and then crashes.
Posted
by Dhruvi.
Last updated
.
Post not yet marked as solved
0 Replies
217 Views
Is it possible to add haptic feedback to a USDZ in Reality Composer? For example, if I want a phone to single-tap after knocking over a 3D model of a block in augmented reality, is that possible to do, and if so, how? If this is not possible to do within Reality Composer, is it possible to add haptic feedback to a USDZ in some other way? For example, if I have a USDZ model of a button in augmented reality view, can I have the phone single-tap when the button model is pushed? If so, how can I accomplish this? Thanks in advance for your help!
Posted
by KDP.
Last updated
.
Post not yet marked as solved
0 Replies
236 Views
hi I am working on AR development. Development is progressing using https://github.com/ProjectDent/ARKit-CoreLocation, and development is progressing to some extent. (using ARSCNView) In the process, we were using a file with animations using usdz, but there was a problem that the animations were working abnormally. The example usdz provided by Apple works normally, but there is a problem in the usdz that is created and used separately. If you use usdz as the default app provided by Apple, it works normally again. Do I have to do something when creating the 3d file to make it work like the usdz animation provided by Apple? We are stuck on these issues and are constantly looking for information. I hope you get some help here.
Posted
by vkeldh.
Last updated
.
Post marked as solved
6 Replies
427 Views
I've been trying to run multiple photogrammetry sessions in parallel in different threads but I keep getting this error [Photogrammetry] queueNextProcessingBatchIfNeeded(): Already running a job... not starting new one. This happens even though the session.isProcessing returns false. Is there someone out there that can help with this issue please ? Thanks a lot
Posted
by Sadafiou.
Last updated
.
Post not yet marked as solved
0 Replies
325 Views
If I change some settings of an usdz inside XCode I got the error usdz-File can´t be saved automaticly. Same error on two machines. Any ideas?
Posted
by Kai_S.
Last updated
.
Post not yet marked as solved
1 Replies
445 Views
Hi, In our workflow, we have 3D objects with the type of gltf provided to the iOS via API endpoints. Using a different types of files is not an option here. Inside the iOS, the glb file is converted to a scn file using a third-party framework. The nodes in the converted scn file look as expected and similar to the original glb file. In the second step, the scn file should be converted to a USDZ file to use in RealityKit. As far as I know, there is only one way to convert scn to USDZ inside the iOS app. which is using an undocumented approach ( assign a usdz format to URL and using write(to:options:delegate:progressHandler:)) let path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] .appendingPathComponent("model.usdz") sceneView.scene.write(to: path) In the converted USDZ file, all the one-sided materials (transparent) materials are changed to a not transparent material. We tested on a car object, and all the window glasses became not transparent after converting. On the other hand, if we use the Reality converter app in macOS to convert glb file directly to USDZ file, every node converts as expected, and it works so fine. Is there any workaround to fix that issue? Or any upcoming update to let us use glb file in the app or successfully convert it to USDZ inside the iOS app? Thanks
Posted Last updated
.
Post not yet marked as solved
1 Replies
318 Views
I created a scene in Reality Composer which includes lots of different models. Then I load scene and try to load model and place them separately, the following is my UITapGestureHandler: guard let loadModel = loadedScene.findEntity(named: selectedPlant.selectedModel) else{ return } loadModel.setPosition(SIMD3(0.0,0.0,0.0), relativeTo: nil) My confusion is, when you use .findentity and place this model to the detected plane, it cannot be retrieved again: I try to call this again to place a second model after placing the first one, and .findentity returns nil. Does anyone know the mechanism behind it? I thought loading the scene will create a template in memory, but in contrary, it seems like only creating a sort of list and pop out entity for every call.
Posted
by daiyukun.
Last updated
.
Post marked as solved
1 Replies
940 Views
I've been trying to build and use the USD command line toolchain pieces from source on an M1/arm64 Mac - and while I've gotten it to compile, I get a crash as python attempts to load up the USD internals: ------------------------------ python terminated ------------------------------- python crashed. FATAL ERROR: Failed axiom: ' Py_IsInitialized() ' in operator() at line 148 of /Users/heckj/src/USD/pxr/base/tf/pyTracing.cpp writing crash report to [ Sparrow.local:/var/folders/8t/k6nw7pyx2qq77g8qq_g429080000gn/T//st_python.79467 ] ... done. -------------------------------------------------------------------------------- I found a number of issues at Github, which hints that this is a potentially known and ongoing problem: https://github.com/PixarAnimationStudios/USD/issues/1620 referenced issue: https://github.com/PixarAnimationStudios/USD/issues/1466 referenced issue: https://github.com/PixarAnimationStudios/USD/issues/1736 With some suggestions, but no clear resolutions. I tried the build commands that are referenced in the release details for USDPython as available on developer downloads, and fiddling it a bit got it to compile: python build_scripts/build_usd.py \ --build-args TBB,arch=arm64 \ --python --no-imaging --no-usdview \ --prefer-safety-over-speed \ --build-monolithic /opt/local/USD But I'm repeatedly hitting the crash where python isn't initializing. I've tried Python 3 from home-brew, an Anaconda version of python (intel through Rosetta), the basing it on the python included with Xcode (universal binary), and the most recent trial was with miniforge3 arm-native python that's recommended from the Metal for Tensorflow marketing page. WIth the warnings about Python disappearing from the default install, I'd like to get a mechanism in place that I'm comfortable with to get an install of USDtools, and ideally that are native to the M1 processor.
Posted
by heckj.
Last updated
.
Post not yet marked as solved
1 Replies
398 Views
Hello, I have a USDZ file placed in a SceneView, and I am searching for a way to interact with parts of that USDZ inside my app. For example, I have a person made up of 6 parts (arm_right, arm_left, leg_left, leg_right, body and head) and I would like to select arm_left and see its width. How can I achieve this inside my app?
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.3k Views
Hello! I am currently developing a new app and I have some doubts about USDZ file format. I am working on archicad and I export collada (.dae) file. Once I do that I go to blender and import the collada and everything is good, the problem starts when I convert to GLTF or USDZ, the details seem to be lost and my materials do not show the nice textures I came up with inside the blender. I want to have lightweight detailed and very realistic 3D visualization for VR on the phone
Posted Last updated
.
Post not yet marked as solved
2 Replies
394 Views
In this video, apple demonstrates a GUI app for object capture that utilizes the interactive workflow. Is there anywhere we can download the code for this app? I'm referring to the GUI on top of the command line app that they provide that allows you to edit the bounding box of the preview to do more targeted photogrammetry processing on the model when the higher quality rendering is done. command line app: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app Looking for the GUI APP they made that enables the interactive workflow talked about in this video: https://developer.apple.com/videos/play/wwdc2021/10076/
Posted Last updated
.
Post not yet marked as solved
1 Replies
488 Views
Hi all, I have a requirement about converting usdz to iges file in swift app, I searched on google but didn't find any support library in this. do you know a library that supports this? Thanks all.
Posted
by thuannh4.
Last updated
.