USDZ is a 3D file format that shows up as AR content on a website.

USDZ Documentation

Posts under USDZ tag

77 Posts
Sort by:
Post not yet marked as solved
5 Replies
1.3k Views
One of people's favourite features of AR Quicklook is the video playback. Aside from the feature its self, what it represents is familiarity as people have used it in video for decades so it helps place AR right next to other common formats, which is awesome! Previously, if a USDZ's animation was longer than 10 seconds, the playback would appear. This rule was recently changed to allow the option from the USDZ metadata using the python tool. Unfortunately most designers are still not using the python toolkit so the feature is now hidden in most cases, which is too bad. Would be great to add the ability to set that metadata from Reality Converter & Composer. Better yet, if it was available as an html fragment identifier it would be super helpful, similar to "allowsContentScaling"
Posted
by
Post not yet marked as solved
2 Replies
823 Views
How to get a Video, like .mp4 or .mov converted or placed into USDZ File?
Posted
by
Post not yet marked as solved
2 Replies
966 Views
When I import the robot.fbx file into reality converter found in the Apple developer documentation found here: https://developer.apple.com/sample-code/ar/Biped-Robot.zip If I bring the .fbx version into RealityConverter, export it to USDZ and bring the USDZ file into Xcode 12 beta or any version below, the usdz robot character loses its skin bindings and the mesh has no skeleton attached. Has anyone experienced this? Have you been able to successfully convert an FBX rigged model to an USDZ model with all bones connected? I have tried this using Reality Converter beta 3 I did not edit the FBX robot character provided with Apples documentation sample.
Posted
by
Post not yet marked as solved
1 Replies
678 Views
Hi! I've been working with AR Quick Look a lot recently and find it really useful. However, when I'm combining two .usdz-files into one, the experience isn't as great. I'm working with 3D-models that are supposed to be attached to vertical surfaces, and one by one they work flawlessly. But as soon as I add more models into the same, the objects won't "stick" as close to the wall as they do when they are separated. It's like there are some sort of "margin" applied to it. To create nested .usdz-files, I use Apple's command line tool: $ usdzcreateassetlib outputFile.usdz asset1.usdz [asset2.usdz [...]] Any idea why this might be the case? Thanks!
Posted
by
Post not yet marked as solved
4 Replies
3k Views
Dear community, I downloaded the USDPython 0.64 Package for my macOS Big Sur 11 on my new Mac book pro with M1 (13inc). After installing it the .pkg ( that is a modelio) I open the folder in my Applications folder and I double click on the usd.command. I am on .zsh and I did .zshrc and inserted my filepath in my case are export PATH=$PATH:/Users/<myusername>/Applications/usdpython/USD export PYTHONPATH=$PYTHONPATH:/Users/<myusername>/Applications/usdpython/USD/lib/python export PATH=$PATH:/Users/<myusername>/Applications/usdpython/usdzconvert are these correct? I am not sure if that Machintosh HD should be /Users/<myusername> I keep getting the Error: failed to import pxr module. Please add path to USD Python bindings to your PYTHONPATH Any help would be much appreciated
Posted
by
Post not yet marked as solved
1 Replies
952 Views
I use usdzaudioimport to embed a simple audio file in my usdz. But in QuickLook (iOS or macOS) the audio is never played. What I've tried : usdzaudioimport ./myfile.usdz ./myfile.usda -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart usdzip myfile.usdz myfile.usda ./0/* where ./0 is the asset folder of myfile.usda I tried with usdc usdzaudioimport ./myfile.usdz ./myfile.usdc -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart usdzip myfile.usdz myfile.usdc ./0/* I tried with usdz without converting to usd(a/c) usdzaudioimport ./myfile.usdz -a /test audio.mp3 -auralMode notSpatial -playbackMode loopFromStart I tried m4a and mp3. I tried the sample cube from Reality Composer by adding my audio.mp3 as behavior at start of scene and exporting in usdz for Quicklook. But none of this solutions worked. I'm not able to play any audio with AR Quick Look usdz file at the start of the scene.
Posted
by
Post not yet marked as solved
2 Replies
2.2k Views
From the keynote #10076 it was mentioned at the 3:00 mark, that USDZ, USDA and OBJ is supported, but I've not been able to find details on how to make the sample command-line app export .obj files. Only .usdz. Anyone have any information on that? Or does anyone have any tips on how to convert a .usdz to .obj? It doesn't seem to be very easy to do.
Posted
by
Post not yet marked as solved
2 Replies
593 Views
I've been watching a 2019 Developer Video "Building Apps with RealityKit" and working along with it. It shows how to create a custom entity. It shows how to load entities from .usd files. How do you either load a custom entity, convert an Entity to a custom entity, or maybe move model hierarchy from an Entity to a custom entity? I assume there's a way to do this.
Posted
by
Post marked as solved
1 Replies
449 Views
I am very new to USD, I have written a simple Python script to generate a basic USD file containing a blue Sphere, this part works fine, however, I then want to create a USDZ package using this file so have used the UsdUtils CreateNewUSDZPackage function which results in a pink and purple mess instead of the blue from the original file. I would like to know why this is occurring and how to prevent it. The simple script is as follows: from pxr import Usd, UsdGeom, UsdUtils stage = Usd.Stage.CreateNew('HelloWorld.usd') xform = stage.DefinePrim('/hello', 'Xform') sphere = stage.DefinePrim('/hello/world', 'Sphere') extentAttr = sphere.GetAttribute('extent') radiusAttr = sphere.GetAttribute('radius') radiusAttr.Set(2) extentAttr.Set(extentAttr.Get() * 2) sphereSchema = UsdGeom.Sphere(sphere) color = sphereSchema.GetDisplayColorAttr() color.Set([(0, 0, 1)]) stage.GetRootLayer().Save() UsdUtils.CreateNewUsdzPackage('HelloWorld.usd', 'HelloWorldZ.usdz') The produced files look like this: Thank you
Posted
by
Post not yet marked as solved
0 Replies
473 Views
Hello! I want to build an app that lets devices with the LiDAR Scanner scan their environment and share their scans with one another. As of now, I can create the mesh using the LiDAR Scanner and export it as an OBJ file. However, I would like the ability to map textures and colors onto this model. How would one go on to get the real world texture and place it onto the OBJ model? Thank you!
Posted
by
Post marked as solved
2 Replies
531 Views
Hi all, this is my first time trying to add an AR preview into an app (also my first time using the file system). I have been trying to implement a solution similar to that explained here https://developer.apple.com/forums/thread/126377 however one key difference is that my usdz model is not in my main bundle as it is generated and downloaded from an external source at run time. I was wondering if it is possible to display a file stored in the apps documents or cache directory and how it is done. The file is downloaded and stored in the caches directory as follows: class ModelFetcher: NSObject{   var modelUrl: URL?       func generateModel() {     guard let url = URL(string: "http://127.0.0.1:5000/model.usdz") else {return}     let urlSession = URLSession(configuration: .default, delegate: self, delegateQueue: OperationQueue())     var request = URLRequest(url: url)     request.httpMethod = "POST"     let downloadTask = urlSession.downloadTask(with: request)     downloadTask.resume()   } } extension ModelFetcher: URLSessionDownloadDelegate {   func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didFinishDownloadingTo location: URL) {     print("File Downloaded Location- ", location)           guard let url = downloadTask.originalRequest?.url else {       return     }     let docsPath = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask)[0]     let destinationPath = docsPath.appendingPathComponent(url.lastPathComponent)           try? FileManager.default.removeItem(at: destinationPath)           do {       try FileManager.default.copyItem(at: location, to: destinationPath)       self.modelUrl = destinationPath       print("File moved to: \(modelUrl?.absoluteURL)")     } catch let error {       print("Copy Error: \(error.localizedDescription)")     }   } } and then the quick look preview looks like this: import SwiftUI import QuickLook import ARKit struct ARQuickLookView: UIViewControllerRepresentable {   var allowScaling: Bool = true       func makeCoordinator() -> ARQuickLookView.Coordinator {     Coordinator(self)   }       func makeUIViewController(context: Context) -> QLPreviewController {     let controller = QLPreviewController()     controller.dataSource = context.coordinator     return controller   }       func updateUIViewController(_ controller: QLPreviewController,                 context: Context) {     // nothing to do here   }       class Coordinator: NSObject, QLPreviewControllerDataSource {     let parent: ARQuickLookView     let destinationPath = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask)[0].appendingPathComponent("model.usdz")     private lazy var fileURL: URL = destinationPath           init(_ parent: ARQuickLookView) {       self.parent = parent       super.init()     }           func numberOfPreviewItems(in controller: QLPreviewController) -> Int {       return 1     }     func previewController(       _ controller: QLPreviewController,       previewItemAt index: Int     ) -> QLPreviewItem {       let fileURL = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask)[0].appendingPathComponent("model.usdz")       print(fileURL)       let item = ARQuickLookPreviewItem(fileAt: fileURL)       print(item)       item.allowsContentScaling = parent.allowScaling       return item     }   } } struct ARQuickLookView_Previews: PreviewProvider {   static var previews: some View {     ARQuickLookView()   } } However, I get an error reading "Unhandled item type 13: contentType is: (null) #PreviewItem" I know the file is actually located at this location as I have opened it in Finder. Any help would be much appreciated. Thanks in advance, Louis
Posted
by
Post not yet marked as solved
1 Replies
707 Views
We are able generate 3D mesh model, but it appears white , as we didn't get texture files in .mtl file. We found way to generate texture model from set of images at the below links https://developer.apple.com/documentation/realitykit/creating_3d_objects_from_photographs LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model However photogrammetry(object capture API) it works on MAC, we want to achieve this into iPhone and iPad. We could see this happening in "3D scanner app" and "Polycam app". Please suggest how we can resolve this. Thanks in Advance.
Posted
by
Post marked as solved
2 Replies
294 Views
Hi all, We have a working model with transparant front (to simulate glass), this successfully reflects like the surrounding border, while being transparant. However, there seem to be some glitches on various angles. There are some objects behind the glass that disappear when viewing it from the sides. Also when viewed from a low angle. They reappear whenever the AR view gets in front of the object again.
Posted
by
Post not yet marked as solved
0 Replies
280 Views
I am trying to view a house I designed in AR at 1:1 scale and it needs to be location specific. So I think I need to use an image target to coordinate the positioning. How would this be done? I've heard to use reality composer but I think this is only on ios and I use a mac. Do I need to use xcode? Can I avoid Unity? Thank you.
Posted
by
Post not yet marked as solved
1 Replies
463 Views
Hello, I downloaded the Pixar kitchen scene from Pixar and opened the rug asset as a USD file. It's about 26k polygons ( unsubdivided mesh ) and only 352kb in size. Converting the rug to a Scenekit or OBJ file increases the file size to 3.7MB! About 10x larger ! How did Pixar manage to optimize/ export a 26k polygon model with a size of 352kb? Is this only possible using their Presto proprietary software ? Is there special settings we need to used to export models created in Maya or 3ds Max with the same file size optimization? The Rug model is 3.7mb when exported as a USD file from 3ds Max.
Posted
by
Post not yet marked as solved
0 Replies
272 Views
We still have some issue facing AR transparency. Tried several different versions on the mesh but still no luck. Both solid as single face after recommendation on the forum. The problem is shown in the video. Only issue is go randomly disappearing objects within the object. When viewed from below all objects inside disappear. https://youtu.be/YKrkHiZYJP8
Posted
by
Post not yet marked as solved
0 Replies
377 Views
Hi, I have two questions about USDZ/USDA, Xcode and html5. I have created an object in Reality Composer and animated it. The actions are started via notifications.This works perfectly when I open the Reality Composer file (.rcproject) in Xcode and program the buttons accordingly. Now my questions are: is there a possibility to open USDZ/USDA files in Xcode instead of the .rcproject file? If so, how does this work? is there a possibility to create a web page with buttons in html5 and execute the actions in a USDZ/USDA file with these buttons, using the notifications? Thanks a lot
Posted
by
Post not yet marked as solved
2 Replies
633 Views
Hi all, I am able to create USDZ files to load in AR Quick Look. However, for prims that contain displayColor primvar values, they all appear incorrect (pink) when opened via AR Quick Look. To repro, create a usdz from this (size too big for attaching usdz directly): #usda 1.0 ( defaultPrim = "Origin" ) def Xform "Origin" ( apiSchemas = ["GeomModelAPI"] kind = "component" ) { string label = "Golden Krone Hotel" string modern_name def Sphere "Volume" ( doc = "This is the main volume for the Golden Krone" ) { color3f[] primvars:displayColor = [(0.721156, 0.0030596028, 0.27578437)] ( elementSize = 1 interpolation = "constant" ) double radius = 2 float xformOp:rotateX:tilt = 12 float xformOp:rotateZ:spin.timeSamples = { 0: 0, 192: 1440, } double3 xformOp:translate = (0, 2, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:rotateZ:spin", "xformOp:rotateX:tilt"] } def Cube "TopBackLeft" ( doc = "This is the volume for the top back left section of the Golden Krone" ) { color3f[] primvars:displayColor = [(0.8052342, 0.0014515291, 0.19331428)] ( elementSize = 1 interpolation = "constant" ) double3 xformOp:translate = (-2, 6, -2) uniform token[] xformOpOrder = ["xformOp:translate"] } def Capsule "TopFrontLeft" ( doc = "This is the volume for the top front left section of the Golden Krone" ) { color3f[] primvars:displayColor = [(0.007277872, 0.58201957, 0.41070253)] ( elementSize = 1 interpolation = "constant" ) double3 xformOp:translate = (-2, 6, 2) uniform token[] xformOpOrder = ["xformOp:translate"] } def Cylinder "TopBackRight" ( doc = "This is the volume for the top back right section of the Golden Krone" ) { color3f[] primvars:displayColor = [(0.20229208, 0.39510167, 0.40260625)] ( elementSize = 1 interpolation = "constant" ) double3 xformOp:translate = (2, 6, -2) uniform token[] xformOpOrder = ["xformOp:translate"] } def Cone "TopFrontRight" ( doc = "This is the volume for the top front right section of the Golden Krone" ) { color3f[] primvars:displayColor = [(0.08033847, 0.7920769, 0.12758464)] ( elementSize = 1 interpolation = "constant" ) double3 xformOp:translate = (2, 6, 2) uniform token[] xformOpOrder = ["xformOp:translate"] } } Opening the file in USDView, SideFX Houdini, Autodesk Maya ends up in viewport displaying expected colors: Opening it in AR Quick Look iOS appears pink: Searching on the internet, I have not been able to find any information about why this happens, what the requirements are, or if it is a known issue. Only a post on usd-interest google group from 2019 reported the same: https://groups.google.com/g/usd-interest/c/8hD2rzRboeg/m/PSgIPlLCBgAJ with the answer: """ And I was just told that ARQL does not currently support vertex displayColors in any capacity - neither without a bound material (as in the example I provided), nor as something read via a UsdPrimvarReader shader to feed to a UsdPreviewSurface... the only primvars currently supported in ARQL are texture coordinates for feeding a UsdUVTexture shader. """ I can not bake textures as my app is meant to work with display colors only. Does anyone have links to official help or information on this? Searching if supporting displayColor as other USD renderers do will be coming eventually.
Posted
by