When I load some usdz file , it crash 100%, Why ?
It crash in simulate , but not crash in Vision Pro
-[MTLDebugDevice newBufferWithBytes:length:options:]:723: failed assertion `Buffer Validation
newBufferWith*:length 0x100fff80 must not exceed 256 MB.
USDZ
RSS for tagUSDZ is a 3D file format that shows up as AR content on a website.
Posts under USDZ tag
72 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello!
I'm trying to play an animation with a toggle button. When the button is toggled the animation either plays forward from the first frame (.speed = 1) OR plays backward from the last frame (.speed = -1), so if the button is toggled when the animation is only halfway through, it 'jumps' to the first or last frame. The animation is 120 frames, and I want the position in playback to be preserved when the button is toggled - so the animation reverses or continues forward from whatever frame the animation was currently on.
Any tips on implementation? Thanks!
import RealityKit
import RealityKitContent
struct ModelView: View {
var isPlaying: Bool
@State private var scene: Entity? = nil
@State private var unboxAnimationResource: AnimationResource? = nil
var body: some View {
RealityView { content in
// Specify the name of the Entity you want
scene = try? await Entity(named: "TestAsset", in: realityKitContentBundle)
scene!.generateCollisionShapes(recursive: true)
scene!.components.set(InputTargetComponent())
content.add(scene!)
} .installGestures()
.onChange(of: isPlaying) {
if (isPlaying){
var playerDefinition = scene!.availableAnimations[0].definition
playerDefinition.speed = 1
playerDefinition.repeatMode = .none
playerDefinition.trimDuration = 0
let playerAnimation = try! AnimationResource.generate(with: playerDefinition)
scene!.playAnimation(playerAnimation)
} else {
var playerDefinition = scene!.availableAnimations[0].definition
playerDefinition.speed = -1
playerDefinition.repeatMode = .none
playerDefinition.trimDuration = 0
let playerAnimation = try! AnimationResource.generate(with: playerDefinition)
scene!.playAnimation(playerAnimation)
}
}
}
}
Thanks!
Hi everyone,
I'm looking for a way to convert an FBX file to USDZ directly within my iOS app. I'm aware of Reality Converter and the Python USDZ converter tool, but I haven't been able to find any documentation on how to do this directly within the app (assuming the user can upload their own file). Any guidance on how to achieve this would be greatly appreciated.
I've heard about Model I/O and SceneKit, but I haven't found much information on using them for this purpose either.
Thanks!
I am creating a 3D model from multiple images using the photogrammetry session. Now, when the session generates an OBJ file and I measure the distance between two points, the distance is displayed sporadically in different units. Sometimes it's meters, then centimeters, or another unit altogether. How can I tell the photogrammetry session to always create the model in millimeters?
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre.
I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject.
I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
I have various USDZ files in my visionOS app. Loading the USDZ files works quite well. I only have problems with the positioning of the 3D model. For example, I have a USDZ file that is displayed directly above me. I can't move the model or perform any other actions on it. If I sit on a chair or stand up again, the 3D model automatically moves with me. This is my source code for loading the USDZ files:
struct ImmersiveView: View {
@State var modelName: String
@State private var loadedModel = Entity()
var body: some View {
RealityView { content in
if let usdModel = try? await Entity(named: modelName) {
print("====> \(modelName) : \(usdModel) <====")
let bounds = usdModel.visualBounds(relativeTo: nil).extents
usdModel.scale = SIMD3<Float>(1.0, 1.0, 1.0)
usdModel.position = SIMD3<Float>(0.0, 0.0, 0.0)
usdModel.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)]))
usdModel.components.set(HoverEffectComponent())
usdModel.components.set(InputTargetComponent())
loadedModel = usdModel
content.add(usdModel)
}
}
}
}
I only want the 3D models from the USDZ files to be displayed later, and later on, to be able to move them via gestures. Moving the models is step 2. First, I need to make sure the models are displayed correctly. What have I forgotten or done wrong?
I have a plane with a texture that was made in Blender & then exported using the Reality Converter from a USDC to a USDZ.
The translucency looks 100% translucent in RCP but it looks a bit like glass with some reflection in a Reality Kit scene in visionOS.
Is there a material setting that I need to change?
This question is about USD animations playing correctly in macOS Preview but not with RealityKit on visionOS.
I have a USD file created with 3D Studio Max that contains mesh-based smoke animation:
https://drive.google.com/file/d/1L7Jophgvw0u0USSv-_0fPGuCuJtapmzo/view
(5.6 MB)
Apple's macOS 14.5 Preview app is able to play the animation correctly:
However, when a visionOS app uses RealityKit to load that same USD file in visionOS 2.0 beta 4, built with 16.0 beta 3 (16A5202i), and Entity/playAnimation is called, the animation does not play as expected:
This same app is able to successfully play animation of a hierarchy of solid objects read from a different USD file.
When I inspect the RealityKit entities loaded from the USD file, the ground plane entity is a ModelEntity, as expected, but the smoke entity type is Entity, with no associated geometry.
Why is it that macOS Preview can play the animation in the file, but RealityKit cannot?
Thank you for considering this question.
When I run my visionOS App, RealityKitContent Report an error:
Tool terminated by signal 'Segmentation fault: 11'
And it points to a USDZ model I imported, but in the scene, my model can be displayed normally and there is no damage. Why does an error occur? How can I check and repair it?
I noticed that with the 4th betas of iOS 18 and visionOS 2, some USDZ models' texture mapping looks completely broken. The issue occurs only with a device, not with the Simulator. It's a regression, the models look fine with iOS 17.5.1 and visionOS 1.2.
The issue occurs if I load a model as an Entity in a RealityView iOS or visionOS, or in a SwiftUI 3DModel view on visionOS.
Has anyone seen this too? Is there a workaround?
I filed a bug report with a minimal example project, it's FB14473756.
Screenshot on Vision Pro device:
Screenshot on Vision Pro Simulator:
As the title suggests, clicking the "Export to SceneKit" button indeed converts a USD to .scn but removes the normals in the process if the mesh has blendshapes.
When I export the same file without any blendshapes / morphtargets, the normals stay on as expected.
If I try to create normals in the scenekit editor (adding them as a new geometry source) Xcode crashes (no matter if there are blendshapes or not)
I've tried loading the resulting scene with
[SCNSceneSource.LoadingOption.createNormalsIfAbsent : true]
but this doesn't change anything either.
I suppose this is a bug?
My last resort is to load my character without any blendshapes and then add the targets from a different scene.
Thanks for any insight!
seb
I was trying to load an Entity by Entity(named: sceneName, in: realityKitContentBundle), which works for many of my .usda file. But this time I got an error:
Error loading asset from scene PinballTable.usda: failed to load '7058602595919186152 Scene (RealityFileAsset)Bundle/RealityKitContent-RealityKitContent-resources/RealityKitContent.reality/Scene_14.compiledscene' (Asset provider load failed: type 'RealityFileAsset' -- Failed to load compiled data for asset path 'Scene_14.compiledscene', due to error: Failed to deserialize asset data.)
Any ideas on why this won't work? I have checked the size of my .usda file it's around 42kb so I won't think it's sake of file size. Due to many .usda reference inside of this scene, I suspect that it might be the case the bundle cannot locate other usda reference. So I export the whole scene into .usdz file and it turns to 118kb. Wonder if this could be the only issue here that affect the loading result but this is what I have tried so far.
visionOS System: visionOS beta 2/simulator 1.1 (neither works)
XCode: 15.4/16.0 beta (neither works)
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene.
I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore.
Can someone confirm where I might be off please?
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/
Firebase looks like the most promising at this point??
Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
Hi All,
I am using RealityKit along with ARKit and Swift UI to develop an app where I am augmenting a usdz model of a complex geometry like that of a car.
I have some other usdz files with a simple plane geometry having the material properties embedded within them which also i am loading as model entities.
I want to traverse through my car usdz file such that i can pick the material from simple usdz file and apply it to the car as car paint. To do this i know the name of the mesh holding the car paint as well as the name of the material applied.
I have tried to traverse through the usdz files using both RealityKit and SceneKit but I am not successful to reach to the lowest mesh and copy the material properties to it.
With RealityKit, I have tried to get the instance data using modelEntity as follows :-
"sourceModel?.model?.mesh.contents.instances". But this returns instance id, model name and transform only.
Any help will be highly appreciated.
Thank You
I used other software to export usdz files, hoping to further adjust the PBR and other parameters in the model in Reality Composer Pro. Because usdz is a whole, I cannot use the mouse to select a specific model in usdz on the interface. I have to find the models I want to modify one by one in the list on the left.
This method of operation is too inefficient. Is there a better way?
Or is there a way to disassemble the usdz file into numerous sub-models and texture material files, so that I can select it with the mouse on the interface in Reality Composer Pro and then modify the PBR, which would be much more efficient.
Is it possible to create a roomplan with the texture of a room plan's wall - or some way to combine ObjectCapture results with RoomPlan results?
Hello,
I have rendered an usdz File using sceneKit's .write() method on the displayed scene. Once I load it on another RealityKit's ARView using the .nonAR mode of the camera, I am trying to use the view's raycast(from:,allowing:,alignment:) method, to get the coordinates on the model. I have applied the collisionComponents when loading the model using the .generateCollisionShapes() function to be able to interact with the modelEntity.
However, the raycast result returns nothing.
Is there something I am missing for it to work?
Thanks!
We have requirement adding usdz file to UIView and showing the it’s content and archive the data and save to file. When user open the file, we need to unarchive that usdz content and binding with UIView and showing it to user. Initially, we created SCNScene object passing usdz file url like below.
do {
usdzScene = try SCNScene(url: usdzUrl)
} catch let error as NSError {
print(error)
}
Since SCNScene support to NSSecureCoding protocol , we directly archive that object and save it file and load back it from file and created SCNScene object using NSKeyedUnarchiver process.
But for some files, we realised high memory consumption while archiving the SCNScene object using below line.
func encode(with coder: NSCoder) {
coder.encode(self.scnScene, forKey: "scnScene")
}
File referene link : toy_drummer_idle.usdz
When we analyse apple documentation (check discussion section) , it said, scn file extension is the fastest format for processing than the usdz.
So we used SCNSecne write to feature for creating scn file from given usdz file.
After that, When we do the archive SCNScene object that was created by sun file url, the archive process is more faster and it will not take high memory as well. It is really faster previous case now.
But unfortunately, SCNScene write method will take lot of time for this conversion and memory meter is also going high and it will be caused to app crash as well.
I check the output file size as well. The given usdz file size is 18MB and generated scn file size is 483 MB. But SCNScene archive process is so fast.
Please, analyse this case and please, provide some guideline how we can optimise this behaviour. I really appreciate your feedback.
Full Code:
import UIKit
import SceneKit
class ViewController: UIViewController {
var scnView: SCNView?
var usdzScene: SCNScene?
var scnScene: SCNScene?
lazy var exportButton: UIButton = {
let btn = UIButton(type: UIButton.ButtonType.system)
btn.tag = 1
btn.backgroundColor = UIColor.blue
btn.addTarget(self, action: #selector(buttonPressed(_:)), for: .touchUpInside)
btn.setTitle("USDZ to SCN", for: .normal)
btn.setTitleColor(.white, for: .normal)
btn.layer.borderColor = UIColor.gray.cgColor
btn.titleLabel?.font = .systemFont(ofSize: 20)
btn.translatesAutoresizingMaskIntoConstraints = false
return btn
}()
func deleteTempDirectory(directoryName: String) {
let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory())
let tempDirectory = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true)
if FileManager.default.fileExists(atPath: URL(string: tempDirectory.absoluteString)!.path) {
do{
try FileManager.default.removeItem(at: tempDirectory)
}
catch let error as NSError {
print(error)
}
}
}
func createTempDirectory(directoryName: String) -> URL? {
let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory())
let toBeCreatedDirectoryUrl = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true)
if !FileManager.default.fileExists(atPath: URL(string: toBeCreatedDirectoryUrl.absoluteString)!.path) {
do{
try FileManager.default.createDirectory(at: toBeCreatedDirectoryUrl, withIntermediateDirectories: true, attributes: nil)
}
catch let error as NSError {
print(error)
return nil
}
}
return toBeCreatedDirectoryUrl
}
@IBAction func buttonPressed(_ sender: UIButton){
let scnFolderName = "SCN"
let scnFileName = "3D"
deleteTempDirectory(directoryName: scnFolderName)
guard let scnDirectoryUrl = createTempDirectory(directoryName: scnFolderName) else {return}
let scnFileUrl = scnDirectoryUrl.appendingPathComponent(scnFileName).appendingPathExtension("scn")
guard let usdzScene else {return}
let result = usdzScene.write(to: scnFileUrl, options: nil, delegate: nil, progressHandler: nil)
if (result) {
print("exporting process is success.")
} else {
print("exporting process is failed.")
}
}
override func viewDidLoad() {
super.viewDidLoad()
let usdzUrl: URL? = Bundle.main.url(forResource: "toy_drummer_idle", withExtension: "usdz")
guard let usdzUrl else {return}
do {
usdzScene = try SCNScene(url: usdzUrl)
} catch let error as NSError {
print(error)
}
guard let usdzScene else {return}
scnView = SCNView(frame: .zero)
guard let scnView else {return}
scnView.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(scnView)
self.view.addSubview(exportButton)
NSLayoutConstraint.activate([
scnView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor),
scnView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor),
scnView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor),
scnView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -30),
exportButton.widthAnchor.constraint(equalToConstant: 200),
exportButton.heightAnchor.constraint(equalToConstant: 40),
exportButton.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor),
exportButton.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor),
])
DispatchQueue.main.asyncAfter(deadline: .now() + 0.01) {[weak self] in
guard let self else {return}
loadModel(scene: usdzScene)
}
}
func loadModel(scene: SCNScene){
guard let scnView else {return}
scnView.autoenablesDefaultLighting = true
scnView.scene = scene
scnView.allowsCameraControl = true
}
}
![]("https://developer.apple.com/forums/content/attachment/5ff0197b-7e0b-4efe-b295-cc205c73dc02" "title=Screenshot 2024-06-20 at 12.23.40.png;width=1712;height=1013")
I've been trying to get the drag gesture up and running so I can move my 3D model around in my immersive space, but for some reason I am not able to move it around. The model shows up in my visionOS 1.0 Simulator, but I can't seem to get it to move around. Would love some help with this and some resources too that would be helpful. Here's a snippet of my Reality View code
import SwiftUI
import RealityKit
import RealityKitContent
struct GearRealityView: View {
static var modelEntity = Entity()
var body: some View {
RealityView { content in
if let model = try? await Entity(named: "LandingGear", in: realityKitContentBundle) {
GearRealityView.modelEntity = model
content.add(model)
}
}.gesture(
DragGesture()
.targetedToEntity(GearRealityView.modelEntity)
.onChanged({ value in
GearRealityView.modelEntity.position = value.convert(value.location3D, from: .local, to: GearRealityView.modelEntity.parent!)
})
)
}
}