Hi,
I'm working on a simple visionOS app and I'm testing on device.
For one part of the app, I load an object in and place it on the user's hand. If I use a primitive shape, like a sphere or cylinder, this works fine. However, now I'm trying to load a an object from my RealityKitContent package. But everytime I try this, I get a an error message, resourceNotFound("Stone"), where "Stone" is one of my usda scenes.
This is what the guts of my function looks like that should return a ModelEntity:
do {
let entity = try await ModelEntity(named: "Stone", in: realityKitContentBundle)
entity.generateCollisionShapes(recursive: true)
return entity
} catch {
print("Error \(error)")
}
I can see the "Stone" in my Xcode sidebar as part of the RealityKitContent package and inside that scene, there is a simple sphere, but alas I always get this in the Xcode console, "Error resourceNotFound("Stone")"
I'm probably doing something pretty silly, hopefully it's obvious to someone else.
Thanks for the help.
Ian
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
Hi, I tried to change the default size for a volumetric window but It looks like this window has a maximum width value. Is it true?
WindowGroup(id: "id") {
ItemToShow()
}.windowStyle(.volumetric)
.defaultSize(width: 100, height: 0.8, depth: 0.3, in: .meters)
Here I set the width to 100 meters but It still looks like about 2 meters
Dear Apple Developer Forum Community,
I hope this message finds you well. I am writing to seek assistance regarding an error I encountered while attempting to create a "Hello World" application using Xcode.
Upon launching Xcode and starting a new project, I followed the standard procedure for creating a simple iOS application. However, during the process, I encountered an unexpected error that halted my progress. The error message I received was [insert error message here].
I have attempted to troubleshoot the issue by see two images, but unfortunately, I have been unsuccessful in resolving it.
I am reaching out to the community in the hope that someone might have encountered a similar issue or have expertise in troubleshooting Xcode errors. Any guidance, suggestions, or solutions would be greatly appreciated.
Thank you very much for your time and assistance.
Sincerely,
Zipzy games
y
Games
How to binding MTLTexture to Color input of the material?
I need use something similar to VideoMaterial.
So I need make a CustomMaterial.
But RealityKit CustomMaterial is not available in VisionOS, and replaced by ShaderGraphMaterial
So how to binding Metal resource such as MTLTexture to ShadeGraphMaterial directly.
I am trying to create a simple custom shader with an image as material and a depth map as bump map information. I have followed the official procedure from "Explore materials in Reality Composer Pro" but the depth map is not processed.
What am I doing wrong?
(attached is a screenshot that shows the setup. I removed the image ref for clarity)
I would like to apply different textures to the front and back faces of a 3D material. Specifically, when applying a texture that cuts the object in half through opacity, I want to be able to observe the back face of the object and apply a different color to it compared to the front face.
In Unity, there is a 'isFrontFace' boolean node that allows for applying different colours to the front and rear faces. However, I am unsure of how to achieve the same effect in Reality Composer Pro!
3D Model is already two-sided.
Where can I find a specification document of displacement file "baked_mesh_disp0.exr" obtained from Full Quality result run by Reality Composer Pro?
I ran Reality Composer Pro, selected Full Quality and ran Create Model, and obtained *.usdz, which I renamed to *.zip and unzipped.
Then I found 5 maps including "baked_mesh_disp0.exr" and I want to know its data specification.
I create a simple primitive shape and I want to add colors on each faces of the cube. I was thinking using Shape Graph, but I have no idea on how to specify each faces with a different color. Any lead or help would be great. This tech is new so help documentions is very low
I have various .reality files published on a website as part of a learning product, which I deployed Feb. 2023 using the latest Reality Composer at the time.
Users informed me that none of the .reality files will open on iOS 17, which I have confirmed. They still open fine on iOS 16.
On iOS 17 the QuickLook viewer says "Object requires a newer version of iOS."
What gives? Did Apple deprecated .reality, or are these designed only to work on one version of iOS only?
Hello all -
I'm experiencing a shading error when I have two UnlitSurface shaders using images for color and opacity. When the shaders are applied to two mesh planes, one placed in front of the other, the shader in front will render and the plane mesh will mask out and not render what is behind.
Basically - it looks like the opacity map on the shader in front is creating a 'mask'.
I've attached some images here to help explain.
Has anyone experienced this error? And how can I go about fixing this - thx!
Hi all,
Up until a couple of days ago I was able to open and run Reality Composer Pro on my intel-based Mac. I tried to open it again this morning and I now receive the notification "Reality Composer is not supported on this Mac".
I understand that I will eventually need a new computer with Apple silicon but it was nice to be able to start exploring Shader Graphs with my existing computer for now.
Any suggestions? Perhaps go back to an earlier version of the beta Xcode - maybe the latest version disabled my ability to run RCP?
I'm running Version 15.1 beta (15C5042i) of Xcode on an Intel i7 MacBook Pro.
Thanks, in advance!
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again."
This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)
Hi,
I'm struggling to find a way to get a simple Unlit Material working with Reality Composer.
With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature.
The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material.
I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.