Hi,
I'm struggling to find a way to get a simple Unlit Material working with Reality Composer.
With all the standard objects, it doesn't seem like we have the option of an unlit material which seems really odd to me... This is kind of a basic feature.
The only way I was able to get an unlit material inside RealityConverter was to import a mesh without material and I was getting a white unlit material.
I have seen that you can set an unlit material using RealityKit but from what I saw RealityKit builds an app at the end right? Honestly, I'm not sure what you get when creating an AR app using RealityKit...What I'm looking for is a simple .reality file to be displayed on the web.
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
I've tried converting a gltf to usdz but it only puts out a USDZ-conversion error. Under deatils it only says "An unexpected error occurred while converting this file to USDZ. Please fix any other errors and try again."
This error just showed up recently. I'm on MacOS 12.3.1 and Reality Converter 1.0 (47.1)
Hi all,
Up until a couple of days ago I was able to open and run Reality Composer Pro on my intel-based Mac. I tried to open it again this morning and I now receive the notification "Reality Composer is not supported on this Mac".
I understand that I will eventually need a new computer with Apple silicon but it was nice to be able to start exploring Shader Graphs with my existing computer for now.
Any suggestions? Perhaps go back to an earlier version of the beta Xcode - maybe the latest version disabled my ability to run RCP?
I'm running Version 15.1 beta (15C5042i) of Xcode on an Intel i7 MacBook Pro.
Thanks, in advance!
Hello all -
I'm experiencing a shading error when I have two UnlitSurface shaders using images for color and opacity. When the shaders are applied to two mesh planes, one placed in front of the other, the shader in front will render and the plane mesh will mask out and not render what is behind.
Basically - it looks like the opacity map on the shader in front is creating a 'mask'.
I've attached some images here to help explain.
Has anyone experienced this error? And how can I go about fixing this - thx!
I'm having trouble figuring out how to open Reality Composer in Xcode. I've tried opening Xcode, selecting "Xcode" from the toolbar, then choosing "Open Developer Tool" and "More Developer Tools" to access the download page, but Reality Composer is not listed there.
Additionally, when I try to check the details of Reality Composer through this link: https://developer.apple.com/augmented-reality/tools/, it redirects me to the iOS app page, and I can't find the download method for Mac. I would appreciate it if someone could provide guidance!
In my Reality Composer scene, I have added a spatial audio. How do I play this from my swift code?
I loaded the scene the following way:
myEntity = try await Entity(named: "grandScene", in: realityKitContentBundle)
I have various .reality files published on a website as part of a learning product, which I deployed Feb. 2023 using the latest Reality Composer at the time.
Users informed me that none of the .reality files will open on iOS 17, which I have confirmed. They still open fine on iOS 16.
On iOS 17 the QuickLook viewer says "Object requires a newer version of iOS."
What gives? Did Apple deprecated .reality, or are these designed only to work on one version of iOS only?
I create a simple primitive shape and I want to add colors on each faces of the cube. I was thinking using Shape Graph, but I have no idea on how to specify each faces with a different color. Any lead or help would be great. This tech is new so help documentions is very low
Where can I find a specification document of displacement file "baked_mesh_disp0.exr" obtained from Full Quality result run by Reality Composer Pro?
I ran Reality Composer Pro, selected Full Quality and ran Create Model, and obtained *.usdz, which I renamed to *.zip and unzipped.
Then I found 5 maps including "baked_mesh_disp0.exr" and I want to know its data specification.
I would like to apply different textures to the front and back faces of a 3D material. Specifically, when applying a texture that cuts the object in half through opacity, I want to be able to observe the back face of the object and apply a different color to it compared to the front face.
In Unity, there is a 'isFrontFace' boolean node that allows for applying different colours to the front and rear faces. However, I am unsure of how to achieve the same effect in Reality Composer Pro!
3D Model is already two-sided.
I am trying to create a simple custom shader with an image as material and a depth map as bump map information. I have followed the official procedure from "Explore materials in Reality Composer Pro" but the depth map is not processed.
What am I doing wrong?
(attached is a screenshot that shows the setup. I removed the image ref for clarity)
How to binding MTLTexture to Color input of the material?
I need use something similar to VideoMaterial.
So I need make a CustomMaterial.
But RealityKit CustomMaterial is not available in VisionOS, and replaced by ShaderGraphMaterial
So how to binding Metal resource such as MTLTexture to ShadeGraphMaterial directly.
Dear Apple Developer Forum Community,
I hope this message finds you well. I am writing to seek assistance regarding an error I encountered while attempting to create a "Hello World" application using Xcode.
Upon launching Xcode and starting a new project, I followed the standard procedure for creating a simple iOS application. However, during the process, I encountered an unexpected error that halted my progress. The error message I received was [insert error message here].
I have attempted to troubleshoot the issue by see two images, but unfortunately, I have been unsuccessful in resolving it.
I am reaching out to the community in the hope that someone might have encountered a similar issue or have expertise in troubleshooting Xcode errors. Any guidance, suggestions, or solutions would be greatly appreciated.
Thank you very much for your time and assistance.
Sincerely,
Zipzy games
y
Games
Hi, I tried to change the default size for a volumetric window but It looks like this window has a maximum width value. Is it true?
WindowGroup(id: "id") {
ItemToShow()
}.windowStyle(.volumetric)
.defaultSize(width: 100, height: 0.8, depth: 0.3, in: .meters)
Here I set the width to 100 meters but It still looks like about 2 meters
Hi,
I'm working on a simple visionOS app and I'm testing on device.
For one part of the app, I load an object in and place it on the user's hand. If I use a primitive shape, like a sphere or cylinder, this works fine. However, now I'm trying to load a an object from my RealityKitContent package. But everytime I try this, I get a an error message, resourceNotFound("Stone"), where "Stone" is one of my usda scenes.
This is what the guts of my function looks like that should return a ModelEntity:
do {
let entity = try await ModelEntity(named: "Stone", in: realityKitContentBundle)
entity.generateCollisionShapes(recursive: true)
return entity
} catch {
print("Error \(error)")
}
I can see the "Stone" in my Xcode sidebar as part of the RealityKitContent package and inside that scene, there is a simple sphere, but alas I always get this in the Xcode console, "Error resourceNotFound("Stone")"
I'm probably doing something pretty silly, hopefully it's obvious to someone else.
Thanks for the help.
Ian
I'd like to map a SwiftUI view (in my case: a map) onto a 3D curved plane in immersive view, so user can literally immersive themselves into the map. The user should also be able to interact with the map, by panning it around and selecting markers.
Is this possible?
I'm developing a vision pro application. However, when the user takes off the Apple Vision Pro device, the application goes into the background. How can I prevent this behavior programmatically?
Following this thread I'm able to render a simple picture in a Plane material, however, I'm unable to scale it to show bigger than the window itself, or move it behind the window.
Here's my relevant code so far.-
var body: some View {
ZStack {
RealityView { content in
var material = UnlitMaterial()
material.color = try! .init(tint: .white,
texture: .init(.load(named: "image",
in: nil)))
let entity = Entity()
let component = ModelComponent(
mesh: .generatePlane(width: 1, height: 1),
materials: [material]
)
entity.components.set(component)
let currentTransform = entity.transform
var newTransform = Transform(scale: currentTransform.scale,
rotation: currentTransform.rotation,
translation: SIMD3(0, 0, -0.2))
entity.move(to: newTransform, relativeTo: nil)
/*
let scalingPivot = Entity()
scalingPivot.position.y = entity.visualBounds(relativeTo: nil).center.y
scalingPivot.addChild(entity)
content.add(scalingPivot)
scalingPivot.scale *= .init(x: 1, y: 1, z: 1)
*/
}
}
}
It belongs to an ImmersiveSpace I'm opening directly from my main window, but I have several issues:
The texture shows always in front of the window
I'm unable to scale it (scaling seems to affect to the texture coordinates inside the material instead of scaling the mesh itself)
I can only see the texture in the canvas preview (not in simulator)
Hi folks!
I have been working with a team on a Vision Pro app using Reality Composer Pro. One thing we have found is that multiple developers editing the RCPro scene are a continuous problem, similar to when multiple developers edit a storyboard.
RC Pro maintains a SceneMetadataList.json file that indexes the file contents of the project that is updated even as the scene hierarchy is opened and closed, not to mention other changes to scene content. We are getting frequent continuous version control conflicts with this file as we each make changes and edits to the scene, or even browse the scene without making any substantive changes.
It seems like it would be safe to add the SceneMetadataList.json file in a RC Pro project to .gitignore. Is that recommended? Any downsides to that?
I am very new to shaders, never used one of the large systems like Unity. However I have started exploring visionOS programming and that led me to create some effects for materials in Reality Composer Pro.
I have been overwhelmed with the possibilities, but also kind of lost. I understand that RCPs shaders are based on MaterialX, so maybe there are tutorials on the web that would cover how to create procedural effects (fire, wind, water, etc)? I’ve stumbled through…but it’s slow going. Are there any good resources that talk about how to use the various nodes to create procedural effects?
For example, it took me a while to figure out that using the “time” node allows me to animate cool color changes, especially when combined with various math and remap nodes.
Just looking for some basic resources I think. Would the shader graph tutorials about Unity, apply to using RCP? Are the node types similar enough?