Shader Graph Editor

RSS for tag

A easy to use visual editor for your materials all within Reality Composer Pro

Posts under Shader Graph Editor tag

43 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
261
2w
SceneKit Transparent Material Self-Overlapping Issue (Front Face Overlapping)
Description: I'm developing an AR effect using SceneKit and applying a transparent material to a face mesh. However, I'm facing an issue where the front faces of the mesh overlap each other, causing incorrect rendering. Problem: The front faces of the mesh overlap with each other when transparency is applied. This causes areas like the cheeks to be visible through the nose, even though they should be occluded. Expected Behavior: The material should behave as if it were opaque to itself—that is, overlapping front faces should be occluded properly, while still allowing transparency for background elements. Actual Behavior: The mesh renders its own front faces incorrectly, making parts of the face visible through others when they should be blocked. What I Have Tried: testMaterial.writesToDepthBuffer = true testMaterial.readsFromDepthBuffer = true Question: 👉 How can I prevent SceneKit's transparent material from rendering overlapping front faces? 👉 Is there a way to force SceneKit to treat its own mesh as opaque for itself while still being transparent to the background? 👉 Does SceneKit support a proper depth pre-pass or an equivalent to Unity’s ZWrite shaders to solve this issue? Attached screenshots demonstrate the problem visually. Any help would be greatly appreciated! 🚀
0
0
225
3w
Dynamically assigning texture resource to ShaderGraphMaterial on VisionOS
I implemented a ShaderGraphMaterial and tried to load it from my usda scene by ShaderGraphMaterial.init(name: in: bundle). I want to dynamically set TextureResource on that material, so I wanted to expose texture as Uniform Input of a ShaderGraphMaterial. But obviously RCP's Shader Graph doesn't support Texture input as parameter as the image shows: And from the code level, ShaderGraphMaterial also didn't expose a way to set TexturesResources neither. Its parameterNames shows an empty array if I didn't set any custom input params. The texture I get is from my backend so it really cannot be saved into a file and load it again (that would be too weird). Is there something I am missing?
1
0
318
3w
ShaderGraphMaterial on entity
Hi I try to make a 360 stereo viewer, and I have made a ShaderGraphMaterial on Reality Composer Pro. Im trying to use that material on a inverted sphere whitch is generated in Swift. When I try to attach the material I get this error "Type of expression is ambiguous without a type annotation" Here is the code (sorry im noob =) ): import SwiftUI import RealityKit import RealityKitContent import PhotosUI struct ImmersiveView: View { @Environment(AppModel.self) var appModel var body: some View { RealityView { content in // Add the initial RealityKit content guard let skyBoxEntity = await createSkybox() else { return } content.add(skyBoxEntity) } } } private func createSkybox () async -> Entity? { var matX = try? await ShaderGraphMaterial(named: "/Root/Mat_Stereo360", from: "360Stereo.usda", in: realityKitContentBundle) let sphere = await MeshResource.generateSphere(radius:1000) let entity = await Entity() entity.components.set(ModelComponent(mesh: sphere, materials: [matX])). //ERROR HERE: Type of expression is ambiguous without a type annotation //entity.scale *= .init(x:-1, y:1, z:1) return entity } I hope someone can help me =) Best regards, Kim
2
0
336
Jan ’25
Reflection Diffuse only show white
sample repo: https://github.com/ckse93/VideoDiffusionIssueSHowcase Repo has detailed step by step workflow. as well as screenshot, python script compute result, and parameters after running computeDiffuseReflectionUVs.py and mapping textures and reflection diffuse to objects, I noticed that reflection diffuse does not produce any color. expected result is shown below, diffused light has color
1
0
300
Jan ’25
Issues importing Tiled Image Material X shader node into Reality Composer Pro
Hey, I am having issues getting my Material X shaders to work properly in Reality Composer Pro that I've authored in Houdini. The shader is very simple. It starts with a tiled image node that is written to the diffuse color of the preview surface node. This node is called mtxltileimage2. When I create a tiled image node in RCP and configure it to have the same parameter values I get the texture to show up correctly. This node is called TiledImage. One difference I can identify is that the second node has a grey icon whereas the first node has a blue icon. Could this be related to this issue? Here is the USD viewer output for the two variants of the tiled image node. Any pointers, misconceptions and help would be greatly appreciated. My goal is to be able and author these shaders in Houdini and import them into RCP. Trying to figure out the right pipeline for this workflow.
1
0
460
Dec ’24
Beginner question: Keyframes?
I am new here, my name is Axel - Hi! 👋 I am both a visionOS beginner and an expert in things 3D, now figuring my way into Reality Composer Pro. Most things are intuitive and easy to understand, yet there is one thing I cannot seem to figure out, and I feel really stupid because it must be there, and that's Keyframing: I would simply make a new Timeline and animate published parameters in a Shader Graph over time. I know how to do this via Xcode and Custom Components, but that can't be it because that will break even on simple to medium animation in terms of previewing and fine-tuning, complete overkill for a simple value animation. Since key framing is the most basic functionality of 3D apps next to move, I am sure it is in RCP somewhere. Anyone got a pointer for me?
1
0
539
Oct ’24
How to use KTX file Image3D in ShaderGraph?
I've tried importing a 3D KTX file into Reality Composer Pro ShaderGraph, but got nothing output. Realitykit doesn't support Image3D yet? First, I use pyktx to create a 3D KTX texture file and it seems correct in Finder preview. import numpy as np from pyktx import KtxTexture2, KtxTextureCreateInfo, KtxTextureCreateStorage, VkFormat size = 32 image = np.zeros((size, size, size, 3), dtype=np.uint8) for i in range(size): image[i, :, :, 0] = np.interp(i, [0, size], [50, 255]) image[:, i, :, 1] = np.interp(i, [0, size], [100, 255]) image[:, :, i, 2] = np.interp(i, [0, size], [150, 255]) info = KtxTextureCreateInfo( gl_internal_format=None, vk_format=VkFormat.VK_FORMAT_R8G8B8_SRGB, base_width=image.shape[2], base_height=image.shape[1], base_depth=image.shape[0], num_dimensions=3, num_levels=1, num_layers=1, num_faces=1, generate_mipmaps=False, is_array=False, ) texture = KtxTexture2.create(info, KtxTextureCreateStorage.ALLOC) for _ in range(1): texture.set_image_from_memory(0, 0, _, image[_].tobytes()) texture.write_to_named_file(f'{size}.ktx') Then I import it into ShaderGraph like this but seems nothing could be read. In addition, I tested 2D KTX file and it worked. I also tested different vk_format or KTX1/KTX2 but did not work. I also tested Image3DPixel/Image2DArray/Image3DRead and none of them work as long as it's 3D.
3
0
551
Oct ’24
Tips for developing shader graphs in Reality Composer Pro
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether: is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance? Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
1
0
575
Oct ’24
Creating and applying a shader to change an entity's rendering in RealityKit
Hello, I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through: I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials. Any hints or pointers? Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost? do { let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle) entity.applyToChildren { $0.components[ModelComponent.self]?.materials = [material] } root.addChild(entity) } catch { fatalError(error.localizedDescription) }
3
0
643
Oct ’24
How to use CubeMap in Reality Composer Pro?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit) I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black. I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong? ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
2
0
597
Oct ’24
Getting vector3 or vector4 per-vertex data into Shader Graph Editor
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor. From my tests, it seems that: the "Geometric Property" node does not give access to 4-channel data, "Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either. My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on) Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
0
0
507
Sep ’24
How to remove the impact of AR real environment light sources on materials
The 3D furniture model I built uses some smooth specular reflection materials. I hope to only reflect the HDR image of the ImageBasedLight component I set myself, without reflecting the light source of the AR real environment. How to achieve this in the following scenario? How to avoid being affected by the light source of the AR real environment when using PBR materials When using Shader Graph, how can EnvironmentRadiance not be affected by the light source of the AR real environment?
2
0
571
Sep ’24