SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

SceneKit Documentation

Posts under SceneKit subtopic

Post

Replies

Boosts

Views

Activity

Render to multiple offscreen images with SCNRenderer
I am trying to extract some built-in and custom render passes from SceneKit, so that I can pass them into a metal pipeline and do some additional work with them. I have a metal viewport, and have instantiated a SCNRenderer so that I can render a SCNScene using SceneKit to a texture as part of my metal draw pass. This works as expected. Now I want to output multiple textures from the SceneKit render, not just the final color. I want to extract Depth, Normal, Lighting, Colour and a custom SCNTechnique for world position. I can easily use a SCNTechnique to render one of these to the color output, but it's not clear how I would render multiple passes in one render call. Is there some way to pass a writeable buffer/texture to a SCNTechnique, so that I can populate it in my SCNTechnique shader at render time with the output from the pass? Similar to how one would bind a buffer for a metal shader. SCNTechnique obfuscates things, so it's not clear how to proceed. Does anyone have any ideas?
0
0
625
Dec ’24
Camera zoom in to 3D point in SceneKit scene
I would like to implement zoom functionality in my SceneKit game: when the user performs the pinch gesture on a point on the screen, the scene zooms in to make that point larger. Until now I simply changed SCNCamera.focalLength, but this simply zooms in to the center of what is currently visible on screen. Is it somehow possible to implement the zoom functionality described above by perhaps interactively rotating the camera at the same time towards the pinched point? Is there a formula for this? I would like to avoid suddenly rotating the camera to face the pinched point when the pinch gesture begins and then zoom in while the pinch is in progress.
0
0
586
Dec ’24
SceneKit Transparent Material Self-Overlapping Issue (Front Face Overlapping)
Description: I'm developing an AR effect using SceneKit and applying a transparent material to a face mesh. However, I'm facing an issue where the front faces of the mesh overlap each other, causing incorrect rendering. Problem: The front faces of the mesh overlap with each other when transparency is applied. This causes areas like the cheeks to be visible through the nose, even though they should be occluded. Expected Behavior: The material should behave as if it were opaque to itself—that is, overlapping front faces should be occluded properly, while still allowing transparency for background elements. Actual Behavior: The mesh renders its own front faces incorrectly, making parts of the face visible through others when they should be blocked. What I Have Tried: testMaterial.writesToDepthBuffer = true testMaterial.readsFromDepthBuffer = true Question: 👉 How can I prevent SceneKit's transparent material from rendering overlapping front faces? 👉 Is there a way to force SceneKit to treat its own mesh as opaque for itself while still being transparent to the background? 👉 Does SceneKit support a proper depth pre-pass or an equivalent to Unity’s ZWrite shaders to solve this issue? Attached screenshots demonstrate the problem visually. Any help would be greatly appreciated! 🚀
0
2
454
Feb ’25
Swift game sometimes runs on efficiency cores then snaps back to performance cores
I've been working on Swift game which is not yet launched or available for preview. The game works in such a way that it has idle CPU while the user is thinking and sustained max CPU and GPU on as many cores as possible when he makes a move. Rarely, due to OS activity or something else outside of my control (for example when dropping the OS curtain even if for just a bit then remove it), the game or some of its threads are moved to efficiency cores which results in major stuttering which persists precisely until the game is idle again at which point the game is moved back on performance cores - but if the player keeps making moves the stuttering simply won't go away and so I guess compuptation is locked onto efficiency cores. The issue does not reproduce on MacCatalyst on Intel. How do I tell Swift to avoid efficiency cores? BTW Swift and SceneKIT have AMAZING performance especially when compared to others.
0
0
91
Mar ’25
SCNTechnique clearColor Always Shows sceneBackground When Passes Share Depth Buffer
Problem Description I'm encountering an issue with SCNTechnique where the clearColor setting is being ignored when multiple passes share the same depth buffer. The clear color always appears as the scene background, regardless of what value I set. The minimal project for reproducing the issue: https://www.dropbox.com/scl/fi/30mx06xunh75wgl3t4sbd/SCNTechniqueCustomSymbols.zip?rlkey=yuehjtk7xh2pmdbetv2r8t2lx&st=b9uobpkp&dl=0 Problem Details In my SCNTechnique configuration, I have two passes that need to share the same depth buffer for proper occlusion handling: "passes": [ "box1_pass": [ "draw": "DRAW_SCENE", "includeCategoryMask": 1, "colorStates": [ "clear": true, "clearColor": "0 0 0 0" // Expecting transparent black ], "depthStates": [ "clear": true, "enableWrite": true ], "outputs": [ "depth": "box1_depth", "color": "box1_color" ], ], "box2_pass": [ "draw": "DRAW_SCENE", "includeCategoryMask": 2, "colorStates": [ "clear": true, "clearColor": "0 0 0 0" // Also expecting transparent black ], "depthStates": [ "clear": false, "enableWrite": false ], "outputs": [ "depth": "box1_depth", // Sharing the same depth buffer "color": "box2_color", ], ], "final_quad": [ "draw": "DRAW_QUAD", "metalVertexShader": "myVertexShader", "metalFragmentShader": "myFragmentShader", "inputs": [ "box1_color": "box1_color", "box2_color": "box2_color", ], "outputs": [ "color": "COLOR" ] ] ] And the metal shader used to display box1_color and box2_color with splitting: fragment half4 myFragmentShader(VertexOut in [[stage_in]], texture2d<half, access::sample> box1_color [[texture(0)]], texture2d<half, access::sample> box2_color [[texture(1)]]) { half4 color1 = box1_color.sample(s, in.texcoord); half4 color2 = box2_color.sample(s, in.texcoord); if (in.texcoord.x < 0.5) { return color1; } return color2; }; Expected Behavior Both passes should clear their color targets to transparent black (0, 0, 0, 0) The depth buffer should be shared between passes for proper occlusion Actual Behavior Both box1_color and box2_color targets contain the scene background instead of being cleared to transparent (see attached image) This happens even when I explicitly set clearColor: "0 0 0 0" for both passes Setting scene.background.contents = UIColor.clear makes the clearColor work as expected, but I need to keep the scene background for other purposes What I've Tried Setting different clearColor values - all are ignored when sharing depth buffer Using DRAW_NODE instead of DRAW_SCENE - didn't solve the issue Creating a separate pass to capture the background - the background still appears in the other passes Various combinations of clear flags and render orders Environment iOS/macOS, running with "My Mac (Designed for iPad)" Xcode 16.2 Question Is this a known limitation of SceneKit when passes share a depth buffer? Is there a workaround to achieve truly transparent clear colors while maintaining a shared depth buffer for occlusion testing? The core issue seems to be that SceneKit automatically renders the scene background in every DRAW_SCENE pass when a shared depth buffer is detected, overriding any clearColor settings. Any insights or workarounds would be greatly appreciated. Thank you!
0
0
119
Jun ’25
Can you reduce drawcalls with FlattenedClone() and PNGs with transparency?
Hello, I want to use flattenedClone() node for the trees in my landscape and reduce the draw call. If my texture uses a jpg file the result is good. If I use a PNG with transparency there is no draw call reduction. Can I use PNGs with transparency with the flattenedClone() ? Below is the structure I'm using: for child in scene_temp.rootNode.childNodes { let newtree = tree.clone() sum_tree.addChildNode(newtree) } let sum_tree_flat = sum_tree.flattenedClone() scene.rootNode.addChildNode(sum_tree_flat) If the tree texture contains a JPG file (diffuse) I get a draw call reduction (< 200 for my complete map) as expected (but no transparency with my leaves). If I use a PNG file, my draw calls are not reduced (> 2000) but I do have my trees with my leaves in transparency. I've tried replacing tree.clone() with tree.flattenedClone() with no results. Thanks for your help. Translated with DeepL.com (free version)
1
0
669
Dec ’24
SCNCamera SSAO on visionOS
Hi Looking at the documentation for screenSpaceAmbientOcclusionIntensity, I noticed that it says this is supported on visionOS 1.0+: https://developer.apple.com/documentation/scenekit/scncamera/screenspaceambientocclusionintensity Could someone enlighten me as to how that would work? As far as I know, we don't use an SCNCamera on visionOS. So, what's the idea here? Can we activate SSAO on visionOS?
1
0
392
Feb ’25
SceneKit - different behavior when debugging
Hello, I'm currently working on my first SceneKit game and have encountered an issue related to moving an SCNNode using a UIPanGestureRecognizer. When I deploy the game to my iPhone via Xcode in debug mode, all interactions are smooth. However, when I stop the debugging session and run the game directly from the device (outside of Xcode), the SCNNode movement behaves inconsistently — it works sometimes smoothly and sometimes not and the interaction becomes choppy. The SCNNode movement is controlled using a UIPanGestureRecognizer. Do you have any ideas what might be causing the issue?
1
0
131
May ’25
How to play Vorbis/OGG files with swift?
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C&#92;&#43;+ to fill the PCM because this method seems to only be available in C&#92;&#43;+ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
2
0
4.1k
Jul ’25
The Action editor doesn't work in Xcode 16
The Actions Editor doesn't seem to work in Xcode 16.1. With a node selected, when I try to drag an action from the Library into the Actions panel nothing happens, the action icon just disappears. Clicking the '+' button to create a new action doesn't work either. Actions do work when created in code, though. Is this a bug, or am I missing something?
2
2
660
Dec ’24
SceneKit
Helle there Currently, I’m attempting to create an interactive learning application with a 3D view. I’ve discovered the framework SceneKit, but I lack the necessary knowledge to animate, load and moving objects. Could someone kindly suggest some good articles or tutorials on this topic?
2
0
570
Feb ’25
SceneKit SCNMorpher Supports SCNGeometry with Some SCNLevelOfDetail
In my project, I have several nodes (SCNNode) with some levels of detail (SCNLevelOfDetail) and everything works correctly, but when I add animation using morphing (SCNMorpher), the animation works correctly but without the levels of detail. Note: the entire scene is created in Autodesk 3D Studio Max and then exported in (.ASE) format. The goal is to make animations using morphing that have some levels of detail. Does anyone know if SCNMorpher supports geometry with some levels of detail? I appreciate any information about this case. Thanks everyone!!! Part of the code I use to load geometries (SCNGeometry) with some levels of detail (SCNLevelOfDetail) using morphing (SCNMorpher). node.morpher = [SCNMorpher new]; SCNGeometry *geometry = [self geometryWithMesh:mesh]; NSMutableArray <SCNLevelOfDetail*> *mutLevelOfDetail = [NSMutableArray arrayWithCapacity:self.mutLevelsOfDetail.count]; for (int i = 0; i < self.mutLevelsOfDetail.count; i++) { ASCGeomObject *geomObject = self.mutLevelsOfDetail[i]; SCNGeometry *geometry = [self geometryWithMesh:geomObject.mesh.mutMeshAnimation[i]]; [mutLevelOfDetail addObject:[SCNLevelOfDetail levelOfDetailWithGeometry:geometry worldSpaceDistance:geomObject.worldSpaceDistance]]; } geometry.levelsOfDetail = mutLevelOfDetail; node.morpher.targets = [node.morpher.targets arrayByAddingObject:geometry];
2
0
179
Mar ’25
Error this SDK is not supported by the compiler - Xcode version 16.2 (16C5032a) to 16.3(16E140)
In Xcode version 16.2 (16C5032a) I created: One [ScenekitApp] [File/New/Project/iOS/Game/Game Technology: SceneKit]. Three custom Frameworks [File/New/Project/iOS/Framework] [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework]. In the four projects the [Minimum Deployments=iOS 15.0], [Swift version=6.0] and [BUILD_LIBRARY_FOR_DISTRIBUTION=Yes]. I directly installed the [Zip.framework] in [ASCSource.framework] without [pod init/pod install] and in the [General tab] [Frameworks, Libraries and Embeded Content] [Zip.framework Embed&Sign]. I installed the three frameworks [ASCSource.framework, ASCCommon.framework and ASCCoreData.framework] in [ScenekitApp] and everything works perfectly in Xcode version 16.2(16C5032a). I updated Xcode version 16.2(16C5032a) to Xcode version 16.3(16E140) and some errors in the SDKs were indicated. [Failed to build module 'ASCSource'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. [Failed to build module 'Zip'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.0.3 effective-5.10 (swiftlang-6.0.3.1.10 clang-1600.0.30.1)', while this compiler is 'Apple Swift version 6.1 effective-5.10 (swiftlang-6.1.0.110.21 clang-1700.0.13.3)'). Please select a toolchain which matches the SDK]. If I recompile all frameworks in Xcode version 16.3 (16E140) it will work, but I don't think it will be a good solution. I found some discussions in links like the following without success. https://github.com/firebase/firebase-ios-sdk/issues/13727 https://stackoverflow.com/questions/70556401/swift-version-conflict-this-sdk-is-not-supported-by-the-compiler-please-select Any help is welcome. Thanks everyone!!!
2
0
276
Apr ’25
SceneKit minimumPointScreenSpaceRadius not work
In the SceneKit framework, I want to render a point cloud and need to set the minimumPointScreenSpaceRadius property of the SCNGeometryElement class, but it doesn't work. I searched for related issues on the Internet, but didn't get a final solution. Here is my code: - (SCNGeometry *)createGeometryWithVector3Data:(NSData *)vData colorData: (NSData * _Nullable)cData{ int stride = sizeof(SCNVector3); long count = vData.length / stride; SCNGeometrySource *dataSource = [SCNGeometrySource geometrySourceWithData:vData semantic:SCNGeometrySourceSemanticVertex vectorCount:count floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:0 dataStride:stride]; SCNGeometryElement *element = [SCNGeometryElement geometryElementWithData:nil primitiveType:SCNGeometryPrimitiveTypePoint primitiveCount:count bytesPerIndex:sizeof(int)]; element.minimumPointScreenSpaceRadius = 1.0; // not work element.maximumPointScreenSpaceRadius = 20.0; SCNGeometry *pointCloudGeometry; SCNGeometrySource *colorSource; if (cData && cData.length != 0) { colorSource = [SCNGeometrySource geometrySourceWithData:cData semantic:SCNGeometrySourceSemanticColor vectorCount:count floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:0 dataStride:stride]; pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource, colorSource] elements:@[element]]; } else { pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource] elements:@[element]]; } return pointCloudGeometry; }
2
0
94
Apr ’25
Scenekit view and scenekit editor color difference
Hello there, I'm having trouble matching what I see in the scenekit editor and the output of the resulting scene in a scnview. For a glitter effect I have set a high value on the diffuse intensity which looks fine in the editor but when running the game the colors are much darker. To see if the intensity value is merely capped I have set the same multiplier on the hat below - but it is blown out which looks to me like there is some grading going on I have tried to switch on hdr rendering but that didn't make a difference. I tried disabling linear rendering and that simply made everything darker still - which I expect. Does someone have an idea what else this could be? What rendering is the scenekit editor using and how can I match it? Interestingly when I take a screenshot of the editor window for this post, the image is also blown out... what is going on? :) Thanks so much for any pointers, Seb
3
0
102
Apr ’25
SCNScene.write To USDZ, Node color fades
I use SCNScene.write To file ".usdz", then open the ".usdz" by SCNScene.url , all Nodes color fades. Export and load only once. The fading is not obvious. Return and repeat 4 or 5 times. It is obvious that the color is inconsistent with the original color and has become much lighter.
3
0
124
May ’25
Can SceneKit be used with Swift 6 Concurrency ?
I am trying to port SceneKit projects to Swift 6, and I just can't figure out how that's possible. I even start thinking SceneKit and Swift 6 concurrency just don't match together, and SceneKit projects should - hopefully for the time being only - stick to Swift 5. The SCNSceneRendererDelegate methods are called in the SceneKit Thread. If the delegate is a ViewController: class GameViewController: UIViewController { let aNode = SCNNode() func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } } Then the compiler generates the error "Main actor-isolated instance method 'renderer(_:updateAtTime:)' cannot be used to satisfy nonisolated protocol requirement" Which is fully understandable. The compiler even tells you those methods can't be used for protocol conformance, unless: Conformance is declare as @preconcurrency SCNSceneRendererDelegate like this: class GameViewController: UIViewController, @preconcurrency SCNSceneRendererDelegate { But that just delays the check to runtime, and therefore, crash in the SceneKit Thread happens at runtime... Again, fully understandable. or the delegate method is declared nonisolated like this: nonisolated func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) { aNode.position.x = 10 } Which generates the compiler error: "Main actor-isolated property 'position' can not be mutated from a nonisolated context". Again fully understandable. If the delegate is not a ViewController but a nonisolated class, we also have the problem that SCNNode can't be used. Nearly 100% of the SCNSceneRendererDelegate I've seen do use SCNNode or similar MainActor bound types, because they are meant for that. So, where am I wrong ? What is the solution to use SceneKit SCNSceneRendererDelegate methods with full Swift 6 compilation ? Is that even possible for now ?
5
0
1.1k
Nov ’24