SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

Posts under SceneKit tag

74 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Using LiDAR DepthData with ARKit and SceneKit
Greetings! I have made use of Apple ARKit documentations to create a simple ARKit application which utilizes SceneKit (Tried Metal too) I am currently unsure of how to make use of SmoothedSceneDepth(SceneDepth) in general to acquire the DepthData from the DataMap acquired in the View. is there any particular method or way that I can access this data for displaying the depth. would be grateful with any inputs or suggestions. Thanks in advance
0
0
459
Dec ’23
Scenekit crash on iOS 17
I found Scenekit crash on iOS 17 very frequently for all device on iOS 17 here is crash trace Crashed: com.apple.scenekit.renderingQueue.SCNView0x15878c630 0 SceneKit 0x3eee4 C3DMatrix4x4GetAffineTransforms + 344 1 SceneKit 0x30208 C3DAdjustZRangeOfProjectionInfos + 140 2 SceneKit 0x2c0a90 C3DCullingContextSetupPointOfViewMatrices + 700 the attachment have the whole log Crash Log have anybody know how fo fix it
1
0
433
Dec ’23
iOS 16 & 17 touch input stutter on Pro Motion devices. Workaround?
The touch input stutter issue that exists since iOS 16 on devices with Pro Motion Displays has not been fixed yet. I filed a bug report in July but there isn't any progress since months. I see the problem in all games I tried. My game is fast paced so the stutters are quite obvious and I receive a lot of complaining emails. My game did run smoothly on Pro Motion devices with iOS 15. Is there a known workaround? I am seeing other developers having the same issue but I can't find any solutions. Other threads about this issue: IPhone 14 Pro stuttering in most games when using touch controls FPS drops when tapping the screen on iPhone 13 Pro Max
0
0
676
Dec ’23
Exporting models from Maya to SceneKit with animation
I am trying to use my animated model in XCode with SceneKit. I exported my model from Maya with Animation Data in .usd format, then converted it to .usdz with Reality Converter. When I open it in XCode viewer it is animated and everything is fine. However when I try to use it in my app it doesn't animate. On the other hand, when I try with the robot_walk_idle model from Apple's example models, it is animated. Maybe I am missing a option in export settings. Thanks for any help. import SwiftUI import SceneKit struct ModelView: View { var body: some View{ VStack{ SceneView(scene: SCNScene(named: "robot_walk_idle.usdz")) } } }
1
0
639
Dec ’23
Scene Kit Rotation - rotating around X and Y axis only, causing Z rotation
I am trying to control the orientation of a box in Scene Kit (iOS) using gestures. I am using the translation in x and y to update the x and y rotation of the SCNNode. After a long search I have realised that x and y rotation will always lead to z rotation, thanks to this excellent post: [https://gamedev.stackexchange.com/questions/136174/im-rotating-an-object-on-two-axes-so-why-does-it-keep-twisting-around-the-thir?newreg=130c66c673f848a7be2873bf675573a9) So I am trying to get the z rotation causes, and then remove this from my object by applying the inverse quaternion however when I rotate the object 90 deg around x, and then 90 deg around Y it behaves VERY weirdly. It is almost behaving as it is in gimbal lock, but I did not think that using quaternion in the way that I am would cause gimbal lock in this way. I am sure it is something I am missing, or perhaps I am not able to remove the z rotation in this way. Thanks! I have added a video of the strange behaviour here [https://github.com/marcusraty/RotationExample/blob/main/Example.MP4) And the code example is here [https://github.com/marcusraty/RotationExample)
0
0
810
Dec ’23
supportsDirectionMeasurement returns false and cant not use SceneKit
Hi everyone, I am trying on iOS 17.1.1 the Nearby Interaction framework and SceneKit. I am testing it on iPhone 15 Pro Max and iPhone 12 Pro Max. if NISession.deviceCapabilities.supportsDirectionMeasurement { print("Interact using device distance and direction.") } else if NISession.deviceCapabilities.supportsPreciseDistanceMeasurement { print("Interact using distance only.") } iPhone 12 Pro Max is working normally, but supportsDirectionMeasurement property in iPhone 15 Pro Max returns false and i cant use the SceneKit. Is anyone experiencing the same issue? Regards, Shin
0
0
553
Nov ’23
SwiftUI SceneView not receiving tap gestures on MacOS
In an Xcode MultiPlatorm App, the following works as expected (detecting tap gestures) in iOS simulator, but not when compiled and run on "My Mac" -- on Mac OS the view doesn't seem to get any clicks. Does anyone know a way to get this to work? One of the reasons import SwiftUI import SceneKit class RenderDelegate: NSObject, SCNSceneRendererDelegate { var lastRenderer: SCNSceneRenderer! func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { lastRenderer = renderer } } class Model: ObservableObject { let scene = SCNScene() let renderDelegate = RenderDelegate() } struct ContentView : View { @ObservedObject var model = Model() @State private var pointOfView = "distantCamera" init() { let sphereGeometry = SCNSphere(radius: 0.05) #if os(iOS) || os(watchOS) || os(tvOS) sphereGeometry.firstMaterial?.diffuse.contents = UIColor.red #else sphereGeometry.firstMaterial?.diffuse.contents = NSColor.red #endif let sphereNode = SCNNode(geometry: sphereGeometry) sphereNode.position = SCNVector3Make(0.0, 0.0, 0.0) model.scene.rootNode.addChildNode(sphereNode) } var body: some View { ZStack { SceneView( scene: model.scene, options: [ .allowsCameraControl ], delegate: model.renderDelegate ).onTapGesture {print("tap")} } .padding() } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } }
0
0
400
Nov ’23
Rotating SceneKit IBL lighting environment
I have a spherical HDR image that is being used for environment lighting in a SceneKit scene. I want to rotate the environment image. To set the environment lighting, I use the lightingEnvironment SCNMaterialProperty. This works fine, and my scene is lit using the IBL. As with all SCNMaterialProperty, I expect that I can use the contentsTransform property to rotate or transform the HDR. So I set it as follows: lightingEnvironment.contentsTransform = SCNMatrix4MakeRotation((45.0).degreesAsRadians, 0.0, 1.0, 0.0) My expectation is that the lighting environment would rotate 45 degrees in Y, but it doesn't change at all. Even if I throw in a completely random transform on all axis, there is no apparent change. To test if there is a change, I added a chrome ball and a diffuse ball to my scene and I'm comparing reflections on the chrome ball, and lighting on the diffuse ball. There is no change on either. It doesn't matter where I set the contentsTransform, it doesn't work. I had intended to set it from the renderer(_:updateAtTime:) method on the SCNRendererDelegate, so that I can rotate the IBL to match the point of view of the scene, but even if I transform the environment immediately after it is set, there is never a change. Is this a bug? Or am I doing something entirely wrong? Has anyone on here ever managed to get this to work?
0
0
701
Nov ’23
Save and align ARKit created and rendered geometry later in SceneKit
I am exploring ARKit and SceneKit, but I am not sure if what I want to do is possible. In App1: Run an ARKit session using configuration.sceneReconstruction = .mesh I am rending the mesh in func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) and func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) and the mesh appears correct I have set up a button that does the following: Capture the mesh to an .obj file (based off the excellent answer in here) Capture a snapshot (SCNView snapshot) Store the AR Camera Transform Store the AR SCN View Point Of View Projection Transform Storing the transforms in a CSV, and taking care to ensure I restore them in column major order. In App2: Load the geometry from the .obj file into SCNNode, do not apply any transform to it, apply a wireframe to it so I can visualise it Add a camera and apply the 2 saved transforms (AR Camera Transform and then AR SCN View Point Of View Projection Transform Set the background of the scene to be the image from the snapshot. I was expecting that the mesh, as visualised by the wireframe in App2 would match the rendering as captured in a point in time from App1 - however I cannot get it to match. I have two specific questions: Have I missed something fundamental in my understanding of what is possible? Should this be possible and could I be missing some step? The codebase is large but I have put the basic outline here What is the difference between ScnCamera Projection Transform and AR Camera Project Matrix - they appear different in App1 which is not expected to me. Thanks very much
0
0
455
Oct ’23
SceneKit leaks IOSurface memory after releasing diffuse contents
I've watched this issue for a long time but it seems this hasn't been fixed yet. My use case is to assign a UIView to the 'contents' variable of SCNMaterialProperty. It works without problem in terms of rendering, but when I assign 'nil' to the variable the allocated memory of IOSurface by SceneKit does not being destroyed I've searched about this and many other developers have been suffered by this issue. I did a 'Game Memory' profiling of my toy example and the allocated memory (134MB) by SceneKit hadn't been released after I've assigned nil. I'm sure I released every relavant UIViews and view controllers used for the 'contents'.
0
0
521
Oct ’23
Xcode SceneKit Scene Editor (and GKEntity)
Hello, I've got a question about the Xcode Scene Editor. That is the SceneKit one NOT SpriteKit. According to this documentation: https://developer.apple.com/documentation/scenekit/scnnode/2873004-entity the entity property of a node serialised via the Xcode's scene editor can be set. While the Xcode's SpriteKit Scene Editor has this option I cannot find anything similar in the SceneKit editor. So my question is do *.scn files produced from Xcode contain GameplayKit information such as a GKEntity graph or only SCNNode data? Do I have to parse the scene and programatically create GKEntities? If that is the case there must be an error in the documentation. Thank you!
1
0
659
Oct ’23
Roompaln wall group objects apply texture
I am writing to seek assistance with a challenge I am facing while working on a 3D model rendering project. I believe your expertise in this area could be immensely helpful in resolving the issue. The problem I am encountering involves difficulties in displaying textures on both parent and child nodes within the 3D model. Here are the key details of the problem: This model contents wall_grp(doors, windows and wall) objects. We are using roomplan data in SCNView. This code dependent on scene kit and room plan apis When we are comment childnode code its working but in this case we don’t have windows and door on wall. func updateWallObjects() { if arch_grp.count > 0 { if !arch_grp.isEmpty { for obj in arch_grp[0].childNodes { let color = UIColor.init(red: 255/255, green: 229/255, blue: 204/255, alpha: 1.0) let parentNode = obj.flattenedClone() for childObj in obj.childNodes { let childNode = childObj.flattenedClone() let childMaterial = SCNMaterial() childNode.geometry?.materials = [childMaterial] if let name = childObj.name { if (removeNumbers(from: name) != "Wall") { childNode.geometry?.firstMaterial?.diffuse.contents = UIColor.white } else { childNode.geometry?.firstMaterial?.diffuse.contents = color } } childObj.removeFromParentNode() parentNode.addChildNode(childObj) } let material = SCNMaterial() parentNode.geometry?.materials = [material] parentNode.geometry?.firstMaterial?.diffuse.contents = color obj.removeFromParentNode() arch_grp[0].addChildNode(parentNode) } } } }``` Please suggest us
0
0
484
Oct ’23
Texture not appling on roomplan wall object (capturedata)
We are attempting to update the texture on a node. The code below works correctly when we use a color, but it encounters issues when we attempt to use an image. The image is available in the bundle, and it image correctly in other parts of our application. This texture is being applied to both the floor and the wall. Please assist us with this issue." for obj in Floor_grp[0].childNodes { let node = obj.flattenedClone() node.transform = obj.transform let imageMaterial = SCNMaterial() node.geometry?.materials = [imageMaterial] node.geometry?.firstMaterial?.diffuse.contents = UIColor.brown obj.removeFromParentNode() Floor_grp[0].addChildNode(node) }
0
0
461
Oct ’23
Does ModelIO export materials to USD?
I've got the following code to generate an MDLMaterial from my own material data model: public extension MaterialModel { var mdlMaterial: MDLMaterial { let f = MDLPhysicallyPlausibleScatteringFunction() f.metallic.floatValue = metallic f.baseColor.color = CGColor(red: CGFloat(color.x), green: CGFloat(color.y), blue: CGFloat(color.z), alpha: 1.0) f.roughness.floatValue = roughness return MDLMaterial(name: name, scatteringFunction: f) } } When exporting to OBJ, I get the expected material properties: # Apple ModelI/O MTL File: testExport.mtl newmtl material_1 Kd 0.163277 0.0344635 0.229603 Ka 0 0 0 Ks 0 ao 0 subsurface 0 metallic 0 specularTint 0 roughness 0 anisotropicRotation 0 sheen 0.05 sheenTint 0 clearCoat 0 clearCoatGloss 0 newmtl material_2 Kd 0.814449 0.227477 0.124541 Ka 0 0 0 Ks 0 ao 0 subsurface 0 metallic 0 specularTint 0 roughness 1 anisotropicRotation 0 sheen 0.05 sheenTint 0 clearCoat 0 clearCoatGloss 0 However when exporting USD I just get: #usda 1.0 ( defaultPrim = "_0" endTimeCode = 0 startTimeCode = 0 timeCodesPerSecond = 60 upAxis = "Y" ) def Xform "Obj0" { def Mesh "_" { uniform bool doubleSided = 0 float3[] extent = [(896, 896, 896), (1152, 1152, 1148.3729)] int[] faceVertexCounts = ... int[] faceVertexIndices = ... point3f[] points = ... } def Mesh "_0" { uniform bool doubleSided = 0 float3[] extent = [(898.3113, 896.921, 1014.4961), (1082.166, 1146.7178, 1152)] int[] faceVertexCounts = ... int[] faceVertexIndices = ... point3f[] points = ... matrix4d xformOp:transform = ( (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } There aren't any material properties. FWIW, this specifies a set of common material parameters for USD: https://openusd.org/release/spec_usdpreviewsurface.html (Note: there is no tag for ModelIO, so using SceneKit, etc.)
0
0
564
Sep ’23
ObjectCaptureView/Session blocks ARSession sceneUnderstanding
I used ObjectCaptureView with an ObjectCaptureSession in different setups, for example nested in an UIViewController so that I was able to deallocate the View and the Session after switching to another View. If I am going to use an ARSession with ARWorldTracking and SceneUnderstanding afterwards and the app won't show the overlaying Mesh anymore. Using SceneUnderstanding without opening the ObjectCaptureView previously works fine. Has someone faced the same issue, or how could I report this to apple? Seems like a problem with the ObjectCaptureView/Session itself. During the start of the ObjectCaptureSession the are also some logs in the Metadata telling me: "Wasn't able to pop ARFrame and Cameraframe at the same time", it will be shown like 10 or 15 times for every start. So I nested it in an ARSCNView but that didn't fixed it.
0
0
472
Sep ’23
AR 3D object tracking using LIDAR.
Hi there, i have recently started development in swift Ui. I wanted to ask whether it is possible to design an AR app which generates and tracks a 3d model or .scn based on a real world 3d model if .usdz format is used. for example i want to generate and track the movement of an aeroplane in AR and i have .scn file but i want a real world object as an anchor like a pen or pencil and i want to use its 3d data in .usdz format. i know you can use ARobjects abnd object tracking but it uses .arobject format and doesnot use LiDAR. important thing is that i want to use Lidar tracking not point cloud. is it possible? point me in right direction Thank you. I am using xcode 15 & ios 17 beta
1
1
574
Jun ’24