SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

SceneKit Documentation

Posts under SceneKit tag

85 Posts
Sort by:
Post not yet marked as solved
1 Replies
553 Views
Hello, I've got a question about the Xcode Scene Editor. That is the SceneKit one NOT SpriteKit. According to this documentation: https://developer.apple.com/documentation/scenekit/scnnode/2873004-entity the entity property of a node serialised via the Xcode's scene editor can be set. While the Xcode's SpriteKit Scene Editor has this option I cannot find anything similar in the SceneKit editor. So my question is do *.scn files produced from Xcode contain GameplayKit information such as a GKEntity graph or only SCNNode data? Do I have to parse the scene and programatically create GKEntities? If that is the case there must be an error in the documentation. Thank you!
Posted
by
Post not yet marked as solved
0 Replies
432 Views
I am writing to seek assistance with a challenge I am facing while working on a 3D model rendering project. I believe your expertise in this area could be immensely helpful in resolving the issue. The problem I am encountering involves difficulties in displaying textures on both parent and child nodes within the 3D model. Here are the key details of the problem: This model contents wall_grp(doors, windows and wall) objects. We are using roomplan data in SCNView. This code dependent on scene kit and room plan apis When we are comment childnode code its working but in this case we don’t have windows and door on wall. func updateWallObjects() { if arch_grp.count > 0 { if !arch_grp.isEmpty { for obj in arch_grp[0].childNodes { let color = UIColor.init(red: 255/255, green: 229/255, blue: 204/255, alpha: 1.0) let parentNode = obj.flattenedClone() for childObj in obj.childNodes { let childNode = childObj.flattenedClone() let childMaterial = SCNMaterial() childNode.geometry?.materials = [childMaterial] if let name = childObj.name { if (removeNumbers(from: name) != "Wall") { childNode.geometry?.firstMaterial?.diffuse.contents = UIColor.white } else { childNode.geometry?.firstMaterial?.diffuse.contents = color } } childObj.removeFromParentNode() parentNode.addChildNode(childObj) } let material = SCNMaterial() parentNode.geometry?.materials = [material] parentNode.geometry?.firstMaterial?.diffuse.contents = color obj.removeFromParentNode() arch_grp[0].addChildNode(parentNode) } } } }``` Please suggest us
Posted
by
Post marked as solved
1 Replies
421 Views
suddenly we found a problem in our app after iOS 17 release: Everything works well on the SceneKit scene bellow iOS16. We can load an avatar and we can rotate nodes with given quaternions. But in iOS 17 it falls apart as we want to make any rotation. can anyone give me some hint what the problem can be? in the attached image I show before and after rotation
Posted
by
Post not yet marked as solved
0 Replies
409 Views
We are attempting to update the texture on a node. The code below works correctly when we use a color, but it encounters issues when we attempt to use an image. The image is available in the bundle, and it image correctly in other parts of our application. This texture is being applied to both the floor and the wall. Please assist us with this issue." for obj in Floor_grp[0].childNodes { let node = obj.flattenedClone() node.transform = obj.transform let imageMaterial = SCNMaterial() node.geometry?.materials = [imageMaterial] node.geometry?.firstMaterial?.diffuse.contents = UIColor.brown obj.removeFromParentNode() Floor_grp[0].addChildNode(node) }
Posted
by
Post not yet marked as solved
0 Replies
479 Views
I've got the following code to generate an MDLMaterial from my own material data model: public extension MaterialModel { var mdlMaterial: MDLMaterial { let f = MDLPhysicallyPlausibleScatteringFunction() f.metallic.floatValue = metallic f.baseColor.color = CGColor(red: CGFloat(color.x), green: CGFloat(color.y), blue: CGFloat(color.z), alpha: 1.0) f.roughness.floatValue = roughness return MDLMaterial(name: name, scatteringFunction: f) } } When exporting to OBJ, I get the expected material properties: # Apple ModelI/O MTL File: testExport.mtl newmtl material_1 Kd 0.163277 0.0344635 0.229603 Ka 0 0 0 Ks 0 ao 0 subsurface 0 metallic 0 specularTint 0 roughness 0 anisotropicRotation 0 sheen 0.05 sheenTint 0 clearCoat 0 clearCoatGloss 0 newmtl material_2 Kd 0.814449 0.227477 0.124541 Ka 0 0 0 Ks 0 ao 0 subsurface 0 metallic 0 specularTint 0 roughness 1 anisotropicRotation 0 sheen 0.05 sheenTint 0 clearCoat 0 clearCoatGloss 0 However when exporting USD I just get: #usda 1.0 ( defaultPrim = "_0" endTimeCode = 0 startTimeCode = 0 timeCodesPerSecond = 60 upAxis = "Y" ) def Xform "Obj0" { def Mesh "_" { uniform bool doubleSided = 0 float3[] extent = [(896, 896, 896), (1152, 1152, 1148.3729)] int[] faceVertexCounts = ... int[] faceVertexIndices = ... point3f[] points = ... } def Mesh "_0" { uniform bool doubleSided = 0 float3[] extent = [(898.3113, 896.921, 1014.4961), (1082.166, 1146.7178, 1152)] int[] faceVertexCounts = ... int[] faceVertexIndices = ... point3f[] points = ... matrix4d xformOp:transform = ( (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } There aren't any material properties. FWIW, this specifies a set of common material parameters for USD: https://openusd.org/release/spec_usdpreviewsurface.html (Note: there is no tag for ModelIO, so using SceneKit, etc.)
Posted
by
Post not yet marked as solved
0 Replies
414 Views
I used ObjectCaptureView with an ObjectCaptureSession in different setups, for example nested in an UIViewController so that I was able to deallocate the View and the Session after switching to another View. If I am going to use an ARSession with ARWorldTracking and SceneUnderstanding afterwards and the app won't show the overlaying Mesh anymore. Using SceneUnderstanding without opening the ObjectCaptureView previously works fine. Has someone faced the same issue, or how could I report this to apple? Seems like a problem with the ObjectCaptureView/Session itself. During the start of the ObjectCaptureSession the are also some logs in the Metadata telling me: "Wasn't able to pop ARFrame and Cameraframe at the same time", it will be shown like 10 or 15 times for every start. So I nested it in an ARSCNView but that didn't fixed it.
Posted
by
Post not yet marked as solved
0 Replies
471 Views
Hi there, i have recently started development in swift Ui. I wanted to ask whether it is possible to design an AR app which generates and tracks a 3d model or .scn based on a real world 3d model if .usdz format is used. for example i want to generate and track the movement of an aeroplane in AR and i have .scn file but i want a real world object as an anchor like a pen or pencil and i want to use its 3d data in .usdz format. i know you can use ARobjects abnd object tracking but it uses .arobject format and doesnot use LiDAR. important thing is that i want to use Lidar tracking not point cloud. is it possible? point me in right direction Thank you. I am using xcode 15 & ios 17 beta
Posted
by
Post not yet marked as solved
0 Replies
657 Views
Hi guys, I need to find a way to extract height information form MapKit and rebuild selected map area in 3D using SceneKit or RealityKit. Constructing a mesh is not a problem. But I can't seem to find a way to extract bitmap and height information from MapKit? I can do it with MapBox but really wanted to avoid using it. Any ideas?
Posted
by
Post not yet marked as solved
0 Replies
405 Views
is any one else having issues with game scene-view .dae and or .scn files it seems these new beta release is very incompatible with files that work perfect with previous Xcode releases up to Xcode 14 I'm working on upgrading a simple striped down version of my chess game and run in to strange and bogus errors messages and crashes /Users/helmut/Desktop/schachGame8423/schach2023/scntool:1:1 failed to convert file with failure reason: *** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[0] all tools reality converter exporting the .dae file to other graphic files working fine as in prior Xcode releases but some thing is missing in current beta release of Xcode 15
Posted
by
Post marked as solved
1 Replies
928 Views
I've been working on an app that combines CoreML and ARKit/SceneKit to detect and measure some objects, with success. Now I need to make it available to a React Native app, and I'm trying this approach here: https://github.com/riteshakya037/react-native-native-module where I can navigate and instantiate the view controller. The problem occurs when my view gets called. I have errors at the sceneView, not being loaded. Is there a way to use it without the Storyboard? For now it seems the incompatibility.
Posted
by
Post not yet marked as solved
0 Replies
539 Views
Hello everyone πŸ‘‹ Occasional Apple developer yet first time poster Flo here. I've had this idea floating around my head for a while now, to develop a little toy that would make use of Apple's XDR displays, i.e. the one in my MBP. So essentially, I'm trying to do real-time 3D graphics utilising the HDR colour space, but I don't have the motivation to learn the bare metal Metal graphics API. SceneKit, so I figured, would allow me to explore the EDR-rendering pipeline, since to my knowledge they all (SpriteKit, RealityKit etc.) use Metal under the hood anyway. As per the WWDC '21 - Explore HDR rendering with EDR presentation, all I had to do was set a few properties on my view's underlying CAMetalLayer to enable EDR rendering for my macOS app. However, the SceneKit template in Xcode seems to be instantiating my view with a CALayer by default and when I try to replace it with a CAMetalLayer nothing gets rendered to the screen / window. Am I oversimplifying things? All I want to do is display a bunch of colours that are brighter than reference white :< If this is possible at all, I would appreciate any pointers. Thanks for reading πŸ™
Posted
by
Post not yet marked as solved
0 Replies
593 Views
Hi all. I am new to swift and AR. I'm trying a project on AR and ran into a problem that I can't change the material on the models. With geometry such as a sphere or a cube, everything is simple. Tell me what am I doing wrong? My simple code: @IBOutlet var sceneView: ARSCNView! var modelNode: SCNNode! override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.showsStatistics = true let scene = SCNScene(named: "art.scnassets/jacket.usdz")! modelNode = scene.rootNode.childNode(withName: "jacket", recursively: true) let material = SCNMaterial() material.diffuse.contents = UIImage(named: "art.scnassets/58.png") modelNode.childNodes[0].geometry?.materials = [material] sceneView.scene = scene
Posted
by
Post not yet marked as solved
0 Replies
382 Views
How do I trace a map on the floor as the user walks through their house, like a trail or heatmap, and then save this trail to CoreData Would it be possible to load and view this map later in the same spot? Or rescan the trail in the same area?
Posted
by
Post not yet marked as solved
0 Replies
515 Views
I am developing a game where users can use their finger to move objects, and when the user releases their finger the game checks for overlap between objects and move the moved object back to it's original position if there is overlap. I am using SCNScene.PhysicsWorld.contactTest(with: ) to check for overlap between my nodes. However, the method only works correctly when nodes have physicsbodys using .convexHull, when I change it to .concavePolyHedron everything stops working and no contact is reported. I have set the physicbodys to be static so I am at a loss of what to do. Here is my code configuring the physicsbody for each node parentNode.physicsBody = SCNPhysicsBody(type: .static, shape: SCNPhysicsShape(node: parentNode, options: [.type: SCNPhysicsShape.ShapeType.concavePolyhedron, .collisionMargin: 0.0, .scale: scaleVector])) Here is my code calling contact test : if let test = currentNode?.physicsBody { let list = view.scene!.physicsWorld.contactTest(with: test) { ... } }
Posted
by
Post not yet marked as solved
0 Replies
703 Views
I am getting 4 warnings compiling the current template of sceneKit game /Users/helmut/Documents/spaceTime/scntool:1:1 Could not find bundle inside /Library/Developer/CommandLineTools any idea why the bundle is not installed I did a new download twice now still stuck getting these warnings about bundles not installed or not found any solution to correct the download install so the base dependencies are present thank you
Posted
by
Post not yet marked as solved
0 Replies
663 Views
A feature of my app I am in need of is to allow users to go through their room and mark specific positions, then be able to navigate these positions with a floor plan like a map. Think google maps but for your room, showing the user's position. I know it is possible to make a floor plan with RoomPlan, which could act as a map, but would it be possible after the plan is made to track a user's location in the room and show it? Is this too complex for RoomPlan? And if so how would I tackle this problem?
Posted
by
Post not yet marked as solved
1 Replies
556 Views
Understanding SCNCameraController TLDR; I'm able to create my own subclassed camera controller, but it only works for rotation, not translation. I made a demo repo here. Background I want to use SceneKit's camera controller to drive my scene's camera. The reason I want to subclass it is that my camera is on a rig where I apply rotation to the rig and translation to the camera. I do that because I animate the camera, and applying both translation and rotation to the camera node doesn't create the animation I want. Setting up Instantiate my own SCNCameraController Set its pointofView to my scene's pointOfView (or its parent node I guess) Using the camera controller We now want the new camera controller to drive the scene. When interactions begin (e.g. mouseDown), call beginInteraction(_ location: CGPoint, withViewport viewport: CGSize) When interactions update and end call the corresponding functions on the camera controller Actual behavior It works when I begin/update/end interactions from mouse down events. It ignores any other event types, like magnification, scrollwheel, which work in e.g. the SceneKit Editor in Xcode. See MySCNView.swift in the repo for a demo. By overriding the camera controller's rotate function, I can see that it is called with deltas. This is great. But when I override translateInCameraSpaceBy my print statements don't appear and the scene doesn't translate. Expected behavior I expected SCNCameraController to also apply translations and rolls to the pointOfView by inspecting the currentEvent and figuring out what to do. I'm inclined to think that I'm supposed to call translateInCameraSpaceBy myself, but that seems inconsistent with how Begin/Continue/End interaction seems to call rotate. Demo repo: https://github.com/mortenjust/Camera-Control-Demo
Posted
by
Post not yet marked as solved
0 Replies
487 Views
I'm trying to create an app similar to PolyCam using Lidar . I'm using SceneKit mesh reconstruction and able to apply some random textures. But need real-world textures in generated output 3D Model. Found few examples available which are related to MetalKit and Point Cloud, which was not helpful. Can you help me out with any references/steps/tutorial to how to achieve it .
Posted
by