SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

SceneKit Documentation

Posts under SceneKit tag

134 Posts
Sort by:
Post not yet marked as solved
1 Replies
266 Views
I tried running the Tracking and Visualizing Faces demo and unfortunately, it crashes at let url = Bundle.main.url(forResource: resourceName, withExtension: "scn", subdirectory: "Models.scnassets")! The "Models.scnassets" is available in the project.
Posted
by Deenan.
Last updated
.
Post not yet marked as solved
3 Replies
221 Views
I am trying to align my previously built model with the real world. Like, making a white ball to be earth while using AR app on an ios device. How can I do that? Any help would be good!
Posted
by Left53415.
Last updated
.
Post not yet marked as solved
0 Replies
71 Views
I am very new to SceneKit and have been following online tutorials. When I use the scene editor and produce objects in the scene editor, when I run the app, all I can see is a blank screen. Am i missing something here?
Posted Last updated
.
Post marked as solved
3 Replies
204 Views
In the viewer, the model looks great. I then bring it into the scene and it looks great. But I want to change the material programmatically. Yet this makes it look ugly. It's a physically based light model and a metallic material. Yet renders like a relatively flat phong. The detail of the other maps is there. The color is just flat and off. To eliminate all other variables, I started applying my programmatic material right after creating the node. The material is intended to be the exact same as the one I set in the viewer. Yet I get the same ugly result when I set it to the programmatically created version. Then I only changed the diffuse contents. And this looks great. Hopefully this isn't a stupid question as I can't find any similar complaints. But what parameter is being set in the viewer that I am missing in the SCNMaterial? The following is my attempt to copy all the settings from my viewer and repeat it in code. Nothing has really changed anything from just setting the usual content values: let material = SCNMaterial()         material.lightingModel = .physicallyBased         let matScale: Float = 1.0///25.4         material.isDoubleSided = false         material.diffuse.contents = UIImage(named: "basecolor.png")         //material.diffuse.contentsTransform = SCNMatrix4MakeScale(matScale, matScale, matScale)         material.diffuse.wrapS = .repeat         material.diffuse.wrapT = .repeat         material.diffuse.mipFilter = .nearest         material.diffuse.magnificationFilter = .linear         material.diffuse.minificationFilter = .linear         material.diffuse.mappingChannel = 0         material.diffuse.maxAnisotropy = 1.0         material.metalness.contents = UIImage(named: "scuffed_metalic.png")         material.metalness.wrapS = .repeat         material.metalness.wrapT = .repeat         material.metalness.mipFilter = .nearest         material.metalness.magnificationFilter = .linear         material.metalness.minificationFilter = .linear         material.metalness.mappingChannel = 0         material.metalness.maxAnisotropy = 1.0         material.roughness.contents = UIImage(named: "scuffed_roughness.png")         material.roughness.wrapS = .repeat         material.roughness.wrapT = .repeat         material.roughness.mipFilter = .nearest         material.roughness.magnificationFilter = .linear         material.roughness.minificationFilter = .linear         material.roughness.mappingChannel = 0         material.roughness.maxAnisotropy = 1.0         material.normal.contents = UIImage(named: "scuffed_normal.png")         material.normal.wrapS = .repeat         material.normal.wrapT = .repeat         material.normal.mipFilter = .nearest         material.normal.magnificationFilter = .linear         material.normal.minificationFilter = .linear         material.normal.mappingChannel = 0         material.normal.maxAnisotropy = 1.0         material.metalness.intensity = 1.0         material.roughness.intensity = 1.0//0.0//0.3         material.normal.intensity = 1.0         material.diffuse.intensity = 1.0        material.multiply.contents = UIColor.white         material.multiply.intensity = 1.0         material.transparent.contents = UIColor.white         material.transparent.intensity = 1.0         material.clearCoatNormal.contents = UIColor.white         material.clearCoatNormal.intensity = 1.0         material.locksAmbientWithDiffuse = true         material.emission.contents = UIColor.black         material.emission.intensity = 1.0         material.selfIllumination.contents = UIColor.black         material.selfIllumination.intensity = 1.0         material.clearCoatRoughness.contents = UIColor.black         material.clearCoatRoughness.intensity = 1.0         material.displacement.contents = UIColor.black         material.displacement.intensity = 1.0         material.transparencyMode = .default         material.shininess = 1.0         material.fresnelExponent = 0.0         material.cullMode = .back         material.blendMode = .alpha         material.writesToDepthBuffer = true         material.readsFromDepthBuffer = true         material.locksAmbientWithDiffuse = true         material.isLitPerPixel = true
Posted
by Bill3D.
Last updated
.
Post not yet marked as solved
2 Replies
141 Views
I want to add a gobo effect to a spotlight in RealityKit. I looked in all information, but I could not find support for it, while I saw that it is supported in SceneKit. However, I can not use SceneKit in combination with the RealityKit application I was already writing, and it would be a lot of effort to go back to SceneKit. Is there a solution to or add gobo effects for spotlights in RealityKit, or use the combination of RealityKit and SceneKit in an AR application?
Posted Last updated
.
Post not yet marked as solved
1 Replies
177 Views
I am using RoomPlan API in my application. Working fine for small apartments but when I am trying to scan a bigger apartment which takes more time greater than 15 minutes the API automatically stopped and finished the scan even though part of the apartment is still pending scanning. I need the original structure like a point cloud in 3D, the USDZ model has a white mesh structure. Is there any way to get a real 3D view from RoomPlan API or USDZ model. How I can change the colour of scanning lines. I am getting RSFloorPlan class value when scanning finished. How I can get this Floor Plan, I mean I need its 2d structure as we what we are scanning. Thanks! Ramneet Singh (iOS developer)
Posted
by Ramneet.
Last updated
.
Post not yet marked as solved
1 Replies
444 Views
Hi, I want to extract real textures of the objects detected by RoomPlanAPI. So, that i can apply them later on to the model generated by RoomPlan. I want to create an illusion of the real room so that i can have the control over the objects. Does anyone have any idea how can i do that?
Posted
by lastsong.
Last updated
.
Post not yet marked as solved
3 Replies
625 Views
SceneKit has started filling my console with this log message: "Pass FloorPass is not linked to the rendering graph and will be ignored check it's input/output" Feels like I'm the only one on the planet using SceneKit, but if anyone can guess at what is happening, or the reason for this - I'm thankful.
Posted
by Olof_t.
Last updated
.
Post not yet marked as solved
0 Replies
214 Views
I am rewriting my watch app in SwiftUI in order to make the app work with the always on display. I have got it working well for text and images, which update even when the watch is lowered and the display is faded. However I also use a SceneView and this does not update whilst the watch is lowered. Is it possible to update the SceneView when the watch is lowered? And if so then how would I do this? Is there somewhere I can access the date/time from within the SceneView, which may cause it to update? Thanks.
Posted
by cfc.
Last updated
.
Post marked as solved
13 Replies
12k Views
Hello guys,In WWDC 2015 - Session 606 - Enhancements to SceneKit there was an Fox example project which is available to download.Can anyone say what is the correct collada file setttings when exporting model with skeletal animation from 3D tool into .dae file? For example in the demo project there is a "walk.scn".I used 3D tool (Cheetah3D) to make a simple model with simple skeletal animation and export it in .dae, but I can't get my model to be animated on the scene.If I use non skeletal animation in 3D tool and export it in .dae then model animation on the scene works perfect.Maybe someone succeed with exporting model with skeletal animation using another 3D tool like Blender?
Posted
by Nils.
Last updated
.
Post not yet marked as solved
0 Replies
254 Views
Is there a reason why animating the frame or bounds of an ARSCNView doesn't animate the camera layer? I can see the frame size animate but the camera feed snaps to position without any animation before the completion handler. Am I doing something wrong or is this the expected outcome?
Posted
by McDuffman.
Last updated
.
Post not yet marked as solved
1 Replies
294 Views
Hello I would like to move a character in swift with SceneKit. I use a gamePad to move the character. It works well at first but when the camera changes orientation, the character is not going in the right direction at all...
Posted Last updated
.
Post not yet marked as solved
0 Replies
286 Views
Hello, I'm developing an iOS app with ARKit and SceneKit, where a point cloud is shown in AR. For this, I create a SCNGeometryElement with SCNGeometryPrimitiveType.Point as type and a SCNGeometrySource with SCNGeometrySource.FromVertices. This way, the resulting points are added into the AR as spheres, which results in holes between the elements, even if it is a dense point cloud. Is it possible, to change the shape of those points to e.g. cubes, so that a dense cloud would close the holes? I've already looked into SCNShaderModifiers and SCNPrograms, but I could not find a simple solution to change the appearance of those points/spheres, without having to create a huge amount of nodes or geometries, which would lead to performance issues. I hope that you can help me with this problem. Thanks!
Posted
by sunburst.
Last updated
.
Post not yet marked as solved
1 Replies
801 Views
Hi all, I am currently making a fitness app and want to replicate the way Apple incorporate medals in their Fitness app. They have a page called awards and it looks like a collection view of different medals that the user can unlock. Once unlocking a medal the user can tap on the medal and it opens to a full screen intractable scene where the user can rotate the medal etc. I have had a play around with SceneKit and managed to get a medal loaded into a scene and display it in the app. However my functionality does not look as smooth or as polished as Apples. Does anyone have any idea on how they have managed to present all of the different Scene-kit scenes in a collection-view for example? Or how they have achieved the smooth transition between tapping on the medal in the collection-view and the scene. I guess this is created using SwiftUI. Can this be replicated in Storyboards? Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
315 Views
Problem: No Lights get imported from USD into my Scene (Geometry, Cameras and Materials worked flawless so far). I am using ModelIO to load a USD Scene (testing both usd and usdz) created in Blender, Cinema4D, Houdini or Maya (Testing all to see if at least one of those will output something working). Scenes got cross-checked in each tool and usda reviewed manually, at least the lights in scenes from blender and maya seem to be exported correctly. I am using both ModelIO and SceneKit directly to load the scene (see the two sections in my example code). SceneKit-Documentation claims that the lights are supported (See Validating feature support for USD files; Using DistantLights and SphereLights). Example of my loading code: let testScenePath = mainBundle.path(forResource: "Maya", ofType: "usdz")     let url = URL(fileURLWithPath: testScenePath!) // case 1: code used if loading with scene kit directly (simplified, ommiting try/catch) let scene = ScnScene(url: url) // case 2: code used if using Model I/O to load usd     let mdl = MDLAsset(url: url)     let scene = SCNScene(mdlAsset: mdl)
Posted Last updated
.
Post not yet marked as solved
1 Replies
497 Views
Is there an example on how I could perform relighting the real-world using RealityKit (or SceneKit or ARKit if possible)? What I mean is thatI would like to use lights in the virtual world to shine on objects (or using depth map) of the real world. thanks
Posted Last updated
.
Post not yet marked as solved
1 Replies
406 Views
Dear Apple Team and everyone who has experience with MapKit. I am building an app where I need to hide some 3D models and replace them with my custom 3D meshes using SceneKit. Up until now I was using Mapbox it allows to get mesh row data to reconstruct all maps 3D. Is there something like this possible with MapKit? Use cases Say you navigated to Kennedy Space Center Launch Complex 39 and there is no 3D model of actual building. I would like to be able to hide simple massing and replace it with my model. In 3D Satellite VIew some areas have detailed meshes. Say London The Queen's Walk. I would like to make specific area flat so I can place my 3D model on top of Satellite 3D View to illustrate new structure or building. Last one. Is it possible to change existing buildings colours? I know it is possible transparency Thank you @apple
Posted
by artpen.
Last updated
.
Post not yet marked as solved
0 Replies
445 Views
I've setup a scene with SceneKit where a ball is rolling on a plan. I've simulated the grass friction with applyForce() in order to make it stop after a while (dampen & friction parameters do not reach that effect). Everything works well till I play with runtime parameters, like scnScene.physicsWorld.timeStep, sceneView.preferredFramesPerSecond and scnScene.physicsWorld.speed. I did a comparison with default gravity. Let's take a plan, with a light slope and launch a ball on it: let planSlope = (Float.pi/180.0)*3.0 planNode.rotation = SCNVector4(x:0, y:0, z:1, w: planSlope) ... let impulse =  Float(ballMass) * speed // N.s = m*v let forceVector = simd_float3(1.0, 0.0, 0.0) let force = SCNVector3(forceVector * impulse) ballNode.physicsBody?.applyForce(force, asImpulse: true) The ball will climb the slope up till a certain distance and come back after. If I change timeStep, FPS or speed, the ball is still reaching the same point. That's great. Now, I disable gravity and create a custom force to simulate it using applyForce(): scnScene.physicsWorld.gravity = SCNVector3(0.0, 0.0, 0.0) func physicsWorld(_ physicsWorld: SCNPhysicsWorld, didUpdate physicsContact: SCNPhysicsContact) { ... let gravity = simd_float3(0.0, -9.8, 0.0) let mass = (ballNode.physicsBody?.mass)! let force = SCNVector3(gravity * Float(mass)) ballNode.physicsBody?.applyForce(force, asImpulse: false) This gives exactly the same good result with default constants: scnScene.physicsWorld.speed = 1.0 scnScene.physicsWorld.timeStep = 1.0/60.0 sceneView.preferredFramesPerSecond = 60 But as soon as I change one of them, the distance changes: scnScene.physicsWorld.speed = 2.0 scnScene.physicsWorld.timeStep = 1.0/120.0 sceneView.preferredFramesPerSecond = 30 So, I needed to take them into account on the force: func physicsWorld(_ physicsWorld: SCNPhysicsWorld, didUpdate physicsContact: SCNPhysicsContact) { ... let gravity = simd_float3(0.0, -9.8, 0.0) // Compensate timeStep & FPS & Speed !? gravity *= Float(scnScene.physicsWorld.timeStep)/(1.0/60.0) gravity *= (1.0/60.0)/(1.0/Float(sceneView.preferredFramesPerSecond)) gravity *= 1.0/Float(scnScene.physicsWorld.speed) let mass = (ballNode.physicsBody?.mass)! let force = SCNVector3(gravity * Float(mass)) ballNode.physicsBody?.applyForce(force, asImpulse: false) That's weird, I guess I missed something? The risk is that if FPS changes dynamically due to GPU overload, the result will differ.
Posted
by Manu3b.
Last updated
.