SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

SceneKit Documentation

Posts under SceneKit tag

122 Posts
Sort by:
Post not yet marked as solved
0 Replies
151 Views
Hi all, I am currently making a fitness app and want to replicate the way Apple incorporate medals in their Fitness app. They have a page called awards and it looks like a collection view of different medals that the user can unlock. Once unlocking a medal the user can tap on the medal and it opens to a full screen intractable scene where the user can rotate the medal etc. I have had a play around with SceneKit and managed to get a medal loaded into a scene and display it in the app. However my functionality does not look as smooth or as polished as Apples. Does anyone have any idea on how they have managed to present all of the different Scene-kit scenes in a collection-view for example? Or how they have achieved the smooth transition between tapping on the medal in the collection-view and the scene. I guess this is created using SwiftUI. Can this be replicated in Storyboards? Thanks!
Posted
by
Post marked as solved
1 Replies
239 Views
My SceneKit Game failed with a com.apple.scenekit.scnview-renderer (10): signal SIGABRT The error was marked on the line @main Here's the log navigator: 2022-05-25 15:24:18.829319+0800 MyWorldiOS[9022:293392] Metal API Validation Enabled validateRenderPassDescriptor:899: failed assertion `RenderPass Descriptor Validation MTLRenderPassAttachmentDescriptor MTLStoreActionMultisampleResolve store action for the depth attachment is not supported by device PixelFormat MTLPixelFormatDepth32Float cannot be a MSAA resolve target ' dyld4 config: DYLD_ROOT_PATH=/Applications/Xcode-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot DYLD_LIBRARY_PATH=/Users/wangkeshijian/Library/Developer/Xcode/DerivedData/MyWorld-aayoxjgvyfzbxvgqnvylzgvlwkyr/Build/Products/Debug-iphonesimulator:/Applications/Xcode-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Applications/Xcode-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/usr/lib/libBacktraceRecording.dylib:/Applications/Xcode-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/usr/lib/libMainThreadChecker.dylib:/Applications/Xcode-beta.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib DYLD_FRAMEWORK_PATH=/Users/wangkeshijian/Library/Developer/Xcode/DerivedData/MyWorld-aayoxjgvyfzbxvgqnvylzgvlwkyr/Build/Products/Debug-iphonesimulator validateRenderPassDescriptor:899: failed assertion `RenderPass Descriptor Validation MTLRenderPassAttachmentDescriptor MTLStoreActionMultisampleResolve store action for the depth attachment is not supported by device PixelFormat MTLPixelFormatDepth32Float cannot be a MSAA resolve target ' CoreSimulator 802.6 - Device: MyPhone (EBB1ECDE-8AD7-4418-84AF-0B761E0A2EA7) - Runtime: iOS 15.4 (19E5234a) - DeviceType: iPhone 12 (lldb)  I'm not sure what else should I put in here
Posted
by
Post not yet marked as solved
0 Replies
165 Views
I want to crop the usdz model in runtime. I use ModelIO for this. Before: [https://i.stack.imgur.com/yDXXF.jpg) After: [https://i.stack.imgur.com/m9ryg.jpg) First of all, get file from bundle let url = URL(fileURLWithPath: file) } else { print("Object not found in Bundle") } And then I need to access asset let asset = MDLAsset(url: url) What should I do after this step? How am I supposed to use SCNGeometrySource and SCNGeometryElement or MDLVoxelArray classes?
Posted
by
Post not yet marked as solved
2 Replies
195 Views
I am new to Swift. I want to display the point clouds data collected by lidar on the robot in real time on iOS. Any suggestions for me? Thanks.
Posted
by
Post not yet marked as solved
0 Replies
104 Views
I am facing issues baking lightprobes on iOS. The same logic bakes lightprobes on macOS successfully. iOS throws the following exception [MTLDebugCommandBuffer waitUntilCompleted]:201: failed assertion `waitUntilCompleted on uncommitted command buffer' Some specs : Runtime: iOS 15.4 - DeviceType: iPhone 13 Pro Max. I created two template apps from xcode (one for iOS and the other for macOS). Following is the code for lightprobe bake added in viewDidLoad SCNScene* scene = [SCNScene scene]; SCNNode *ambientLight = [SCNNode node]; ambientLight.light = [SCNLight light]; ambientLight.light.type = SCNLightTypeAmbient; ambientLight.light.color = [UIColor whiteColor]; ambientLight.light.intensity = 1000.0; [scene.rootNode addChildNode:ambientLight]; scene.background.contents = [UIColor whiteColor]; scene.background.intensity = 2000.; SCNNode *probe1 = [SCNNode node]; probe1.position = SCNVector3Make(-0.493530, 1.7285934, -0.150000); probe1.light = [SCNLight light]; probe1.light.type = SCNLightTypeProbe; [scene.rootNode addChildNode:probe1]; SCNRenderer* probeRenderer = [SCNRenderer rendererWithDevice:nil options:nil]; probeRenderer.scene = scene; NSArray<SCNNode*> *probes = [NSArray arrayWithObjects: probe1, nil]; [probeRenderer updateProbes: probes atTime:1.0]; The crash occurs at updateProbes. Also, I have logged and checked the 27 floats and they are not garbage for macOS so essentially the bake is working as expected on macOS. Any help would be really appreciated!
Posted
by
Post marked as solved
1 Replies
224 Views
How can I crop a 3D model as seen in the photos? Should I use MetalKit or can I handle it with sceneKit and modelIO? I couldn't find any code examples on this topic. Can you share the code snippet Before: [https://i.stack.imgur.com/yDXXF.jpg) After: [https://i.stack.imgur.com/m9ryg.jpg)
Posted
by
Post not yet marked as solved
1 Replies
356 Views
I'm not sure which combination of iOS/XCode/Mac OS is causing this issue, but all of a sudden when I try to run our SceneKit app and the "Scheme -> Diagnostics -> Metal -> API Validation" setting is turned off the scene won't render and the console is just full of the following errors: Execution of the command buffer was aborted due to an error during execution. Invalid Resource (00000009:kIOGPUCommandBufferCallbackErrorInvalidResource) [SceneKit] Error: Main command buffer execution failed with status 5, error: Error Domain=MTLCommandBufferErrorDomain Code=9 "Invalid Resource (00000009:kIOGPUCommandBufferCallbackErrorInvalidResource)"  ) If you run the app outside of xcode it's fine, also enabling the "API Validation" option stops the issue. One of my schemes has this option disabled since the project began and never had an issue before. Just throwing this out there incase someone else has spent hours of their life trying to figure out why this is not working for them. Also you can just create a new SceneKit project and turn that diagnostic option off and the app won't render anything.
Posted
by
Post marked as solved
1 Replies
217 Views
I need some help adding an .scn file to my submission for this year. I've tried creating an art.scnassets file from xcode within the project, adding the .scn file and trying to use it with SCNScene(named: "scene.scn"), but I get a blank SCNView and printing value of the SCNScene, I get nil. Please help in this context ASAP.
Posted
by
Post not yet marked as solved
1 Replies
204 Views
I am using scene.write(to:"dirpath\name.usdz") to get usdz export functionality into my app (universal, macOS & iOS). My problem is, it ceases to work after the first use, quitting & restarting the app is the only way to re-enable it. I have tried reusing the same scene, and instantiating a new scene (both ways with the exact same node structure), same results every time: first invocation writes a file of ~14MB, any calls after that write 1.5-2k of garbage. I use a unique filename for each write, and check to make sure it doesn't already exist. Any ideas?
Posted
by
Post not yet marked as solved
1 Replies
198 Views
Hi, I just wanted to display a SpriteKit Scene in a SCNPlane. So I set the the SCNMaterial contents to my SKScene, but instead of getting the scene I'm getting a grey plane. This is my code by the way: var mainScene: SKScene {     let scene = Game()     scene.size = CGSize(width: 1024, height: 1024)     scene.scaleMode = .resizeFill     scene.backgroundColor = .purple     scene.view?.backgroundColor = .purple     scene.view?.allowsTransparency = false     return scene } func initMainScene() -> SceneView {     mainScene.view?.isPaused = false     let scene = SCNScene()     let mainSceneMaterial = SCNMaterial()     mainSceneMaterial.normal.contents = mainScene     mainSceneMaterial.isDoubleSided = true     let planeGeometry = SCNPlane(width: 1, height: 1)     planeGeometry.materials = [mainSceneMaterial]     let plane: SCNNode = SCNNode(geometry: planeGeometry)     let camera: SCNNode = SCNNode()     camera.name = "Camera"     camera.camera = SCNCamera()     camera.position = SCNVector3(x: 0.0, y: 0.0, z: 4.0)     let light: SCNNode = SCNNode()     light.light =  SCNLight()     light.light!.type = .omni     light.position = SCNVector3(x: 1.5, y: 1.5, z: 1.5)     scene.rootNode.addChildNode(camera)     scene.rootNode.addChildNode(light)     scene.rootNode.addChildNode(plane)     return SceneView(         scene: scene,         pointOfView: scene.rootNode.childNode(withName: "Camera", recursively: false),         options: []     ) } Here is the screenshot: Also, my SpriteKit scene has touchesBegan and touchesMoved functions implemented, will those events still work if I embed the scene in the SCNMaterial? Thanks very much 🙏
Posted
by
Post not yet marked as solved
2 Replies
342 Views
I want to remove unnecessary materials or textures in order to reduce the size of the USDZ model I have. How can I manipulate this model with swift? or, I can try any advice to reduce the size of the USDZ model
Posted
by
Post not yet marked as solved
1 Replies
278 Views
Hi, I want to build an ARKit + SceneKit (using Storyboard) game for this year's WWDC Swift Student Challenge. However, the format for submission must be in swiftpm, and it seems that it only supports SwiftUI. I'm not too familiar with SwiftUI, so can I still use UIKit, ARKit, and SceneKit in swiftpm? Or must I just build something with SwiftUI? Thanks!
Posted
by
Post not yet marked as solved
1 Replies
146 Views
I'm having the most frustrating time trying to get SceneKit to render custom geometry on Monterey on an MBP M1Max. After some mucking about, I'm starting to suspect that the internal floating point representation has shifted to 64-bit?!? I'm wondering if this is also the case for element buffers, because rendering with UInt32 element arrays is FUBAR. Unfortunately, SceneKit won't allow me to create 64-bit element arrays.
Posted
by
edj
Post not yet marked as solved
0 Replies
120 Views
Hi, I'm building SceneKit app and run into a weird situation. From time to time, I can 't see any virtual objects in AR Scene, but it works perfectly in most cases. And in this moment I can't see any objects, sceneView.pointOfView.position.x or y or z is Nan, rotation is the same. guard let camera = sceneView?.pointOfView else {       print("Get phonePosition Fail")       return nil     }           // TODO: Don't know why sceneView?.pointOfView gets nothing (NaN). Should find out why.     if camera.position.x.isNaN {       if let session = sceneView?.session {         interrupted?(session)       }       return nil     } Did anyone handle this case before?
Posted
by
Post not yet marked as solved
1 Replies
230 Views
How do you set the size of an area light in sceneKit?         let areaLightNode = SCNNode()         areaLightNode.light = SCNLight()         areaLightNode.light!.type = .area                  areaLightNode.position = SCNVector3(x: 0, y: 5, z: 0)         areaLightNode.light?.intensity = 1000 Nothing appears when using the above
Posted
by
Post not yet marked as solved
1 Replies
269 Views
I'm trying to populate scene view inside Table cell (for 3D objects) as a subview. For first view load its working fine but as new child nodes are added/Updated to view that is not happening its remain static(objects are not updating on view). Tried with renderer delegate values of child nodes are updated but on view its not happening we have to switch between different scene then view is loaded properly. How to update or refresh subview continuously ?
Posted
by
Post not yet marked as solved
0 Replies
353 Views
Good morning, We are creating an AR app made with Unity and AR Foundation and we would like to associate our app with an App Clip. Is it possible to create an App Clip from a Unity app? I understand that Unity builds may be too heavy to be used as App Clips. Otherwise, is it possible to associate a Unity app with an App Clip created on SceneKit or RealityKit and to upload it on the App Store? How could we achieve that? Is it possible to add a custom App Clip to an app archive made with Unity? Thanks in advance!
Posted
by