SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

Posts under SceneKit tag

74 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

SCNNode Pivot and Position
Hi, I am initializing a SCNNode from a OBJ file. Let's suppose the object is a sphere, and its pivot after loading it from the OBJ file is the bottom of the sphere (where it would rest on the floor). Its default position is the zero vector. However, I must change the pivot to the center of the sphere. After doing so (based on its bbox), since the position is still the zero vector, does that mean that the object was translated so that the new pivot lies at (0,0,0)? Or should set its position to (0,0,0), which will now be based on the new pivot? To test whether this is needed, I am using a separate button to change the node's position to (0,0,0) after changing its pivot, but I do not see any change visually, which leads me to believe that after changing the pivot, the object is automatically moved to (0,0,0) based on its new pivot. This is probably done faster than the scene renders which is why I do not notice any difference between the two methods. I cannot tell which of the two is correct, meaning that I do no know whether I should set the position again to (0,0,0) after changing the pivot or not. Right now it seems like it makes no difference. Any thoughts?
1
0
223
1w
scenekit error
使用xib方式的scnview 加载点云图3D模型,在苹果12上无法展示,在苹果13上可以正常显示.以下为报错信息:Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (00000005:kIOGPUCommandBufferCallbackErrorInnocentVictim) 2024-07-10 11:01:22.403196+0800 不愁物联网[26648:1375452] Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (00000005:kIOGPUCommandBufferCallbackErrorInnocentVictim) 2024-07-10 11:01:22.403458+0800 不愁物联网[26648:1375452] [SceneKit] Error: Resource command buffer execution failed with status 5, error: Error Domain=MTLCommandBufferErrorDomain Code=1 "Discarded (victim of GPU error/recovery) (00000005:kIOGPUCommandBufferCallbackErrorInnocentVictim)" UserInfo={NSLocalizedDescription=Discarded (victim of GPU error/recovery) (00000005:kIOGPUCommandBufferCallbackErrorInnocentVictim)} ( ) 2024-07-10 11:01:22.403539+0800 不愁物联网[26648:1375452] Execution of the command buffer was aborted due to an error during execution. Caused GPU Address Fault Error (0000000b:kIOGPUCommandBufferCallbackErrorPageFault) 2024-07-10 11:01:22.403556+0800 不愁物联网[26648:1375452] Execution of the command buffer was aborted due to an error during execution. Caused GPU Address Fault Error (0000000b:kIOGPUCommandBufferCallbackErrorPageFault) 2024-07-10 11:01:22.403586+0800 不愁物联网[26648:1375452] [SceneKit] Error: Main command buffer execution failed with status 5, error: Error Domain=MTLCommandBufferErrorDomain Code=3 "Caused GPU Address Fault Error (0000000b:kIOGPUCommandBufferCallbackErrorPageFault)" UserInfo={NSLocalizedDescription=Caused GPU Address Fault Error (0000000b:kIOGPUCommandBufferCallbackErrorPageFault)} ( )
0
0
169
1w
Augmented Reality app unable to load the image from the camera
I have an app on the App Store for many years enabling users to post text into clouds in augmented reality. Yet last week abruptly upon installing the app on the iPhone the screen started going totally dark and a list of little comprehensible logs came up of the kind: ARSCNCompositor <0x300ad0e00>: ARSCNCompositor (0, 0) initialization failed. Matting is not set up properly. many times, then RWorldTrackingTechnique <0x106235180>: Unable to update pose [PredictorFailure] for timestamp 870.392108 ARWorldTrackingTechnique <0x106235180>: Unable to predict pose [1] for timestamp 870.392108 again several times and then: ARWorldTrackingTechnique <0x106235180>: SLAM error callback: Error Domain=Slam Error Code=7 "Non fatal error occurred due to significant drop in a IMU data" UserInfo={NSDescription=Non fatal error occurred due to significant drop in a IMU data, NSLocalizedFailureReason=SlamEngineNodeGroup Failure: IMU issue: gyro data stream verification failed [Significant data drop]. Failed on timestamp: 870.413247, Last known timestamp: 865.350198, Delta: 5.063049, System timestamp: 870.415781, Delta between system and frame: 0.002534. } and then again the pose issues several times. I hoped the new beta version would have solved the issue, but it was not the case. Unfortunately I do not know if that depends on the beta version or some other issue, given the app may be not installed on the Mac simulator.
1
0
133
1w
How to create objects based on a list?
So I am trying to create a certain amount of spheres in a SceneKit scene based on the number of objects in a list. So I think I would put an addChild in a for loop, but how would I assign them all to the same type of model? Especially because sometimes the list will be empty, so the model would not need to show up at all.
3
0
266
1w
SceneKit doesn't build, says it needs an output file
I am trying to make an app that uses SceneKit to display some 3D models, but started it up from the regular app format. When I try to build I get this error: shell script build rule for "/somePath/Scene.scn' must declare at least one output file. The .scn is being provided to a view, is that not an output? Or is there some formatting issue that I need to solve.
2
0
195
1w
RealityKit, DrawableQueue, and synchronizing scene updates
I have a visionOS app that utilizes DrawableQueue and CADisplayLink to update an Entity, TextureResource tied to the drawable, and a Material that uses that TextureResource. TextureResource gets updated with when a video frame is ready. Material properties can get updated from the video or from other sources. Current process: when each video frame is ready, we get the next drawable, render to it, present it, and make an Entity update (e.g. transform). However, I’m experiencing jitter in the rendered content where it seems that the updates to the entity and the drawable being presented are milliseconds off from each other. Should I be using Drawable.presentOnSceneUpdate() to ensure all updates happen in the same update cycle? And if so, do you have any additional details on how to correctly use this function (the docs are unclear)?
0
1
155
1w
Apply sprite scene as material in Reality Composer pro
Hello, I’m trying to move my app into vision OS, my app is used for pilot to study the airplane system, is a 3d airplane cockpit build with scene kit and I use sprite scene to animate the cockpit instruments . Scenekit allow to apply as material a sprite scene , so I could animate easy all the different instruments and indication there, but I can’t find this option on reality compose pro , is this possible? any suggestions I can look into to animate and simulate instruments.
2
0
270
2w
Xcode memory meter goes to high while converting USDZ file to scn file by programatically
We have requirement adding usdz file to UIView and showing the it’s content and archive the data and save to file. When user open the file, we need to unarchive that usdz content and binding with UIView and showing it to user. Initially, we created SCNScene object passing usdz file url like below. do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } Since SCNScene support to NSSecureCoding protocol , we directly archive that object and save it file and load back it from file and created SCNScene object using NSKeyedUnarchiver process. But for some files, we realised high memory consumption while archiving the SCNScene object using below line. func encode(with coder: NSCoder) { coder.encode(self.scnScene, forKey: "scnScene") } File referene link : toy_drummer_idle.usdz When we analyse apple documentation (check discussion section) , it said, scn file extension is the fastest format for processing than the usdz. So we used SCNSecne write to feature for creating scn file from given usdz file. After that, When we do the archive SCNScene object that was created by sun file url, the archive process is more faster and it will not take high memory as well. It is really faster previous case now. But unfortunately, SCNScene write method will take lot of time for this conversion and memory meter is also going high and it will be caused to app crash as well. I check the output file size as well. The given usdz file size is 18MB and generated scn file size is 483 MB. But SCNScene archive process is so fast. Please, analyse this case and please, provide some guideline how we can optimise this behaviour. I really appreciate your feedback. Full Code: import UIKit import SceneKit class ViewController: UIViewController { var scnView: SCNView? var usdzScene: SCNScene? var scnScene: SCNScene? lazy var exportButton: UIButton = { let btn = UIButton(type: UIButton.ButtonType.system) btn.tag = 1 btn.backgroundColor = UIColor.blue btn.addTarget(self, action: #selector(buttonPressed(_:)), for: .touchUpInside) btn.setTitle("USDZ to SCN", for: .normal) btn.setTitleColor(.white, for: .normal) btn.layer.borderColor = UIColor.gray.cgColor btn.titleLabel?.font = .systemFont(ofSize: 20) btn.translatesAutoresizingMaskIntoConstraints = false return btn }() func deleteTempDirectory(directoryName: String) { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let tempDirectory = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if FileManager.default.fileExists(atPath: URL(string: tempDirectory.absoluteString)!.path) { do{ try FileManager.default.removeItem(at: tempDirectory) } catch let error as NSError { print(error) } } } func createTempDirectory(directoryName: String) -> URL? { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let toBeCreatedDirectoryUrl = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if !FileManager.default.fileExists(atPath: URL(string: toBeCreatedDirectoryUrl.absoluteString)!.path) { do{ try FileManager.default.createDirectory(at: toBeCreatedDirectoryUrl, withIntermediateDirectories: true, attributes: nil) } catch let error as NSError { print(error) return nil } } return toBeCreatedDirectoryUrl } @IBAction func buttonPressed(_ sender: UIButton){ let scnFolderName = "SCN" let scnFileName = "3D" deleteTempDirectory(directoryName: scnFolderName) guard let scnDirectoryUrl = createTempDirectory(directoryName: scnFolderName) else {return} let scnFileUrl = scnDirectoryUrl.appendingPathComponent(scnFileName).appendingPathExtension("scn") guard let usdzScene else {return} let result = usdzScene.write(to: scnFileUrl, options: nil, delegate: nil, progressHandler: nil) if (result) { print("exporting process is success.") } else { print("exporting process is failed.") } } override func viewDidLoad() { super.viewDidLoad() let usdzUrl: URL? = Bundle.main.url(forResource: "toy_drummer_idle", withExtension: "usdz") guard let usdzUrl else {return} do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } guard let usdzScene else {return} scnView = SCNView(frame: .zero) guard let scnView else {return} scnView.translatesAutoresizingMaskIntoConstraints = false self.view.addSubview(scnView) self.view.addSubview(exportButton) NSLayoutConstraint.activate([ scnView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor), scnView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor), scnView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor), scnView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -30), exportButton.widthAnchor.constraint(equalToConstant: 200), exportButton.heightAnchor.constraint(equalToConstant: 40), exportButton.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor), exportButton.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor), ]) DispatchQueue.main.asyncAfter(deadline: .now() + 0.01) {[weak self] in guard let self else {return} loadModel(scene: usdzScene) } } func loadModel(scene: SCNScene){ guard let scnView else {return} scnView.autoenablesDefaultLighting = true scnView.scene = scene scnView.allowsCameraControl = true } } ![]("https://developer.apple.com/forums/content/attachment/5ff0197b-7e0b-4efe-b295-cc205c73dc02" "title=Screenshot 2024-06-20 at 12.23.40.png;width=1712;height=1013")
0
0
248
Jun ’24
Capturing Perspective Camera View in RealityKit and Integrating SceneKit with RealityKit
Hello, I am currently developing an application using RealityKit and I've encountered a couple of challenges that I need assistance with: Capturing Perspective Camera View: I am trying to render or capture the view from a PerspectiveCamera in RealityKit/RealityView. My goal is to save this view of a 3D model as an image or video using a virtual camera. However, I'm unsure how to access or redirect the rendered output from a PerspectiveCamera within RealityKit. Is there an existing API or a recommended approach to achieve this? Integrating SceneKit with RealityKit: I've also experimented with using SCNNode and SCNCamera to capture the camera's view, but I'm wondering if SceneKit is directly compatible within a RealityKit scene, specifically within a RealityView. I would like to leverage the advanced features of RealityKit for managing 3D models. Is saving the virtual view of a camera supported, and if so, what are the best practices? Any guidance, sample code, or references to documentation would be greatly appreciated. Thank you in advance for your help!
3
1
327
Jun ’24
SceneKit or RealityKit for Non-AR Game Development
Hi everyone, I'm choosing a framework for developing a game that doesn't involve augmented reality (AR) and I'm unsure whether to use SceneKit or RealityKit. I would like to hear from Apple engineers on this matter. Which of these frameworks is better suited for creating non-AR games? Additionally, I'd like to know if it's possible to disable AR in RealityKit using the updated RealityView? Thanks in advance for your insights and recommendations!
2
0
546
Jun ’24
Wrong hitTest results in iOS 17.2
We’re experiencing an issue with wrong SceneKit hit testing results in iOS 17.2 compared with iOS 16.1 when using the either Metal or OpenGLES2 engines. Tapping on a 3D model to place a SCNNode // pointInScene: tapped point let hitResults = sceneView.hitTest(pointInScene, options: nil) return hitResults.first { $0.node.name?.compare("node_name") == .orderedSame }
2
0
300
Jun ’24
Camera pointOfView vs. Error
Dear all, I have several scenes, each with it’s own camera at different positions. The scenes will be loaded with transitions. If I set the pointOfView in every Scene to the scene-camera, the transitions don’t work properly. The active scene View switches to the position of the camera of the scene, which is fading in. If I comment the pointOfView out, the transitions works fine, but the following error message appears: Error: camera node already has an authoring node - skip Has someone an idea to fix this? Many Thanks, Ray
0
0
298
May ’24
Loading a .scnz file in Xcode / Displaying it in a view using Swift
Hello! I need to display a .scnz 3D model in an iOS app. I tried converting the file to a .scn file so I could use it with SCNScene but the file became corrupted. I also tried to instantiate a SCNScene with the .scnz file but that didn't work either (crash when instantiating it). After all this, what would be the best way to use this file knowing that converting it or exporting it to a .scn file with scntool hasn't worked? Thank you!
0
0
382
May ’24
What Platform to Use To Make a Game in Xcode?
I am wanting to create a 3D video game in Xcode for macOS, iOS, iPadOS, tvOS, and visionOS. I have heard that there are a few different ways to go about this such as MetalKit or SceneKit. These libraries seem to have little examples and documentation so I am wondering: Are they still be developed/supported? Which platform should I make a game in? Where are some resources to learn how to use these platforms? Are there other better platforms that I am just not aware of? Thanks!
1
0
506
May ’24
Arranging Controller, Scenes & Views Structure
Dear all, I'm new in coding, for maybe the fifth time :), and I hope I find the right words. Right now I'm prototyping a 3D game with scenekit for IOS devices. At the moment the prototyp of the MainScene.scn (the first game scene) is ready. The 3D objects are located in the assetcatalog, and the game behavior with object movements, gyro data, 3D Text and so on, is codet in the GameViewController.swift. Now I’m at a point where I think I can do many things wrong, at the cost of days, weeks or months of work, to reconstruct my app, afterwards. So I want to understand, what's the „sexiest“ way to structure my project Scenes, Views, Controller, etc. Simplified, my game will have a user interface to store the name, change gameplay setting and so on, and it will have multiple 3D scenes where the game takes place. At first I thought, it would be a good idea to arrange multiple scenes, all controlled by the GameViewController, due to not having to duplify recurring methods or so. But as I thought of, I saw a GameViewController file bigger and bigger and I had the fear to more and more loosing the focus the more scenes I added. The thought of multiple Controller for each or a fiew Scenes are not as „****“ too, because by changes on a recurring method, I maybe have to change it for every Controller. I then thought of instancing the GameViewController but in no case I had the feeling „that’s the way to go“. So long things short: How would you arrange a game project like this? Thank you in advance, Ray
2
0
371
Apr ’24
Placing object in AR without hittest or plane
Hi, i want to place a object in 3d world space without the use of hittest or plane detection in ios swift code. Suggest the best method. Now, I take the camera center matrix and use simd_mul to place the object, it works but the object gets placed at the centre of the mobile screen. I want to select the x and y positino on the screen 2d coordinate and place the object. I tried using the unprojectpoint function, to get the AR scene world coordinate of the point i touch on the mobile screen. I get the x, y,z values, they are very close to the values from camera center matrix. When i try to replace the unprojectpoint values in the cameracenter matrix, i dont see a difference in the location of the placed object. The below code always place object from center screen with specified depth, But i need to place object in user specified position(x,y) of the screen with depth. 2D pixel coordinate system of the renderer to the 3D world coordinate system of the scene. /* Create a transform with a translation of 0.2 meters in front of the camera. */ var translation = matrix_identity_float4x4 translation.columns.3.z = -0.2 let transform = simd_mul(view.session.currentFrame.camera.transform, translation) Refer from : [https://developer.apple.com/documentation/arkit/arskview/providing_2d_virtual_content_with_spritekit) The code i used for replacing the camera center matrix with the unprojectpoint is let vpWithZ = SCNVector3(x: 100.0, y: 100.0, z: -1.0) let worldPoint = sceneView.unprojectPoint(vpWithZ) var translation = matrix_identity_float4x4 translation.columns.3.z = Float(Depth) var translation2 = sceneView.session.currentFrame!.camera.transform translation2.columns.3.x = worldPoint.x translation2.columns.3.y = worldPoint.y translation2.columns.3.z = worldPoint.z let new_transform = simd_mul(translation2, translation) /* add object name you wanted in your project*/ let sphere = SCNSphere(radius: 0.03) let objectNode = SCNNode(geometry: sphere) objectNode.position = SCNVector3(x: transform.columns.3.x, y: transform.columns.3.y, z: transform.columns.3.z) The below image shows outline of my idea.
0
0
382
Apr ’24
Dynamically Loading USDZ Objects from an Array into a fully immersive Scene using Reality Kit
Hello, I am currently working on a project where I am creating a bookstore visualization with racks and shelves(Full immersive view). I have an array of names, each representing a USDZ object that is present in my working directory. Here’s the enum I am trying to iterate over: enum AssetName: String, Codable, Hashable, CaseIterable { case book1 = "B1" case book2 = "B2" case book3 = "B3" case book4 = "B4" } and the code for adding objects I wrote: import SwiftUI import RealityKit struct LocalAssetRealityView: View { let assetName: AssetName var body: some View { RealityView { content in if let asset = try? await ModelEntity(named: assetName.rawValue) { content.add(asset) } } } } Now I get the error, when I try to add multiple objects on Button click: Unable to present another Immersive Space when one is already requested or connected please suggest any solutions. Also suggest if anything can be done to add positions for the objects as well programatically.
1
0
497
Mar ’24
RealityKit dynamic shadows are flickering
I'm trying to add dynamic shadows by adding a directional light to the scene. I implemented a POC based on the latest documentation. Basically, the way shadows are being rendered in RealityKit is by a adding a ModelEntity into an AnchorEntity with a target of type planes. The result is that I'm getting shadows that are terribly flickering. I'd add that in SceneKit, there are many more shadow-related properties that let you tweak the look and feel of the shadows, and it's not hard to get a decent shadow there. I'm wondering if having accurate dynamic shadows is possible in RealityKit and if not, if there's a plan to fix it in the next RealityKit version.
0
0
440
Mar ’24