SceneKit

RSS for tag

Create 3D games and add 3D content to apps using high-level scene descriptions using SceneKit.

SceneKit Documentation

Posts under SceneKit tag

122 Posts
Sort by:
Post not yet marked as solved
2 Replies
678 Views
I've seen two crashes in the fox sample code already for simple SCNVector functions. The enemies also don't move - I suppose that's a Gameplay Kit error. I can see glimpses of other possible problems too - and of course the intermittent display stutter is there. (You know, the one on every OS after High Sierra for any Metal backed view) I tried both on m1 native and rosetta. Unless I have some wrong setting, SceneKit's SCNVector math seems crippled at the moment. Already sent reports... Can someone else confirm the crashes?
Posted
by
Post not yet marked as solved
6 Replies
4.6k Views
I am trying to use the new LiDAR scanner on the iPhone 12 Pro in order to gather points clouds which later on will be used as input data for neural networks. Since I am relatively new to the field of computer vision and augmented reality, I started by looking at the official code examples (e.g., Visualizing a Point Cloud Using Scene Depth - https://developer.apple.com/documentation/arkit/environmental_analysis/visualizing_a_point_cloud_using_scene_depth) and the documentation of ARKit, SceneKit, Metal and so. However, I still do not understand how to get the LiDAR data. I found another thread in this forum (Exporting Point Cloud as 3D PLY Model - https://developer.apple.com/forums/thread/658109) and the given solution works so far. However, I do not understand that code in detail, unfortunately. So I am not sure if this gives me really the raw LiDAR data or if some (internal) fusion with other (camera) data is happening, since I could not figure out where the data comes from exactly in the example. Could you please give me some tips or code examples on how to work with/access the LiDAR data? It would be very much appreciated!
Posted
by
C3d
Post not yet marked as solved
2 Replies
582 Views
I'm doing an experiment integrating SwiftUI views as Materials for a SceneKit scene SCNPanel node. It is working perfectly in iOS using UIHostingController with the following code: Swift func createInfoPanel() { let panel = SCNPlane(width: 6.0, height: 6.0) let panelNode = SCNNode(geometry: panel) let infoPanelHost = SCNHostingController(rootView: helloWorld) infoPanelHost.view.isOpaque = false infoPanelHost.view.backgroundColor = SCNColor.clear infoPanelHost.view.frame = CGRect(x: 0, y: 0, width: 256, height: 256) panel.materials.first?.diffuse.contents = infoPanelHost.view panel.materials.first?.emission.contents = infoPanelHost.view panel.materials.first?.emission.intensity = 3.0 [... BillBoardConstraint etc here ...] addNodeToScene(panelNode) } Yet, when I tried to apply the same to macOS, I don't seem to be able to make the view created by NSHostingController transparent. Invoking infoPanelHost.view.isOpaque = false returns an error, saying isOpaque is read-only and can't be set. I tried subclassing NSHostingController and overriding viewWillAppear to try and make the view transparent / non-opaque, to no avail. Swift override func viewWillAppear() { super.viewWillAppear() self.view.wantsLayer = true self.view.layer?.backgroundColor = NSColor.clear.cgColor self.view.layer?.isOpaque = false self.view.opaqueAncestor?.layer?.backgroundColor = NSColor.clear.cgColor self.view.opaqueAncestor?.layer?.isOpaque = false self.view.opaqueAncestor?.alphaValue = 0.0 self.view.alphaValue = 0.0 self.view.window?.isOpaque = false self.view.window?.backgroundColor = NSColor.clear     } Tried setting everything I could think of to non-opaque as you can see, and still, the panels are opaque, show no info, and obscure the 3D entity they should overlay... Can someone please advise?
Posted
by
Post not yet marked as solved
0 Replies
733 Views
Hello, I am preparing for WWDC2021, but I encountered a small problem. I don’t know how to use RealityKit to create a facial geometry, just like using SCNKit. SCNKit: func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) - SCNNode? {     let faceMesh = ARSCNFaceGeometry(device: sceneView.device!)     let node = SCNNode(geometry: faceMesh) node.geometry?.firstMaterial?.fillMode = .lines     return node   }
Posted
by
Post not yet marked as solved
4 Replies
1.3k Views
My app is using ARKit and Scenekit to render some animation for user interaction, at some points, it hangs with the below thread backtrace Anyway to tell what is blocking the main thread? (lldb) thread backtrace thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP frame #0: 0x00000001b1dd5204 libsystem_kernel.dylib`__psynch_mutexwait + 8 frame #1: 0x00000001cf935224 libsystem_pthread.dylib`_pthread_mutex_firstfit_lock_wait + 92 frame #2: 0x00000001cf935174 libsystem_pthread.dylib`_pthread_mutex_firstfit_lock_slow + 216 frame #3: 0x00000001b821e560 SceneKit`-[SCNRenderer _projectPoint:viewport:] + 220 frame #4: 0x00000001b82c8438 SceneKit`-[SCNView projectPoint:] + 124 frame #5: 0x000000010477d970 app`static OffScreenIndicator.updateOffScreenOn(displayView=0x00000001071c2b30, sceneView=0x00000001071d5820, target=0x0000000281a3c700, image=0x0000000281411050, color=0x0000000283c3d4c0, frame=0x000000015683e340, self=app.OffScreenIndicator) at OffScreenIndicator.swift:26:37 frame #6: 0x000000010458d5d8 app`ARViewController.phoneIsFar(camPosition=SceneKit.SCNVector3 @ 0x000000016db94730, frame=0x000000015683e340, isTutorial=true, self=0x000000010883bc00) at ARViewController.swift:2434:48 frame #7: 0x000000010458a0b4 app`ARViewController.scanUpdate(frame=0x000000015683e340, showTutorial=true, scanAllDoneAction=0x000000010459300c app`partial apply forwarder for closure #1 () - () in app.ARViewController.(stateUpdate_tutorial in _ED82BCD2A98A9516DA8B452F58022553)(frame: __C.ARFrame) - () at compiler-generated, self=0x000000010883bc00) at ARViewController.swift:2371:13 frame #8: 0x0000000104560e14 app`ARViewController.stateUpdate_tutorial(frame=0x000000015683e340, self=0x000000010883bc00) at ARViewController.swift:2692:9 frame #9: 0x00000001045609f4 app`implicit closure #22 in implicit closure #21 in ARViewController.bindStateFunctions(frame=0x000000015683e340, self=0x000000010883bc00) at ARViewController.swift:1048:34 frame #10: 0x00000001045506fc app`ARViewController.updateState(frame=0x000000015683e340, self=0x000000010883bc00) at ARViewController.swift:1108:31 frame #11: 0x0000000104550274 app`ARViewController.session(session=0x00000001568cbdd0, frame=0x000000015683e340, self=0x000000010883bc00) at ARViewController.swift:696:9 frame #12: 0x0000000104550778 app`@objc ARViewController.session(_:didUpdate:) at compiler-generated:0 frame #13: 0x00000001afa25248 ARKitCore`__36-[ARSession _sessionDidUpdateFrame:]_block_invoke + 128 frame #14: 0x0000000106f37bcc libdispatch.dylib`_dispatch_call_block_and_release + 32 frame #15: 0x0000000106f396c0 libdispatch.dylib`_dispatch_client_callout + 20 frame #16: 0x0000000106f48f34 libdispatch.dylib`_dispatch_main_queue_callback_4CF + 1000 frame #17: 0x0000000183e0111c CoreFoundation`__CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 frame #18: 0x0000000183dfb120 CoreFoundation`__CFRunLoopRun + 2508 frame #19: 0x0000000183dfa21c CoreFoundation`CFRunLoopRunSpecific + 600 frame #20: 0x000000019b9c4784 GraphicsServices`GSEventRunModal + 164 frame #21: 0x000000018683aee8 UIKitCore`-[UIApplication _run] + 1072 frame #22: 0x000000018684075c UIKitCore`UIApplicationMain + 168 frame #23: 0x00000001022766b4 app`main at AppDelegate.swift:14:7 frame #24: 0x0000000183aba6b0 libdyld.dylib`start + 4 (lldb) (lldb) thread list Process 4264 stopped thread #1: tid = 0x131e7b, 0x00000001b1dd5204 libsystem_kernel.dylib`__psynch_mutexwait + 8, queue = 'com.apple.main-thread', stop reason = signal SIGSTOP thread #6: tid = 0x131f75, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'com.apple.uikit.eventfetch-thread' thread #9: tid = 0x131f7e, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'AVAudioSession Notify Thread' thread #11: tid = 0x131ffc, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'com.apple.NSURLConnectionLoader' thread #14: tid = 0x132099, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'com.apple.coreaudio.AQClient' thread #21: tid = 0x1320d9, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #27: tid = 0x132153, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'com.apple.CoreMotion.MotionThread' thread #28: tid = 0x132157, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #46: tid = 0x1321ff, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #49: tid = 0x132202, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #53: tid = 0x13228c, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #58: tid = 0x132397, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'com.apple.arkit.ardisplaylink.0x283d67a00' thread #59: tid = 0x132398, 0x00000001b1dd5f5c libsystem_kernel.dylib`__ulock_wait + 8, name = 'com.apple.scenekit.scnview-renderer', queue = 'com.apple.scenekit.renderingQueue.ARSCNView0x1071d5820' thread #60: tid = 0x1323a2, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'H11ANEServicesThread' thread #61: tid = 0x1323b3, 0x00000001b1db12d0 libsystem_kernel.dylib`mach_msg_trap + 8, name = 'H11ANEServicesThread' thread #62: tid = 0x13249f, 0x00000001b1db1324 libsystem_kernel.dylib`semaphore_timedwait_trap + 8 thread #65: tid = 0x1324a8, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #67: tid = 0x1324df, 0x00000001b1db1324 libsystem_kernel.dylib`semaphore_timedwait_trap + 8 thread #68: tid = 0x132505, 0x0000000198e9d150 libobjc.A.dylib`object_getClassName, queue = 'com.apple.libdispatch-manager' thread #69: tid = 0x132507, 0x0000000198e9d150 libobjc.A.dylib`object_getClassName thread #70: tid = 0x132537, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread thread #71: tid = 0x132538, 0x00000001cf938764 libsystem_pthread.dylib`start_wqthread (lldb)
Posted
by
Post not yet marked as solved
2 Replies
553 Views
Hello, I'm building an AR image tracking application that detects some images and play video on each image. In most cases there is no problem, but some images are not detected. So, I tried these. when I init ARReferenceImage, I set the physicalWidth value to be same as the actual size. (It is 10cm) I change the contrast of image. But it didn't work very well. I would be very grateful if you could give me a small hint. Here is my code. ///Model struct PhotoCard {   var imageName: String   var image: UIImage   var videoURL: URL } struct TrackingItem {   var referenceImage: ARReferenceImage   var videoURL: URL } final class CameraModel { private let dataManger = DataManger.shared()   private func getPhotoCards() -> [PhotoCard] {     //get photocard from server return dataManger.getPhotoCards()   }       func trackingItems() -> [TrackingItem] {     let photocards: [PhotoCard] = getPhotoCards()     var items: [TrackingItem] = []           photocards.forEach { photoCard in       guard let cgImage = photoCard.image.cgImage else { return }       let referenceImage = ARReferenceImage(cgImage, orientation: .up, physicalWidth: 0.1)       referenceImage.name = photoCard.imageName       let item = TrackingItem(referenceImage: referenceImage, videoURL: photoCard.videoURL)       items.append(item)     }     return items   } } ///viewController import SceneKit import ARKit import AVFoundation import SpriteKit import RxSwift final class CameraViewController: VideoPlayerPresentableViewController, ARSCNViewDelegate { @IBOutlet private var sceneView: ARSCNView! private let model = CameraModel() private var data: [TrackingItem] = [] override func viewDidLoad() { super.viewDidLoad() setupSceneView() setupTrackingConfiguration() } override func viewWillDisappear(_ animated: Bool) { super.viewWillDisappear(animated) sceneView.session.pause() } private func setupSceneView() { sceneView.delegate = self let scene = SCNScene(named: "SceneAssets.scnassets/photocard.scn")! sceneView.scene = scene } private func setupTrackingConfiguration() { data = model.trackingItems() let configuration = ARImageTrackingConfiguration() configuration.isAutoFocusEnabled = true let trackingImages = data.map { $0.referenceImage } configuration.trackingImages = Set(trackingImages) configuration.maximumNumberOfTrackedImages = trackingImages.count sceneView.session.run(configuration) } func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { guard let current = data.first(where: { $0.referenceImage.name == imageName }) else { return } let referenceImage = imageAnchor.referenceImage let planeGeometry = SCNPlane(width: referenceImage.physicalSize.width, height: referenceImage.physicalSize.height) let plane = SCNNode(geometry: planeGeometry) plane.transform = SCNMatrix4MakeRotation(-.pi/2, 1, 0, 0) let videoSceneAndPlayer = makeVideoSceneAndPlayer(with: current.videoURL) planeGeometry.materials.first?.diffuse.contents = videoSceneAndPlayer.0 node.addChildNode(plane) anchorAndPlayerMap[anchor] = videoSceneAndPlayer.1 model.updatePlayYN(of: current) } func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { guard let imageAnchor = anchor as? ARImageAnchor, imageAnchor.isTracked == false else { return } sceneView.session.remove(anchor: anchor) } private func makeVideoSceneAndPlayer(with url: URL) -> (SKScene, AVPlayer) { let size = CGSize(width: 500, height: 500) let scene = SKScene(size: size) scene.scaleMode = .aspectFit let player = AVPlayer(url: url) let videoSpriteNode = SKVideoNode(avPlayer: player) videoSpriteNode.position = CGPoint(x: size.width/2, y: size.height/2) videoSpriteNode.size = size videoSpriteNode.yScale = -1 scene.addChild(videoSpriteNode) addObserver(of: player) player.play() return (scene, player) } }
Posted
by
Post marked as solved
1 Replies
327 Views
I'm trying to create custom ARSCNView. class CustomARSCNView: ARSCNView, ARSessionDelegate, ARSCNViewDelegate {     override init(frame: CGRect) {         super.init(frame: frame)         loadWorldMap()     }     required init?(coder: NSCoder) {         fatalError("init(coder:) has not been implemented")         loadWorldMap()     }     override init(frame: CGRect, options: [String : Any]? = nil) {         super.init(frame: frame, options: options)         loadWorldMap()     }     public func loadWorldMap () {         let config = ARWorldTrackingConfiguration()         config.planeDetection = .horizontal         config.environmentTexturing = .automatic         self.debugOptions = [ARSCNDebugOptions.showFeaturePoints, .showWorldOrigin]         session.run(config, options: [.resetTracking, .removeExistingAnchors])         self.session.delegate = self     } } Everything works pretty cool, any attached session method is called predictably, but unfortunately renderer methods not running at all. Is there any mistake in my code?
Posted
by
Post not yet marked as solved
0 Replies
435 Views
Given six KTX, ASTC compressed textures -- all equal in size and attributes -- a.ktx, b.ktx, c.ktx, d.ktx, e.ktx, & f.ktx, I can embed them in the bundle and then create a working cube via- let cube = MDLTexture(cubeWithImagesNamed: [ "a.ktx", "b.ktx", "c.ktx", "d.ktx", "e.ktx", "f.ktx"]) This can be assigned to background.contents and works great. If, on the other hand, I have loaded those six textures from some other source into six separate MTLTextures, I cannot provide them as an array to background.contents (it fails with "image at index 0 is NULL). I have attempted to create a cube MTLTexture with the appropriate MTLTextureDescriptor.textureCubeDescriptor (using the pixel format and other attributes from the source textures), then copying the data via MTLBlitCommandEncoder, however the end result, while error free, is a cube that is wholly purple. I suspect this may be that the source textures are ASTC compressed, but am a bit at a loss as the documentation is rather sparse. Everything else seems to be incredibly easy relative to this very simple need of creating a cube from textures that aren't named bundle items. Any guidance or hints would be greatly appreciated.
Posted
by
Post not yet marked as solved
0 Replies
445 Views
I have custom View using ARSCNView, let's say it's something like import Foundation import UIKit import ARKit import SceneKit @available(iOS 11.0, *) class ARSceneView: ARSCNView, ARSessionDelegate, ARSCNViewDelegate { } I need to use it in React Native, so I created swift view manager: import UIKit @objc(ARSceneViewManager) class ARSceneViewManager : RCTViewManager {   override func view() -> UIView! {     if #available(iOS 11.0, *) {       return ARSceneView(frame: .zero)     } else {       return UIView()     }; }   override static func requiresMainQueueSetup() -> Bool {       return true     } } and ObjC file: #import "React/RCTBridgeModule.h" #import "React/RCTViewManager.h" @interface RCT_EXTERN_MODULE(ARSceneViewManager, RCTViewManager) @end I used it in my App.js file: const ARSceneView = requireNativeComponent('ARSceneView', ARSceneView); But it does not seem to appear to screen - I can see only black screen. What am I doing wrong? If I use e.g. UILabel instead, everything works very well.
Posted
by
Post not yet marked as solved
2 Replies
423 Views
After updating iOS to 14.7 I got a scenekit crash whenever I start opening an AR session. (com.apple.scenekit.scnview-renderer (19): EXC_BAD_ACCESS (code=2, address=0x1d934ec04) Any idea how this can be fixed? Best Anri02
Posted
by
Post not yet marked as solved
0 Replies
308 Views
The below code no worky... I've tried tweaking mapping UV positions, normals, etc. The Apple doc mentioned texture mapping to a point, so it sounded possible. // Material     let material = SCNMaterial()     material.lightingModel = SCNMaterial.LightingModel.constant     material.isDoubleSided = true     material.readsFromDepthBuffer = false     material.multiply.contents = UIColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 1.0)     material.diffuse.contents = UIImage(named: "star")!     // Texture mapping     let textureMap = SCNGeometrySource(textureCoordinates: [       CGPoint(x: 0, y: 1),       CGPoint(x: 1, y: 1),       CGPoint(x: 0, y: 0),       CGPoint(x: 1, y: 0)     ])           // Elements     let vertexSource = SCNGeometrySource(vertices: [SCNVector3(0, 0 ,0)])           let pointsElement = SCNGeometryElement(indices: [1], primitiveType: .point)     pointsElement.pointSize = 10     pointsElement.minimumPointScreenSpaceRadius = 1     pointsElement.maximumPointScreenSpaceRadius = 50           let normals = SCNGeometrySource(normals: [SCNVector3](repeating: SCNVector3(0, 0, 1), count: 1))           // Geometry     let geometry = SCNGeometry(sources: [vertexSource, normals, textureMap], elements: [pointsElement])     geometry.materials = [material]           // Node     let bodyNode = SCNNode()     bodyNode.geometry = geometry
Posted
by
Post not yet marked as solved
1 Replies
413 Views
I'd like to zoom-in inside an ARSCNView. Easy to do in a SCNView by setting the fieldOfView of the SCNCamera instance, however ARSession / ARSCNView have an iron grip on the SCNCamera (for obvi reasons). Is there a proper way to zoom the ARSession physical camera, and thus the SCNCamera? FYI: My AR experience deals with distant objects in the sky.
Posted
by
Post marked as solved
4 Replies
595 Views
Hello everyone, I'm graphic beginner programmer I want to use 3d texture on metal for my projects... But I can't, because of error. I try example of this link.. fragment half4 mip_fragment ( VertexOutput in [[ stage_in ]], texture2d<float> backface [[ texture(0) ]], texture3d<float> volume [[ texture(1) ]] ) {   constexpr sampler s(s_address::clamp_to_edge, t_address::clamp_to_edge, min_filter::linear, mag_filter::linear); float3 rgb = backface.sample(s, in.pixelCoord).rgb;    float3 lookupColor = volume.sample(s, rgb, 0).rgb;    return half4(half3(lookupColor), 1.h); } But I get this errors. Fragment Function(mip_fragment): incorrect type of texture (MTLTextureType2D) bound at texture binding at index 1 (expect MTLTextureType3D) for volume[0]. And app is crashed. Please help me.
Posted
by
Post not yet marked as solved
3 Replies
444 Views
Hello everyone, I am studying apply metal to scenekit. I want to pass nil texture to fragment shader. But I don't know how... This code is for pass texture named volume to fragment shader using scntechnique. swift code if let path = Bundle.main.path(       forResource: "volumerendering-metal",       ofType: "plist")     {       if let dico1 = NSDictionary(contentsOfFile: path) {         let dico = dico1 as! [String: AnyObject]         let technique = SCNTechnique(dictionary: dico)                   let screenSize = NSValue(           cgSize: view.frame.size.applying(CGAffineTransform(scaleX: 2.0, y: 2.0)))         technique?.setValue(screenSize,                   forKeyPath: "screen_size")                   technique?.setValue(nil, forKey: "volume")        // or?  // technique?.setObject(nil, forKeyedSubscript: "volume" as NSCopying)                   scnView.technique = technique       }     } property list file <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>passes</key> <dict> <key>pass_backface</key> <dict> <key>metalVertexShader</key> <string>backface_vertex</string> <key>metalFragmentShader</key> <string>backface_fragment</string> <key>program</key> <string>doesntexist</string> <key>depthStates</key> <dict> <key>clear</key> <true/> </dict> <key>colorStates</key> <dict> <key>clear</key> <true/> <key>clearColor</key> <string>sceneBackground</string> </dict> <key>cullMode</key> <string>front</string> <key>includeCategoryMask</key> <string>2</string> <key>draw</key> <string>DRAW_NODE</string> <key>inputs</key> <dict> <key>aPos</key> <string>vertex-symbol</string> </dict> <key>outputs</key> <dict> <key>color</key> <string>backface</string> </dict> </dict> <key>pass_mip</key> <dict> <key>colorStates</key> <dict> <key>clear</key> <true/> <key>clearColor</key> <string>sceneBackground</string> </dict> <key>depthStates</key> <dict> <key>clear</key> <true/> </dict> <key>metalVertexShader</key> <string>mip_vertex</string> <key>metalFragmentShader</key> <string>mip_fragment</string> <key>program</key> <string>doesntexist</string> <key>cullMode</key> <string>back</string> <key>includeCategoryMask</key> <string>2</string> <key>draw</key> <string>DRAW_NODE</string> <key>inputs</key> <dict> <key>aPos</key> <string>vertex-symbol</string> <key>viewport</key> <string>screen_size</string> <key>backface</key> <string>backface</string> <key>volume</key> <string>volume</string> </dict> <key>outputs</key> <dict> <key>color</key> <string>COLOR</string> <key>depth</key> <string>DEPTH</string> </dict> </dict> </dict> <key>sequence</key> <array> <string>pass_backface</string> <string>pass_mip</string> </array> <key>targets</key> <dict> <key>backface</key> <dict> <key>type</key> <string>color</string> </dict> </dict> <key>symbols</key> <dict> <key>vertex-symbol</key> <dict> <key>semantic</key> <string>vertex</string> </dict> <key>screen_size</key> <dict> <key>type</key> <string>vec2</string> </dict> <key>volume</key> <dict> <key>type</key> <string>sampler3D</string> </dict> </dict> </dict> </plist> shader code fragment float4 mip_fragment (   VertexOutput in [[stage_in]]   , texture2d<float, access::sample> backface [[texture(0)]]   , texture2d<float, access::sample> volume [[texture(1)]] ) {   ...     if (is_null_texture(volume)) return float4(1, 0, 0, 0);   else return float4(0, 1, 0, 0); } I want to get red object but green object.... Because I want to set 3d texture at runtime. So before set it want to be nil, because error occurred. (https://developer.apple.com/forums/thread/688841)
Posted
by
Post marked as solved
1 Replies
439 Views
I'm trying to run the Fox game sample SceneKit code from WWDC2017, but it crashes when calculating the collision bounding box. // Setup collision shape let (min, max) = model.boundingBox Thread 1: EXC_BAD_ACCESS (code=1, address=0x7fffc0000020) This error only occurs on my M1 iMac, but runs fine on Intel (using Rosetta). I really want to get started on SceneKit development for M1 Macs so any help would be appreciated.
Posted
by
Post not yet marked as solved
0 Replies
214 Views
I am using SceneKit to make a type of block pushing game and I am trying to access a specific node that I interact with. I know I can use the nodes name but I am trying to access a specific node that has the same name as other reference nodes, its a pushable block that I need to make an array of its past movement so that I can undo the players actions. I have printed the nodes details to the console and each one has a unique "ID" <SCNNode: 0x6000039ef300 'wall' pos(0.000000 0.500000 4.000000) but I can't seem to then get access to the node using this "ID" I can't find this documented anywhere so I don't know if this is even possible.
Posted
by
Post not yet marked as solved
0 Replies
315 Views
I created a SCNBox in real time based on a group of anchors that I do not know the locations in advance. I create the SCNBox using the following code:            let nodes: [SCNNode] = getMyNodes()           for node in nodes {             if(sceneView.anchor(for: node)?.name != nil) && (sceneView.anchor(for: node)?.name != "dot"){               parentNode.addChildNode(node)             }           }           let width = (abs(parentNode.boundingBox.0.x) + abs(parentNode.boundingBox.1.x))           let height = (abs(parentNode.boundingBox.0.y) + abs(parentNode.boundingBox.1.y))           let length = (abs(parentNode.boundingBox.0.z) + abs(parentNode.boundingBox.1.z))           let box = SCNBox(width: CGFloat(width), height: CGFloat(height), length: CGFloat(0.3), chamferRadius: 0)           let boxNode = SCNNode(geometry: box)           boxNode.position = parentNode.boundingSphere.center           box.firstMaterial?.diffuse.contents = UIColor.white           box.firstMaterial?.transparency = 0.4           boxNode.position = parentNode.boundingSphere.center           boxNode.name = "box"           boxNode.addChildNode(parentNode)           sceneView.scene.rootNode.addChildNode(boxNode)           boundingBox = boxNode So how do I get the vertices position of the box? The geometry of the node contains the location, length, width and height but it does not contain the location of its vertices. I also obtain the the location of the finger using the vision framework. I need to know the location of the vertex so I can enlarge or shrink the box with respect to the finger location. I tried to to calculate the vertex using the position of the center and the length and with but the location does not add back to the finger location. I think this has to do something with different scale system.
Posted
by
Post not yet marked as solved
1 Replies
373 Views
I know it's uncool to ask vague questions here, but what do they call it when you create a world and follow it with a camera in Swift? Like an RPG? Like Doom? I want to try and learn that now. And more importantly can it be done without using the Xcode scene builder? Can it be done all via code? Thanks, as always. Without the forum I would never have gotten much farther than "Hello World!"
Posted
by