Posts

Post not yet marked as solved
16 Replies
7.6k Views
Hi, Is there a performant way to record the ARKit view as a video?- I've tried to use ReplayKit, but the methods are just not getting called.- I've tried the ARSessionDelegate call backs to get a CVPixelBuffer, but it's not a fully rendered frame.Has anyone got a solution?
Posted
by nthState.
Last updated
.
Post marked as solved
4 Replies
1.3k Views
Hi,Previously with ARKit/ARSCNView I could use a SCNTechnique to apply a shader to my scene, but I can't find out how to do something similar in RealityKit- Is there something like this in RealityKit? where I can apply a fragment/vertex shader to the scene- or a way to get the rendered scene as a texture?Kind RegardsChris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
2 Replies
975 Views
Hi, I am setting up a NSBrowser as follows: NWBrowser(for: .bonjour(type: bonjourType, domain: nil), using: parameters) When running on iOS13, everything works great, but on iOS14, I keep getting: nwbrowserfailondnserrorlocked [B3] DNSServiceBrowse failed: NoAuth(-65555) Is there something that I need to specifically set for iOS14? Kind Regards Chris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
1 Replies
395 Views
Hi, I'm not sure if I should be doing this, but anyway: I'm using model files created in Blender, who's Axis system is Z up. When I download a model from my server into my app, the model is rotated incorrectly, as RealityKit uses Y Up. Should I change the axis system in Blender so that it matches RealityKit (As I can't figure out how to change RealityKit's axis) Should I incorporate some sort of transformation step when I load my model? Kind Regards Chris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
10 Replies
1.6k Views
Hi, I just installed BigSur Beta 5 on my Mac mini with Apple Silicon and now it's in a reboot loop. I hear the boot chime, the LED is white, 2 seconds later it goes orange, 10 seconds after that it flickers and goes back to the chime. Does anyone know how to get past this, or is the machine now bricked? Kind Regards Chris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
1 Replies
432 Views
Hi, I'm debugging my app on an Apple Developer Transition Kit running Big Sur Beta 8 I have a NSToolbarItem.Identifier.space in my Toolbar, and when I hover over it, it changes color......this is news to me. Previously on Catalina, you would get zero interaction with this. Is there anyway to stop it having a rollover state? Thanks Chris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
0 Replies
291 Views
Hi,I have a SCNTechnique pass `scene_only`, and I was expecting the camera feed to be rendered to a texture `scene_out`, snippet below, but when I debug the GPU, it's just black for the pass, do I have to specify another parameter to capture the camera feed too? Seemingly related: https://stackoverflow.com/questions/59569525/scntechnique-on-arscnview-does-not-affect-camerafeed-scene-background-on-ios-1 <key>scene_only</key> <dict> <key>colorStates</key> <dict> <key>clear</key> <true/> </dict> <key>excludeCategoryMask</key> <string>1</string> <key>draw</key> <string>DRAW_SCENE</string> <key>inputs</key> <dict/> <key>outputs</key> <dict> <key>color</key> <string>scene_out</string> </dict> </dict>
Posted
by nthState.
Last updated
.
Post not yet marked as solved
0 Replies
595 Views
Hi, Is there the concept of atomic read/write to an MTLBuffer?For instance, save I have an MTLBuffer of structs and want to change one of the values, can I atomically write a value that another GPU thread can't read until it's been updated?I have seen you can do this with Int's directly, but what about an Int, for example, as part of a struct in an MTLBuffer?
Posted
by nthState.
Last updated
.
Post not yet marked as solved
5 Replies
1.2k Views
Hi,I can't find any documentation on how to train a model for use with VNRecognizedObjectObservation.If you look at ObjectDetector.mlmodel, it's type is: PipelineWhere standard Inception models types are: Image ClassifierFor Object detection, I'm guessing I need to create a Pipeline, but can't find any documentation for it....Are there any other constructors I can use other than:let datasource: CreateML.MLImageClassifier.DataSource = .labeledDirectories(at: root) let classifier = try MLImageClassifier(trainingData: datasource) try classifier.write(to: modelURL)for example:let datasource: CreateML.MLPipelineClassifier.DataSource = .labeledDirectories(at: root, withOther: data) let classifier = try MLPipelineClassifier(trainingData: datasource) try classifier.write(to: modelURL)Kind RegardsChris
Posted
by nthState.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
Hi,I'm running an old Macbook Pro 2014 and was training an image classifier on 40'000 images. It kept on crashing at around 27'000 images, I'm wondering, if I had a newer machine, or even a machine with a dedicated GPU, would my training complete faster?- Is the training done on the CPU or GPU?Kind RegardsChris
Posted
by nthState.
Last updated
.
Post marked as solved
2 Replies
4.5k Views
Hi, I'm getting the following message in BETA 5:'ARSessionConfiguration' was deprecated in iOS 11.0: renamed to 'ARConfiguration'However, ARConfiguration has no public init(), what should I use instead?if ARWorldTrackingConfiguration.isSupported { / let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = ARWorldTrackingConfiguration.PlaneDetection.horizontal configuration.isLightEstimationEnabled = true sceneView.session.run(configuration, options: [ARSession.RunOptions.resetTracking, ARSession.RunOptions.removeExistingAnchors]) } else { // WARNING HERE let configuration = ARSessionConfiguration() sceneView.session.run(configuration) }
Posted
by nthState.
Last updated
.