This is the first I've heard of TabletopKit. So I may be clueless. But assuming it's a specialization of RealityKit, shouldn't those bounding boxes be different colors than white? That would signify static properties. Where I would suspect you want a kinematic and dynamic? The object you are moving would be kinematic and the one you want to react to it would be dynamic.
Post
Replies
Boosts
Views
Activity
Thank you for the update and for working on this issue. RealityKit is like SceneKit, both are 3D frameworks that use ARKit as the AR framework. So this may actually help isolate the issue to the ARKit/Game Center interaction. Plus I swear it was all working for me months ago. However I also just realize this is breaking a SceneKit AR app I already have in the store. Granted, not a very popular one. Still an annoying mess. I'll file the TPS.
Sounds like something that could be imitated by animating a UV map across the surface.
Any updates on this? Wasted several hours trying to figure out why my code suddenly doesn't work before running across this post. I had been using ARView with RealityKit in non-AR and on simulator for several weeks with no issue. Flipped it back to use AR again on device and suddenly the camera was black. I tend to be a little behind on updates while grinding through a project. But this had all been working as of a few months ago. An update to the latest version of XCode today did not fix it. I already had GKAcceessPoint.shared.isActive set to false before moving to that VC. But if I avoid GameCenter altogether, the camera works.
To create a unique instance, you need to do a "deep clone" or otherwise use recursion to instance everything as you see fit.
extension SCNGeometry{
func deepClone() -> SCNGeometry? {
// Use NSKeyedArchiver and NSKeyedUnarchiver to clone the geometry
do {
// Archive the geometry to Data
let archivedData = try NSKeyedArchiver.archivedData(withRootObject: self, requiringSecureCoding: true)
// Unarchive the geometry from Data, specifying the expected class to ensure type safety
let clonedGeometry = try NSKeyedUnarchiver.unarchivedObject(ofClass: SCNGeometry.self, from: archivedData)
return clonedGeometry
} catch {
print("Error cloning geometry: \(error)")
return nil
}
}
}
So you can change the material without changing the original. Each of these will take up more memory. You want to just copy them when possible.
Yes, especially at the start, I've been after more users and feedback even at a loss. I didn't think the app was that niche. But I can't say I know anyone that actually finds apps via the store. So that all makes sense.
Yes, that wasn't the issue. It doesn't like the order and there is overlap. It's easier to see when you reduce the nProfiles and print the arrays to visually inspect them. I don't even remember why I started using .makeVerticesUniqueAndReturnError() but it's not even needed here.
let controlPoints: [(x: Float, y: Float)] = [
(0.728,-0.237), (0.176,-0.06), (0.202,0.475), (0.989,0.842),
(-0.066,1.093), (-0.726,0.787) ]
let pairs = bsplinePath(controlPoints)
var knobProfile = [SCNVector3]()
for (x,y) in pairs {
knobProfile += [ SCNVector3(x: Float(x), y: Float(y), z: 0)]
}
let nProfiles = 20
let aIncrement: CGFloat = 2 * CGFloat.pi / CGFloat(nProfiles)
var knobVertices: [SCNVector3] = []
for i in 0..<nProfiles {
let angle = aIncrement * CGFloat(i)
let rotatedProfile = knobProfile.map { $0.rotate(about: .y, by: Float(angle)) }
knobVertices.append(contentsOf: rotatedProfile)
}
let source = SCNGeometrySource(vertices: knobVertices)
var indices = [UInt16]()
let profileCount = knobProfile.count
for i in 0..<nProfiles {
let nextProfileIndex = (i + 1) % nProfiles
for j in 0..<profileCount - 1 {
let currentIndex = UInt16(i * profileCount + j)
let nextIndex = UInt16(nextProfileIndex * profileCount + j)
let nextIndexNext = UInt16(nextProfileIndex * profileCount + j + 1)
let currentIndexNext = UInt16(i * profileCount + j + 1)
indices.append(contentsOf: [currentIndex, nextIndex, currentIndexNext])
indices.append(contentsOf: [currentIndexNext, nextIndex, nextIndexNext])
}
}
let element = SCNGeometryElement(indices: indices, primitiveType: .triangles)
let surfaceGeometry = SCNGeometry(sources: [source], elements: [element])
let modelMesh = MDLMesh(scnGeometry: surfaceGeometry)
let aluminum = SCNMaterial()
aluminum.lightingModel = .physicallyBased
aluminum.diffuse.contents = UIColor.green
aluminum.roughness.contents = 0.2
aluminum.metalness.contents = 0.9
aluminum.isDoubleSided = true
surfaceGeometry.materials = [aluminum]
//let knobNode = SCNNode(geometry: surfaceGeometry)
//return knobNode
do{
try modelMesh.makeVerticesUniqueAndReturnError()
modelMesh.addNormals(withAttributeNamed: "normal", creaseThreshold: 0.1)
let flattenedGeom = SCNGeometry(mdlMesh: modelMesh)
let flattenedNode = SCNNode(geometry: flattenedGeom)
flattenedNode.geometry?.materials = [aluminum]
return flattenedNode
}catch{
fatalError("mesh vert error")
}
}
It's been awhile since I worked with this. I ultimately switched to calculating the normals for more control. Not that I'm a wiz with that stuff. A lot of hacking away. But the vertices need to be referred to uniquely for each face and there is a method for that.
do{
try modelMesh.makeVerticesUniqueAndReturnError()
modelMesh.addNormals(withAttributeNamed: "normal", creaseThreshold: 0.9)
let flattenedGeom = SCNGeometry(mdlMesh: modelMesh)
let flattenedNode = SCNNode(geometry: flattenedGeom)
flattenedNode.geometry?.materials = [greenMaterial]
return flattenedNode
}catch{
print("mesh vert error")
return node
}
I didn't try your code as it doesn't include bsplinePath.
I copied the exact same previews and screenshots from the base language to the Brazil localization and the error went away. So yes, there certainly seems to be a bug there. Although it seems that had nothing to do with why Basic doesn't reliably spend.
On the plane, try using .firstMaterial?.lightingModel = .constant
I didn't have this exact scenario but I remember solving a display issue with SKTextures in a 3D scene by doing this.
Yes, it does crash on the iPad simulator. It didn't crash for me on the iPhone simulator. Try this:
let node = SKShapeNode(rect: CGRect(x: 100, y: 100, width: 100, height: 100))
node.fillColor = .yellow
let removeAction = SKAction.run {
DispatchQueue.main.async {
node.removeFromParent()
}
}
node.run(.sequence([ .fadeOut(withDuration: 5), removeAction]))
renderer.overlaySKScene!.addChild(node)
It doesn't crash for me on simulator or device.
Well the issue at least helped me spot and resolve a problem with my ‘resolveConflictingSavedGames’ implementation. It also works correctly on a iPad simulation. Just not the iPhone simulators I’ve tried.
I tried to reply to the comment but character limit. I think "add normals" just literally adds the data for the normal vectors and puts them all at uniform value. It's just iterating through the vertices. You can sharpen or soften a custom mesh with it if you didn't want to create a SCNGeometrySource for normals. I wouldn't say programmatic normal maps are all that easy of a solution. It's just easier than writing your own boolean geometry routines.
Have you tried to apply the image to the model? I imagine the image will be shrunken at the top and bottom and ballooned out at the equator.
The best way to visualize this may still be to use Blender and create a mockup. See what the resulting maps look like for the look you want. Then that is what you would need to shoot for programmatically. I'm sure there are algorithms for this that can be adapted to Swift without reinventing the wheel.
Boolean mesh functions are a little involved and not part of the frameworks last I checked. Not unless you get into voxels. You might look into applying this as a map instead. Normal maps can give the appearance of depth. It's easier to do this in 3D modeling software such as Blender and import the results. You'll need to understand the relationship of the UV map to the 3D vertices to do this programmatically.