Accessing the colored mesh generated by LiDAR scan

I love how the LiDAR scan generates a beautifully colored mesh.

Is it possible to retain that coloring when exporting (such as to an .OBJ file? The examples I've seen so far convert the LiDAR scan and create the .OBJ file, but none of those files include any of the coloring from the original scan.

Is this even feasible?
Post not yet marked as solved Up vote post of RobBrennan Down vote post of RobBrennan
8.1k views

Replies

Also wondering this. I have spent soooo long trying to do it
I saw an application that creates a texture for a scanned 3D model https://apps.apple.com/us/app/id1419913995

But the grid question is also of interest

I have been able to successfully export the ARMeshGeometry with a color texture as a USDZ file (I've not succeeded in doing so as an .obj; the .obj ends up looking very peculiar and the colors do not appear as expected). My understanding of 3D modeling and working with 3D technologies is cursory at best, but I was pretty interested in understanding how to do this. There are a few things to consider, especially if your starting point is the Visualizing and Interacting with a Reconstructed Scene sample project;
  • The sample project is showing the beautiful mesh with colors by leveraging the arView.debugOptions.insert(.showSceneUnderstanding) call. This shows the mesh, with colors, using a debug method, which is not publicly accessible, nor representative of any particular color scheme relevant for export.

  • As far as I can tell, the sample project leverages RealityKit for rendering AR content, which does not have a method to generate a mesh from the ARMeshGeometry and texture it, nor does it have a built-in way of exporting any sort of 3D model as something like SceneKit does.

My approach was to set up a new AR project leveraging the SceneKit content technology, rather than RealityKit. This then assumes that you;
  • Set up an ARWorldTrackingConfiguration with the configuration's sceneReconstruction property set to .meshWithClassification.

  • Set up your ARSCNViewDelegate to receive delegate calls related to your ARSCNView.

In my ARSCNViewDelegate method, I have the following functions configured;

Code Block
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
guard let meshAnchor = anchor as? ARMeshAnchor else {
return nil
}
let geometry = SCNGeometry(arGeometry: meshAnchor.geometry)
let classification = meshAnchor.geometry.classificationOf(faceWithIndex: 0)
let defaultMaterial = SCNMaterial()
defaultMaterial.fillMode = .lines
defaultMaterial.diffuse.contents = colorizer.assignColor(to: meshAnchor.identifier, classification: classification)
geometry.materials = [defaultMaterial]
let node = SCNNode()
node.geometry = geometry
return node
}


Code Block
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard let meshAnchor = anchor as? ARMeshAnchor else {
return
}
let newGeometry = SCNGeometry(arGeometry: meshAnchor.geometry)
let classification = meshAnchor.geometry.classificationOf(faceWithIndex: 0)
let defaultMaterial = SCNMaterial()
defaultMaterial.fillMode = .lines
defaultMaterial.diffuse.contents = colorizer.assignColor(to: meshAnchor.identifier, classification: classification)
newGeometry.materials = [defaultMaterial]
node.geometry = newGeometry
}


In general, I am using the ARMeshGeometry to create a SCNGeometry, classifying the "type" of mesh this is (table, floor, door, etc.) and setting a color based on that, then returning (or updating) the relevant node. Once you are ready to export your scene to a model, you can leverage SceneKit's write(to:options:delegate:progressHandler:) method. For example, if your ARSCNView is a property called arView, you could call;

Code Block
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let documentsDirectory = paths[0]
let filename = directory.appendingPathComponent("mesh.usdz")
self.arView.scene.write(to: filename, options: nil, delegate: nil, progressHandler: nil)


The general idea is to recreate the ARMeshGeometry using a SCNGeometry, set texture colors using whatever methodology works for your needs, then save the scene. Per the above code samples, there are some extensions necessary to convert the ARMeshGeometry to a SCNGeometry, which I will share in a follow-up post, as I believe there is a length limit here.
Converting an ARMeshGeometry to a SCNGeometry requires several steps;

Code Block
extension SCNGeometry {
convenience init(arGeometry: ARMeshGeometry) {
let verticesSource = SCNGeometrySource(arGeometry.vertices, semantic: .vertex)
let normalsSource = SCNGeometrySource(arGeometry.normals, semantic: .normal)
let faces = SCNGeometryElement(arGeometry.faces)
self.init(sources: [verticesSource, normalsSource], elements: [faces])
}
}
extension SCNGeometrySource {
convenience init(_ source: ARGeometrySource, semantic: Semantic) {
self.init(buffer: source.buffer, vertexFormat: source.format, semantic: semantic, vertexCount: source.count, dataOffset: source.offset, dataStride: source.stride)
}
}
extension SCNGeometryElement {
convenience init(_ source: ARGeometryElement) {
let pointer = source.buffer.contents()
let byteCount = source.count * source.indexCountPerPrimitive * source.bytesPerIndex
let data = Data(bytesNoCopy: pointer, count: byteCount, deallocator: .none)
self.init(data: data, primitiveType: .of(source.primitiveType), primitiveCount: source.count, bytesPerIndex: source.bytesPerIndex)
}
}
extension SCNGeometryPrimitiveType {
static func of(_ type: ARGeometryPrimitiveType) -> SCNGeometryPrimitiveType {
switch type {
case .line:
return .line
case .triangle:
return .triangles
@unknown default:
return .line
}
}
}


Subsequently, I've used a class called Colorizer to determine the colors ideal for each mesh, and save those colors per each node;

Code Block
class Colorizer {
struct storedColors {
var id: UUID
var color: UIColor
}
var savedColors = [storedColors]()
init() {
}
func assignColor(to: UUID, classification: ARMeshClassification) -> UIColor {
return savedColors.first(where: { $0.id == to })?.color ?? saveColor(uuid: to, classification: classification)
}
func saveColor(uuid: UUID, classification: ARMeshClassification) -> UIColor {
let newColor = classification.color.withAlphaComponent(0.7)
let stored = storedColors(id: uuid, color: newColor)
savedColors.append(stored)
return newColor
}
}


In my case, I am using the colors found in the Visualizing and Interacting with a Reconstructed Scene sample project, which has colors determined by the ARMeshGeometry's classification (which you will find in the Extensions.swift file).
@brandonK212 your example is very helpful.
I am wondering on one point though.
Code Block Swift
let classification = meshAnchor.geometry.classificationOf(faceWithIndex: 0)


This assumes that all the faces in this anchor share the same classification, which in practice is not the case.

Is there any easy/straightforward way to do this colorization based on the individual faces' classification?
Hi  @brandonK212

Do you have a GitHub repo for your solution? I was trying to implement what you proposed but ran into some issues. If you can please provide access to your used code, it would help me a lot. Thank you.

@aditiSwaroop

https://github.com/indrajitv/Mesh-With-Color I have implemented idea of @brandonK212

BTW it shows plain colors, I was expecting a colored object / not gray scaled / real-life image like.

@indinind did you manage to solve the colored object issue?

Anyone has able to export the coloured object?

I referred to https://github.com/indrajitv/Mesh-With-Color but I would like to see real colours instead of setting up through code. Thanks :)

OBJ format can only store geometric data, [ points in space, how they arranged, is there is a material assigned to them, to which polygons the material is assigned, normals per vertex].

What you are wanting to serialise into the file format (texture data) is not possible. It would be possible in

  • USDZ as the zip would contain the texture data.
  • Alembic, you could write out that meta data per a point in space.

I have been trying to 3d scan my room just like Polycam provides, I referred to https://github.com/indrajitv/Mesh-With-Color but I want to have natural textures of objects rather than colours. I also tried, https://github.com/TokyoYoshida/ExampleOfiOSLiDAR, but this provides plain white 3d model on export and if we scan with texture, it provides single frame only. Has anyone been able to scan the whole room with textures using SceneKit?

  • I have the same problem, did anyone find the answer?

Add a Comment