Add a UIImage to a Plane in RealityKit

I'm trying to dynamically add a UIImage that is in memory (that is dynamically created at runtime and changed over time) to a RealityKit scene. I see with TextureResource that I can easily do this by creating a material from an image file in my bundle, but I don't see any way to do this with just a UIImage. Is there another way to go about this that I'm just missing? Any help would be appreciated!

Replies

A UIImage is effectively image data that has been loaded into a UIKit resource. Based on this, the biggest question is where is your image data coming from, and how are you turning it into a UIImage? Your best bet is to convert that UIImage into a local file, which you could then load as a TextureResource and apply to your model.

For example, let's say you already have a UIImage prepared in your app, dynamically generated, as you indicated in your question (my example assumes your UIImage is named myImage);

Code Block
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
if let data = myImage.pngData() {
let filePath = documentsDirectory.appendingPathComponent("sky.png")
    try? data.write(to: filePath)
    DispatchQueue.main.async {
           self.texture = try? TextureResource.load(contentsOf: filePath)
     }
}


In this example, I am converting my UIImage to Data, saving it locally, then loading it as a TextureResource which could be applied to a model.

Subsequently, if your UIImage is an image being downloaded from the web, you could take an approach of using Combine;
Code Block
var loadTexture: Cancellable?
let url = URL(string: "theimageurl.com/image.png")
loadTexture = URLSession.shared.dataTaskPublisher(for: url!)
.receive(on: RunLoop.main)
   .map { UIImage(data: $0.data) }
   .sink(receiveCompletion: { (completion) in
    loadTexture?.cancel()
   }, receiveValue: { (image) in
    let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
        let filePath = documentsDirectory.appendingPathComponent("sky.png")
        if let data = image?.pngData() {
        try? data.write(to: filePath)
        self.texture = try! TextureResource.load(contentsOf: filePath)
        }
})


It is worth noting that these examples do not take into account proper error handling or performance considerations (you likely would want to save the UIImage data somewhere other than the documents directory if it is data that can be purged/won't be reused again). Alongside that, if your UIImage is a relatively large file, you may want to consider using TextureResoruce.loadAsync(...), rather than TextureResource.load(...), as it may provide your users a more responsive app experience. You can also consider saving the data as a JPEG representation, rather than PNG, but this is all dependent on your use case.
Thanks for the reply. This strategy might work for me with some major adjustments, but unfortunately isn't ideal. To add a little more detail, the image is created by the user drawing on the screen, which creates a UIImage (which will be hidden from the user), and would then ideally be passed on to a 3D plane in AR space. Writing the image to disk and then reloading it for the texture won't fare too well performance-wise.

Hey Barrylium. Why don't you try it with the SceneKit? Using SceneKit for that purpose is easier.

func placeImageModelEntity(imageAnchor: ARImageAnchor, width: Float, height: Float) {

        let imageAnchorEntity = AnchorEntity(anchor: imageAnchor)

        imageAnchorEntity.name = imageAnchor.name!

        let screenMesh = MeshResource.generatePlane(width: width, depth: 1.33333*width, cornerRadius: 0.5)

        var myMaterial = PhysicallyBasedMaterial()

        if let baseResource = try? TextureResource.load(named: "MyImage") {

            // Create a material parameter and assign it.

            let baseColor = MaterialParameters.Texture(baseResource)

            myMaterial.baseColor = PhysicallyBasedMaterial.BaseColor(texture:baseColor)

        }

        let imageModelEntity = ModelEntity(mesh: screenMesh, materials: [myMaterial])

        let anchorWidth = imageAnchor.referenceImage.physicalSize.width

        imageModelEntity.setPosition(SIMD3(x: Float(anchorWidth), y: 0, z: 0), relativeTo: imageAnchorEntity)

        imageAnchorEntity.addChild(imageModelEntity)

        

        // Add anchor to scene

        arView.scene.addAnchor(imageAnchorEntity)

    }