How to use `EnvironmentLightEstimationProvider` to capture a environment texture and apply it on an model entity?

I am a newby of spatial computing. Here I am learning how to use ARKit to capture the environment texture and apply it on a ModelEntity of RealityKit on Vision Pro. But I do not find a demo of how to use EnvironmentLightEstimationProvider. After checking the documentation, I also have some questions:

  1. EnvironmentProbeAnchor.environmentTexture is a MTLTexture, but EnvironmentResource needs a CGImage. How do I translate MTLTexture to CGImage(Forgive me that I do not know much about Metal or other framework, so It will be better if there is a code that I can copy and paste directly)
  2. It seems that the EnvironmentProbeAnchor can only get the light information around the device. But what should I do if I want get the light information around the ModelEntity so that I can apply the environment texture on it.

It will be better if you can provide a code demo about how to use the new api.

Thank you!

Hi @YaShiho

I don't have code to covert an MTLTexture to an EnvironmentResource, but you may not need EnvironmentLightEstimationProvider to accomplish your goal if you are working with a ModelEntity in a RealityView. By default, all entities with a material that responds to light, respond to environment lighting. In other words it "just works" when you add the entity to a RealityView's content. If you want to apply a custom lighting effect to an entity consider using shader graph to take input from a EnvironmentRadiance node and apply it to an UnlitSurface.

How to use `EnvironmentLightEstimationProvider` to capture a environment texture and apply it on an model entity?
 
 
Q