Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)

Hello,

On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material.

My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue.

On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public.

The radar number including a sample project is: FB11420976

Thank you!

Replies

Hello,

One workaround would be to create a new UnlitMaterial each time (rather than copy the old one), then set its texture and opacity, and then replace the old material with the new one.

That being said, please continue following up with your bug report!

Yeah I took a look at your sample, love all the cat pics you send us! When you first set the blending mode, did you see the texture disappear? That’s what I was seeing, and was wondering if that was expected behavior or also a bug.

  • Hehe, glad it's appreciated 🙏 :) Interestingly I just noticed that I need to set the blend mode after assigning the texture. If I do it before, the blending is ignored. Not sure anymore how that behaved on iOS 15. Can test that tomorrow. Afterwards I can update the opacity one more time, then suddenly the texture resource is gone.

Add a Comment

Hi,

thanks for the reply! I tried that, but unfortunately it doesn't really work in my case (or only once) because when I create a new UnlitMaterial for an animation step, assign the opacity value and set it on the model, the texture is already gone when I try to create that new UnlitMaterial for another animation step. The only way that worked was if I permanently store the texture resource somewhere and assign it manually. But I would really like to avoid that, because then I need to make a lot of assumptions in my code about the material my fade system is working on. A more generic approach would be good. Maybe you can check out the sample project from my bug report in case I overlooked something? Appreciate the support!

Alright, so my workaround for now is that I store the initial materials for my model in a dedicated animation component, define a target opacity and then always derive a new material for each animation step from the initial material and assign that to the model. This way the texture persists. If I try to always dynamically query the current material from the model itself and adjust that, the texture is being purged after assigning a new blending value.

Nevertheless I feel like this is a bug that should be fixed.

Another observation; On both iOS 15 and 16 the order when assigning blending is crucial:

Has no effect:

var unlitMaterial = UnlitMaterial()
unlitMaterial.blending = .transparent(opacity: 0.1)
unlitMaterial.color = .init(texture: .init(textureResource))

Works:

var unlitMaterial = UnlitMaterial()
unlitMaterial.color = .init(texture: .init(textureResource))
unlitMaterial.blending = .transparent(opacity: 0.1)