LeakyReLU Bug on A12/A13 iPhone Devices when using ANE

Running mlmodel's prediction on ANE, the leakyReLU layer causes large different output feature maps than running on CpuAndGpu and CPUOnly. And the results are totally wrong when using ANE, but results are right using gpu or cpu.
If I just replace the leakyReLU with ReLU in the mlmodel, the output feature maps only have small difference and outputs are all correct.
This problem occurs on iPhone with A12 and A13 chip, no matter the ios system is ios12, ios13, or ios14. But on A14 device, the problem is gone.
Is it a bug in coreml framework on A12/A13 devices?

Replies

Hello, I met the similar problem you asked but I
failed on A14 devices (lack of A12 A13 device). And changing leakyrelu to relu also faces with the problem that the resulf of 'cpuandgpu' differs a lot from the result of 'all'(using ANE). The only difference in my project is changing compute units between MLComputeCPUAndGPU and MLComputeAll. Any experience you can share with me?

Hello, could you please submit a bug report on http://feedbackassistant.apple.com/ along with the version of the OS you are observing this on?