I am curious about XLA support. Is this planned for tensorflow-metal/tensorflow-macos?
That would likely work seamlessly for JAX and PyTorch as well.
I am curious about XLA support. Is this planned for tensorflow-metal/tensorflow-macos?
That would likely work seamlessly for JAX and PyTorch as well.
Hi @ngam!
Unfortunately we can't comment here any future plans we have. But I will pass forward that there is interest for support for XLA in the developer community.
Hi @ngam!
Unfortunately we can't comment here any future plans we have. But I will pass forward that there is interest for support for XLA in the developer community.
Thank you. I understand.
If there is a need for a lengthier explanation, I would be happy to provide one. Both TensorFlow and JAX will likely gain a lot in terms of performance if XLA is supported here. This will likely provide developers (and users) exciting tools to develop their models. It goes without saying that if that could be open-sourced, it will be great. :)