I have a TensorFlow 2.x object detection model (ssd resnet50 v1) that was trained on an Ubuntu 20.04 box with a GPU.
The predictions from the model preform as expected on Linux CPU&GPU, Windows 10 CPU&GPU, and Intel MacBook Air CPU, and the M1 MacBook Air CPU.
However, when I install the tensorflow-metal plugin on the M1, I can see the GPU is being used but the predictions are garbage.
I followed these install instruction:
https://developer.apple.com/metal/tensorflow-plugin/
Which gives me:
- tensorflow-macos 2.6.0
- tensorflow-metal 0.2.0
and
- Python 3.9.5
Anyone have insight as to what may be the problem? The M1 Air is running the public release of Monterey.
UPDATE: It may be something specific to the SSD Resnet50 v1 architecture. I have several other models built with the same pipeline and data which do seem to be working.
Hi AdkPete, I have the same problems here. I compared my results with the same model on Windows 10, Intel MacBook and Linux CPU. The predictions are very bad when I installed TensorFlow-metal plugin. So far I created another environment without this thing and use CPU only to train the model. Do you have any idea about how we deal with it? Many thanks.
Mona190: I don't have a fix and the other models that I thought were working, are not actually working. I looked at the outputs from the models and scores are numbers like 90000 when they should be between 0 and 1. Also the output of a prediction can produce wacky non-existent class values. Something is very wrong with the plugin.