How to convert a Tensorflow™ Lite model to ONNX using tf2onnx

Revision as of 10:21, 7 April 2023 by Registered User (Created page with "To convert a .tflite model to the ONNX format, ONNX provides a tool named {{Highlight|tf2onnx <ref name=tf2onnx_url>[https://github.com/onnx/tensorflow-onnx tf2onnx]</ref>}},...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

To convert a .tflite model to the ONNX format, ONNX provides a tool named tf2onnx [1], which is very simple to use.

The first step is to install Tensorflow on the host computer. For test purposes it might be useful to install also ONNX Runtime. The tool tf2onnx uses the versions of Tensorflow and ONNX Runtime already installed. If it does not find any, it installs the most recent versions.

The second step is to install tf2onnx:

- Install from pypi :

 pip install -U tf2onnx

or

- Install latest from GitHub:

 pip install git+https://github.com/onnx/tensorflow-onnx

After the installation, the user is able to convert the .tflite model directly using the following command line:

  python -m tf2onnx.convert --opset 16 --tflite path/to/tflite/model.tflite --output path/to/onnx/model/model.onnx

Native ONNX models are also available in the ONNX Model Zoo [2].

No categories assignedEdit