How to measure performance of your NN models using the Coral Edge TPU

Revision as of 17:53, 26 October 2021 by Registered User (Created page with " This article describes how to benchmark a Neural Network model on STM32MP1 using the Coral Edge TPU. {{info|There are many ways to benchmark a NN model, this article provides...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

This article describes how to benchmark a Neural Network model on STM32MP1 using the Coral Edge TPU.

Info white.png Information
There are many ways to benchmark a NN model, this article provides a simple example using an application provided in the X-LINUX-AI package. You are free to explore other methods that are better adapted to your development constraints.

1. Installation[edit source]

1.1. Install from the OpenSTLinux AI package repository[edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.

After having configured the AI OpenSTLinux package you can install X-LINUX-AI components for this application. The minimum package required is :

 apt-get install tflite-edgetpu-benchmark

In this example the model used come from the following package:

 apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu

Then restart the demo launcher:

 systemctl restart weston@root

2. How to use the Benchmark application[edit source]

2.1. Executing with the command line[edit source]

The "tflite_edgetpu_benchmark"" application is located in the userfs partition:

/usr/local/bin/demo-ai/benchmark

It accepts the following input parameters:

Usage: ./tflite-edgetpu-benchmark

        -m --model_file <.tflite file path>:  .tflite model to be executed
        -l --loops <int>:                     provide the number of time the inference will be executed 
                                              (by default nb_loops=1)
        --help:                               show this help

2.2. Testing with COCO SSD MobileNet V1[edit source]

The model used for testing is the detect_edgetpu.tflite which is a COCO SSD MobilenetV1. It is a model used for object detection.

Info white.png Information
The use of such a model is only as an example, you should use your model for benchmarking its performances using the Coral Edge TPU

This model is located here:

/usr/local/demo-ai/computer-vision/models/coco_ssd_mobilenet/

To launch the application, use the following command :

  ./tflite-edgetpu-benchmark -m <model .tflite> -l <number of loops>

In ouput, this benchmark script will return the following line:

inference time: min=65734us  max=77319us  avg=74377.3us

With that, you can have an idea of your model performances.

No categories assignedEdit