- Last edited 5 months ago ago
How to measure performance of your NN models using the Coral Edge TPU
This article describes how to measure the performance of a neural network model compiled for Coral Edge TPU on STM32MP1x plateform.
Contents
1 Installation[edit]
1.1 Installing from the OpenSTLinux AI package repository[edit]
After having configured the AI OpenSTLinux package you can install X-LINUX-AI components for this application. The minimum package required is the tflite-edgetpu-benchmark, it could be installed directly on your board using the following command:
apt-get install tflite-edgetpu-benchmark
The model used in this example can be installed from the following package:
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu
2 How to use the Benchmark application[edit]
2.1 Executing with the command line[edit]
The tflite_edgetpu_benchmark application is located in the userfs partition:
/usr/local/bin/coral-edgetpu-2.0.0/tools/tflite_edgetpu_benchmark
It accepts the following input parameters:
Usage: ./tflite-edgetpu-benchmark -m --model_file <.tflite file path>: .tflite model to be executed -l --loops <int>: provide the number of time the inference will be executed (by default nb_loops=1) --help: show this help
2.2 Testing with COCO SSD MobileNet V1[edit]
The model used for testing is the detect_edgetpu.tflite which is a COCO SSD MobilenetV1. It is a model used for object detection.
On the target, the model is located here:
/usr/local/demo-ai/computer-vision/models/coco_ssd_mobilenet/
To launch the application, use the following command:
/usr/local/bin/coral-edgetpu-2.0.0/tools/tflite_edgetpu_benchmark -m /usr/local/demo-ai/computer-vision/models/coco_ssd_mobilenet/detect_edgetpu.tflite -l 50
Console output:
model file set to: /usr/local/demo-ai/computer-vision/models/coco_ssd_mobilenet/detect_edgetpu.tflite This benchmark will execute 50 inference(s) Bus 002 Device 004: ID 18d1:9302 Google Inc. Loaded model /usr/local/demo-ai/computer-vision/models/coco_ssd_mobilenet/detect_edgetpu.tflite resolved reporter inferences are running: # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # inference time: min=58315us max=109714us avg=66009.4us