Difference between revisions of "How to build an example using libcoral API"
[quality revision] | [quality revision] |
m
|
m (→Description)
|
Applicable for | STM32MP13x lines, STM32MP15x lines |
This article explains how to build an example using the libcoral API[1]. The libcoral is a C/C++ API, which is used for three main purposes:
- Inferencing: facilitates the implementation of neural network inference on Coral Edge TPU™
- Pipelining: provides functions to pipeline a model on multiple Coral Edge TPUs
- Transfer learning: enables on board transfer learning
Several examples can be found on the libcoral GitHub[2]. This article only shows how to build a simple image classification example from scratch. The method used here can easily be applied to other examples.
1 Description[edit]
This simple example is based on two image classification models, which allow the identification of the subject represented by an image.
The purpose of this example is beyond the image classification aspect to demonstrate how to use the libcoral API to realize inferences of two models alternatively on only one Google Coral Edge TPU™[3].
This example depends only on the TensorFlow™ Lite[4] interpreter and the libraries associated to the Google Coral Edge TPU™[3].
The models used with this example are two variants, Inat_bird and Inat_plant respectively for bird and plant classification of the MobileNet v2 downloaded from the Coral GitHub testing models[5].
2 Installation[edit]
2.1 Install from the OpenSTLinux AI package repository[edit]
![]() |
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here. |
After having configured the AI OpenSTLinux package, you can install the X-LINUX-AI components for this example:
apt-get install libcoral
2.2 Source code location[edit]
The source code and the header files of this example are located in the libcoral API examples GitHub [6].
3 How to use the example[edit]
3.1 Installation of the X-LINUX-AI SDK[edit]
First of all, the installation of the X-LINUX-SDK is required to be able to cross-compile AI applications for STM32 boards.
3.2 Start the SDK[edit]
![]() |
The SDK environment setup script must be run once on each new working terminal on which you cross-compile. |
Once the OpenSTLinux SDK is installed, go to the installation directory and source the environment:
cd $HOME/STM32MPU_workspace/STM32MP1X-Ecosystem-v4.01.0/Developer-Package/SDK source environment-setup-cortexa7t2hf-neon-vfpv4-ostl-linux-gnueabi
![]() |
The path to the SDK must be adapted depending of your working configuration (board used). |
3.3 Set up the file tree[edit]
This example requires the following file tree to operate out of the box:
mkdir -p sources/coral/examples cd sources/coral/examples
3.4 Download the example[edit]
As mentioned before the source and header files must be downloaded from the libcoral API examples GitHub [6]:
wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/two_models_one_tpu.cc wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/file_utils.cc wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/file_utils.h
3.5 Create the Makefile[edit]
Create the following Makefile in the sources/coral/examples directory:
# Copyright (c) 2022 STMicroelectronics
TARGET_BIN = two_models_one_tpu
CXXFLAGS += -Wall $(shell pkg-config --cflags absl_base libglog)
CXXFLAGS += -I../../
# Fix undefined reference during the link
CXXFLAGS += -std=c++11
LDFLAGS = $(shell pkg-config --libs absl_base libglog)
LDFLAGS += -lcoral -ledgetpu -ltensorflow-lite -lglog
LDFLAGS += \
-labsl_flags_internal \
-labsl_flags_marshalling \
-labsl_flags_reflection \
-labsl_statusor \
-labsl_flags_parse \
-labsl_strings \
-labsl_throw_delegate
SRCS = two_models_one_tpu.cc file_utils.cc
OBJS = $(SRCS:.cc=.o)
all: $(TARGET_BIN)
$(OBJS): $(SRCS)
$(CXX) $(CXXFLAGS) -c $^
$(TARGET_BIN): $(OBJS)
$(CXX) $(LDFLAGS) $^ -o $@
clean:
rm -rf $(OBJS) $(TARGET_BIN)
Minimal required libraries like Abseil, Glog, Tflite, Edgetpu, and Coral are linked during the cross-compilation.
3.6 Download and prepare test data[edit]
An additional package is necessary on the host PC to preprocess data for this example: imagemagick
apt-get install imagemagick
First create the directory to store test data:
mkdir edgetpu_cpp_example
Next download the models and the test pictures:
wget -O edgetpu_cpp_example/inat_plant_edgetpu.tflite https://github.com/google-coral/edgetpu/raw/master/test_data/mobilenet_v2_1.0_224_inat_plant_quant_edgetpu.tflite wget -O edgetpu_cpp_example/inat_bird_edgetpu.tflite https://github.com/google-coral/edgetpu/raw/master/test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite wget -O edgetpu_cpp_example/inat_plant_labels.txt https://github.com/google-coral/edgetpu/raw/master/test_data/inat_plant_labels.txt wget -O edgetpu_cpp_example/inat_bird_labels.txt https://github.com/google-coral/edgetpu/raw/master/test_data/inat_bird_labels.txt wget -O edgetpu_cpp_example/bird.jpg https://farm3.staticflickr.com/8008/7523974676_40bbeef7e3_o.jpg wget -O edgetpu_cpp_example/plant.jpg https://c2.staticflickr.com/1/62/184682050_db90d84573_o.jpg
Finally preprocess the pictures to make them fit with the inputs shape of the models:
cd edgetpu_cpp_example && convert bird.jpg -resize 224x224! bird.rgb && convert plant.jpg -resize 224x224! plant.rgb
3.7 Cross-compilation and launch[edit]
Run the cross-compilation:
cd .. make
Once the compilation is finished, a binary file named two_models_one_tpu has been created.
Copy the binary file and the test data directory onto the board:
scp -r edgetpu_cpp_example/ root@<board_ip>:/path/ scp two_models_one_tpu root@<board_ip>:/path/
![]() |
The Coral Edge TPU™ must be plugged on the board before launching the script |
Connect to the board and launch the example:
./two_models_one_tpu
After 2000 inferences the result is:
Running model: edgetpu_cpp_example/inat_bird_edgetpu.tflite and model: edgetpu_cpp_example/inat_plant_edgetpu.tflite for 2000 inferences [Bird image analysis] max value index: 659 value: 0.652344 [Plant image analysis] max value index: 1680 value: 0.964844 Using one Edge TPU, # inferences: 2000 costs: 106.278 seconds.
Where the max value index represents the index of the class detected and the value represents the confidence. On these particular pictures, the bird detected is a poecile atricapillus (black-capped chickadee) and the plant is a helianthus annuus (sunflower). The index and the name of each class are available in the inat_bird_labels.txt and inat_plant_labels.txt stored in the edgetpu_cpp_example directory.
4 References[edit]
<noinclude>{{ApplicableFor |MPUs list=STM32MP13x, STM32MP15x |MPUs checklist=STM32MP13x,STM32MP15x }}</noinclude> This article explains how to build an example using the {{Highlight|'''libcoral API'''<ref name=libcoral_api>[https://coral.ai/docs/reference/cpp/ libcoral API] </ref>}}. The libcoral is a C/C++ API, which is used for three main purposes: * '''Inferencing''': facilitates the implementation of neural network inference on Coral Edge TPU™ * '''Pipelining''': provides functions to pipeline a model on multiple Coral Edge TPUs * '''Transfer learning''': enables on board transfer learning Several examples can be found on the {{Highlight|'''libcoral GitHub'''<ref name=libcoral_github>[https://github.com/google-coral/libcoral/ libcoral GitHub]</ref>}}. This article only shows how to build a simple image classification example from scratch. The method used here can easily be applied to other examples. ==Description== This simple example is based on two image classification models, which allow the identification of the subject represented by an image.<br> <div class="res-img"> [[File: libcoral_edgetpu_image_classification_example_illustration.png|center| libcoral API image classification example]]</div> The purpose of this example is beyond the image classification aspect to demonstrate how to use the libcoral API to realize inferences of two models alternatively on only one {{Highlight|'''Google Coral Edge TPU™'''<ref name=edgetpu_url>[https://coral.ai/ Coral Edge TPU™]</ref>}}. This example depends only on the {{Highlight|TensorFlow™ Lite<ref name=tensorflowlite_url>[https://www.tensorflow.org/lite TensorFlow™ Lite]</ref>}} interpreter and the libraries associated to the Edge TPU™. {{Highlight|'''Google Coral Edge TPU™'''<ref name=edgetpu_url>[https://coral.ai/ Coral Edge TPU™]</ref>}}. The models used with this example are two variants, '''Inat_bird''' and '''Inat_plant''' respectively for bird and plant classification of the '''MobileNet v2''' downloaded from the {{Highlight|'''Coral GitHub testing models'''}}<ref name=coral_testing_modelsl>[https://github.com/google-coral/edgetpu/blob/master/test_data/ Coral GitHub testing models]</ref>. ==Installation== ===Install from the OpenSTLinux AI package repository=== {{Warning|{{SoftwareLicenseAgreement | distribution=X-LINUX-AI}}}} After having [[X-LINUX-AI_OpenSTLinux_Expansion_Package#Configure the AI OpenSTLinux package repository|configured the AI OpenSTLinux package]], you can install the X-LINUX-AI components for this example: {{Board$}} apt-get install libcoral ===Source code location=== The source code and the header files of this example are located in the {{Highlight|'''libcoral API examples GitHub'''}} <ref name=libcoral_github_examples>[https://github.com/google-coral/libcoral/tree/master/coral/examples libcoral API examples GitHub]</ref>. ==How to use the example== ===Installation of the X-LINUX-AI SDK === First of all, the [[How_to_install_and_use_the_X-LINUX-AI_SDK_add-on# on|installation of the '''X-LINUX-SDK''']] is required to be able to cross-compile AI applications for STM32 boards. === Start the SDK === {{info| The SDK environment setup script must be run once on each new working terminal on which you cross-compile.}} Once the OpenSTLinux SDK is installed, go to the installation directory and source the environment: {{PC$}} cd $HOME/STM32MPU_workspace/STM32MP1X-Ecosystem-v4.01.0/Developer-Package/SDK {{PC$}} source environment-setup-cortexa7t2hf-neon-vfpv4-ostl-linux-gnueabi {{Warning| The path to the SDK must be adapted depending of your working configuration (board used).}} === Set up the file tree === This example requires the following file tree to operate out of the box: {{PC$}} mkdir -p sources/coral/examples {{PC$}} cd sources/coral/examples === Download the example === As mentioned before the source and header files must be downloaded from the {{Highlight|'''libcoral API examples GitHub'''}} <ref name=libcoral_github_examples>[https://github.com/google-coral/libcoral/tree/master/coral/examples libcoral API examples GitHub]</ref>: {{PC$}} wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/two_models_one_tpu.cc {{PC$}} wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/file_utils.cc {{PC$}} wget https://raw.githubusercontent.com/google-coral/libcoral/master/coral/examples/file_utils.h === Create the Makefile === Create the following Makefile in the sources/coral/examples directory: <br> <source lang="Makefile"> # Copyright (c) 2022 STMicroelectronics TARGET_BIN = two_models_one_tpu CXXFLAGS += -Wall $(shell pkg-config --cflags absl_base libglog) CXXFLAGS += -I../../ # Fix undefined reference during the link CXXFLAGS += -std=c++11 LDFLAGS = $(shell pkg-config --libs absl_base libglog) LDFLAGS += -lcoral -ledgetpu -ltensorflow-lite -lglog LDFLAGS += \ -labsl_flags_internal \ -labsl_flags_marshalling \ -labsl_flags_reflection \ -labsl_statusor \ -labsl_flags_parse \ -labsl_strings \ -labsl_throw_delegate SRCS = two_models_one_tpu.cc file_utils.cc OBJS = $(SRCS:.cc=.o) all: $(TARGET_BIN) $(OBJS): $(SRCS) $(CXX) $(CXXFLAGS) -c $^ $(TARGET_BIN): $(OBJS) $(CXX) $(LDFLAGS) $^ -o $@ clean: rm -rf $(OBJS) $(TARGET_BIN)</source> Minimal required libraries like Abseil, Glog, Tflite, Edgetpu, and Coral are linked during the cross-compilation. === Download and prepare test data === An additional package is necessary on the host PC to preprocess data for this example: '''imagemagick''' {{PC$}} apt-get install imagemagick First create the directory to store test data: {{PC$}} mkdir edgetpu_cpp_example Next download the models and the test pictures: {{PC$}} wget -O edgetpu_cpp_example/inat_plant_edgetpu.tflite https://github.com/google-coral/edgetpu/raw/master/test_data/mobilenet_v2_1.0_224_inat_plant_quant_edgetpu.tflite {{PC$}} wget -O edgetpu_cpp_example/inat_bird_edgetpu.tflite https://github.com/google-coral/edgetpu/raw/master/test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite {{PC$}} wget -O edgetpu_cpp_example/inat_plant_labels.txt https://github.com/google-coral/edgetpu/raw/master/test_data/inat_plant_labels.txt {{PC$}} wget -O edgetpu_cpp_example/inat_bird_labels.txt https://github.com/google-coral/edgetpu/raw/master/test_data/inat_bird_labels.txt {{PC$}} wget -O edgetpu_cpp_example/bird.jpg https://farm3.staticflickr.com/8008/7523974676_40bbeef7e3_o.jpg {{PC$}} wget -O edgetpu_cpp_example/plant.jpg https://c2.staticflickr.com/1/62/184682050_db90d84573_o.jpg Finally preprocess the pictures to make them fit with the inputs shape of the models: {{PC$}} cd edgetpu_cpp_example && convert bird.jpg -resize 224x224! bird.rgb && convert plant.jpg -resize 224x224! plant.rgb === Cross-compilation and launch === Run the cross-compilation: {{PC$}} cd .. {{PC$}} make Once the compilation is finished, a binary file named '''two_models_one_tpu''' has been created. <br> Copy the binary file and the test data directory onto the board: {{PC$}} scp -r edgetpu_cpp_example/ root@<board_ip>:/path/ {{PC$}} scp two_models_one_tpu root@<board_ip>:/path/ {{Info | The Coral Edge TPU™ must be plugged on the board before launching the script}} Connect to the board and launch the example: {{board$}} ./two_models_one_tpu After 2000 inferences the result is: <pre> Running model: edgetpu_cpp_example/inat_bird_edgetpu.tflite and model: edgetpu_cpp_example/inat_plant_edgetpu.tflite for 2000 inferences [Bird image analysis] max value index: 659 value: 0.652344 [Plant image analysis] max value index: 1680 value: 0.964844 Using one Edge TPU, # inferences: 2000 costs: 106.278 seconds.</pre> Where the max value index represents the index of the class detected and the value represents the confidence. On these particular pictures, the bird detected is a poecile atricapillus (black-capped chickadee) and the plant is a helianthus annuus (sunflower). The index and the name of each class are available in the '''inat_bird_labels.txt''' and '''inat_plant_labels.txt''' stored in the edgetpu_cpp_example directory. ==References==<references /> <noinclude> [[Category:Artificial intelligence expansion packages |09]] {{PublicationRequestId | 23825 | 17Jun'22}}</noinclude>
(2 intermediate revisions by 2 users not shown) | |||
Line 20: | Line 20: | ||
The purpose of this example is beyond the image classification aspect to demonstrate how to use the libcoral API to realize inferences of two models alternatively on only one {{Highlight|'''Google Coral Edge TPU™'''<ref name=edgetpu_url>[https://coral.ai/ Coral Edge TPU™]</ref>}}. |
The purpose of this example is beyond the image classification aspect to demonstrate how to use the libcoral API to realize inferences of two models alternatively on only one {{Highlight|'''Google Coral Edge TPU™'''<ref name=edgetpu_url>[https://coral.ai/ Coral Edge TPU™]</ref>}}. |
||
− | This example depends only on the {{Highlight|TensorFlow™ Lite<ref name=tensorflowlite_url>[https://www.tensorflow.org/lite TensorFlow™ Lite]</ref>}} interpreter and the libraries associated to the Edge TPU™. |
+ | This example depends only on the {{Highlight|TensorFlow™ Lite<ref name=tensorflowlite_url>[https://www.tensorflow.org/lite TensorFlow™ Lite]</ref>}} interpreter and the libraries associated to the {{Highlight|'''Google Coral Edge TPU™'''<ref name=edgetpu_url>[https://coral.ai/ Coral Edge TPU™]</ref>}}. |
The models used with this example are two variants, '''Inat_bird''' and '''Inat_plant''' respectively for bird and plant classification of the '''MobileNet v2''' downloaded from the {{Highlight|'''Coral GitHub testing models'''}}<ref name=coral_testing_modelsl>[https://github.com/google-coral/edgetpu/blob/master/test_data/ Coral GitHub testing models]</ref>. |
The models used with this example are two variants, '''Inat_bird''' and '''Inat_plant''' respectively for bird and plant classification of the '''MobileNet v2''' downloaded from the {{Highlight|'''Coral GitHub testing models'''}}<ref name=coral_testing_modelsl>[https://github.com/google-coral/edgetpu/blob/master/test_data/ Coral GitHub testing models]</ref>. |
||
Line 35: | Line 35: | ||
==How to use the example== |
==How to use the example== |
||
===Installation of the X-LINUX-AI SDK === |
===Installation of the X-LINUX-AI SDK === |
||
− | First of all, the [[How_to_install_and_use_the_X-LINUX-AI_SDK_add-on |
+ | First of all, the [[How_to_install_and_use_the_X-LINUX-AI_SDK_add-on|installation of the '''X-LINUX-SDK''']] is required to be able to cross-compile AI applications for STM32 boards. |
=== Start the SDK === |
=== Start the SDK === |
||
{{info| The SDK environment setup script must be run once on each new working terminal on which you cross-compile.}} |
{{info| The SDK environment setup script must be run once on each new working terminal on which you cross-compile.}} |
||
Once the OpenSTLinux SDK is installed, go to the installation directory and source the environment: |
Once the OpenSTLinux SDK is installed, go to the installation directory and source the environment: |
||
− | {{PC$}} cd $HOME/STM32MPU_workspace/STM32MP1X-Ecosystem-v4. |
+ | {{PC$}} cd $HOME/STM32MPU_workspace/STM32MP1X-Ecosystem-v4.1.0/Developer-Package/SDK |
{{PC$}} source environment-setup-cortexa7t2hf-neon-vfpv4-ostl-linux-gnueabi |
{{PC$}} source environment-setup-cortexa7t2hf-neon-vfpv4-ostl-linux-gnueabi |
||
{{Warning| The path to the SDK must be adapted depending of your working configuration (board used).}} |
{{Warning| The path to the SDK must be adapted depending of your working configuration (board used).}} |