Last edited 3 years ago

X-LINUX-AI OpenSTLinux Expansion Package

(Redirected from How to install X-LINUX-AI v2.0.0 on Avenger96 board)
AI Expansion Package inserted in the STM32MPU Embedded software distribution

X-LINUX-AI is an STM32 MPU OpenSTLinux Expansion Package that targets artificial intelligence for STM32MP1 Series devices.

It contains Linux AI frameworks, as well as application examples to get started with some basic use cases such as computer vision (CV).

It is composed of an OpenEmbedded meta layer, named meta-st-stm32mpu-ai, to be added on top of the STM32MP1 Distribution Package.

It brings a complete and coherent easy-to-build / install environment to take advantage of AI on STM32MP1 Series devices.

1. Versions[edit source]

1.1. X-LINUX-AI v2.1.0[edit source]

Info white.png Information
This version is compatible with Yocto Project® build system Thud and Dunfell and has been validated against the OpenSTLinux ecosystem release v2.1.0 More info.png , ecosystem release v2.0.0 More info.png , ecosystem release v1.2.0 and validated on STM32MP157x-DKx, STM32MP157x-EV1 and STM32MP157 Avenger96[1] boards.

1.1.1. Contents[edit source]

Warning white.png Warning
The face recognition binary is available on demand. Please contact the local STMicroelectronics support for more information about this application or send a request to edge.ai@st.com

1.1.2. Validated hardware[edit source]

As any software expansion package, the X-LINUX-AI is supported on all STM32MP1 Series and it has been validated on the following boards:

  • STM32MP157C-DK2[8]
  • STM32MP157C-EV1[9]
  • STM32MP157A-EV1[10]
  • STM32MP157 Avenger96 board[1]

1.1.3. Software structure[edit source]

X-LINUX-AI v2.1.0 Expansion Package Software structure

1.2. X-LINUX-AI v2.0.0[edit source]

Info white.png Information
This version has been validated against the OpenSTLinux ecosystem release v2.0.0 More info.png and validated on STM32MP157x-DKx and STM32MP157x-EV1 boards.

1.2.1. Contents[edit source]

1.2.2. Validated hardware[edit source]

As any software expansion package, the X-LINUX-AI is supported on all STM32MP1 Series and it has been validated on the following boards:

  • STM32MP157C-DK2[8]
  • STM32MP157C-EV1[9]
  • STM32MP157A-EV1[10]

1.2.3. Software structure[edit source]

X-LINUX-AI v2.0.0 Expansion Package Software structure

2. Install from the OpenSTLinux AI package repository[edit source]

Info white.png Information
The STMicroelectronics packages repository service is provided for evaluation purposes only, its content may be updated at any time without notice and is therefore not approved for use in production.

All the generated X-LINUX-AI packages are available from the OpenSTLinux AI package repository service hosted at the non-browsable URL http://extra.packages.openstlinux.st.com/AI.

This repository contains AI packages that can be simply installed using apt-* utilities, which the same as those used on a Debian system:

  • the main group contains the selection of AI packages whose installation is automatically tested by STMicroelectronics
  • the updates group is reserved for future uses such as package revision update.

You can install them individually or by package group.

2.1. Prerequisites[edit source]

ST boards prerequisites:

  • Flash the Starter Package on your SDCard
For OpenSTLinux ecosystem release v2.1.0 More info.png and ecosystem release v2.0.0 More info.png :
  • Your board has an internet connection either through the network cable or through a WiFi connection.
Info white.png Information

If your internet access depends on a proxy server, you should define the http_proxy environment variable with the following command before any apt-* commands:

 export http_proxy='http://<proxy url>:<proxy port>/'

Avenger96 board prerequisites:

  • The Avenger96 board starter image supporting OpenSTLinux v2.1.0 must be flashed on to your SD Card
OpenSTLinux-2.1 based on Yocto Dunfell LTS and Linux 5.4.56 - v6.5 Starter Image
  • Your board has an internet connection either through the network cable or through a WiFi connection.
Info white.png Information

If your internet access depends on a proxy server, you should define the http_proxy environment variable with the following command before any apt-* commands:

 export http_proxy='http://<proxy url>:<proxy port>/'

2.2. Configure the AI OpenSTLinux package repository[edit source]

Once the board is booted, execute the following command in the console in order to configure the AI OpenSTLinux package repository:

For ecosystem release  v2.1.0 More info.png :

 wget http://extra.packages.openstlinux.st.com/AI/2.1/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
 dpkg -i apt-openstlinux-ai_1.0_armhf.deb

For ecosystem release  v2.0.0 More info.png :
 wget http://extra.packages.openstlinux.st.com/AI/2.0/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
 dpkg -i apt-openstlinux-ai_1.0_armhf.deb


For ecosystem release  v1.2.0 :
 wget http://extra.packages.openstlinux.st.com/AI/1.2/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
 dpkg -i apt-openstlinux-ai_1.0_armhf.deb

Then synchronize the AI OpenSTLinux package repository.

 apt-get update

2.3. Install AI packages[edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.

2.3.1. Install all X-LINUX-AI packages[edit source]

Command Description
apt-get install packagegroup-x-linux-ai
Install all the X-LINUX-AI packages (TensorFlow Lite, Edge TPU, armNN, application samples and tools)

2.3.2. Install AI framework related packages[edit source]

Command Description
apt-get install packagegroup-x-linux-ai-tflite
Install X-LINUX-AI packages related to TensorFlow Lite framework (including application samples)
apt-get install packagegroup-x-linux-ai-tflite-edgetpu
Install X-LINUX-AI packages related to the Edge TPU framework (including application samples)
apt-get install packagegroup-x-linux-ai-armnn-tflite
Install X-LINUX-AI packages related to the armNN framework (including application samples)

2.3.3. Install individual packages[edit source]

X-LINUX-AI v2.1.0 packages

Command Description
apt-get install arm-compute-library
Install Arm Compute Library (ACL)
apt-get install arm-compute-library-tools
Install Arm Compute Library utilities (graph examples and benchmarks)
apt-get install armnn
Install arm Neural Network SDK (armNN)
apt-get install armnn-tensorflow-lite
Install armNN TensorFlow Lite parser
apt-get install armnn-tensorflow-lite-examples
Install armNN TensorFlow Lite examples
apt-get install armnn-tfl-cv-apps-image-classification-c++
Install C++ image classification example using armNN TensorFlow Lite parser
apt-get install armnn-tfl-cv-apps-object-detection-c++
Install C++ object detection example using armNN TensorFlow Lite parser
apt-get install armnn-tools
Install armNN utilitites such as unitary tests
apt-get install python3-tensorflow-lite
Install Python TensorFlow Lite inference engine
apt-get install python3-tensorflow-lite-edgetpu
Install Python TensorFlow Lite inference engine for Edge TPU
apt-get install tensorflow-lite-edgetpu
Install Edge TPU libraries and the USB rules
apt-get install tensorflow-lite-tools
Install Tensorflow Lite utilities
apt-get install tflite-cv-apps-edgetpu-image-classification-c++
Install C++ image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-image-classification-python
Install Python image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-c++
Install C++ object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-python
Install Python object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-image-classification-c++
Install C++ image classification using TensorFlow Lite
apt-get install tflite-cv-apps-image-classification-python
Install Python image classification example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-c++
Install C++ object detection example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-python
Install Python object detection example using TensorFlow Lite
apt-get install tflite-edgetpu-benchmark
Install benchmark application for Edge TPU models
apt-get install tflite-models-coco-ssd-mobilenetv1
Install TensorFlow Lite COCO SSD Mobilenetv1 model
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu
Install TensorFlow Lite COCO SSD Mobilenetv1 model for Edge TPU
apt-get install tflite-models-mobilenetv1
Install TensorFlow Lite Mobilenetv1 model
apt-get install tflite-models-mobilenetv1-edgetpu
Install TensorFlow Lite Mobilenetv1 model for Edge TPU

X-LINUX-AI v2.0.0 packages

Command Description
apt-get install arm-compute-library
Install Arm Compute Library (ACL)
apt-get install arm-compute-library-tools
Install Arm Compute Library utilities (graph examples and benchmarks)
apt-get install armnn
Install arm Neural Network SDK (armNN)
apt-get install armnn-tensorflow-lite
Install armNN TensorFlow Lite parser
apt-get install armnn-tensorflow-lite-examples
Install armNN TensorFlow Lite examples
apt-get install armnn-tfl-benchmark
Install armNN benchmark application for TensorFlow Lite models
apt-get install armnn-tfl-cv-apps-image-classification-c++
Install C++ image classification example using armNN TensorFlow Lite parser
apt-get install armnn-tfl-cv-apps-object-detection-c++
Install C++ object detection example using armNN TensorFlow Lite parser
apt-get install armnn-tools
Install armNN utilitites such as unitary tests
apt-get install libedgetpu1
Install Edge TPU libraries and the USB rules
apt-get install python3-tensorflow-lite
Install Python TensorFlow Lite inference engine
apt-get install python3-tensorflow-lite-edgetpu
Install Python TensorFlow Lite inference engine for Edge TPU
apt-get install tensorflow-lite-tools
Install Tensorflow Lite utilities
apt-get install tflite-cv-apps-edgetpu-image-classification-c++
Install C++ image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-image-classification-python
Install Python image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-c++
Install C++ object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-python
Install Python object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-image-classification-c++
Install C++ image classification using TensorFlow Lite
apt-get install tflite-cv-apps-image-classification-python
Install Python image classification example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-c++
Install C++ object detection example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-python
Install Python object detection example using TensorFlow Lite
apt-get install tflite-edgetpu-benchmark
Install benchmark application for Edge TPU models
apt-get install tflite-models-coco-ssd-mobilenetv1
Install TensorFlow Lite COCO SSD Mobilenetv1 model
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu
Install TensorFlow Lite COCO SSD Mobilenetv1 model for Edge TPU
apt-get install tflite-models-mobilenetv1
Install TensorFlow Lite Mobilenetv1 model
apt-get install tflite-models-mobilenetv1-edgetpu
Install TensorFlow Lite Mobilenetv1 model for Edge TPU
Info white.png Information
If you need more information about how to use apt-* utilities check the Package repository for OpenSTLinux distribution article.

3. Re-generate X-LINUX-AI OpenSTLinux distribution[edit source]

With the following procedure, you can re-generate the complete distribution enabling the X-LINUX-AI expansion package.
This procedure is mandatory if you want to update frameworks by yourself, or if you want to modify the application samples.
For further details, please expand the contents...

3.1. Download the STM32MP1 Distribution Package[edit source]

For ecosystem release v2.1.0 More info.png :
Install the STM32MP1 Distribution Package v2.1.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).

For ecosystem release v2.0.0 More info.png :

Install the STM32MP1 Distribution Package v2.0.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).

For ecosystem release v1.2.0 :

Install the STM32MP1 Distribution Package v1.2.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).

3.2. Install X-LINUX-AI environment for ST boards[edit source]

  • Clone the meta-st-stm32mpu-ai git repositories
Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.
For X-LINUX-AI v2.1.0:

 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.1.0

For X-LINUX-AI v2.0.0:
 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.0.0

  • Set up the build environment
 cd ../..
 DISTRO=openstlinux-weston MACHINE=stm32mp1 BSP_DEPENDENCY='layers/meta-st/meta-st-stm32mpu-ai' source layers/meta-st/scripts/envsetup.sh

3.3. Install X-LINUX-AI environment for Avenger96 board[edit source]

  • Clone the meta-av96 and meta-st-stm32mpu-ai git repositories
Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.
For X-LINUX-AI v2.1.0:

 cd <Distribution Package installation directory>/layers
 git clone https://github.com/dh-electronics/meta-av96.git -b av96_v65
 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.1.0

For X-LINUX-AI v2.0.0:
 git clone https://github.com/dh-electronics/meta-av96.git -b av96_v62
 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.0.0

  • Set up the build environment
 cd ../..
 META_LAYER_ROOT=layers DISTRO=openstlinux-weston MACHINE=stm32mp1-av96 BSP_DEPENDENCY='layers/meta-st/meta-st-stm32mp-addons layers/meta-st/meta-st-stm32mpu-ai' source layers/meta-st/scripts/envsetup.sh

3.4. Build the image[edit source]

 bitbake st-image-ai
Info white.png Information
Note that building the image could take long time depending on the host computer performance.

3.5. Flash the built image[edit source]

Follow this link to see how to flash the built image.

4. How to use the X-LINUX-AI Expansion Package[edit source]

4.1. Material needed[edit source]

To use the X-LINUX-AI OpenSTLinux Expansion Package, choose one of the following materials:

  • STM32MP157C-DK2[8] + an UVC USB WebCam
  • STM32MP157C-EV1[9] with the built in OV5640 parallel camera
  • STM32MP157A-EV1[10] with the built in OV5640 parallel camera
  • STM32MP157 Avenger96 board[1] + an UVC USB WebCam or the OV5640 CSI Camera mezzanine board[11]

Optional:

  • Coral USB Edge TPU[3] accelerator

4.2. Boot the OpenSTlinux Starter Package[edit source]

At the end of the boot sequence, the demo launcher application appears on the screen.

Demo launcher appearance when X-LINUX-AI is not installed

4.3. Install the X-LINUX-AI[edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.

After having configured the AI OpenSTLinux package you can install the X-LINUX-AI components.

 apt-get install packagegroup-x-linux-ai

And restart the demo launcher

 systemctl restart weston@root

4.4. Launch an AI application sample[edit source]

Once the demo launcher is restarted, notice that it is slightly different because new AI application samples have been installed.
The demo laucher has the following appearance, and you can navigate into the different screens by using the NEXT or BACK buttons.

Demo launcher appearance when X-LINUX-AI is installed

Screens 2, 3 and 4 contain AI application samples that are described within dedicated article available in the X-LINUX-AI application samples zoo page.

4.5. Enjoy running your own NN models[edit source]

The above examples provide application samples to demonstrate how to execute models easily on the STM32MP1.

You are free to update the C/C++ application or Python scripts for your own purposes, using your own NN models.

Source code locations are provided in application sample pages.

5. References[edit source]