How to install X-LINUX-AI v2.0.0 on Avenger96 board

This article explains how to use and rebuild X-LINUX-AI v2.0.0 on an OpenSTLinux v1.2.0[1] distribution using the Avenger96 board[2].

For more detailed information about the X-LINUX-AI Expansion Package, refer to X-LINUX-AI OpenSTLinux Expansion Package.

Info white.png Information
The OpenSTLinux v2.0.0[3] is not yet supported on the Avenger96 board.

1 Validated hardware[edit]

As for any software expansion package, X-LINUX-AI is supported on all STM32MP1 Series devices. It has been validated on the following boards:

  • STM32MP157 Avenger96 board[2] + an UVC USB WebCam or the OV5640 CSI Camera mezzanine board[4]

Optional:

  • Coral USB Edge TPU[5] accelerator

2 Installing X-LINUX-AI from the OpenSTLinux AI package repository[edit]

Info white.png Information
STMicroelectronics package repository service is provided for evaluation purposes only. Its content can be updated at any time without notice. It is therefore not approved for use in production.

All the generated X-LINUX-AI packages are available from the OpenSTLinux AI package repository service hosted at the non-browsable URL http://extra.packages.openstlinux.st.com/AI.

This repository contains AI packages that can be simply installed using apt-* utilities. These utilities are the same as those used on a Debian system:

  • the main group contains the selection of AI packages whose installation is automatically tested by STMicroelectronics
  • the updates group is reserved for future uses such as package revision update.

You can install these individually or by group of packages.

2.1 Prerequisites[edit]

  • The Avenger96 board starter image supporting OpenSTLinux v1.2.0 must be flashed on to your SD Card
OpenSTLinux-4.19-thud v5.0.0 Starter Image
  • Your board must have an internet connection either through the network cable or through a WiFi connection.
Info white.png Information

If your internet access depends on a proxy server, define the http_proxy environment variable with the following command before any apt-* commands:

 export http_proxy='http://<proxy url>:<proxy port>/'

2.2 Configuring the AI OpenSTLinux package repository[edit]

Once the board is booted, execute the following command from the console to configure the AI OpenSTLinux package repository:

 wget http://extra.packages.openstlinux.st.com/AI/1.2/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
 apt-get install ./apt-openstlinux-ai_1.0_armhf.deb
Info white.png Information
This command may issue a warning message similar to the following:
N: Can't drop privileges for downloading as file '/home/root/apt-openstlinux-ai_1.0_armhf.deb'
couldn't be accessed by user '_apt'. - pkgAcquire::Run (13: Permission denied)

You can safely ignore it.


Then synchronize the AI OpenSTLinux package repository.

 apt-get update

2.3 Installing AI packages[edit]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA0048). The detailed content licenses can be found here.

2.3.1 Installing all X-LINUX-AI packages[edit]

Command Description
apt-get install packagegroup-x-linux-ai
This command installs all the X-LINUX-AI packages (TensorFlow Lite, Edge TPU, armNN, application samples and tools)

2.3.2 Installing AI framework related packages[edit]

Command Description
apt-get install packagegroup-x-linux-ai-tflite
This command installs X-LINUX-AI packages related to the TensorFlow Lite framework (including application samples)
apt-get install packagegroup-x-linux-ai-tflite-edgetpu
This command installs X-LINUX-AI packages related to the Edge TPU framework (including application samples)
apt-get install packagegroup-x-linux-ai-armnn-tflite
This command installs X-LINUX-AI packages related to the armNN framework (including application samples)

2.3.3 Installing individual packages[edit]

Command Description
apt-get install arm-compute-library
This command installs the Arm Compute Library (ACL)
apt-get install arm-compute-library-tools
This command installs the Arm Compute Library utilities (graph examples and benchmarks)
apt-get install armnn
This command installs the Arm Neural Network SDK (armNN)
apt-get install armnn-tensorflow-lite
This command installs the armNN TensorFlow Lite parser
apt-get install armnn-tensorflow-lite-examples
This command installs armNN TensorFlow Lite examples
apt-get install armnn-tfl-benchmark
This command installs the armNN benchmark application for TensorFlow Lite models
apt-get install armnn-tfl-cv-apps-image-classification-c++
This command installs a C++ image classification example using the armNN TensorFlow Lite parser
apt-get install armnn-tfl-cv-apps-object-detection-c++
This command installs C++ object detection example using the armNN TensorFlow Lite parser
apt-get install armnn-tools
This command installs armNN utilitites such as unitary tests
apt-get install libedgetpu1
This command installs Edge TPU libraries and the USB rules
apt-get install python3-tensorflow-lite
This command installs the Python TensorFlow Lite inference engine
apt-get install python3-tensorflow-lite-edgetpu
This command installs the Python TensorFlow Lite inference engine for Edge TPU
apt-get install tensorflow-lite-tools
This command installs Tensorflow Lite utilities
apt-get install tflite-cv-apps-edgetpu-image-classification-c++
This command installs a C++ image classification example using the Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-image-classification-python
This command installs a Python image classification example using the Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-c++
This command installs a C++ object detection example using the Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-python
This command installs a Python object detection example using the Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-image-classification-c++
This command installs C++ image classification using TensorFlow Lite
apt-get install tflite-cv-apps-image-classification-python
This command installs a Python image classification example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-c++
This command installs a C++ object detection example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-python
This command installs a Python object detection example using TensorFlow Lite
apt-get install tflite-edgetpu-benchmark
This command installs a benchmark application for Edge TPU models
apt-get install tflite-models-coco-ssd-mobilenetv1
This command installs the TensorFlow Lite COCO SSD Mobilenetv1 model
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu
This command installs the TensorFlow Lite COCO SSD Mobilenetv1 model for Edge TPU
apt-get install tflite-models-mobilenetv1
This command installs the TensorFlow Lite Mobilenetv1 model
apt-get install tflite-models-mobilenetv1-edgetpu
This command installs the TensorFlow Lite Mobilenetv1 model for Edge TPU
Info white.png Information
For more information about how to use apt-* utilities, check the Package repository for OpenSTLinux Distribution article.

3 Re-generating X-LINUX-AI OpenSTLinux Distribution[edit]

Use the following procedure to regenerate the complete distribution and enable the X-LINUX-AI expansion package.
This procedure is mandatory if you need to update frameworks by yourself, or modify the application samples.
For more details, expand the following sections:

3.1 Downloading the STM32MP1 Distribution Package v1.2.0[edit]

Install the STM32MP1 Distribution Package v1.2.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).

3.1.1 Cloning the meta-av96 and meta-st-stm32mpu-ai git repositories[edit]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA0048). The detailed content licenses can be found here.
 cd <Distribution Package installation directory>/layers
 git clone https://github.com/dh-electronics/meta-av96.git -b av96_v50
 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.0.0_thud

3.2 Setting up the build environment[edit]

 cd ../..
 META_LAYER_ROOT=layers DISTRO=openstlinux-weston MACHINE=stm32mp1-av96 source layers/meta-st/scripts/envsetup.sh

3.3 Adding the new layers to the build system (in that order)[edit]

 bitbake-layers add-layer ../layers/meta-av96
 bitbake-layers add-layer ../layers/meta-st/meta-st-stm32mpu-ai

3.4 Building the image[edit]

 bitbake st-image-ai
Info white.png Information
Building the image can take a long time since it depends on the host computer performance.

3.5 Flashing the built image[edit]

Follow this link to see how to flash the built image.

4 References[edit]