- Last edited 2 weeks ago ago
How to install X-LINUX-AI v2.0.0 on Avenger96 board
This article explains how to use and rebuild X-LINUX-AI v2.0.0 on an OpenSTLinux v1.2.0[1] distribution using the Avenger96 board[2].
For more detailed information about the X-LINUX-AI Expansion Package, refer to X-LINUX-AI OpenSTLinux Expansion Package.
![]() |
The OpenSTLinux v2.0.0[3] is not yet supported on the Avenger96 board. |
Contents
1 Validated hardware[edit]
As for any software expansion package, X-LINUX-AI is supported on all STM32MP1 Series devices. It has been validated on the following boards:
Optional:
- Coral USB Edge TPU[5] accelerator
2 Installing X-LINUX-AI from the OpenSTLinux AI package repository[edit]
![]() |
STMicroelectronics package repository service is provided for evaluation purposes only. Its content can be updated at any time without notice. It is therefore not approved for use in production. |
All the generated X-LINUX-AI packages are available from the OpenSTLinux AI package repository service hosted at the non-browsable URL http://extra.packages.openstlinux.st.com/AI.
This repository contains AI packages that can be simply installed using apt-* utilities. These utilities are the same as those used on a Debian system:
- the main group contains the selection of AI packages whose installation is automatically tested by STMicroelectronics
- the updates group is reserved for future uses such as package revision update.
You can install these individually or by group of packages.
2.1 Prerequisites[edit]
- The Avenger96 board starter image supporting OpenSTLinux v1.2.0 must be flashed on to your SD Card
- Your board must have an internet connection either through the network cable or through a WiFi connection.
2.2 Configuring the AI OpenSTLinux package repository[edit]
Once the board is booted, execute the following command from the console to configure the AI OpenSTLinux package repository:
Board $> wget http://extra.packages.openstlinux.st.com/AI/1.2/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
Board $> apt-get install ./apt-openstlinux-ai_1.0_armhf.deb
Then synchronize the AI OpenSTLinux package repository.
Board $> apt-get update
2.3 Installing AI packages[edit]
![]() |
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here. |
2.3.1 Installing all X-LINUX-AI packages[edit]
Command | Description |
---|---|
apt-get install packagegroup-x-linux-ai |
This command installs all the X-LINUX-AI packages (TensorFlow Lite, Edge TPU, armNN, application samples and tools) |
[edit]
Command | Description |
---|---|
apt-get install packagegroup-x-linux-ai-tflite |
This command installs X-LINUX-AI packages related to the TensorFlow Lite framework (including application samples) |
apt-get install packagegroup-x-linux-ai-tflite-edgetpu |
This command installs X-LINUX-AI packages related to the Edge TPU framework (including application samples) |
apt-get install packagegroup-x-linux-ai-armnn-tflite |
This command installs X-LINUX-AI packages related to the armNN framework (including application samples) |
2.3.3 Installing individual packages[edit]
Command | Description |
---|---|
apt-get install arm-compute-library |
This command installs the Arm Compute Library (ACL) |
apt-get install arm-compute-library-tools |
This command installs the Arm Compute Library utilities (graph examples and benchmarks) |
apt-get install armnn |
This command installs the Arm Neural Network SDK (armNN) |
apt-get install armnn-tensorflow-lite |
This command installs the armNN TensorFlow Lite parser |
apt-get install armnn-tensorflow-lite-examples |
This command installs armNN TensorFlow Lite examples |
apt-get install armnn-tfl-benchmark |
This command installs the armNN benchmark application for TensorFlow Lite models |
apt-get install armnn-tfl-cv-apps-image-classification-c++ |
This command installs a C++ image classification example using the armNN TensorFlow Lite parser |
apt-get install armnn-tfl-cv-apps-object-detection-c++ |
This command installs C++ object detection example using the armNN TensorFlow Lite parser |
apt-get install armnn-tools |
This command installs armNN utilitites such as unitary tests |
apt-get install libedgetpu1 |
This command installs Edge TPU libraries and the USB rules |
apt-get install python3-tensorflow-lite |
This command installs the Python TensorFlow Lite inference engine |
apt-get install python3-tensorflow-lite-edgetpu |
This command installs the Python TensorFlow Lite inference engine for Edge TPU |
apt-get install tensorflow-lite-tools |
This command installs Tensorflow Lite utilities |
apt-get install tflite-cv-apps-edgetpu-image-classification-c++ |
This command installs a C++ image classification example using the Coral Edge TPU TensorFlow Lite API |
apt-get install tflite-cv-apps-edgetpu-image-classification-python |
This command installs a Python image classification example using the Coral Edge TPU TensorFlow Lite API |
apt-get install tflite-cv-apps-edgetpu-object-detection-c++ |
This command installs a C++ object detection example using the Coral Edge TPU TensorFlow Lite API |
apt-get install tflite-cv-apps-edgetpu-object-detection-python |
This command installs a Python object detection example using the Coral Edge TPU TensorFlow Lite API |
apt-get install tflite-cv-apps-image-classification-c++ |
This command installs C++ image classification using TensorFlow Lite |
apt-get install tflite-cv-apps-image-classification-python |
This command installs a Python image classification example using TensorFlow Lite |
apt-get install tflite-cv-apps-object-detection-c++ |
This command installs a C++ object detection example using TensorFlow Lite |
apt-get install tflite-cv-apps-object-detection-python |
This command installs a Python object detection example using TensorFlow Lite |
apt-get install tflite-edgetpu-benchmark |
This command installs a benchmark application for Edge TPU models |
apt-get install tflite-models-coco-ssd-mobilenetv1 |
This command installs the TensorFlow Lite COCO SSD Mobilenetv1 model |
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu |
This command installs the TensorFlow Lite COCO SSD Mobilenetv1 model for Edge TPU |
apt-get install tflite-models-mobilenetv1 |
This command installs the TensorFlow Lite Mobilenetv1 model |
apt-get install tflite-models-mobilenetv1-edgetpu |
This command installs the TensorFlow Lite Mobilenetv1 model for Edge TPU |
![]() |
For more information about how to use apt-* utilities, check the Package repository for OpenSTLinux Distribution article. |
3 Re-generating X-LINUX-AI OpenSTLinux Distribution[edit]
Use the following procedure to regenerate the complete distribution and enable the X-LINUX-AI expansion package.
This procedure is mandatory if you need to update frameworks by yourself, or modify the application samples.
For more details, expand the following sections:
3.1 Downloading the STM32MP1 Distribution Package v1.2.0[edit]
Install the STM32MP1 Distribution Package v1.2.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).
3.1.1 Cloning the meta-av96 and meta-st-stm32mpu-ai git repositories[edit]
![]() |
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here. |
PC $> cd <Distribution Package installation directory>/layers PC $> git clone https://github.com/dh-electronics/meta-av96.git -b av96_v50 PC $> cd <Distribution Package installation directory>/layers/meta-st PC $> git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.0.0_thud
3.2 Setting up the build environment[edit]
PC $> cd ../.. PC $> META_LAYER_ROOT=layers DISTRO=openstlinux-weston MACHINE=stm32mp1-av96 source layers/meta-st/scripts/envsetup.sh
3.3 Adding the new layers to the build system (in that order)[edit]
PC $> bitbake-layers add-layer ../layers/meta-av96 PC $> bitbake-layers add-layer ../layers/meta-st/meta-st-stm32mpu-ai
3.4 Building the image[edit]
PC $> bitbake st-image-ai
![]() |
Building the image can take a long time since it depends on the host computer performance. |
3.5 Flashing the built image[edit]
Follow this link to see how to flash the built image.