How to install X-LINUX-AI v2.0.0 on OpenSTLinux v1.2.0

Revision as of 15:51, 6 July 2020 by Registered User (→‎Installing all X-LINUX-AI packages)

X-LINUX-AI v2.0.0 is officially delivered on top of the OpenSTLinux v2.0.0 but it also supports OpenSTLinux v1.2.0.

This article explains how to use and rebuild X-LINUX-AI v2.0.0 on an OpenSTLinux v1.2.0 distribution. For more detailed information about X-LINUX-AI Expansion Package, refer to X-LINUX-AI OpenSTLinux Expansion Package.

1. Validated hardware[edit source]

As any software expansion package, X-LINUX-AI is supported on all STM32MP1 Series. It has been validated on the following boards:

  • STM32MP157C-DK2[1]
  • STM32MP157C-EV1[2]
  • STM32MP157A-EV1[3]

2. Installing X-LNUX-AI from OpenSTLinux AI package repository[edit source]

Info white.png Information
STMicroelectronics package repository service is provided for evaluation purposes only. Its content can be updated at anytime without notice. It is therefore not approved for use in production.

All the generated X-LINUX-AI packages are available from the OpenSTLinux AI package repository service hosted at the non-browsable URL http://extra.packages.openstlinux.st.com/AI.

This repository contains AI packages that can be simply installed using apt-* utilities. These utilities are the same as the ones used on a Debian system:

  • the main group contains the selection of AI packages whose installation is automatically tested by STMicroelectronics
  • the updates group is reserved for future use like package revision update.

You can install them individually or by group of packages.

2.1. Prerequisites[edit source]

  • The Starter Package must be flashed on your SD card
STM32MP157x-DKx Starter Package procedure
or
STM32MP157x-EV1 Starter Package procedure
  • Your board must have an internet connection either through the network cable or through a WiFi connection.
Info white.png Information

If your internet access depends on a proxy server, define the http_proxy environment variable with the following command before any apt-* commands:

 export http_proxy='http://<proxy url>:<proxy port>/'

2.2. Configuring AI OpenSTLinux package repository[edit source]

Once the board is booted, execute the following command from the console to configure the AI OpenSTLinux package repository:

 wget http://extra.packages.openstlinux.st.com/AI/1.2/pool/config/a/apt-openstlinux-ai/apt-openstlinux-ai_1.0_armhf.deb
 apt-get install ./apt-openstlinux-ai_1.0_armhf.deb
Info white.png Information
This command may issue a warning message similar to the following:
N: Cannot drop privileges for downloading as file '/home/root/apt-openstlinux-ai_1.0_armhf.deb'
could not be accessed by user '_apt'. - pkgAcquire::Run (13: Permission denied)

You can safely ignore it.


Then synchronize the AI OpenSTLinux package repository.

 apt-get update

2.3. Installing AI packages[edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.

2.3.1. Installing all X-LINUX-AI packages[edit source]

Command Description
apt-get install packagegroup-x-linux-ai
This command installs all the X-LINUX-AI packages (TensorFlow Lite, Edge TPU, armNN, application samples and tools)

2.3.2. Installing AI framework related packages[edit source]

Command Description
apt-get install packagegroup-x-linux-ai-tflite
Install X-LINUX-AI packages related to TensorFlow Lite framework (including application samples)
apt-get install packagegroup-x-linux-ai-tflite-edgetpu
Install X-LINUX-AI packages related to the Edge TPU framework (including application samples)
apt-get install packagegroup-x-linux-ai-armnn-tflite
Install X-LINUX-AI packages related to the armNN framework (including application samples)

2.3.3. Installing individual packages[edit source]

Command Description
apt-get install arm-compute-library
Install Arm Compute Library (ACL)
apt-get install arm-compute-library-tools
Install Arm Compute Library utilities (graph examples and benchmarks)
apt-get install armnn
Install arm Neural Network SDK (armNN)
apt-get install armnn-tensorflow-lite
Install armNN TensorFlow Lite parser
apt-get install armnn-tensorflow-lite-examples
Install armNN TensorFlow Lite examples
apt-get install armnn-tfl-benchmark
Install armNN benchmark application for TensorFlow Lite models
apt-get install armnn-tfl-cv-apps-image-classification-c++
Install C++ image classification example using armNN TensorFlow Lite parser
apt-get install armnn-tfl-cv-apps-object-detection-c++
Install C++ object detection example using armNN TensorFlow Lite parser
apt-get install armnn-tools
Install armNN utilitites such as unitary tests
apt-get install libedgetpu1
Install Edge TPU libraries and the USB rules
apt-get install python3-tensorflow-lite
Install Python TensorFlow Lite inference engine
apt-get install python3-tensorflow-lite-edgetpu
Install Python TensorFlow Lite inference engine for Edge TPU
apt-get install tensorflow-lite-tools
Install Tensorflow Lite utilities
apt-get install tflite-cv-apps-edgetpu-image-classification-c++
Install C++ image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-image-classification-python
Install Python image classification example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-c++
Install C++ object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-edgetpu-object-detection-python
Install Python object detection example using Coral Edge TPU TensorFlow Lite API
apt-get install tflite-cv-apps-image-classification-c++
Install C++ image classification using TensorFlow Lite
apt-get install tflite-cv-apps-image-classification-python
Install Python image classification example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-c++
Install C++ object detection example using TensorFlow Lite
apt-get install tflite-cv-apps-object-detection-python
Install Python object detection example using TensorFlow Lite
apt-get install tflite-edgetpu-benchmark
Install benchmark application for Edge TPU models
apt-get install tflite-models-coco-ssd-mobilenetv1
Install TensorFlow Lite COCO SSD Mobilenetv1 model
apt-get install tflite-models-coco-ssd-mobilenetv1-edgetpu
Install TensorFlow Lite COCO SSD Mobilenetv1 model for Edge TPU
apt-get install tflite-models-mobilenetv1
Install TensorFlow Lite Mobilenetv1 model
apt-get install tflite-models-mobilenetv1-edgetpu
Install TensorFlow Lite Mobilenetv1 model for Edge TPU
Info white.png Information
For more information about how to use apt-* utilities, check the Package repository for OpenSTLinux distribution article.

3. Re-generate X-LINUX-AI OpenSTLinux distribution[edit source]

With the following procedure, you can re-generate the complete distribution enabling the X-LINUX-AI expansion package.
This procedure is mandatory if you want to update by yourself some frameworks or if you want to modify the application samples.
To know more, please expand the contents...

3.1. Download the STM32MP1 Distribution Package v1.2.0[edit source]

Install the STM32MP1 Distribution Package v1.2.0, but do not initialize the OpenEmbedded environment (do not source the envsetup.sh).

3.1.1. Clone the meta-st-stm32mpu-ai git repositories[edit source]

Warning white.png Warning
The software package is provided AS IS, and by downloading it, you agree to be bound to the terms of the software license agreement (SLA). The detailed content licenses can be found here.
 cd <Distribution Package installation directory>/layers/meta-st
 git clone https://github.com/STMicroelectronics/meta-st-stm32mpu-ai.git -b v2.0.0_thud

3.2. Set up the build environment[edit source]

 cd ../..
 DISTRO=openstlinux-weston MACHINE=stm32mp1 source layers/meta-st/scripts/envsetup.sh

3.3. Add the new layers to the build system[edit source]

 bitbake-layers add-layer ../layers/meta-st/meta-st-stm32mpu-ai

3.4. Build the image[edit source]

 bitbake st-image-ai
Info white.png Information
Note that building the image could take long time depending on the host computer performance.

3.5. Flash the built image[edit source]

Follow this link to know how to flash the built image.

4. References[edit source]