1. unknown revision[edit source]
1.1. Contents[edit source]
- TensorFlow™ Lite[1] 2.11.0 with XNNPACK delegate activated
- Coral Edge TPU™[2] accelerator native support
- libedgetpu 2.0.0 (Grouper) aligned with TensorFlow™ Lite 2.11.0
- libcoral 2.0.0 (Grouper) aligned with TensorFlow™ Lite 2.11.0
- PyCoral 2.0.0 (Grouper) aligned with TensorFlow™ Lite 2.11.0
- ONNX Runtime™ [3] 1.14.0 with XNNPACK execution engine activated
- OpenCV[4] 4.7.x
- Python™[5] 3.11.x (enabling Pillow module)
- Support of Sony™ IMX335 5Mpx sensor with use of DCMIPP and internal ISP
- TensorFlow™ Lite application samples :
- GPU Python human pose estimation using TensorFlow™ Lite based on Movenet SinglePose Lightning quantized model
- NPU Python semantic segmentation using TensorFlow™ Lite based on DeepLabV3 quantized model
- NPU C++ / Python image classification using TensorFlow™ Lite based on MobileNet v3 quantized model
- NPU C++ object detection using TensorFlow™ Lite based on COCO SSD MobileNet v1 quantized model
- NPU Python object detection using TensorFlow™ Lite based on Tiny YOLOv4 quantized model
- NPU C++ face recognition using TensorFlow™ Lite models capable of recognizing the face of a known (enrolled) user (available on demand)
- Coral Edge TPU™ based application samples :
- C++ / Python image classification using Coral Edge TPU™ based on MobileNet v1 quantized model and compiled for the Coral Edge TPU™
- C++ / Python object detection using Coral Edge TPU™ based on COCO SSD MobileNet v1 quantized model and compiled for the Coral Edge TPU™
- ONNX Runtime™ application samples :
- Python image classification using ONNX Runtime™ based on MobileNet v3 quantized model
- C++ / Python object detection using ONNX Runtime™ based on COCO SSD MobileNet v1 quantized model
- OpenVX™ based application samples :
- NPU C++ image classification using Network Binary Graph based on MobileNet v3 quantized model
- Tools :
- STM32AI MPU offline tool dedicated to deploy NN on the stm32mp25
- NBG benchmark application
1.2. Validated hardware[edit source]
X-LINUX-AI STM32MP25 beta is supported on all STM32MP25 Series and it has been validated on the following boards: