Registered User mNo edit summary |
Registered User mNo edit summary Tag: 2017 source edit |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
{| class="st-table" style="text-align:left; width:100%; border-style: hidden; margin:auto;" | {| class="st-table" style="text-align:left; width:100%; border-style: hidden; margin:auto;" | ||
|- style="height: 100px; vertical-align:middle;" | |||
|style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_Machine_code.png|link=|185px|center]] | |||
|style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | |||
* [[How to measure the performance of NBG-based models| How to measure the performance of NBG-based models]] | |||
|- style="height:100px; vertical-align:middle;" | |- style="height:100px; vertical-align:middle;" | ||
|style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_TFLite.png|link=|center]] | |style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_TFLite.png|link=|200px|center]] | ||
|style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | |style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | ||
* [[How to measure performance of your NN models using TensorFlow Lite runtime|How to measure performance of your NN models using TensorFlow Lite runtime]] | * [[How to measure performance of your NN models using TensorFlow Lite runtime|How to measure performance of your NN models using TensorFlow Lite runtime]] | ||
|- style="height:100px; vertical-align:middle;" | |- style="height:100px; vertical-align:middle;" | ||
|style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_Onnx.png|link=|center]] | |style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_Onnx.png|link=|180px|center]] | ||
|style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | |style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | ||
* [[How to measure the performance of your models using ONNX Runtime|How to measure performance of your models using ONNX Runtime]] | * [[How to measure the performance of your models using ONNX Runtime|How to measure performance of your models using ONNX Runtime]] | ||
*[[How to convert a Tensorflow Lite model to ONNX using tf2onnx |How to convert a Tensorflow™ Lite model to ONNX using tf2onnx]] | *[[How to convert a Tensorflow Lite model to ONNX using tf2onnx |How to convert a Tensorflow™ Lite model to ONNX using tf2onnx]] | ||
|- style="height:100px; vertical-align:middle;" | |- style="height:100px; vertical-align:middle;" | ||
|style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_coral.png|link=|center]] | |style="width=25%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | [[File:X-LINUX-AI_coral.png|link=|140px|center]] | ||
|style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | |style="width=75%; border-width: 1px; border-style: solid; border-color: #f3f6f4" | | ||
* [[How to compile model and run inference on Coral Edge TPU| How to compile model and run inference on Coral Edge TPU]] | * [[How to compile model and run inference on Coral Edge TPU| How to compile model and run inference on Coral Edge TPU]] | ||
Line 17: | Line 21: | ||
* [[How to build an example using libcoral API]] | * [[How to build an example using libcoral API]] | ||
* [[How to reproduce an example using PyCoral API]] | * [[How to reproduce an example using PyCoral API]] | ||
|} | |} | ||
Latest revision as of 09:22, 3 July 2024