eIQ® ML Software Development Environment

The NXP® eIQ® machine learning (ML) software development environment enables the use of ML algorithms on NXP EdgeVerse microcontrollers and microprocessors, including i.MX RT crossover MCUs, and i.MX family application processors. eIQ ML software includes a ML workflow tool called eIQ Toolkit, along with inference engines, neural network compilers and optimized libraries. This software leverages open-source and proprietary technologies and is fully integrated into our MCUXpresso SDK and Yocto development environments, allowing you to develop complete system-level applications with ease.

eIQ Software Ecosystem

eIQ ML Software Development Environment Image

A platform-optimized runtime inference engine that enables compact code size for resource-constrained devices.

eIQ-Enabled Devices:
i.MX RT1050, i.MX RT1060, i.MX RT1064, i.MX RT1160, i.MX RT1170, i.MX 8M Plus, i.MX 8M, i.MX 8M Nano, i.MX 8M Nano UL, i.MX 8M Mini.

Faster and smaller than TensorFlow, TensorFlow Lite enables inference at the edge with lower latency and smaller binary size using open source libraries.

eIQ-Enabled Devices:
i.MX 8M Plus, i.MX 8M, i.MX 8M Nano, i.MX 8M Nano UL, i.MX 8M Mini.

Faster and smaller than TensorFlow, TF Micro is optimized for running machine learning models on resource constrained devices.

eIQ-Enabled Devices:
i.MX RT500, i.MX RT600, i.MX RT1050, i.MX RT1060, i.MX RT1064, i.MX RT1160, i.MX RT1170, i.MX 8M Plus.

A machine learning compiler that enables ahead-of-time compilation by converting neural networks into object files, which are then converted into binary images for increased performance and smaller memory footprint.

eIQ-Enabled Devices:
i.MX RT600, i.MX RT1050, i.MX RT1060, i.MX RT1064, i.MX RT1160, i.MX RT1170.

ONNX Runtime

Cross-platform inference and training ML accelerator.

eIQ-Enabled Devices:
i.MX 8M Plus, i.MX 8M, i.MX 8M Nano, i.MX 8M Nano UL, i.MX 8M Mini.