eIQ® Inference with TensorFlow Lite


eIQ® TensorFlow Lite

eIQ<sup>&reg;</sup> TensorFlow Lite


  • Delivered as middleware in NXP Yocto BSP releases
  • NXP eIQ software support available for i.MX applications processors
  • Provides the ability to run inferencing on Arm® Cortex®-M, Cortex-A, Verisilicon GPUs and NPU
  • Faster and smaller than TensorFlow — enables inference at the edge with lower latency and smaller binary size
  • Uses open source libraries to accelerate matrix and vector arithmetic (Eigen and GEMMLOWP)
  • NXP optimized implementation of GEMMLOWP uses Arm® Cortex®-M7 SIMD instructions yielding 2-3x performance increase compared to out-of-the-box implementation

Supported Devices

  • i.MX8: i.MX 8 Family – Arm® Cortex®-A53, Cortex-A72, Virtualization, Vision, 3D Graphics, 4K Video
  • i.MX8M: i.MX 8M Family - Arm® Cortex®-A53, Cortex-M4, Audio, Voice, Video
  • i.MX8MMINI: i.MX 8M Mini - Arm® Cortex®-A53, Cortex-M4, Audio, Voice, Video
  • i.MX8MNANO: i.MX 8M Nano Family - Arm® Cortex®-A53, Cortex-M7
  • i.MX8X: i.MX 8X Family – Arm® Cortex®-A35, 3D Graphics, 4K Video, DSP, Error Correcting Code on DDR


Quick reference to our software types.


  • Examples and Quick Start Software

    MCUXpresso SDK Builder

  • Software Development Resources

    eIQ Machine Learning Software for i.MX Linux 4.14.y

Note: For better experience, software downloads are recommended on desktop.


Quick reference to our documentation types.

5 documents


Quick reference to our board types.

5 hardware offerings

Related Software

Quick reference to our software types.

    Note: For better experience, software downloads are recommended on desktop.

    Engineering Services

    1 engineering service

    To find a complete list of our partners that support this software, please see our Partner Marketplace.


    1 trainings