eIQ® Inference with TensorFlow Lite


Software Details

Select a section:


eIQ® TensorFlow Lite

eIQ<sup>&reg;</sup> TensorFlow Lite


  • Delivered as middleware in NXP Yocto BSP releases
  • NXP eIQ software support available for i.MX applications processors
  • Provides the ability to run inferencing on Arm® Cortex®-M, Cortex-A, Verisilicon GPUs and NPU
  • Faster and smaller than TensorFlow — enables inference at the edge with lower latency and smaller binary size
  • Uses open source libraries to accelerate matrix and vector arithmetic (Eigen and GEMMLOWP)
  • NXP optimized implementation of GEMMLOWP uses Arm® Cortex®-M7 SIMD instructions yielding 2-3x performance increase compared to out-of-the-box implementation

Supported Devices

  • i.MX8: i.MX 8 Family – Arm® Cortex®-A53, Cortex-A72, Virtualization, Vision, 3D Graphics, 4K Video
  • i.MX8M: i.MX 8M Family - Arm® Cortex®-A53, Cortex-M4, Audio, Voice, Video
  • i.MX8MMINI: i.MX 8M Mini - Arm® Cortex®-A53, Cortex-M4, Audio, Voice, Video
  • i.MX8MNANO: i.MX 8M Nano Family - Arm® Cortex®-A53, Cortex-M7
  • i.MX8X: i.MX 8X Family – Arm® Cortex®-A35, 3D Graphics, 4K Video, DSP, Error Correcting Code on DDR



  • BSP, Drivers and Middleware

    MCUXpresso SDK Builder

  • BSP, Drivers and Middleware

    eIQ Machine Learning Software for i.MX Linux 4.14.y

Note: For better experience, software downloads are recommended on desktop.


Quick reference to our documentation types.

5 documents

Design Resources

Select a section:


5 hardware offerings

Related Software

    Note: For better experience, software downloads are recommended on desktop.


    1 trainings