Application Note (1)
Fact Sheet (1)
-
eIQ Software Fact Sheet[EIQ-FS]
eIQ Inference with DeepViewRT™ is a platform-optimized, runtime inference engine that scales across a wide range of NXP devices and neural network compute engines. Provided free-of-charge, this inference engine enables compact code size for resource-constrained devices including the i.MX RT crossover MCUs (Arm® Cortex®-M cores), i.MX applications processors (Cortex-A and Cortex-M cores, dedicated neural processing units (NPU) and GPUs).
This inference engine is delivered via NXP standard Yocto BSP release for Linux® OS-based development, and MCUXpresso SDK release for embedded RTOS-enabled MCU development.
1 download
Note: For better experience, software downloads are recommended on desktop.
Please wait while your secure files are loading.
EIQ-INFERENCE-DEEPVIEWRT
Quick reference to our documentation types.
2 documents
Compact List
Please wait while your secure files are loading.
1-5 of 8 hardware offerings
Additional hardware available. View our featured partner solutions.
1-5 of 12 hardware offerings
To find a complete list of our partners that support this software, please see our Partner Marketplace.
Quick reference to our software types.
1 software file
Note: For better experience, software downloads are recommended on desktop.
2 software offerings
To find a complete list of our partners that support this software, please see our Partner Marketplace.