Sign in to access this content and additional site features.
The demo showcases NXP's eIQ® Auto software development environment. eIQ Auto supports optimized inferencing with automotive quality across S32 devices. The development environment provides tools to import a pre-trained neural network from multiple frameworks (e.g. PyTorch, TensorFlow) and optimizes it for execution on the S32 devices. In this demo, a pre-trained battery state-of-charge (SOC) estimation algorithm based on the LSTM neural network model is compiled using eIQ Auto optimized layers for deployment on the S32Z/E Arm® Cortex®-R52 core and DSP/ML accelerator. The results demonstrate how eIQ Auto enables an optimized execution of the LSTM neural network on the S32E target hardware with maximum accuracy.