Embedded Artificial Intelligence (AI) is quickly becoming an essential capability for edge processing, gives 'smart' devices an ability to become 'aware' of its surroundings and make decisions on the input received with little or no human intervention. NXP's ML environment enables fast-growing machine learning use-cases in vision, voice, and anomaly detections. The vision-based ML applications utilize cameras as inputs to the various machine learning algorithms of which neural networks are the most popular. These applications span most market segments and perform functions such as object recognition, identification, people-counting and others. Voice Activated Devices (VADs) are driving the need for machine learning at the edge for wake word detection, natural language processing, and for 'voice as the user-interface' applications. Machine learning-based anomaly detection (based on vibration/sound patterns) will revolutionize Industry 4.0 by recognizing imminent failures and dramatically reducing down-times. NXP offers its customers several approaches for integrating machine learning into their applications. The NXP ML environment includes free software that allows customers to import their own trained TensorFlow or Caffe models, convert them to optimized inference engines, and deploy them on NXP's breadth of scalable processing solutions from MCUs to highly-integrated i.MX and Layerscape processors.
“When it comes to machine learning in embedded applications, it’s all about balancing cost and the end-user experience. For example, many people are still amazed that they can deploy inference engines with sufficient performance even in our cost-effective MCUs,” said
Another critical requirement in bringing AI/ML capability to the edge is easy and secure deployment and upgrade from the cloud to embedded devices. The EdgeScale platform enables secure provisioning and management of IoT and Edge devices. EdgeScale enables an end-to-end continuous development and delivery experience by containerizing AI/ML learning and inference engines in the cloud, and securely deploying the containers to edge devices automatically.
To support a broad range of customer needs, NXP also created a Machine Learning partner ecosystem to connect customers with technology vendors that can accelerate time-to-revenue with proven ML tools, inference engines, solutions and design services. Members of the ecosystem include Au-Zone Technologies and Pilot.AI. Au-Zone Technologies provides the industry’s first end-to-end embedded ML toolkit and RunTime inference engine, DeepView, which enables developers to deploy and profile CNNs on NXP’s entire SoC portfolio that includes heterogeneous mixture of Arm Cortex-A, Cortex-M cores, and GPU’s. Pilot.AI has built a framework to enable a variety of perception tasks - including detection, classification, tracking, and identification - across a variety of customer platforms, ranging from microcontrollers to GPUs, along with data collection/annotation tools and pre-trained models to enable drop-in model deployment.
NXP and the NXP logo are trademarks of
For more information, please contact:
|Americas||Europe||Greater China / Asia|
|Tate Tran||Martijn van der Linden||Esther Chang|
|Tel: +1 408-802-0602||Tel: +31 6 10914896||Tel: +886 2 8170 9990|
|Email: email@example.com||Email: firstname.lastname@example.org||Email: email@example.com|