NXP Integrates Facebook's Glow Neural Network Compiler into eIQ ML Software Development Framework
August 11, 2020
Blog
NXP combined the target-specific optimization capability of the Glow compiler with Arm Cortex-M and Cadence Tensilica HiFi 4 DSP neural network operator libraries within the eIQ IDE.
NXP has added support for the open-source Glow neural network compiler to its eIQ machine learning software development framework. NXP engineers combined the target-specific optimization capability of the Glow compiler with Arm Cortex-M and Cadence Tensilica HiFi 4 DSP neural network operator libraries within the eIQ IDE. This helps optimize inferencing performance on i.MX RT685, RT1050, and RT1060 crossover MCUs.
Originally developed by Facebook, Glow generates optimized code from unoptimized neural networks, significantly reducing processing and memory requirements compared to just-in-time compilation.
“The standard, out-of-the-box version of Glow from GitHub is device-agnostic to give users the flexibility to compile neural network models for basic architectures of interest, including the Arm Cortex-A and Cortex-M cores, as well as RISC-V architectures,” says Dwarak Rajagopal, Software Engineering Manager at Facebook. “By using purpose-built software libraries that exploit the compute elements of their MCUs and delivering a 2-3x X performance increase [over the standard Glow implementation], NXP has demonstrated the wide-ranging benefits of using the Glow NN compiler for machine learning applications, from high-end cloud-based machines to low-cost embedded platforms.”
Running a eIQ/Glow-generated CIFAR-10 inference model on an i.MX RT685 with an integrated 600 MHz Cadence Tensilica HiFi4 DSP with 4.8 GMACs of performance, NXP was able to achieve 25x workload acceleration.
The eIQ ML toolkit provides building blocks, including the Glow compiler and TFLite support, to help developers create voice, vision, and sound-based AI applications. eIQ is now freely available as part of the NXP MCUXpresso SDK.
For more information, visit www.nxp.com/eiq and www.nxp.com/eiq/glow.
Additional reading on the Glow compiler: www.embedded-computing.com/dev-tools-and-os/neural-network-optimization-with-sparse-computing-and-facebook-glow