We did it; we made the world a better place!
November 13, 2018
Blog
The goal for the Better Place Challenge design contest was to have our embedded engineering community design something that makes the world a better place. I?m pleased to announce the 3 winners.
The goal for the Better Place Challenge design contest was a simple one—to have our embedded engineering community design something that makes the world a better place. The parameters were fairly vague, but the engineers responded in a huge way. The main criteria was that the design had to be powered by a Renesas RZ microprocessor and IAR’s Embedded Workbench tool had to be used to fine-tune the software. The challenge was sponsored by Renesas Electronics, IAR Systems, and Embedded Computing Design. And the first three winners received $6,000, $3,000, and $1,500, respectively.
I’m pleased to announce the three winners here. First prize went to the designer of a skin-cancer detector. It’s a device that uses artificial intelligence (AI) and machine learning to detect skin cancers such as melanoma in their early stages and improve survival rates. The platform runs a convolutional neural network (CNN) pre-trained in Tensorflow to identify cancerous tissues on the Renesas RZ/A1 Stream-it! development kit.
Second place was awarded to a real-time water-quality monitoring system. It also uses AI and machine learning to minimize the consumption of contaminated water by detecting harmful microbes and bacteria. The system uses a CIFAR-10-based convolutional neural network trained to detect the bacteria, which is then quantized to 8 bits and run on the RZ/A Stream-it! development kit. The Arm CMSIS-NN software library ported to the RZ/A software package enables the neural network given the available memory.
Nearly as impressive is our third-place winner, with an environment narrator for the blind. This design describes the surrounding physical environment to visually impaired users to help them avoid potentially dangerous objects (such as hot stoves) or trip hazards. It works by sending time-of-flight data from an optical sensor over I2C to the Renesas RZ/A MPU. The processor then calculates the size and shape of the object before outputting verbal notifications.
I’m proud to be a part of such a challenge as we do our part to make the world a better place.