Embedded Insiders: How to Integrate AI into the Embedded Enterprise
July 02, 2020
AI is all the rage these days, and poised to disrupt nearly every industry. In fact it already is. A recent IDC survey reported that the majority of companies are failing in their AI initiatives.
AI is all the rage these days, and poised to disrupt nearly every industry. In fact it already is, but more because organizations are struggling to integrate new AI personnel, processes, tools, and workflows alongside their existing infrastructure. It’s so bad in fact that a recent IDC survey reported that the majority of companies are failing in their AI initiatives.
In this episode of the Embedded Insiders, Brandon and Rich interview Michael Grant, Vice President of Services at Anaconda, an open-source-centric data science company who manages the Anaconda distribution of the Python and R programming languages. Michael explains some of the obstacles organizations looking to enter the AI space need to watch out for before they get started, from licensing issues to security vulnerabilities to technical strategies. He then discusses how his company’s recent collaboration with the IBM Watson team can help such organizations integrate, organize, and manage their AI solutions stacks, from model development to endpoint inferencing, using a package-centric architecture.
Later, Jean Labrosse is back with more “Things That Annoy a Veteran Software Engineer,” as he rants about the use of lengthy macros in the C language.
Tune in for more.