Executive Viewpoint: Migrating to the Intelligent Edge
January 24, 2020
Story
The Intelligent Edge is gaining momentum in the embedded development community and beyond. Embedded Computing Design interviewed Wind River?s Chief Strategy Officer Gareth Noyes to understand why.
The Intelligent Edge is gaining momentum in the embedded development community and beyond. To get a better understanding of what that means exactly, Embedded Computing Design spent some time with Wind River’s Gareth Noyes, the company’s Chief Strategy Officer.
ECD: The Intelligent Edge is something that’s getting a lot of attention these days. Explain what that term means.
Noyes: The simple answer is that an Intelligent Edge is a place that spans from the device edge to the infrastructure edge where data is both collected and analyzed. This means that data is processed, analyzed and acted on before it is sent to the cloud. The result of the Intelligent Edge is that costs are lower, network impact and latencies are minimized, and security risks are reduced. Ultimately, business is conducted in a more efficient manner.
ECD: Why would traditional enterprise-based vendors want to create an Intelligent Edge?
Noyes: Companies that rely on highly centralized cloud and hybrid-cloud infrastructures realize the importance of the edge and the value it offers. Growing quantities of raw data now originate from IoT devices and sensors, and new classes of devices—from autonomous vehicles to industrial robots—require real-time access to operational data. In other words, data is being both generated and consumed at the edge, far from the centralized computing power of cloud-scale infrastructure. For all that data to make round trips to the cloud would devour too much bandwidth and take too much time for the operational needs of the edge devices. That’s why edge computing is becoming so important.
ECD: Today, embedded systems and cloud computing represent two different worlds. How do we make them intersect?
Noyes: The border between things (the edge) and the cloud is not easy to cross. Software development in the two worlds requires different skillsets, and largely uses different tools, languages, and methods. Expectations diverge about how long a development project will take and what its lifecycle should be.
And, cloud solutions don’t transport easily to the edge. Companies run into unfamiliar and incompatible environments when they try to move processing closer to the data at the edge. The reasons for this are that edge computing resources are more limited, physical access and security present new challenges, and virtualization is not the norm. Resiliency, quality of service, and high availability are cornerstone data center requirements, but they can be far more challenging at the edge. Moreover, devices at the edge today are not standardized and interchangeable like servers in a data center. All this can make the edge a hostile environment for the cloud-native way of doing things.
ECD: What challenges can a developer expect to confront when migrating intelligence to the edge?
Noyes: Edge computing devices that interact with real-time embedded systems can have specific requirements that are unfamiliar to cloud computing. Determinism is a good example. The deterministic model at the root of embedded systems is the expectation that a device will always complete the same task in the same way and in the same amount of time. Anything less than 100% determinism can result in catastrophic failure.
This is foreign territory to the data center, which is all about parallel processes that typically complete their tasks within the target timeframe. In the data center, you expect and accept a certain amount of jitter, a long tail of latency with outliers that miss the target. Deterministic embedded devices at the edge don’t have this tolerance.
ECD: What other differences are there between what’s traditionally found at the edge versus the cloud?
Noyes: Embedded systems have historically been about fixed-function, monolithic, cost-sensitive, compute-limited physical devices with long development cycles and long lifecycles. Embedded device builders struggle with how to accelerate development, scale production, and reimagine edge devices more broadly to be relevant and valuable to the world of cloud-scale infrastructure.
The pressure for device original equipment manufacturers (OEMs) to change comes both from “above” and “below.” Below are the hardware platforms that devices are built on. The availability and affordability of powerful new processing and storage options challenges device makers to take advantage of these capabilities. Today, it doesn’t make sense to spend tremendous effort optimizing a custom, single-purpose piece of software to run on a specific processor. And with the advent of today’s fast, multi-core, power-efficient CPUs, that’s not necessary.
The pressure from above comes from customers who want flexible, multipurpose, interchangeable edge devices that are compatible with their cloud-like infrastructures. Cloud computing is expanding from the data center out to the edge, and it needs a place to land there. Pressure from above is also coming from the cloud developers who want to run their applications on edge devices. They don’t want to have to learn new development languages or worry about the constrains of the system.
ECD: Wind River refers to a concept called the landing zone. Can you describe how that works?
Noyes: Sure. We look at the landing zone concept as enabling the development of applications that can be deployed anywhere in the intelligent edge, irrespective of whether it’s a physical (embedded) edge device or part of the virtualized cloud-scale infrastructure.
Wind River has deep and broad experience delivering solutions for the intelligent edge. We understand embedded systems that make up the intelligent edge, we have experience deploying robust cloud-scale edge infrastructure, and we know how they can become more compatible with cloud-native applications. To create a landing zone architecture, OEMs will need to build new kinds of edge devices, consisting of a few critical, yet standard, building blocks enabling them to leverage the same modern application development processes to deploy new services at the edge.
For example, new devices and systems must decouple monolithic embedded systems with layers of abstraction, and there must be containers for traditional real-time operating-system (RTOS) applications and for a new class of cloud-native edge applications. It’s also a good idea to utilize Agile development practices. As embedded development moves to a foundation of DevSecOps and continuous integration/continuous delivery (CICD), the skills gap—and cultural gap—will narrow between OEMs and the customers they serve.
ECD: With all that said, why should a developer choose Wind River over competitive offerings?
Noyes: Wind River is the only company with a robust and comprehensive embedded software portfolio to deliver this vision. We can help bring embedded OEMs and cloud-native industries together to architect a new intelligent edge infrastructure. This would be accomplished through a variety of software and tools already in the Wind River portfolio, including the VxWorks RTOS, Wind River Linux, the Helix Virtualization Platform that allows VxWorks and Linux to run concurrently with or without containers, Wind River Cloud Platform, Wind River’s family of products for cloud-scale infrastructure, and Simics for system simulation.