How fog computing is driving Industrial IoT systems evolution
April 05, 2018
Story
The percentage of IIoT systems focused on simple cloud-based use cases and value propositions is high right now, but there is a rapidly increasing shift to more advanced and valuable use cases.
In the last century, transportation, medical, power, and industrial systems were built from individual devices, usually programmed one at a time. The Industrial Internet of Things (IIoT) is changing all of that, transforming isolated programmable devices into intelligent networks of connected machines, such as autonomous cars, intelligent drone delivery systems, smart grid power systems, automated air traffic control, connected medical devices, robotic oil drilling, and more. As a result, these systems have unique computing requirements including real-time processing, complex data interconnectivity, integrated security, high performance, reliability, and scalability.
Although the IoT is currently driven by connecting field devices to the cloud, cloud computing use cases have limitations. Most IoT implementations are about connecting to, and doing all of the processing in, the cloud. Although this can work for the consumer IoT, for Industrial IoT systems not everything can take place in the cloud. The percentage of IIoT systems focused on simple cloud-based use cases and value propositions is high right now, but there is a rapidly increasing shift to more advanced and valuable use cases. These advanced use cases require data interconnectivity and advanced analytics running from the edge to the cloud. Fog computing enables these core functions, driving the evolution of the IIoT.
The three levels of IIoT evolution
The range of IIoT use cases can be roughly categorized into three different levels: monitoring, optimization and autonomy (Figure 1). These levels are characterized by increasing data interconnectivity and artificial intelligence (AI), or analytics.
Monitoring
Monitoring includes the use cases that can be categorized as asset performance management (including predictive maintenance, asset tracking, etc.). Data is collected from sensors on assets and transmitted to a back-end server for analysis. There is no real peer-to-peer data connectivity; data moves from the edge to the backend cloud or control center. AI and advanced analytics are run solely in the backend. One of the most popular and valuable use cases in monitoring is predictive maintenance. As an example, for predictive maintenance of a wind turbine, machine states like bearing temperature, vibration amplitude, and other basic measures are gathered by sensors and communicated to a backend server running a predictive maintenance analytics application. The data is gathered from the edge and the computation is primarily in the backend/cloud.
Optimization
Optimization includes use cases focused on process optimization. Going beyond simple asset instrumentation, including sensors that capture the state of an entire process provides the ability to monitor and improve the process. For example, a natural gas pipeline benefits from predictive maintenance of the turbines that pressurizes the gas and moves it along the pipeline. However, adding sensors to other assets in the pumping stations and pipeline, and sharing data along the pipeline, provides the opportunity to optimize gas flow, monitor for faults and failures, and learn how to streamline bottlenecks. Often the assets in the process use increasing numbers of sensors, driving the need for edge analytics in order to reduce data transmission to the backend. In turn, edge analytics benefit from results and data from peers along the process line, thus driving more peer-to-peer data connectivity. The IIoT system now has layers of compute from edge to backend, analytics running throughout, and data connectivity between the various applications running across the different layers. These data connectivity and layered compute functions are the core capabilities enabled by fog computing.
Autonomy
Autonomy removes humans from the loop and uses AI to manage more and more of the system, such as in an autonomous vehicle. Decision authority and control is delegated to different layers of the IIoT systems depending on the response time requirements and the scope of data needed to make a decision (monitoring and controlling a single, high-speed machine in a process versus supervising a subsystem with multiple machines). Advanced control and AI reduce or eliminate the need for a human in the loop, and real-time data is moved between compute nodes in a mesh-like manner. Without fog computing, autonomy is nearly impossible to implement, with fog computing, we can develop more flexible, resilient, and capable autonomy-level systems.
Where does the fog fit in?
Fog computing provides flexible compute environments well beyond the data center, down to the edge of the IIoT. It includes pervasive data connectivity between applications running anywhere in the system and helps with the management and orchestration of the applications and compute nodes. IIoT systems with Monitoring use cases can use simple sensors wired to gateways that in turn send data to backend analytics with simple protocols like MQTT. Fog computing is overkill in these applications unless it’s a particularly large system.
With edge compute and increased data connectivity, fog computing is ideal for IIoT systems implementing Optimization use cases and is absolutely necessary for Autonomy. Elastic compute frameworks like Linux-based containers simplify the deployment and management of complex AI applications, and connectivity standards like DDS underpin the mesh-like data connectivity needed for Optimization and Autonomy use cases.
Consider an offshore wind farm with one hundred very large wind turbines (with a nacelle the size of a bus and blades nearing 100 meters long). There are layers of compute and pervasive data communications needed to implement this system. Inside each turbine is a databus sharing data between analytics and control applications running across multiple compute nodes. Numerous sensors gather data on the state of the machine and its subsystems, and control signals activate actuators to manage the turbine operation.
Above the turbine, from a compute layer perspective, is a databus and various applications that manage the health and operations of the entire one-hundred-turbine farm. A turbine on the edge of the farm may detect a sudden change in wind direction or speed and quickly alert turbines downwind to change direction or blade pitch to compensate. In this manner the power output of the entire farm can be kept stable. Finally, data from the farm is passed back and forth with a backend control center that manages multiple wind farms. Applications in the control center provide dashboard level information for human supervisors, connect with outside services like weather, integrate business systems, and provide long-term prognostics and analytics. There are Monitoring level use cases like predictive maintenance implemented in this complex, autonomy-level IIoT system, but they are implemented on the underlying fog compute infrastructure.