Predictive analytics for maintenance and business intelligence (BI) is one of the most promising goals driving investment in digital transformation and the industrial internet of things (IIoT). However, the artificial intelligence engines underlying these systems require large volumes of data for training, validation, and testing.
In response to this and other Industry 4.0 drivers, industrial control systems are evolving to facilitate heavier data processing and transmission. The result is a new class of operational technology (OT) products that function as first-class citizens on information technology (IT) networks, making it possible to move data from the field or plant floor directly to on-premises or cloud-based databases and applications.
Let’s look at some of the limiting factors in the current generation of industrial controllers and devices as well as the design principles underlying the latest advances.
Limiting Factors in Traditional Operational Technology
Legacy control systems, like programmable logic controllers (PLCs), have a limited range of communication. They rely on point-to-point connections designed mostly to move data from slave devices to a single master control device or application. More sophisticated data processing and distribution require additional hardware and software, typically PCs or servers providing open platform communications (OPC) connectivity and interfaces to other applications, databases, and networks (Figure 1).
These limitations inhibit large-scale data acquisition of the kind needed to support new AI-based projects. Point-to-point communications require each new data-consuming application to create additional connections to field devices, and the traditional communication protocols they use typically gather data through an inefficient polling process. Both of these attributes eat up bandwidth without yielding new data; and the additional hardware and software required to build connections between disparate network segments and external resources increase the complexity of the system, adding time and cost to integration projects and throughout the lifecycle of the system.
Forward-looking users are moving beyond these limitations using methods drawn from consumer and enterprise IT, where networks have already evolved to handle the demands of big data.
Reducing Complexity with Edge Computing
In response to the explosion in the number of smartphones and other connected devices, as well as the general growth of e-commerce in the last decade, experts in the consumer and enterprise IT space have developed a new approach to handling large data volumes, called edge computing. This distributed architecture places lightweight computing resources at the local network level that help to process and transmit data at its source, improving local responsiveness and increasing the efficiency of data transfer to central computing resources and data consumers.
What this means for industrial equipment is that it no longer requires a deep technology hierarchy to move data across the organization. Industrial edge computing devices can process data from sensors and transmitters in the field, then send it directly to on-premise or cloud-based applications and databases without intermediary hardware or software (Figure 2). This approach frees up network resources to handle larger volumes of data, easily integrate new devices, and reduce the burden of normalizing large data sets.
Many edge computing offerings are appearing in the industrial automation space, most commonly in the form of edge I/O and communication gateways. These provide a variety of methods for connecting devices and equipment, including traditional wired sensors, to form unified data networks.
Edge programmable industrial controllers (EPICs) are a more powerful option that combines the real-time control and I/O sensing capabilities of PLCs and PACs with a broad array of connectivity tools. These tools include efficient communication options, like MQTT (formerly MQ Telemetry Transport), which supports one-to-many connectivity using report-on-change data delivery. In addition to traditional programming languages for control, EPICs also include languages like Node-RED, designed for combining, transforming, and delivering data from many different sources.
Importantly, edge computing devices also embed modern security standards to protect equipment from outside intrusion and safely transport data across public networks. Embedded features like user authentication, device firewalls, and data encryption can reinforce existing network protections or be used to further consolidate the technology stack. A feature rarely found in traditional automation controllers, it also provides a means of protecting existing, unsecured equipment and data.
Industrial control is evolving to meet the data demands of modern applications. EPICs combine real-time control and I/O sensing with IT-oriented networking, security, and data processing capabilities.
Building an OT solution on edge computing lets engineers design control networks that deliver more data to systems for predictive maintenance and BI without the burden of additional connectivity layers. In combination with IoT-oriented technologies like MQTT and Node-RED, more data can be generated and consumed more securely than with traditional technologies.