Internet of Things and AI Will Form Fourth Industrial Revolution - Coffee with CIS - Latest News & Articles

Internet of Things and AI Will Form Fourth Industrial Revolution

IoT has come on the factory floor with the power of Kool-Aid Man exploding through walls.

Big data, analytics, and machine learning are starting to feel like anonymous small business words, but they are not only overused abstract concepts--those buzzwords represent huge changes in a lot of the technology we handle in our daily lives. A few of those changes have been for the better, which makes our interaction with all machines and information powerful and more natural. Others have helped companies tap into consumers' relationships, behaviors, places and innermost thoughts in strong and disturbing ways. And the technologies have left a mark on everything to our homes from our highways.

It is no surprise that the idea of "info about everything" has been aggressively applied to manufacturing contexts. Clever, economical, sensor-laden apparatus have been altering the industrial world over the past decade, As they transformed consumer goods. The "Internet of Things" has arrived on the mill floor with all the power of a giant digital Kool-Aid Man exploding through a cinderblock wall.

Tagged as "Industry 4.0," (hey, at least it's better than "Internet of Things"), this fourth industrial revolution has been unfolding over the last ten years with fits and starts--chiefly due to the huge cultural and structural differences between the information technology that fuels the shift and also the "operational technologies" which were in the heart of industrial automation for decades.

Much like other unions of technologies and artificial intelligence (or the restricted learning algorithms we are all currently calling "artificial intelligence"), the potential payoffs of Industry 4.0 are huge. Organizations are currently seeing more precise manufacturing with lowered operational costs; less downtime in the distribution chain because of intellect and maintenance; and injuries on factory floors because of more adaptable equipment. And out of the mill, other industries could benefit from using a nervous system of detectors, analytics to process "lakes" of information, and just-in-time responses to emergent issues--aviation, energy, logistics, and lots of other businesses that rely on dependable, predictable things could also get a boost.

But the newest way comes with significant challenges, not the least of which is the security and resilience of these unmanned nervous systems stitching all of this fresh magical together. When human safety is on the line the security of workers and people who reside in proximity to industrial sites -- those concerns can't be set aside as operating system patches or application upgrades.

And then there's always that entire "robots are stealing our jobs" thing. (The fact is much more complex --and we'll touch on it after this week).

Sensors and Sensibility

The term "Industry 4.0" was chased by Acatech (the German government's academy of engineering sciences) in a 2011 national roadmap for use of embedded systems technologies. Intended as a means to describe industrial"digitization," the word was applied to mark the change away from easy automation using largely standalone industrial robots toward networked "cyber-physical systems"--information-based orchestration between systems and the people working with them, dependent on a variety of sensor and human inputs.

As a promotional record to its roadmap in the German Federal Ministry of Education and Research said, "Machines that communicate with each other, inform each other about defects in the production process, identify and re-order scarce material inventories... this really is the vision behind Business 4.0."

From the Industry 4.0 future, clever factories utilizing additive production --such as 3D printing via selective laser sintering--and other computer-driven manufacturing systems can adaptively manufacture parts on demand, direct from digital designs. Sensors keep track of needed elements and order them based on patterns of need and other algorithmic choice trees, taking "just-in-time" production to a new level of optimization. Sensors and systems that are machine-learning-driven track the standard of elements with more consistency and precision than exhausted and potentially tired humans on the product line. Industrial robots work in synchronization with all the humans handling more tasks--or replace them. Entire supply chains can pivot with the introduction of fluctuations in consumption, new products, and economic fluctuation. And the machines may tell people when to educate folks better ways to arrange the line -- all because of intelligence processing the massive amounts of data created by the procedure or the machines need to be repaired before they break.

That vision has pushed a 1.15 billion Euro (roughly $1.3 billion) European Union effort called the European Factories of the Future Research Association.

Similar"factory of the future" efforts are funded by the US government--in particular, from the Department of Defense, which sees the technology as crucial to the defense industrial base.

The Defense Advanced Research Projects Agency (DARPA) has utilized research applications such as the Flexible Vehicle Make job to seed growth of advanced, information-integrated manufacturing projects and proceeds to look at Industry 4.0-enabling technologies such as successful human-machine teaming (the ability of machines to accommodate and operate side by side with humans as partners rather than as tools) and smart supply chain systems based on artificial intelligence technology--an attempt known as LogX. Researchers at MITRE Corporation's Human-Machine Social Systems (HMSS) Lab have also been focusing on ways to improve how robotic systems interact with humans.

As part of the work, MITRE has partnered with several robotics startups--such as American Robotics, that has developed a fully automatic drone platform for precision agriculture. Called Scout, the machine is a unit which sits adjacent to areas. A farmer must do is a program in drone flight times, along with the AI handles drone flight planning and managing the flight itself, in addition to the processing and collection of imagery and information, uploading all to the cloud because it moves.

That amount of freedom allows farmers to just look at data about crop health and other metrics on their personal devices and then act upon that information --selectively applying pesticides, pesticides, or additional fertilizers if needed. With some machine learning juice, those are tasks that may be passed off to robotic farming gear or other drones once rules and patterns of their use are established.

Scout mirrors the way human-machine teaming could work in the mill --using autonomous machines passing information to humans via augmented vision or other displays, letting humans make decisions based on their abilities and understanding of their domain, then having machines and humans act upon the required tasks collectively. But that amount of integration is still in its infancy.

Every sensor tells a story

One place where an embryonic type of human-machine teaming takes place is in the world of retail: Walmart uses robots to scan store shelves for inventory levels and has automated truck unloading (via a system called the "Quick Unloader") at several stores--utilizing detectors and conveyor belts to sort shipments onto stocking carts. And autonomous systems have already taken over the part of warehouse"picking" in Amazon, working with people to retrieve and send buys.

Conversely, an element of Industry 4.0 that has evolved past the embryonic phase is the use of sensor data to induce plant operations--particularly for the task of predictive maintenance. Equipment downtime that is unexpected is the bane of all industries, especially when the failure of a part leads to the entire collapse of an asset that is expensive.

By some estimates, about 80 percent of the time currently spent on industrial maintenance is purely reactive--time spent fixing things that bankrupt. And nearly half of unscheduled downtime in industrial systems is the result of equipment failures with equipment in its life cycle. Having the ability to predict failures and program care or replacement of hardware when it will have less effect on operations is the Holy Grail of plant operators.

It is also a goal that business has been chasing for a very long time. The concept of computerized maintenance management systems (CMMS) has been around in some form since the 1960s when early implementations were assembled around mainframes. But CMMS has practically always been a significantly manual process, relying upon maintenance reports and information collected and fed into computers by humans--not grabbing the full breadth and depth of sensor information being generated by increasingly instrumented (and pricey ) industrial systems.

Doing something with that information to predict and prevent system failures is becoming increasingly significant. As explained by MathWorks' Business Manager Philipp Wallner, the mounting urgency is because of " [T] he growing sophistication that we're seeing with electronic components in resources and devices, and the growing amount of software inside them." And as industrial systems provide more data about their operations on the plant floor or in the area, that data needs to be processed to be useful to the operator--not only for calling when maintenance should occur but to maximize the way equipment is operated.

Predictive maintenance programs --for example IBM's Maximo, General Electric's Predix and MATLAB Predictive Maintenance Toolbox--are an effort to harness machine learning and simulation models to make that level of smartness possible. "Predictive maintenance is the leading application in taking advantage of that data within the area," Wallner said, "particularly in places where components are extremely costly, such as wind energy. For equipment operators, it's a no brainer."

It is a harder sell to gear makers, in some instances --particularly because implementing the concept frequently involves providing comprehensive (and so proprietary and deeply guarded) modeling data for their merchandise. And some gear manufacturers might observe maintenance as a threat. But some companies have begun building their particular lines of businesses based on predictive care --such as General Electric.

GE initially used Predix for internal purposes, such as planning maintenance of its own fleet of jet engines--using "data lakes" of engine telemetry readings to help determine when to schedule aircraft for maintenance to minimize its effects on GE's customers. With a library of information for each piece of supported equipment along with also a stream of sensor data, GE Software's data scientists assembled models--"digital twins" of the systems themselves--that may be used to detect early signals of part wear before things progress to part failure.

But GE has also implemented the same technique to other, less mechanical inputs--including using models for weather and shrub growth data to predict when trees could turn into a threat to Quebec Hydro's electricity lines. And the role of Predix has expanded into modeling power plant output the energy economy and other components to give electricity trader's a tool to help them make financial decisions. Predictive systems are having a direct impact on logistics in Amazon, which utilizes predictive models to the pre-staging of goods of power Amazon Prime.

There are other approaches to prognostication, some of which bleed into managing the overall functioning of the plant itself. The Maximo APM according to IBM's Watson IoT platform of IBM --builds its baseline from the equipment on the factory floor from sensors and other data to refine its algorithms. Another Maximo package focuses on identifying process bottlenecks and other problems that can drive operation expenses up. (L'Oreal has had success implementing Maximo along with the Watson IoT platform as part of its Industry 4.0 effort.)

Bridging the gap between knowledge and data

But there are several challenges that firms face in creating predictive systems powerful --the old calculating proverb of "garbage in, garbage out" unquestionably still applies. MathWorks' Wallner noted that the main obstacle is bridging the gap between the two knowledge domains needed to create predictive maintenance work. "How do you enable the domain specialists to work closely together with the information scientists, or have one person do both? That's quite often the tension," Wallner explained. "You have two silos of understanding, with one group using the pure information scientists and the other having domain experts with knowledge of the gear they construct, not talking to each other." The tools to create the models must facilitate collaboration between those two camps, he said.

Even when there's a fantastic collaboration, there is another difficulty for several predictive models: while there's plenty of data available, most of it's about regular operations instead of failures (that is how it should be--a smoothly functioning plant shouldn't be suffering a lot of failures). "Often there's not enough failure data to train calculations," Wallner said. "How do you train algorithms which need lots of data with a lack of failure data?"

In some cases, manufacturers execute "conduct to fail" tests to collect data about how their gear acts as components start to push out of their normal operating parameters. But"run to fail" tests involve creating failures, and purposefully breaking costly and complicated manufacturing hardware is rare. "You do not wish to conduct a situation in which you split your wind turbine," Wallner explained. "It's too expensive and dangerous." In such cases, the producers' domain experts may have built simulation models to check such requirements computationally--and those models can be integrated with a little bit of adaptation into predictive maintenance methods.

The previous gap to be bridged is how and where to process device data. In some cases, for safety or speed of response, the data from gear needs to be examined very close to the industrial equipment itself--even having algorithms operate on the embedded chip or procedural logic controller (PLC) that drives the machine. Other parts of analysis which are real-time but not might run on hardware near. However, the predictive investigation usually requires a lot of computing power and access to lots of supporting data, and software running in a cloud computing system or a company's data center are typically meant by this. The two GE's and IBM's predictive systems run in the cloud, while MathWorks' algorithms can be run locally or in different clouds (such as GE's Predix cloud).

In some cases, employers may run combinations of each the above-mentioned strategies or start off with "edge" systems managing predictions until they're more comfortable with using cloud solutions. "It is logical to get a number of the algorithm as near as possible to the gear, to do things like data filtering," clarified Wallner, "but possess the predictive algorithm in the cloud" This gets you the very best of all worlds.

The Hazards of digitizing

While there's vast potential in the mix of information technology and operational technology that makes Business 4.0 concepts like predictive maintenance possible, realizing that potential does not come without risks--especially if appropriate security measures are not taken. While there were few commendable cyber-threats to industrial systems, new threats are emerging--including the "Triton" malware attacks that aimed to disable security systems at several industrial sites and the "Black Energy" cyber-attacks from Ukraine that temporarily took parts of the energy grid down.

Predictive modeling systems pose a lesser threat than those having direct control over equipment, but there's still reason for concern about possible access to raw analytics information from the factory floor. Such information won't immediately yield the blueprints for proprietary production components, but if it's subject to "big data" analytics techniques it might give an adversary (or a competitor) a wealth of information regarding the patterns of fabricating operations, plant efficiency, and production process details that might be used for other purposes--including outright industrial espionage. Officials from German Ministry of Education and Research noted from the ministry's business 4.0 report that "The most prevalent concern, especially among [subject matter specialists ], is that Industry 4.0's information isn't protected, business secrets are missing, and closely guarded companies' knowledge is revealed to the rivalry."

There are far greater threats, however, that may come from mixing operational technology with traditional IT, especially as autonomous systems are connected to existing industrial networks. Ransomware and other malware that is harmful could bring down control networks, as it did in Baltimore when a ransomware attack destroyed information from red light and speed camera sensors and closed down the CityWatch camera system. And there's the threat that controls themselves loathed, or subverted, could be targeted and manipulated.

Much of what has shielded operational technology from attacks so far has been "security through obscurity." Industrial control protocols vary widely across gear producers. But blending the Web or other information technology and Things with operational tech will call for a great deal more focus on security-especially in programs. A malicious attack on security systems may have "cyber-physical" ramifications beyond lost productivity or broken equipment in the chemical, energy, and other industries where a failure may place the general public at risk.

GE and many others have tried to shield networks by isolating control systems from sensor data networks and by placing firewalls facing older systems to block unwanted traffic. Industrial cloud computing is usually partitioned from the Internet by private networks and other measures. But before industries hand over more tasks to hardware robots and autonomous software, a complete evaluation of the safety for commands and information flowing to and from them is a fantastic idea.

We are going to be looking at a number of these issues during this week--stay tuned.