I’ve made the argument before that the Internet of Things is nothing new. In the logistics space, for example, we’ve been taking RF scans and using that data to improve warehouse processes since at least 1975 when McHugh Freeman, an early warehouse management system supplier, began business. Internet protocols make it easier to communicate that sensor data to applications. This is a big advance, but I don’t think we are seeing fundamentally new types of applications; just better and cheaper ones.
The process manufacturing industry has been at this for a long time as well. There was an interesting article written in Forbes.com by Peter Zornio, the Chief Strategic Officer of Emerson Process Management. Emerson Process Management is a leader in in helping businesses automate their production, processing and distribution processes in several process industries. Process industries make their products by using formulas and manufacturing recipes. Process manufacturing include industries like Chemicals and Oil & Gas.
Process Manufacturing is Complex
(Photographer: Andrey Rudakov/Bloomberg via Getty Images)
I’d like to review some of the points made in this article because I think they provide insight to not just how IoT could evolve in process industries, but also broader insights that also apply to logistics professionals.
According to Mr. Zorino, in the process industries, IoT has been used for the past 25 years – “ever since the development of microprocessors and network-based instruments… Many of these enterprises work with products and materials that can be readily measured as they flow through pipes.”
Monitoring the performance of individual pieces of equipment with sensors and wireless communications is both relatively simple and can provide good ROI based on better machine uptime. The payback is only getting better. “Driving these changes are increasingly inexpensive sensors, the maturation of the Internet, and the beginnings of enhanced analytics. Sensors are now so cheap and easy to install – no drilling, no screws, practically “lick ‘n’ stick” in many cases – that we now refer to ‘pervasive sensing,’ and use them especially in locations described within our industries as the four ‘d’s’ – dull, dangerous, dirty and distant.”
In the process industries, these changes are leading to an expansion of IoT’s role. “Until recently, only process control and safety functions were monitored and connected. Now, with costs plummeting, areas like plant and equipment reliability, energy management, personnel safety and environmental compliance are increasingly being addressed.”
But the next step forward – monitoring an entire process or operation – is much bigger than a focus on individual assets.
Big assets like industrial plants are a lot like human beings: They’re complicated, mercurial, and different – every single one. In any given plant, the equipment keeps changing as it wears or gets replaced. The supervisor who ran the operations yesterday is off today and has been replaced by one who runs things differently.
Even the weather has an impact; when a warm front blows in, performance changes.
As a result, modeling most complex processes or operations requires subject matter experts with a really deep and comprehensive understanding of how everything works, separately and together. Analyzing the resulting data is no easy task either. It’s often both science and art – not unlike a doctor’s interpretation of a patient’s chart and own words. These kinds of interpretive skills do not grow on trees – and certainly not within most companies.
The upshot: Unless they’re willing to outsource the modeling of their operations as well as the collection and interpretation of their data, many industries will be limited in what they can derive from the IoT by their own in-house skills – at least until applications can be made more sophisticated.
The most complex models we have in logistics are probably supply chain network models, provided by software companies like LlamaSoft and JDA, which help us understand where are warehouses and factories should be located and how we should engage in transportation routing.
My process automation colleagues argue that models in the process industries are far more real time, which adds additional layers of complexity. These models have to be adopting to changing conditions on an ongoing basis. For example, the models change set points across a variety of machines and instruments so that when “a warm front blows in” yields can still be optimized. Or if one set point drifts outside of a specified control limit, other set points change to compensate for that deviation.
One conclusion, is that to fully leverage IoT across extended supply chains we may find ourselves building far more real time models than we have ever seen before. As one small example, during last year’s Long Beach labor strikes, shippers shifted volume to other ports. They might book on a ship headed to Vancouver only to find out a slew of other shippers had done the same. Delays by port were changing on a day by day basis. What if models of port throughput were used and combined with AIS Vessel tracking data? If the model understood how many ships were sitting outside a port, how many were headed to that port, the relative sizes of those ships, and the throughput capacity of the port, shippers could have made much better port scheduling decisions.
A key theme at the 20th Annual ARC Industry Forum, taking place next week in Orlando, will be the Internet of Things. I will be moderating a panel called Building a Supply Chain Control Tower for IoT Visibility. Our panelists have built very advanced and interesting visibility solutions that leverage IoT. The panelists are Jeff Tazelaar, Global Leader – Auto ID, RFID, GPS and Telemetry Expertise Center at Dow Chemical; Jan Theissen, Director Strategy and Methods, Global Purchasing and Materials Management at AGCO; and Tom Moroney, VP Wells and Facilities Technologies at Shell. Let me know if you will be at our forum and would like to meet up with me (sbanker@arcweb.com).
Cimetrics says
Agree. We’ve been doing big data analytics for more than 15 years but just now it’s trendy and is called Internet of things.
Dennis Palko-Columbus Ohio says
Back in 1983, I had started a personal movement around having all the parts collaborating, speaking, and sharing. Teaching others that like the human body, systems can/need to change by what your exposing it to ie. bad food reactions. Antibodies come out as called for. Insulin is produced in amounts as needed. How indeed we can continue to fashion what has been given us in our own body, as a sample or nemplate of how to build other successful processes.
And how we have produced outside pieces parts to repair those that get broken or worn out. If it is “Robot-able” or 3D-able, are new terms I have started using more regularly. So maybe we call it “The Body of things”.
Dennis Palko
Tim says
if the scope of iot is logistics and process control, then cheaper sensors/actuators will start out as a relatively straightforward extension. However, even for such a small problem domain, there are some obvious future challenges, such as:
– the economics of installing and configuring the sensors/actuators will need to move with the price of the assets
– dynamic models will introduce issues of stability and opportunities for gaming the system for cross organisational data
– existing security assumptions will break, e.g. the more that’s connected, the more juicy the target and the larger the attack surface.
For other IoT domains, such as consumer facing applications (e.g. my fridge, hvac, solar panels and diary cooperate), there are additional issues about:
– integration (at many abstraction levels),
– security (privacy) for aggregated data (used to build and calibrate the models that become the intelligence of iot),
– delegation of authority (e.g. lend the lawn mower to a neighbour, who can allow his son to cut his lawn),
– the scale is much larger and more complex (a dimension of integration is the user, who may consume from many suppliers, but should have one view, so that sensors in the car can be integrated with worn and at-home ‘things’),
– requirements cannot be predicted and will depend on unexpected combinations of Things, so build/test/deploy/measure cycles become rolling processes,
– the system should behave as the untrained user expects,
– the challenges around configuration management, testing is hard and legacy hardware are even more complex than they were when PCs were rolled out into enterprises,
– hardware reliability and consistency of sensors is low,
– validation and testing of integrated sets of Things presents a combinatorial challenge,
– the distributed computing models are challenging to write reliable code for, more so when connectivity changes rapidly,
– emergent behaviour of combinations cannot be predicted, and
– the economics of ‘big data’ for these types of usecase are less trivial: much data just aren’t worth keeping.