Monthly Archives: July 2015

Minuteness and power — learning to perceive IoT

physicsrockstars3.003

for those about to do physics, we salute you

Perceiving, working with, and managing risk around the Internet of Things (IoT) requires a new way of thinking. The vast and growing numbers of sensing-networked-computing devices, the great variability of types of those devices, the ambient-almost-invisible nature of many of the devices, and the emergent and unpredictable behavior from networked effects puts us into a whole new world. I don’t believe that, at present, we have the mindset or tools to even broadly perceive this cultural and technological change, much less manage risk around it.  In many ways, we don’t know where to start. The good news, though, is that there is historical precedent with changing how we perceive the world around us.

Physics rock stars of the 2nd millennium

While a lesser known name, one of the top 3 rock star physicists of the past half millennium or so is James Clerk Maxwell, sharing the stage with Isaac Newton and Albert Einstein. While known primarily for his contributions to electromagnetic theory, he made contributions in several other areas as well to include governorscolor photography, the kinetic theory of gases.  It was this latter work on the distribution of particle velocities in a gas that helped evolve the notion of entropy and led to bold new ways of perceiving the world around us. And he did all of this in his short 48 years of life.

In his book, Maxwell’s Demon: Why Warmth Disperses and Time Passes, author and physics professor Van Baeyer describes Maxwell as particularly adept at marrying theory with what was observable in the lab or elsewhere in the natural world. As Maxwell worked to come to terms with the work that Joule, Rumford, and Clausius had done toward the development of the 2nd law of thermodynamics, he realized that he was entering a world that he could no longer directly observe. This was a world of unseeable atoms and molecules that could not be counted and whose individual behavior could not be known. As Von Baeyer describes,

“What troubled him was the fact that molecules were too small to be seen, too numerous to be accounted for individually, and too quick to be captured … he no longer had the control that he had become accustomed to …”

In short, he had to abandon his expectations of certainty and find a new way to understand this realm.

To me, this has some similarities to what we’re dealing with with IoT:

  • molecules were many IoT devices are too small to be seen
  • [molecules were] IoT devices are too numerous to be accounted for individually
  • [molecules were] too quick to be captured individual IoT devices change state too often to track

The increasingly frequent lack of visibility of IoT devices, the large numbers of devices, the potentially nondeterministic behavior of individual devices, and network effects create the high probability of emergent and completely unpredictable effects and outcomes.

“… building up in secret the forms of visible things …”

In Maxwell’s own words, he says,

“I have been carried … into the sanctuary of minuteness and of power, where molecules obey the laws of their existence, clash together in fierce collision, or grapple in yet more fierce embrace, building up in secret the forms of visible things.”

Of this, in particular, I believe that the phrase,

“… molecules … clash together in fierce collision or grapple in yet more fierce embrace … building up in secret the forms of visible things …”

speaks to emergent, unpredictable, and possibly even unknowable things.  And I’m not at all saying this is all bad on its own but rather that we’ll have to look at it, perceive it, and attempt to manage it differently than we have in the past.

Minuteness and power

In none of this am I trying to suggest that the Internet of Things is exactly like gas molecules in a balloon or that all of the properties and formulas of molecules in a gas (or wherever) apply to all systems of IoT devices. However, the change in thinking and perception of the world around us that is required might be similar. The numbers and variability of IoT devices are huge and rapidly growing, the behavior of any particular device is uncertain and possibly not knowable, and network effects will contribute to that unpredictability of systems as a whole. I’m hoping that in the onrush of excitement about IoT, as well as the unseen subtlety and nuance of IoT, that we’ll acknowledge and respect the effects of that minuteness and power and work to adjust our perceptions and approaches to managing risk.

Who’s building your IoT data pipeline?

data pipelines require labor too

IoT data pipelines require labor too

There is a lot of excitement around IoT systems for companies, institutions, and governments that sense, aggregate, and publish useful, previously unknown information via dashboards, visualizations, and reporting.  In particular, there has been much focus on the IoT endpoints that sense energy and environmental quantities and qualities in a multitude of ways. Similarly, everybody and their brother has a dashboard to sell you. And while visualization tools are not yet as numerous, they too are growing in number and availability. What is easy to miss, though, in IoT sensor and reporting deployments is that pipeline of data collection, aggregation, processing, and reporting/visualization. How does the data get from all of those sensors and processes along the way so that it can be reported or visualized? This part can be more difficult than it initially appears.

Getting from the many Point A’s to Point B

While ‘big data’, analysis, and visualization have been around for a few years and continue to evolve, the new possibilities brought on by IoT sensing devices have been most recent news.  Getting an individual’s IoT data from the point of sensing to a dashboard or similar reporting tool is generally not an issue. This is because data is coming from only one (or a few) sensing points and will (in theory) only be used by that individual. However, for companies, institutions, and governments that seek to leverage IoT sensing and aggregating systems to bring about increased operating effectiveness and ultimately cost savings, this is not a trivial task. Reliably and continuously collecting data from hundreds, thousands, or more points, is not a slam dunk.

Companies and governments typically have pre-existing network and computing infrastructure systems. For these new IoT systems to work, the many sensing devices (IoT endpoints) need to be installed by competent and trusted professionals. Further, that data needs to be collected and aggregated by a system or device that knows how to talk to the endpoints. After that (possibly before), the data needs to be processed to handle sensing anomalies across the array of sensors and from there, the creation of operational/system health data and indicators is highly desirable so that the new IoT system can be monitored and maintained. Finally, data analysis and massaging is completed so that the data can be published in a dashboard, report, or visualization. The question is, who does this work? Who connects these dots and maintains that connection?

who supplies the labor to build the pipeline?

who supplies the labor to build the pipeline?

The supplier of the IoT endpoint devices won’t be the ones to build this data pipeline. Similarly, the provider for the visualization/reporting technology won’t build the pipeline. It probably will default to the company or government that is purchasing the new IoT technology to build and maintain that data pipeline. In turn, to meet the additional demand on effort, this means that the labor will need to be contracted out or diverted from internal resources, both of which incur costs, whether direct cost or in opportunity cost.

Patching IoT endpoint devices – Surprise! it probably won’t get done

Additional effects of implementing large numbers of IoT devices and maintaining the health of the same include:

  1. the requirement to patch the devices, or
  2. accept the risk of unpatched devices, or
  3. some hybrid of the two.

Unless the IoT endpoint vendor supplies the ability to automatically patch endpoint devices in a way that works on your network, those devices probably won’t get patched.  From a risk management point of view, probably the best approach is to identify the highest risk endpoint devices and try to keep those patched. But I suspect even that will be difficult to accomplish.

Also, as endpoint devices become increasingly complicated and have richer feature sets to remain competitive, they have increased ability to do more damage to your network and assets or those of others on the Internet. Any one of the above options increase cost to the organization and yet that cost typically goes unseen.

Labor leaks

Anticipating labor requirements along the IoT data pipeline is critical for IoT system success. However, this part is often not seen and leaks away. We tend to get caught up in the IoT devices at the beginning of the data pipeline and the fancy dashboards at the end and forget about the hard work of building a quality pipeline in the middle. In our defense, this is where the bulk of the marketing and sales efforts are — at each end of the pipeline. This effort of building a reliable, secure, and continuous data pipeline is a part of the socket concept that I mentioned in an earlier post, Systems in the Seam – Shortcomings in IoT systems implementation.

With rapidly evolving new sensing technologies and new ways to integrate and represent data, we are in an exciting time and have the potential to be more productive, profitable, safe, and efficient. However, it’s not all magical — the need for labor has not gone away. The requirement to connect the dots, the devices and points along the pipeline, is still there. If we don’t capture this component, our ROI from IoT systems investments will never be what we hoped it would be.