Tag Archives: uncertainty

Minuteness and power — learning to perceive IoT

physicsrockstars3.003

for those about to do physics, we salute you

Perceiving, working with, and managing risk around the Internet of Things (IoT) requires a new way of thinking. The vast and growing numbers of sensing-networked-computing devices, the great variability of types of those devices, the ambient-almost-invisible nature of many of the devices, and the emergent and unpredictable behavior from networked effects puts us into a whole new world. I don’t believe that, at present, we have the mindset or tools to even broadly perceive this cultural and technological change, much less manage risk around it.  In many ways, we don’t know where to start. The good news, though, is that there is historical precedent with changing how we perceive the world around us.

Physics rock stars of the 2nd millennium

While a lesser known name, one of the top 3 rock star physicists of the past half millennium or so is James Clerk Maxwell, sharing the stage with Isaac Newton and Albert Einstein. While known primarily for his contributions to electromagnetic theory, he made contributions in several other areas as well to include governorscolor photography, the kinetic theory of gases.  It was this latter work on the distribution of particle velocities in a gas that helped evolve the notion of entropy and led to bold new ways of perceiving the world around us. And he did all of this in his short 48 years of life.

In his book, Maxwell’s Demon: Why Warmth Disperses and Time Passes, author and physics professor Van Baeyer describes Maxwell as particularly adept at marrying theory with what was observable in the lab or elsewhere in the natural world. As Maxwell worked to come to terms with the work that Joule, Rumford, and Clausius had done toward the development of the 2nd law of thermodynamics, he realized that he was entering a world that he could no longer directly observe. This was a world of unseeable atoms and molecules that could not be counted and whose individual behavior could not be known. As Von Baeyer describes,

“What troubled him was the fact that molecules were too small to be seen, too numerous to be accounted for individually, and too quick to be captured … he no longer had the control that he had become accustomed to …”

In short, he had to abandon his expectations of certainty and find a new way to understand this realm.

To me, this has some similarities to what we’re dealing with with IoT:

  • molecules were many IoT devices are too small to be seen
  • [molecules were] IoT devices are too numerous to be accounted for individually
  • [molecules were] too quick to be captured individual IoT devices change state too often to track

The increasingly frequent lack of visibility of IoT devices, the large numbers of devices, the potentially nondeterministic behavior of individual devices, and network effects create the high probability of emergent and completely unpredictable effects and outcomes.

“… building up in secret the forms of visible things …”

In Maxwell’s own words, he says,

“I have been carried … into the sanctuary of minuteness and of power, where molecules obey the laws of their existence, clash together in fierce collision, or grapple in yet more fierce embrace, building up in secret the forms of visible things.”

Of this, in particular, I believe that the phrase,

“… molecules … clash together in fierce collision or grapple in yet more fierce embrace … building up in secret the forms of visible things …”

speaks to emergent, unpredictable, and possibly even unknowable things.  And I’m not at all saying this is all bad on its own but rather that we’ll have to look at it, perceive it, and attempt to manage it differently than we have in the past.

Minuteness and power

In none of this am I trying to suggest that the Internet of Things is exactly like gas molecules in a balloon or that all of the properties and formulas of molecules in a gas (or wherever) apply to all systems of IoT devices. However, the change in thinking and perception of the world around us that is required might be similar. The numbers and variability of IoT devices are huge and rapidly growing, the behavior of any particular device is uncertain and possibly not knowable, and network effects will contribute to that unpredictability of systems as a whole. I’m hoping that in the onrush of excitement about IoT, as well as the unseen subtlety and nuance of IoT, that we’ll acknowledge and respect the effects of that minuteness and power and work to adjust our perceptions and approaches to managing risk.

In search of a denominator

churchill2

Winston Churchill 1942

Winston Churchill is frequently quoted as saying,  ‘Democracy is the worst form of government, except for all the other ones.’  I feel a similar thing might be said about risk management: ‘Risk management is the worst form of predictive analysis for making IT, Information Management, and cybersecurity decisions, except for all the other ones.’

Because there is so much complexity and so many factors and systems to consider in IT, Information Management, and cybersecurity, we resort to risk management as a technique to help us make decisions. As we consider options and problem-solve, we can’t logically follow every possible approach and solution through and analyze each one on its relative merits.  There are simply way too many.  So we try to group things into buckets of like types and analyze them very generally with estimated likelihoods and estimated impacts.

We resort to risk management methods, because we know so little about the merits of future options that we have to ‘resort to’ this guessing game.  That said, it’s the best game in town.  We have to remind ourselves that the reason that we are doing risk management to begin with is that we know so little

In some ways, it is a system of reasoned guesses — we take our best stabs at the likelihood of events and what we think the impacts of  events might be. We accept the imprecision and approximations of risk management because we have no other better alternatives.

Approximate approximations

It gets worse. In the practice of managing IT and cybersecurity risk, we find that we have trouble doing the things that allow us to make those approximations that are to guide us.  Our approximations are approximations.

Managing risk is about establishing fractions. In theory we establish the domain over which we intend to manage risk, much like a denominator, and we consider various subsets of that domain as we group things into buckets for simpler analysis, the numerator.  The denominator might be all of the possible events in a particular domain, or it might be all of the pieces of equipment upon which we want to reflect, or maybe it’s all of the users or another measure.

When we manage risk, we want to do something like this:

  1. Identify the stuff, the domain — what’s the scope of things that we’re concerned about? How many things are we talking about? Where are these things? Count these things
  2. Figure out how we want to quantify the risk of that stuff — how do we figure out the likelihood of an event, how do we want to quantify the impact?
  3. Do that work — quantify it, prioritize it
  4. Come up with a plan for mitigating the risks that we’ve identified, quantified, & prioritized
  5. Execute that mitigation (do that work)
  6. Measure the outcome (we did some work to make things less bad; we’re we successful at making things less bad?)

The problem is that in this realm of increasingly complex problems, the 1st step, the one in which we establish the denominator, doesn’t get done. We’re having a heck of a time with #1 — identifying and counting the stuff — and this is supposed to be the easy step.  Often we’re surprised by this. In fact, sometimes we don’t even realize that we’re failing at step #1 — because it’s supposed to be the easy part of the process.  However, the effects of BYOD, hundreds of new SaaS services that don’t require substantial capital investment, regular fluctuations in users, evolving and unclear organizational hierarchies, and other factors all contribute to this inability to establish the denominator.  A robust denominator, one that fully establishes our domain, is missing.

Solar flares and cyberattacks

solarflareUnlike industries such as finance, shipping, and some health sectors, which can have sometimes hundreds of years of data to analyze, the IT, Information Management, and cybersecurity industries are very new and are evolving very rapidly.  There is not the option of substantial historical data to assist with predictive analysis. 

However, even firms such as Lloyd’s of London must deal with emerging risk for which there is little to no historical data.  Speaking on attempting to underwrite risks from solar flare damages (of all things), emerging risks manager, Neil Smith says, “In the future types of cover may be developed but the challenge is the size of the risk and getting suitable data to be able to quantify it … The challenge for underwriters of any new risk is being able to quantify and price the risk. This typically requires loss data, but for new risks that doesn’t really exist.”  While this is talking about analyzing solar flare risk, it can be readily applied to risk we have in cybersecurity and Information Management.

In an interview with Lloyds.com, reinsurer Swiss Re Corporate Solutions  head of global markets, Oliver Dlugosch, says that he takes the following steps when evaluating an emerging risk:

  1. Is the risk insurable?
  2. Does the company have the risk appetite for this particular emerging risk?
  3. Can the risk be quantified (particularly in the absence of large amounts of historical data)?

Whence the denominator?

So back to the missing denominator. What to do? How do we do our risk analysis?  Some options for addressing this missing denominator might be:

  1. Go no further until that denominator is established. (However, that is not likely to be fruitful as we may never get to do any analysis if we get hung up on step 1) OR
  2. Make the problem space very small so that we have a tractable denominator (but then our risk analysis is probably so narrow that it is not useful to us) OR
  3. Move forward with only a partial denominator and being ready to continually backfill that denominator as we learn more about our evolving environment.

I think there is something to approach #3.  The downside is that with it we may feel we are departing what we perceive as intellectual honesty, scientific method, or possibly even ‘truthiness.’  However, it is better than #1 or 2 and better than doing nothing. I think we can resolve the intellectual honesty part to ourselves by 1) acknowledging that this approach will take more work (ie constantly updating the quantity and quality of the denominator) and 2) acknowledging that we’re going to know even less than we were hoping for.

So, more work to know less than we were hoping for. It doesn’t sound very appetizing, but as the world becomes increasingly complex, I think we have to accept that we will know less about the future and that this approach is our best shot.

 

“Not everything that can be counted counts. Not everything that counts can be counted.”

William Bruce Cameron (probably)

 

[Images: WikiMedia Commons]

Horseshoe Irony & Uncertainty

As we manage the online existence of our enterprises, we have discussed that we must rid ourselves of the illusion of total control, certainty, and predictability.  Ironically, however, we often must move forward as if we’re certain or near-certain of the outcome.  No shortage of irony, contradiction, and paradox here.

While playing horseshoes over Memorial Day Weekend with neighbors, I had a minor epiphany regarding uncertainty.  I realized that once I tossed my horseshoe and it landed in the other pit (a shallow three-sided box made up of sand, dirt, some tree roots, and the pin itself) that

I had no idea about where that horseshoe was going to bounce.

I had some control over the initial toss — I could usually get it in the box — but once it hit the ground, I had zero certainty about which direction it was heading. It might bury in the sand, bounce high on the dirt, or hit a tree root and fly off to parts unknown.

Unpredictable bounce

Unpredictable bounce

Because I couldn’t control that bounce, my best opportunity for winning was to get that horseshoe to land in the immediate vicinity of that pin as often as possible so that I created the opportunity for that arbitrary bounce to land on the pin as often as possible.

By seeking to place that horseshoe near that pin for the first bounce as often as I could, the more likely it was to land on, near, or around the pin for points or even the highly coveted ringer.

Here’s the irony.  One of the best ways to achieve this goal of landing the shoe near the pin as often as possible, is to aim for the pin every time.  That is, toss the horseshoe like you expect to get a ringer with every throw.

So, we’re simultaneously admitting to ourselves that we can’t control the outcome while proceeding as if we can control the outcome.  

This logical incongruity is precisely the sort of thing that can make addressing uncertainty and managing risk so challenging.

Similar things happen with the management of our information systems.  We want to earnestly move forward with objectives like 100% accurate inventory and 100% of devices on a configuration management plan.  Most of us know that’s not going to happen in practice with our limited resources.  However,

    • by choosing good management objectives (historically known as ‘controls’) 
    • executing earnestly towards those objectives while
    • thoughtfully managing resources,

we increase the chances of things going our way even while living in the middle of a world that we ultimately can’t control.

Those enterprises that do this well not only increase survivability, but also increase competitive advantage because they will use resources where they are most needed and not waste them where they don’t add value.

Ringer

Ringer

So, yes, I’m trying to get a ringer with every throw, but at the same time, I know that is unlikely. But while shooting for the ringer every time, I increase the opportunity for that arbitrary bounce to go in a way that’s helpful to me.

Much like horseshoes, in Information Risk Management, I can’t control what is going to happen on every single IT asset, be it workstation, server, or router, but I can do things to increase the chances for things move in a helpful way.

The opportunity for growth before us, then, is to have that self-awareness of genuinely moving towards a goal, simultaneously knowing that it is unlikely that we will reach it, and being ready to adjust when and if we don’t.

 

How do you create opportunity for taking advantage of uncertainty in your organization?

Wrestling With Risk & Uncertainty

We have a love-hate relationship with risk and uncertainty. We hate uncertainty when it interferes with us trying to get things done such as projects we are trying to complete, programs we are developing, or things that break.  We love it because of the excitement that it brings when we go to Las Vegas, or a enjoy a ball game, or watch a suspenseful movie.  There’s uncertainty and risk in all of these, but that is what makes it fun and attracts us to them.

When we think of risk and uncertainty in terms of our IT and Information Management projects, programs, and objectives, and the possibility of unpleasant surprises, we are less enamored with uncertainty.

Why do we have a hard time managing uncertainty and incorporating risk-based analysis into our day-to-day work? We’ve already talked some about resource constraints that can make information risk management challenging, particularly for small and medium-sized businesses.  However, there is a deeper issue that stems from over 300 years of human thought. In a nutshell, we just don’t think that way.  Why is this?  Who got this ball rolling?  Isaac Newton.

Newton

Isaac Newton lowers uncertainty and increases predictability

In 1687 Isaac Newton lowers uncertainty and increases predictability with the publication of Principia

Isaac Newton (1642 – 1727) published Principia in 1687. In this treatise, Newton’s three laws of motion were presented as well as the law of universal gravity.  This was a huge step forward in human knowledge and capability. Application of these laws provided unprecedented predictive power. The laws separated physics and philosophy into separate professions and formed the basis of modern engineering. Without the predictive power provided by Principia (say that three times real fast), the industrial and information revolutions would not have occurred.

Now, for the sneaky part.  The increased ability to predict motion and other physical phenomena was so dramatic and the rate of knowledge change was so rapid that the seeds were planted that we must be able to predict anything.  Further, the universe must be deterministic because we need only to apply Newton’s powerful laws and we can always figure out where we’re going to be.  Some refer to this uber-ordered, mechanical view as Newton’s Clock.

A little over a century later, Pierre-Simon Laplace (1749-1827) takes this a step further.

Laplace

Pierre-Simon Laplace ups the ante and says that we can predict anything if we know where we started

Pierre-Simon Laplace ups the ante in 1814 and says that we can predict anything if we know where we started

In his publication, A Philosophical Essay on Probabilities (1814), he states:

We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.

In other words, Laplace is saying:

      • if you know where you started
      • if you have enough intellectual horsepower
      • then if you apply Newton’s laws
      • you’ll always be able to figure out where you’ll end up

Now, almost 200 years later, we’ve got enough experience to know that things don’t always work out as planned. That is, as much as one may desire certainty, you can’t always get what you want.

Jagger

!n 1969 Jagger-Richards revisit uncertainty & remind us that we can't always get what we want

!n 1969 Jagger-Richards revisit uncertainty & remind us that we can’t always get what we want

The problem is that we largely still think the Newtonian/Laplacian way.

Even after the practical and theoretical developments of the last century, eg Heisenberg’s Uncertainty Principle, the unknowability in Schrödinger’s Wave Equation and Godel’s Incompleteness,  we still think the old school way on a day-to-day basis.

This gets reaffirmed in our historical IT service experience as well.  We are used to systems having strong dependencies.  For example, to set up a traditional content management system, there had to be supporting hardware, there had to be a functional OS on top of that, there had to be a functional database, and there had to be a current CMS framework with supporting languages.  This is a very linear, dependent approach and configuration. And it was appropriate for simpler systems of the past.

Complexity Rising

The issue now is that our systems are so complex, with ultimately indeterminate interconnectivity, and systems are changing so rapidly that we can’t manage them with hard dependency mapping.  Yet our basic instinct is to keep trying to do so.  This is where we wrestle with risk management. It is not natural for us.  Newton and Laplace still have a big hold on our thinking.

Our goal is to train ourselves to think in terms of risk, of uncertainty.  One way to do this is to force ourselves to regularly apply risk management techniques such as management of a Risk Register. Here we write down things that could happen — possibilities — and also estimate how likely we think they are to occur and estimate their impact to us. When we add to and regularly review this list, we begin to get familiar with thinking and planning in terms of uncertainty.

We have to allow for and plan for uncertainty.  We need to create bounds for our systems as opposed to attempting explicit system control.  The days of deterministic control, or perceived control, are past.  Our path forward, of managing the complexity before us, is to learn to accept and manage uncertainty.

 

How do you think about uncertainty in your planning? Do you have examples of dependency management that you can no longer keep up with?

What Floyd the Barber knew about information risk management

The Mayberry Model

Watch the till and lock the door at night. If you were opening a small business 30 years ago, your major security concerns were probably to keep an eye on the till (cash register) during the day and to lock the door at night.  It reminds me a little bit of the Andy Griffith Show which ran in the 1960’s about a small fictional town called Mayberry RFD in North Carolina.  Mayberry enterprises included Floyd’s Barbershop, Emmett’s Fix-It Shop, and Foley’s Grocery.

mayberry

Floyd didn’t need a risk management program, much less an IT risk management program to run his business.  It was pretty easy to remember — watch the till and lock the door.   He could also easily describe and assign those tasks to someone else if he wasn’t available.    Further, it was fairly easy to watch the till:  Money was physical — paper or metal — and it was transferred to or from the cash drawer. He knew everyone that came into his shop.  Same for Emmett and his Fix-It Shop.  Plus they had the added bonus of a pleasant bell ring whenever the cash drawer opened.  This leads us to the MISRMP (Mayberry Information Security & Risk Management Plan).

cash_register1

Mayberry Information Security and Risk Management Plan:

  • Watch the till 
  • Lock the door at night
  • Make sure the cash register bell is working

Today’s model

Fast forward to a small business today, however, and we have a different story.  Today, in our online stores selling products, services, or information, there is no physical till and probably little to no physical money.  There are online banks, credit cards, and PayPal accounts and we really don’t know where our money is.  We just hope we can get it when we need it.

There are not actual hands in the till nor warm bodies standing near the till when the cash drawer is opened. There is no soft bell ring to let us know the cash drawer just opened.  We don’t know the people in the store and they don’t go away when the front door is locked.  Our customers shop 24/7.

Further, instead of a till with a cash drawer, our businesses rely on very complex and interconnected equipment and systems — workstations, servers, routers, and cloud services — and we don’t have the time to stop and understand how all of this works because we’re busy running a business.  Floyd’s only piece of financial equipment was the cash register (and Emmett could fix that if it broke).

This new way of doing business has happened pretty fast. It is not possible to manage and control all the pieces that make up our financial transactions.  We also have a lot more financial transactions.  While the Internet has brought many more customers to our door, it has also brought many more criminals to our door.  Making the situation even more challenging, we largely don’t have the tools in place to manage our information risks.

Floyd the Barber

Floyd the Barber

What Floyd knew (and we don’t): 

  • who his customers were (knew them by face and name)
  • what their intentions were (wanted to purchased a haircut or shave or steal from the till)
  • where his money was (in the till, in the bank, or in his pocket while being transferred from the shop to the bank)
  • when business transactions occurred ( 9:00 – 5:00 but closed for lunch and closed on Sundays)
  • what was happening in his store after hours (nothing)

That is to say, Floyd had much less business uncertainty than we must contend with today.  He could handle most of his uncertainty by watching the till and locking the door at night. Our small and medium sized businesses today, though, are much more complex, have much higher levels of uncertainty, and need be risk managed to allow us to operate and grow.

As Floyd managed his security and risk to operate a successful business, so must we — ours is just more complicated.

What are the 3 biggest IT & Information Management risks that you see affecting your business?