The Parable of the Exploding Apples


I thought up the following parable in order to show the value of particular sorts of formal completeness during hazard analysis (Hazan). Contemporary Hazan strikes me as a procedure or procedures in which clever, knowledgeable people sit down together, think about all the things which can go wrong and list them, and stop when they think they have thought of them all. Various rituals with their various names go with this, but the results are consensual and not necessarily objective, in that important properties of the analysis, such as forms of completeness, are not assessed or assured. Our technique, Ontological Hazard Analysis (OHA) performs Hazan in a formal way which ensures certain objective properties of the resulting analysis, rather than just the consensus arrived at by traditional techniques.

There’s a big apple orchard in the countryside, not far from a big housing estate, and families love to go there and picnic under the trees. But there is a problem.

When the apples fall to the ground, people pick them up and eat them, take them home sometimes. Every so often, one explodes, and people are harmed.

We do have some sort of a test for good apples, ones that will definitely not explode, but most apples we just don’t know about, and it’s not practical to insist that the ones which may explode are cleared up, because there are just, well, way too many. It’s impossible to collect them all up and check them. For one thing, we don’t seem to be able to tell when we have got them all.

The ownership of the apple trees is scattered over many people, and each person’s trees are distibuted over the field, so one can’t, say, put a fence around Joe’s, and a fence around Mary’s trees, and so on.

There is a safety assessor. He (for he is male) comes around at harvest time, and asks all the apple tree owners what they have done to ensure safety. He asks Mary. Well, says Mary, I have this list of exploding-apple experts with decades of experience and CVs longer than the road you took to get here, who came last Friday, looked around in the vicinity of my trees, debated at length using special words I am not very familiar with, and picked up some apples consensually thought to be possibly bad. He talks to Joe. Similar story. Indeed, he talks to all the apple owners.

Then he comes to me. Well, I said, I injected my trees at the beginning of the season with a special substance which gets into the apples, and which I can sense using this magic wand. The wand isn’t very sensitive, but when it pings there is an apple of mine within two meters of it. Now, I used this to pick up all my apples and inspect them. The ones that are obviously good I put back on the ground for people to enjoy. The ones I couldn’t tell, I painted blue and put them all in a basket. Then I put the basket in that bunker over there, so when one explodes no one will be harmed.

That sounds pretty good, agrees the assessor, but how, he asks me, could I possibly have picked up all my apples, for there just seem to be lots and lots? Well, I said, I went to contraption-school and not apple-growing school, so I devised this contraption with 40 arms and sensors. It goes to where my magic wand said there’s an apple. Because, you see, where my magic wand says there is an apple, I know from experience that there’s somewhere between 3 and 37 apples within two meters; there are never more than 37. My contraption picks them all up simultaneously, waves them fast one after the other at the wand, and when the wand peeps loudly, it puts it in that big basket over there. So we do four square meters, accurately, in just a little bit longer than the time it takes to pick up one apple. And we go over all the places near to my trees. So that solves the collection problem.

Then we take the big basket, and do the good/don’t know test, paint the don’t-know’s blue and put them in that blue basket over there. And we put the good ones in that yellow basket. Then we take the blue basket to the bomb bunker, the one over there with Safety Requirement painted on the lintel, and then we chuck the contents of the yellow basket back in the orchard.

That all sounds pretty good, said the assessor, but do I have documentation that all this happened as I said. Sure I do, say I, here it is. And in five minutes looking through it he is convinced.

But why bother doing this? He asks me finally. There are sooooo many apples in this orchard, and yours are only a few of them. You don’t do all of them. I say there are a few reasons. First, if an apple explodes this season and someone gets hurt, then you know right now it’s not one of mine, so that will save you time and resources reinterviewing me about it, and it saves me the cost of possible recompense, because we know in advance ’tain’t one of mine: you’re holding the proof in your hand. Second, you’ve gone round the other growers interviewing them at length and satisfying yourself at length whether the experts they hired really did an adequate job. Whereas, with me, it’s taken you ten minutes and you have proof of adequacy of my measures. That has saved you time, it’s saved me time, so we can go off and do something else with the time we saved, and the evidence you have in hand is better than what you get from the others.

So, he says, why don’t we insist that all do what you do? Well, you probably can’t, say I, because there are sooooooo many apples, and besides not everyone has this lucky property that my trees have, that there are only ever less than 37 apples per four square meters round a tree. But maybe we can hope, says he, that at least some other people will be able to use your technique, and then we can all save a little more time, be a little more sure that more apples are safe, have maybe a couple fewer accidents this year, and all in all improve the quality of our product!

Yes, and why not? say I.

The assessor returns to headquarters and tells his chief, Mr. Golden-Apple-Guru, what he has just seen. Oh, don’t be silly!, says Mr. GAG. That’s all nonsense what Ladkin told you. He can’t have done that. Everyone knows that Completeness Is Impossible! You’re fired!

Key

Many definitions of hazard construe hazards as states. The apples represent states. Whether they are system states, or states of the system+environment, depends on your definition of hazard. MIL-STD-882 and Leveson’s Safeware define hazards as system states.

Hazards are supposed to be conditions in which the chances of an accident are increased. For many conceptions, it is also important (although unstated) that for an accident to happen, the system must pass through a hazard state. (If this condition is not fulfilled, then one exposes oneself to possibly mistaken calculations of risk, as shown in my book chapter Problems Calculating Risk Via Hazard .) The exploding apples, or, rather, their immediate consequences, represent accidents that happen through a hazard state. It is important to note in the parable that all that explodes is an apple.

“My” trees represent those yielding hazards at a particular development stage. In Ontological Hazard Analysis (OHA), these development stages are formal refinements of (earlier) stages. The apples “my” trees shed are the hazards identified at this stage. Daniel Jackson has proposed that HazAn may be performed entirely before one starts on the software development, in this note to the York safety-critical list. Whether this turns out to be so, in the typical OHA there will be many refinement stages before one gets to the point of proceeding with software development.

Daniel has queried the point about knowing there are only at most 37 apples to pick up around «my» trees (private communication). This was my attempt to indicate a level of control imposed during OHA, a deliberated restriction of vocabulary so that the system states may be enumerated (or their relevant equivalence classes may be enumerated) and sorted into hazards and non-hazards as happens in the parable. My special picking tool is intended to show that one applies a specific technique for this sorting at a specific refinement level; we have developed no effective general techniques for so doing and doubt that there are any.

I sent a version of this parable to the University of York safety-critical systems mailing list on 01 November 2010 at 08.33. It was intended to illustrate the values of formal analysis, contrary to claims such as that it is «impossible» to establish completeness of the high-level safety requirements unless you ignore most of the relevant factors, a suggestion which appears in a response of Nancy Leveson to a question of mine in a note in the York thread of 31 October 2010 at 11.15. I believe such claims to be mistaken, but they are also widely believed, as may be seen by reading wider in the thread. Whether the parable helps to counter such claims is, of course, up to the reader to judge.


Leave a Reply

Recent Comments

No comments to show.

Archives