Showing posts with label time. Show all posts
Showing posts with label time. Show all posts

a man called Ove

is the title of an excellent book by Fredrik Backman. As usual I'm going to quote from a few pages:
He felt one should not go through life as if everything was exchangeable. As if loyalty was worthless. Nowadays people changed their stuff so often that any expertise in how to make things last was becoming superfluous. Quality: no one cared about that any more. Not Rune or the other neighbours and not those managers in the place where Ove worked. Now everything had to be computerised, as if one couldn't build a house until some consultant in a too-small shirt figured out how to open a laptop.
'They've bumped up the electricity prices again,' he informs her as he gets to his feet. He looks at her for a long time. Finally he puts his hand carefully on the big boulder and caresses it tenderly from side to side, as if touching her cheek. 'I miss you,' he whispers. It's been six months since she died. But Ove still inspects the whole house twice a day to feel the radiators and check that she hasn't sneakily turned up the heating.
'Now you listen to me,' says Ove calmly while he carefully closes the door. 'You've given birth to two children and quite soon you'll be squeezing out a third. You've come here from a land far away and most likely you fled war or persecution and all sorts of other nonsense. You've learned a new language and got yourself an education and you're holding together a family of obvious incompetents. And I'll be damned if I've seen you afraid of a single bloody thing in this world before now.' ...
'I'm not asking for brain surgery. I'm asking you to drive a car. It's got an accelerator, a brake, a clutch. Some of the greatest twits in world history have sorted out how it works. And you will as well.'
And then he utters seven words, which Parvaneh will always remember as the loveliest compliment he'll ever give her.
'Because you are not a complete twit.'
Men like Ove and Rune were from a generation in which one was what one did, not what one talked about.
Ove has probably known all along what he has to do, but all people are time optimists. We always think there's enough time to do things with other people. Time to say things to them. And then something happens and then we stand there holding on to words like 'if'.
'But serious, man. You do this every morning?' Jimmy asks cheerfully.
'Yes, to check if there have been any burglaries.'
'For real? Are there a lot of burglaries round here?'
'There are never a lot of burglaries before the first burglary,' Ove mutters and heads off towards the guest parking.
'There is no hope for these boys and girls,' the headmaster soberly explained in the interview. 'This is not education, this is storage.' Maybe Sonja understood how it felt to be described as such. The vacant position only attracted one applicant, and she got the boys and girls to read Shakespeare.


beyond culture

is an excellent book by Edward T Hall (isbn 0385124740). As usual I'm going to quote from a few pages...
The investigation of out-of-awareness culture can be accomplished only by actual observation of real events in normal settings and contexts.
Research with business groups, athletic teams, and even armies around the world has revealed there is an ideal size for a working group. The ideal size is between eight and twelve individuals.
All theoretical models are incomplete. By definition, they are abstractions and therfore leave things out. What they leave out is as important as, if not more important that what they do not, because it is what is left out that gives structure and form to the system.
Paradoxically, studying the models that men create to explain nature tells you more about the men than about the part of nature being studied.
All bureaucracies are oriented inward, but P-type are especially so.
High context actions are by definition rooted in the past, slow to change, and highly stable.
High context communications are frequently used as art forms. They act as a unifying, cohesive force, are long-lived, and are slow to change. Low context communications do not unify; however, they can be changed easily and rapidly.
Nothing happens in the world of human beings that is not deeply affected by linguistic forms.
If there is anything that can change the character of life, it is how time is handled.
M-time emphasizes schedules, segmentation, and promptness. P-time systems are characterized by several things happening at once. They stress involvement of people and completion of transactions rather than adherence to preset schedules.
By scheduling, we compartmentalize; this makes it possible to concentrate on one thing at a time, but it also denies us context.
In many forms, culture designates what we pay attention to and what we ignore.
The natural act of thinking is greatly modified by culture.
Low-context cultures seem to resist self-examination.
Alfred Korzybski and Wendell Johnson, founders of semantics, identified the Extention Transference factor in the use of words and published extensively on the profound effects of mistaking the symbol for the thing symbolized while endowing the symbol with properties it does not possess.
Environments are not behaviorally neutral.
For some reason, people reared in the European tradition feel more comfortable if they have a rule to fall back on, even if it doesn't fit.


software blending

Here's a story from the book Dr Deming by Rafael Aguayo.
In the early 1950s, American coffee roasters faced a dilemma. The price of coffee beans had risen dramatically, and they were faced with two distasteful choices: either absorb the increased price, partially or totally, hurting profitability; or pass the cost on to their customers and risk losing market share or having customers switch to other beverages.

They came up with an innovative alternative. The coffee roasters' business consists of buying, ageing, roasting, and blending coffee beans to achieve the desired state and smell. Coffee beans, like all agricultural commodities, are highly variable. Two beans can be quite distinct. Even beans picked at the same time from the same tree can be different. A bean picked from the top of the tree, which receives more sunshine, tastes different than a bean from the bottom of the tree.

Blending is a critical part of the process. The leading roaster of the time tried experimenting with different formulations. It found that gradually changing the formulation, substituting lower quality beans, was unnoticeable to the customer. It begins to slowly change the blend. Every two weeks a few more of the less expensive beans were substituted for the heartier, more expensive ones. Most consumers couldn't notice the difference in coffees bought two weeks apart. But if they had tasted, side by side, two batches made six weeks apart, they would have noticed a slight difference.

In effect, the roaster started training customers to accept an inferior blend of coffee. The other roasters noticed what was going on and responded in kind to avoid losing market share. In a few years the American consumer's standards for a decent cup of coffee were radically altered. The managers did their jobs and enjoyed their bonuses. At the time this response was viewed as a triumph of ingenuity. But a funny thing began to happen. Per capita consumption of coffee began a slow but steady decline. The business stopped growing. The roaster that started it all began to experience profitability problems. It is now part of a huge conglomerate and is still experiencing problems in its coffee business. Consumers have discovered gourmet coffees. But ironically, the coffee drunk by Americans during the 1940s was on a par with what we now call gourmet coffees.
Isn't it great! It reminds me of the Fast Food Fallacy from Jerry Weinberg's The Secrets of Computing:
No difference plus no difference plus no difference plus ... eventually equals a clear difference
I emailed the story to Jerry and he replied:
Great story. Even though I've never had a cup of coffee in my life, I can appreciate the dynamic. Indeed, the same dynamic has occurred over and over. I've written about white bread's deterioration - and now we have "gourmet" breads.

And, I suspect, the same thing is now happening in software - gradually lowering the quality of apps, as expectations lower. Then we will have software selling on the basis of freedom from those annoying bugs. Same thing in security, I think.

time

Back to quotes table-of-contents

From Adrenaline Junkies and Template Zombies
On most development projects, time is a scarcer resource than money.

From Consilience
Consilience among the biological sciences is based on a thorough understanding of scale in time and space.

From Nudge
Self-control issues are most likely to arise when choices and their consequences are separated in time.

From The Principles of Product Development flow
Opportunities get smaller with time, and obstacles get larger.

From Zen and the art of motorcycle maintenance
Impatience is close to boredom but always results from one cause: an underestimation of the amount of time the job will take.

From Kluge
What can evolve at any given point in time is heavily constrained by what has evolved before.

People are more likely to accept falsehoods if they are distracted or put under time pressure.

Organisms tend to value the present far more than the future.

From The End of Certainty
There is a necessary trade off between certainty at a given time for continuity through time.

From How Buildings Learn: Chapter 2 - Shearing Layers
Hummingbirds and flowers are quick, redwood trees slow, and whole redwood forests even slower. Most interaction is within the same pace level.

The dynamics of the system will be dominated by the slow components, with the rapid components simply following along. Slow constrains quick; slow controls quick.

From Agile Development in the Large
Quick feedback should be the first thing you introduce.

From Beating the System
Time is our only absolutely nonrenewable and, thus, most highly valued resource. To place a low value on another's time is to show a lack of respect for that person.

From Hackers and Painters
Paying attention is more important to reliability than moving slowly.

mind and nature

is an excellent book by Gregory Bateson (isbn 1-57273-434-5). As usual I'm going to quote from a few pages:
If you want to understand mental processes, look at biological evolution and conversely if you want to understand biological evolution, go look at mental processes.
How is the world of logic, which eschews "circular argument," related to a world in which circular trains of causation are the rule rather than the exception?
Perception operates only on difference. All receipt of information is necessarily the receipt of news of difference, and all perception is necessarily limited by threshold. Differences that are too slight or too slowly presented are not perceivable.
The universe is characterized by an uneven distribution of causal and other types of linkage between its parts; that is, there are regions of dense linkage separated from each other by regions of less dense linkage.
We should define "stability" always by reference to the ongoing truth of some descriptive proposition.
Notoriously it is very difficult to detect gradual change because along with our high sensitivity to rapid change goes also the phenomenon of accommodation. Organisms become habituated. To distinguish between slow change and the (imperceptible) unchanging, we require information of a different sort; we need a clock.
Stability may be acheived either by rigidity or by continual repetition of some cycle of smaller changes, which cycle will return to a status quo ante after every disturbance.
Every given system embodies relations to time, that is, was characterized by time constants determined by the given whole. These constants were not determined by the equations of relationship between successive parts but were emergent properties of the system.
The shape of what it deposits is determined by the shape of the previous growth.
What characterizes those adaptations that turn out to be disasterous, and how do these differ from those that seem to be benign and, like the crab's claw, remain benign through geological ages?
Above all, in sexual reproduction, the matching up of chromosomes in fertilization enforces a process of comparison. What is new in either ovum or spermatozoon must meet with what is old in the other, and the test will favour conformity and conservation. The more grossly new will be eliminated on grounds of incompatibility.
It is very easy to fall into the notion that if the new is viable, then there must have been something wrong with the old. This view, to which organisms already suffering the pathologies of over rapid, frantic social change are inevitably prone, is, of course, mostly nonsense. What is always important is to be sure that the new is not worse than the old.

the pleasure of finding things out

is an excellent book by Richard Feynman (isbn 978-0-141-03143-9). As usual I'm going to quote from a few pages:
Looking at the bird he says, "Do you know what that bird is? It's a brown throated thrush; but in Portuguese it's a … in Italian a …, " he says "in Chinese it's a …, in Japanese a …," etcetera. "Now," he says, "you know in all the languages you want to know what the name of the bird is and when you've finished with all that," he says, "you'll know absolutely nothing whatever about the bird. You only know about humans in different places and what they call the bird. Now," he says, "let's look at the bird."
I said, "Say, Pop, I noticed something: When I pull the wagon the ball rolls to the back of the wagon, and when I'm pulling it along and I suddenly stop, the ball rolls to the front of the wagon," and I says, "why is that?" And he said, "That nobody knows," he said. "The general principe is that things that are moving try to keep on moving and things that are standing still tend to stand still unless you push on them hard." And he says, "This tendency is called inertia but nobody knows why it's true." Now that's a deep understanding - he doesn't give me a name, he knew the difference between knowing the name of something and knowing something, which I learnt very early.
To do high, real good physics work you do need absolutely solid lengths of time.
You cannot expected old designs to work in new circumstances.
If you are in a hurry, you must dissipate heat.
We had lots of fun.
The people underneath didn't know at all what they were doing. And the Army wanted to keep it that way; there was no information going back and forth... I felt that you couldn't make the plant safe unless you knew how it worked… I said that the first thing there has to be is that the technical guys know what we're doing. Oppenheimer went and talked to the security people and got special permission. So I had a nice lecture in which I told them what we were doing, and they were all excited. We're fighting a war. We see what it is. They knew what the numbers meant. If the pressure came out higher, that meant there was more energy released and so on and so on. They knew what they were doing. Complete transformation! They began to invent ways of doing it better. They supervised the scheme. They worked all night. They didn't need supervising at night. They didn't need anything. They understood everything. They invented several of the programs that we used and so forth. So my boys really came through and all that had to be done was to tell them what it was, that's all. It's just, don't tell them they're punching holes. As a result, although it took them nine months to do three problems before, we did nine problems in three months.
Most of the trouble was the big shots coming all the time and saying you're going to break something, going to break something.
We used to go for walks often to get rest.
Advertising, for example, is an example of a scientifically immoral description of the products.
The magnetic properties on a very small scale are not the same as on a large scale.
But what we ought to be able to do seems gigantic compared with our confused accomplishments. Why is this? Why can't we conquer ourselves?
Erosion and blow-by are not what the design expected. They are warnings that something is wrong. The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviations in this unexpected and not thoroughly understood way… The O-rings of the Solid Booster Rockets were not designed to erode. Erosion was a clue that something was wrong. Erosion was not something from which safety can be inferred.
We have also found that certification criteria used in Flight Readiness Reviews often develop a gradually decreasing strictness.
The computer software checking system and attitude is of highest quality. There appears to be no process of gradually fooling oneself while degrading standards so characteristic of the Solid Rocket Booster or Space Shuttle Main Engine safety systems. To be sure, there have been recent suggestions by management to curtail such elaborate and expensive tests as being unnecessary at this late date in Shuttle history. This must be resisted for it does not appreciate the mutual subtle influences, and sources of error generated by even small changes of one part of a program on another. There are perpetual requests for changes as new payloads and new demands and modifications are suggested by the users. Changes are expensive because they require extensive testing. The proper way to save money is to curtail the number of requested changes, not the quality of testing for each.
Official management, on the other hand, claims to believe the probability of failure is a thousand times less. One reason for this may be an attempt to assure the government of NASA perfection and success in order to ensure the supply of funds. The other may be that they sincerely believe it to be true, indicating an almost incredible lack of communication between themselves and their working engineers.
It is presumptuous if one says, "We're going to find the ultimate particle, or the unified field laws," or "the" anything.

systems thinking slide deck





Here are the slides for the talk on Systems Thinking that Niklas Bjornerstedt and I did in the Scotsman pub in Oslo yesterday. The point about "Your wife is very beautiful" is that it is very easy to read that in a static sense. To get a sense of reading it in a more dynamic (relative) sense, imagine if someone replies "compared to who?!"

The books mentioned in the talk are:
The Law of Unintended Consequences slide has three images for these three stories:

the (de)composition fallacy

You've probably heard the saying "the whole is greater than the sum of the parts". You can think about this law in various ways. For example, if you remove the brain from a man you no longer have a man minus a brain. You have two dead things. A brain is more than just a "part" of a man. A part has relationships to a whole that contribute to the essential wholeness of the whole.

Another way is at a much simpler level - where all the parts are of the same type aggregating together over time. Time patterns them. Human beings are pretty poor at seeing effects over time. We tend to think things are more permanent than they are. We think that the way things are now is how they've always been and how they'll continue to be. I recall showing my children some old Victorian pennies and telling them that's what pennies used to look like. They didn't believe me! Before street lighting it was apparently very common in England to have two sleeps a day. The short one after a midday meal was called the "small sleep". The time we eat our main meal has changed. Eating several courses at a meal is a relatively recent phenomenon. The cutlery we eat with has changed. And what we wear. Before the 17th century your left and right shoe (if you had shoes at all) were the same and were called "straights".

But things do change, and this matters because

Things that don't matter in isolation often matter in composition.

Suppose you compile some code and you get a few warnings. Do these warnings matter? You might think not, since there are only a few, but they do. Tomorrow you'll be writing some more code, and the day after that some more too, and so will your colleagues. After six months those few warnings have turned into 3000 warnings. That's the composition fallacy.

3000 warnings is a big problem for at least two reasons. But the two reasons I'm thinking of are really the same reason. Let me explain.

The first reason is that if you've got 3000 warnings then any new warnings aren't even noticed in the comparative vastness of the existing 3000. You've become completely desensitized to warnings. The number of warnings inexorably grows but no one notices - you've just got "a lot of warnings" that is always "a lot of warnings".

The second reason is the same reason but in reverse.

Suppose you see what you couldn't see before - that a lot of warnings is a composition problem - and you try to do something about it. You've now got to work hard learning how to do something you don't know how to do - namely how to write code without warnings. That takes effort. And what do you get for all your effort? Almost nothing it seems! The number of warnings goes down a tiny bit but there are so many warnings you've hardly made any difference at all. A lot of warnings remains a lot of warnings.

One thing you can do in this situation is to stop reporting the total number of warnings and switch to reporting only the number of warnings either added or removed. This is an example of switching from a static measure (the number of warnings is now 2981) to a more dynamic measure (in the last 24 hours the number of warnings has gone down by 19).

Consilience


is an excellent book by Edward O. Wilson (isbn 0-349-11112-X). As usual I'm going to quote from a few pages:
The first step to wisdom, as the Chinese say, is getting things by their right names.
The cost of scientific advance is the humbling recognition that reality was not constructed to be easily grasped by the human mind.
Analysis and synthesis, he [Goethe] liked to say, should be alternated as naturally as breathing in and breathing out.
Nothing in science - nothing in life, for that matter - makes sense without theory.
Complexity is what interests scientists in the end, not simplicity.
Consilience among the biological sciences is based on a thorough understanding of scale in time and space.
Complexity theory can be defined as the search for algorithms used in nature that display common features across many levels of organisation.
In a system containing perfect internal order, such as a crystal, there can be no further change.
The brain is a machine assembled not to understand itself, but to survive.
The biologist S. J. Singer has drily expressed the matter thus: I link, therefore I am.
No example of bias-free mental development has yet been discovered.

6000 degree-minutes

If you use the same recipe you get the same bread.

That's the White Bread Warning from Jerry Weinberg's truly excellent The Secrets of Consulting.

I was thinking about that the other day and I realized something important. I realized that when I read the word recipe I thought about the ingredients but not really about the non-ingredient related instructions in the recipe. About time. A recipe doesn't just tell you what to mix with what, and in what order, it tells you how long to apply heat. And how much heat. These two things matter just as much as the ingredients. If you change the ingredients you'll get different bread. But if you change the time or the amount of heat you'll also get different bread. Although it might not look much like bread.

Suppose the recipe says to heat the oven to 200 degrees and then cook for 30 minutes. That's 6000 degree-minutes. Now 1200 degrees for 5 minutes is also 6000 degree-minutes. But the bread will be predictably black. Similarly 1 degree for 6000 minutes is also 6000 degree-minutes. But the bread will still be ingredients. Or rather it won't. You see 6000 minutes is 100 hours. Which is 4 days as near as makes no difference. That matters because ingredients are organic. They have a shelf life. A sell-by/eat-by expiry date. They decay. And even if baking the ingredients for 4 days at 1 degree did produce something vaguely bread-like the extra time would create extra cost. In lots of ways. Extra time does that.

In the brain of me

Here's a video of the SkillsMatter talk I did on Thursday, titled "Stuff I'm starting to know now that I really wish I'd known 20 years ago". Its loosely based on the theme of Making The Invisible More Visible, one of my entries from the book, 97 Things Every Programmer Should Know. I completely botched what I was trying to say about courage. What I was trying to say was that courage is not the absence of fear.

My other SkillsMatter talk was based on Do More Deliberate Practice my other entry in the 97 Things Book.

This was the first run of quite a lot of new material so I was quite nervous, but I felt most of it went very well. Here's some of the feedback.
  • Fun, informative, useful
  • An interesting brain dump
  • Entertaining and enlightening
  • Very good. Thoughtful and interesting
  • Great content - very interesting
  • Great laid back presentation
  • Interesting ideas and great presentation to go with it
  • Well planned presentation, not just your standard powerpoint
  • Very good talk. Inspiring
  • Very interactive and well explained
  • Clear explanations, good analogies, funny
  • Fantastic
  • Very good. Passionate speaker. Good insights


Why do car drivers brake?

Recently I wrote a small blog entry Why to cars have brakes? It's getting a lot of hits. Well, a lot for me. So naturally in my quest for even more hits here's a follow up...

Imagine you're in a car. Driving along. Why do you brake? I'm not asking what happens when you press the brake pedal? That's too easy (the car stops). I'm asking why…?


.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Here are two reasons:

Reason one: because there's danger ahead

This corresponds to a failing test. More specifically, it's when a test runs and asserts. A red. Perhaps there's a queue of traffic ahead (often a problem on the orbital car park known as the M25). Perhaps the road is closed off because workmen are working on it. Or maybe the local council has been meddling and the road is now a one-way-but-not-the-way-you-want-to-go road. Whatever the reason, something is not as expected.

Reason two: because you've arrived!

This corresponds to a passing test. A green. Wherever it is you're going, you've got there. Nothing unexpected happened. No traffic queues. No closed off roads. No meddling local council. Incredible!

In my entirely unscientific sampling almost everyone answers with the first reason. The second reason is not nearly as common. I find this interesting. I think maybe it's a reflection of the thinking statically vs thinking dynamically thing again. A perception that tests are most useful when they fire red. That tests which run to completion without incident are not so useful. But they are. Maybe more so.

Brakes help me stop. And stopping implies I'm already moving - I'm already going somewhere. But where? If I don't know where I'm going why am I moving at all? If I don't know where I'm going I'm just as lost as if I don't know where I am.

Tests are useful not just because of the I-didn't-arrive effect when they fail, they're also useful because of the I-did-arrive effect when they pass.

Another reason passing tests are really useful is the elves. But the elves will have to wait for another time.

Micro refactoring - from here to there

One of my Mastery book snippets from a few days ago reads:

Masters ... are zealots of practice, connoisseurs of the small, incremental step.

I was doing the roman numerals kata in ruby and something related to small incremental steps occurred to me. Here's three tests:

def test_to_roman
  assert_equal "I", to_roman(1)
  assert_equal "II", to_roman(2)
  assert_equal "III", to_roman(3)
end

and here's the first code snippet:

def to_roman(n) 
  roman = ''
  if n == 3
    roman = 'III' 
  end
  if n == 2
    roman = 'II'
  end
  if n == 1
    roman = 'I'
  end
  roman
end

I can refactor these three if statements into a single while statement, in very small steps, as follows:

def to_roman(n) 
  roman = ''
  if n >= 3
    roman += 'I'
    n -= 1
  end
  if n >= 2
    roman += 'I'
    n -= 1
  end
  if n >= 1
    roman += 'I'
    n -= 1 
  end
  roman
end

and then:

def to_roman(n) 
  roman = ''
  if n >= 1
    roman += 'I'
    n -= 1
  end
  if n >= 1
    roman += 'I'
    n -= 1
  end
  if n >= 1
    roman += 'I'
    n -= 1 
  end
  roman
end

and then:

def to_roman(n) 
  roman = ''
  while n >= 1
    roman += 'I'
    n -= 1
  end
  roman
end

Ok. Now here's the original again:

def to_roman(n) 
  roman = ''
  if n == 3
    roman = 'III'
  end
  if n == 2
    roman = 'II'
  end
  if n == 1
    roman = 'I'
  end
  roman
end

And this time I refactor in a different direction, towards an array lookup:

def to_roman(n) 
  units = [ '', 'I', 'II', 'III' ]
  roman = ''
  if n == 3
    roman = units[3]
  end
  if n == 2
    roman = units[2]
  end
  if n == 1
    roman = units[1]
  end
  roman
end

And then:

def to_roman(n) 
  units = [ '', 'I', 'II', 'III' ]
  roman = ''
  if n == 3
    roman = units[n]
  end
  if n == 2
    roman = units[n]
  end
  if n == 1
    roman = units[n]
  end
  roman
end

And then:

def to_roman(n) 
  units = [ '', 'I', 'II', 'III' ]
  roman = ''
  if n == 3
    roman = units[n]
  end
  if n == 2
    roman = units[n]
  end
  if n == 1
    roman = units[n]
  end
  roman
  units[n]
end

And then:

def to_roman(n) 
  units = [ '', 'I', 'II', 'III' ]
  units[n]
end

In this case I can refactor from a sequence of if statements in two directions equally easy - towards a while or towards an array lookup. Ok. Now consider the situation if I was starting from a switch instead of a sequence of ifs:

def to_roman(n) 
  case n
  when 3 then 'III'
  when 2 then 'II'
  when 1 then 'I'
  end
end

From this I can easily refactor towards an array lookup but refactoring towards a while is perhaps not quite so straightforward.

I'm suggesting that a construct is useful not just in it's own right but also in relation to all the other constructs it might get refactored to or from. How easily can I move from one construct to another? Do some constructs live only on one-way roads? Do some lead you down more dead-ends than others?

I'm reminded of the family on holiday in the West Country who got thoroughly lost. Spotting a farmer leaning on a gate they stop the car, wind down the window and ask

Excuse me. Can you tell me how to get to Nempnett Thrubwell?

The old farmer looks at them and says

Well.... you don't want to start from here.


Test-gunpowder-pudding driven development

I was rereading chapter 3, Systems and Illusion in Jerry Weinberg's excellent An Introduction to General Systems Thinking yesterday. On page 56-57 Jerry writes:

If I say: "The exception proves the rule" in front of a large class, there will be a division in understanding... Some will believe I have uttered nonsense, while others will understand "The exception puts the rule to the test"

I've read the book four times. I'm a slow learner but this time something clicked and I immediately understood the earlier passage:

...the exception does not prove the rule, it teaches it.

Jerry goes on:

"Proof" in its original sense was "a test applied to substances to determine if they are of satisfactory quality."

I was struck by two thoughts when I read this. One was the parallel with testing. Of a test as a "proof". The other was the word original. I realized that when I hear the word "proof" I have a strong association with its noun meaning rather than its verb meaning. I tend to think of a proof as a finished proof that completely proves something. It's the noun-verb thing I've blogged about before. I wondered if there were any old dictionaries online so I could get a feel for how the generally accepted meaning of the word proof might have changed over time. There is. http://machaut.uchicago.edu/websters has two Webster's dictionaries. The 1828 dictionary came back with:

Proof [noun] 1. Trial; essay; experiment; any effort, process or operation that ascertains truth or fact. Thus the quality of spirit is ascertained by proof; the strength of gun-powder, ...

The 1913 one came back with:

Proof [noun] 1. Any effort, process, or operation designed to establish or discover a fact or truth; an act of testing; a test; a trial.

and a modern dictionary http://www.thefreedictionary.com/proof said:

Proof [noun]. 1. The evidence or argument that compels the mind to accept an assertion as true.

I find the difference fascinating. The 1828 and the 1913 definitions define the noun as the process whereas the modern one defines the noun as the evidence resulting from the process.

Jerry continues:

We retain this meaning in the "proofs" of printing and photography, in the "proof" of whiskey, and in "the proof of the pudding." Over the centuries, the meaning of the word "prove" began to shift, eliminating the negative possibilities to take on an additional sense: "To establish, to demonstrate, or to show truth or genuineness."

At first I didn't understand the bit about "eliminating the negative possibilities". I think it's partly to do with my ITA spelling at school. But I am persistant. Slowly it came to me. The word that did it for me in...

"Proof" in its original sense was "a test applied to substances to determine if they are of satisfactory quality."

...was the word if. To determine if they are of satisfactory quality. The proof was an act. There was the possibility of failure.

I started thinking about the word proof a bit more. I googled the phrase "the proof of the pudding". If you think this phrase is pretty meaningless then you're right - it's a shortened version of:

The proof of the pudding is in the eating.

Again it's about the possibility of failure. It reminds me of the scene in the film The Cat in the Hat (another film Patrick and I love watching) where the Cat has just made some cupcakes (with the amazing kupcake-inator). He tries one and says:

"Yeuch. They're horrible. Who want's some?"

I love that line. I also googled the word proof as related to alcohol content. The history behind the phrase is just wonderful. In the 18th century spirits were graded with gunpowder. Imagine you're buying some spirits. How would you know if an unscrupulous merchant had watered it down? You couldn't tell just by looking. What they did was pour a sample of the solution onto a pinch of gunpowder. If the wet gunpowder could still be ignited then the solution had proved itself. Don't you just love that?

So let's hear it for pudding and for spirits and for gunpowder and for tests. And for the possibility of failure.

Nudge

is an excellent book by Richard Thaler and Cass Sunstein (isbn 978-0-141-04001-1). As usual I'm going to quote from a few pages:
School children, like adults, can be greatly influenced by small changes in the context.
There is no such things as a 'neutral design.'
Roughly speaking, losing something makes you twice as miserable as gaining the same same thing makes you happy. In more technical language, people are 'loss averse'... Loss aversion helps produce inertia.
Most teachers know that students tend to sit in the same seats in class, even without a seating chart.
Eating turns out to be one of the most mindless activities we do. Many of us simply eat what is put in front of us.
Social scientists generally find less conformity, in the same basic circumstances as Asch's experiments, when people are asked to give anonymous answers.
On average, those who eat with one other person eat about 35 percent more than they do when they are alone; members of a group of four eat about 75 percent more; those in groups of seven or more eat 96 percent more.
Self-control issues are most likely to arise when choices and their consequences are separated in time.
Even hard problems become easier with practice.
The best way to help Humans improve their performance is to provide feedback.
For most of their time on earth, Humans did not have to worry much about saving for retirement, because most people did not live long enough to have much of a retirement period.

Things are the way they are because they got that way

If you're working on a complex codebase and you're trying to understand the complexity by looking at the codebase then you're looking in the wrong place. That's like the man in Peopleware who loses his keys in a dark street and looks for them in the adjacent street because, as he explains, "the light is better there". A codebase is the way it is because it got that way. Slowly. Over time. If you're looking at the code your looking at the effect and not at the cause. It was the developers that did it!

Things are the way they are because they got that way.


Recency

I've written before that:

Human beings have evolved a very strong association that cause and effect are simple and linear; that cause and effect are local in space and time.


One of example of this is when contestants on Who Wants To Be a Millionaire phone a friend. It's easy to think that the friend could simply watch the program and start trying to find the answer before they're called. They can't. What you're watching happened some time in the past. The program is not live.

Another example is this blog! At the recent ACCU conference several people commented on what a fast reader I was. I often blog three or four book snippets in a week. And I am learning to increase my reading speed. And I do read a lot. But it's an illusion. I have a stock of hundreds of books I've already read and all the best bits from them are already marked. I simply take a book from my shelf and blog the best of its best bits as a book snippet. And then put the book onto the ACCU charity book-stall pile.