Rules of thumb and unreliable models

We were discussing below the scourge of U.C.H.  I am referring, of course, to the unintentionally humorous criticism by Dan Kahan of the “unreliable cognitive heuristics” of the unwashed masses.  We just cannot get them to take our word for stuff any more.  They keep relying on their guts to decide whether we’re crying wolf and trying to dazzle them with B.S.  Where’s the trust?

What’s funny is the idea that your average smart Yalie uses something better than rules of thumb to weigh essentially unquantifiable risks for political purposes.  If you’re of a rigorous turn of mind, you can get a pretty good handle on risks in repetitive situations that are susceptible to statistical analysis.  You can’t get anything like a rigorous handle on risks from models of the behavior of chaotic systems that have never met the gold standard of predictions confirmed by observations (and no fair back-fitting with previously unidentified critical factors).  The best anyone could ever get out of an emerging science of prediction is a gut feel, an instinct for where to focus future research.

Richard Feynman analyzed the failure of the Challenger shuttle.  He found that people were sharpening their pencils to an absurd degree and fooling themselves into thinking they had pinpointed risk out to a number of decimal points.  In fact, they were piling probability assumption on probability assumption, when no single assumption had a solid empirical basis:

It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life.  The estimates range from roughly 1 in 100 to 1 in 100,000.  The higher figures come from the working engineers, and the very low figures from management.  What are the causes and consequences of this lack of agreement?  Since 1 part in 100,000 would imply that one could put a Shuttle up each day for 300 years expecting to lose only one, we could properly ask “What is the cause of management’s fantastic faith in the machinery?”

. . .

There is nothing much so wrong with this as believing the answer!  Uncertainties appear everywhere. . . .When using a mathematical model careful attention must be given to uncertainties in the model.

. . .

There was no way, without full understanding, that one could have confidence that conditions the next time might not produce erosion three times more severe than the time before.  Nevertheless, officials fooled themselves into thinking they had such understanding and confidence, in spite of the peculiar variations from case to case.  A mathematical model was made to calculate erosion.  This was a model based not on physical understanding but on empirical curve fitting.”

He concluded with one of my favorite statements, a truly reliable rule of thumb:  “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

Loading Likes...