Dave Pollard picks up on The Edge 2006 Question response from Kai Krause, and adds quite a detailed take of his own.
The original Krause analogy is a good one. If you take a memetic view of cultural ideas, that drive our day to day decision making, the crises that occur (those that don’t arise from unpredictable natural causes, we couldn’t have prepared for) are pretty akin to system crashes. Conflicts that interrupt smooth running of the program – irrespective of whether the current program was actually productive, valuable or in any sense good.
Generally despite my fundamental informational / computational view of physical reality, I don’t subscribe to everyday “computer” analogies of behaviour. Social “programmes” are quite different to typical digital computer software. That said, the cultural bases of decision making are certainly software of a kind.
Dave makes a good point. He says
“We have ‘forgotten’ how to [respond to quantitative objective evidence] because we have been taught to ignore and suppress our instincts [until it gets personal].”
This is the failure of objectivity I’ve been banging on about. Without the subjective, the objective is (literally) meaningless. It’s also what I believe is behind the Brunsson “hypocrisy” of management decision making. Provided we can show evidence of objective rationale, we can do the downright wrong.
Perhaps, as MoQ-Discuss has suggested, it’s not that objectivity is for the birds, more that we need to learn a “new-objectivity”, that incorporates the meaning of instinctive personal subjective involvement with the merely (otherwise deliberately detached) objective.
When the road is paved with good (but soft and subjective) intentions, rather than hard objective facts, we’re maybe more likely to notice when we’re sleepwalking towards the freight train at the rail crossing ?