The problem is that we are also in the era of The Brand Named Me, which means we are rather selective about those facts we wish to believe and support versus those we tend to find unconvincing or inconvenient. We have so much data on so many subjects that the very nature of the word 'fact' is being called into question.
Just how sustainable, supportable and even relevant facts may be is becoming a serious question in business and public policy. As we have seen in any number of debates, one side's incontrovertible fact is another's distortion.
We pick and choose, challenge and change, argue endlessly - and then agree to disagree. The current situation with cabinet officers in Germany may provide an instructive glimpse of the future; your positions are less important than whether you have plagiarized or lied about the legitimacy of your academic degrees.
In business we see proponents of differing 'visions' battling it out over resource allocation and strategy. It's like 'High Noon' with powerpoint rather than six-shooters. Everyone involved marshals 'the facts' as they see them but a then curious thing happens; they become incidental. It is the presentation, explanation and interpretation of said facts rather than the data itself that usually wins the day. The reality is not just data selectivity but that we have become all too aware that facts change. The verities of the industrial age have given way to the intangibles of the knowledge era. Traditionalists decry the weakening of accounting's rigid guidelines while failing to expose either the fraudulent misstatements of an Enron or the inability of such conventions to explain Apple's $7 billion 'look and feel' intellectual property victory over Samsung.
The reality is that in the contemporary economy, facts are a starting point, not the end in itself. Our ability to identify, organize and project them begins the discussion rather than settles it. Describing our methods, our interpretation of meaning and our view of potential impact is more important than the data themselves. And we must expect that little is sustainable. With the power of information at our fingertips, everything can and will be challenged. All of which should make for better analysis - if not always assuring better decisions. JL
Greg Satell comments in Digital Tonto:
In the classic TV show Dragnet, Sergeant Joe Friday famously admonished witnesses to give him “just the facts.” Generations of business executives have adopted the same approach, demanding substantiation rather than conjecture.
The problem is that the world is a confusing place and there are plenty of facts to go around. A quick Google search is all that is required to find the facts to support any argument. Studies conflict with other studies, contexts shift and the game goes on.
Yet even that far understates the problem. Even truths born out by rigorous analysis are often laid asunder by a rapidly changing world. Last year’s truths are often today’s red herrings. As rapid technological change transforms politics, culture and economics, we need a new approach that is based less on false certainty and more on simulation.
During World War II, the fighting in the Pacific theater was fierce. Small islands became improvised bases and large amounts of supplies were airdropped to feed the war machine. Food, medicine, weapons and even vehicles appeared from the sky, as if by magic. Once the conflict ended, the flow of manufactured goods mysteriously disappeared.
Alarmed by this sudden turn of events, some of the indigenous island peoples sought to replicate the conditions that led to the benevolence the visiting troops enjoyed. They built makeshift airfields and offices, fashioned radios and headsets from wood and coconuts and even marched with makeshift rifles in imitation of soldiers’ drills.
Alas, no cargo ever came. Anthropologists have named these groups cargo cults and it’s fun to laugh at their naiveté. They confuse correlation with causality. Clearly, mimicking superficial behaviors achieves nothing and those who thinks it does are simply fooling themselves.
However, similar rituals are alarmingly common in the corporate world. They worship their own gods, (like Steve Jobs, or whoever else is the darling of the business press at any given moment), hoping that by emulating their superficial behavior, fortune will smile upon them as well. I’ve come to call these people cargo cult marketers.
Looking For Support Rather Than Illumination
We like to think we’re rational, but we’re really not. In fact, the basis of our beliefs is often strictly irrational.
We’re very susceptible to previous suggestions (a phenomenon that psychologists call priming) and will pay much more attention to easily available information (the availability heuristic) when making decisions. For instance, a recent pile-up on the news will affect our driving behavior more than comprehensive statistics will.
Once we’ve hit on a belief, we will tend to focus on facts that seem to confirm it (i.e. confirmation bias). The physicist Richard Feynman had this to say about the problem with facts in a famous speech about cargo cults:
The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.
So how do you avoid fooling yourself? You put the data first. You search for illumination rather than support. You begin with doubt, rather than certainty. You act, in other words, scientifically, which unfortunately is a term so often misused that it bears some explaining.
Faith and Science
“Science” is a word that gets thrown around a lot. We’re told that “scientists” have “proven” global warming and then that other “scientists” have expressed doubt. We’re told that evolution is real science, but intelligent design is not. Why is that?
The fundamental value of science is that it is falsifiable. It makes predictions that can be refuted. The predictions of climate scientists can be compared against observable data and, as temperatures rise, we gain confidence in the hypothesis. Darwin gave us more than explanations of the past, he described a testable mechanism that we can verify.
Creationists and advocates of the paranormal give us none of those things. They merely state assertions that we may or may not find plausible and which may or may not be true. We can choose to believe them, gain strength from our faith in them and they may even offer lessons that make us better people. Faith can be a very positive thing.
What we can’t do is disprove them and that’s the difference between science and pseudoscience. Science can always be disproven, matters of faith can not be.
So if we can never be sure that we’re right, only that we’re wrong, what do we do?
Tim O’Reilly, long a fixture in Silicon Valley, likes to talk about perpetual beta. The idea is that products should be constantly updated. Google’s Gmail, for example, was in “beta” until 2009, five years after it had been launched. It had already become the most popular service on the planet and still wasn’t considered finished!
Anybody who has been involved with developing technology products knows what a painstaking process this can be. Seemingly endless development meetings and usability testing along with exasperating feature launches and redesigns can certainly inflame passions. However, there is simply no other way to build a great product.
“Fail cheap and fail fast” has become a mantra among start-ups, but larger organizations have problems adopting the same approach. When thousands of jobs are at stake, you have to be careful about what you experiment with. Fortunately, there is another way.
Way back in the 1740’s, the Presbyterian minister Thomas Bayes suggested that we should not be shy about guessing. No matter how wrong we are, subsequent evidence will bear that out and we will become less wrong over time. The method, called Bayesian inference, was mostly abandoned in favor of more controlled methods, but it’s coming back.
In far reaching and varied fields, we are learning to simulate failure in order to succeed in the real world. Digital marketers reserve a small part of their campaigns for A/B testing so that they can improve results in real time. Logistics operations run millions of simulated routes on computers before choosing the best one.
Probably part of the reason that Bayes’ method is becoming popular is that the power of computers makes testing easy. We have a variety of tools, such as Markov chains, agent based models and other forms of sequential analysis, which allow us to simulate extremely cheaply. Nate Silver has successfully predicted 3 elections this way.
However, beyond the technology, we need a change in mindset. Good strategy is always becoming, never being. The mindless quest for absolute substantiation leads to false certainty, not greater rigor.
In the end, all you can really do is try to improve your odds as best you can, manage your risk through good portfolio strategies and adapt to changes in the marketplace when they come about. If you can survive, you can thrive.