Last Modified:                                                                                                



Homepage / Publications & Opinion / Silicon.com

Peter Cochrane's Uncommon Sense: Science and belief
Let's not create "a house of cards standing on a foundation of sand" Dr Peter Cochrane
The business world would do well to learn from the kind of verification and peer group review that is central to science. Peter Cochrane spells out the dangers of being slapdash...

I can't recall the point in my education and training when I became so enamoured by the scientific method but life long exposure to it over many projects has made me a strong advocate. What has been surprising is the difficulty that I, and others, experience in trying to explain and defend the inherent power and surety of science to those outside the community.

As a student I had the good fortune to have professors who insisted I solved problems using at least three different techniques starting with different assumptions to achieve the same answer. When the same or very similar answers to a given problem were realised in this manner then and only then they would be confident we were on the right track. However, at the leading edge of science, engineering and business we often find ourselves with a very limited scenario of options. The techniques and starting conditions available are often limited and incomplete.

Ultimately we rely upon others independently achieving the same or similar results. In an ideal world experimenters should be free of bias in their selection of technique, measurement and subject. When involving human subjects, they too should be unaware of what is being done, how it is being done and what the outcome might be.

In the case of testing drugs, for example, it is best that the clinicians involved have no idea whether they are administering a given drug or placebo, and the same should apply to the patients. The real experimenters are the observers who play no part in the administration or the measurement. When this is repeated over at least three different locations across the planet and a good result correlation has been achieved then and only then are they deemed credible.

Ideally practitioners and patients should be blind to the experiment and the observers analysing the outcomes should be the only people aware of all the variables. This 'double blindness' is the most powerful safeguard we have devised to guard against human fallibility.

This approach turns out to be our ultimate method of establishing the truth over belief and achieving real understanding. Finally, we also employ publishing to make all techniques and results public, and then others can repeat and confirm or criticise our experiments, and go on to refine and polish techniques, theories and models. Not surprising, a large percentage of experiments tend to be unrepeatable in the wider scientific community and therefore the results are dismissed or parked for later investigation.

Such a gatekeeper function is of immense value as it guards against fraud, human error, bigotry, personal bias, greed, graft and belief. It is an absolute essential in science, engineering and technology as all advance by standing on the shoulders of previous generations and their assured results. If we do not build our knowledge in this way, we would be creating a house of cards standing on a foundation of sand.

Outside the reaches of science we find a multiplicity of belief systems and practices leading to wrong assumptions, decisions and often catastrophes. In the past few years, for example, we have seen the accounting profession discredited in the shape of Andersen Consulting, Enron and WorldCom. According to press reports the US Generally Accepted Accounting Principles (GAAP) were flouted, cross checks ignored and results massaged and interpreted for reasons essentially down to human greed and/or ignorance.

Legal and governmental systems are also prone to promote grief as they too are dominated by human fallibility rather than rigorous and testable frameworks. We see governments adjusting figures to show improvements in education and healthcare year on year to appease an electorate. But very quickly a few percentage point improvements each year leads to an unbelievable state where the disparity between the political mirage and practical experience becomes gross. The downside of such violations of trust centre on the raised expectation of the individual and society.

Unfortunately the same is true in many circles of human activity and is deeply embedded in all management and political systems. Individuals choose to ignore demand, technology, human nature, practical and theoretical evidence and experience. Why? Belief systems, human ignorance, stupidity and vested interests are powerful allies and extremely resilient in all cultures, past and present.

If education systems are improving as advertised we should perhaps be witnessing a huge improvement in governance and decision-making but I suspect they are getting worse. How come? Fewer people are being educated to appreciate any form of science and technology is regarded by more and more as some form of magic.

And worse, in an IT dominated world that is moving faster, and becoming increasingly non-linear, belief systems are even more damaging. Bad decisions now create even worse results in a much shorter time.

Having worked all day in London I have just walked from Pall Mall to Liverpool Street station because the city is at a virtual standstill. This column was typed on my Apple laptop on a train that was delayed for 35 minutes. Copy was dispatched from my home on a dial-up 56Kbps modem at 48Kbps.