This sequence from two years ago still seems relevant.
My recent post on psi, as well as a positive reference to it in Irreducible Mind, triggered me to go back to a book I read many years ago before I started blogging. It is Dean Radin’s The Conscious Universe. This is the second of two posts; the first was published yesterday.
Processes of Distortion
Radin devotes a whole chapter to explaining the various processes that contribute to our persistent distortion of reality. These processes apply to all of us and are not unique to scientists, but scientists should theoretically be the group of people least susceptible to them. Unfortunately, as the evidence cited in the previous post indicates, this is very far from being the case.
Some of what Radin explores we have already looked at in some detail on this blog, for example Kahneman’s analysis of what he terms System 1 thinking, and which I have referred to as our gut instinct about things which is quick to reach strongly held conclusions about reality but frequently wrong about complex matters. These are important aspects of the problem but I don’t intend to repeat my exploration of them here. Those who are interested should click on the links.
He also includes, in his list of factors, filtering processes such as suppression, repression and dissociation (something existential approaches term ‘discounting’). While intriguing I’m not convinced these are among the most important.
To my mind social factors, which he deals with at length in other parts of his book, carry more weight. Scientists concerned about their careers or those for whom the opinion of colleagues carries excessive weight will clearly be motivated to deny what they see. They will probably, though, be aware at some level that this is what they are doing. I want to focus here on some of the unconscious processes that are at work among biased scientists convinced that they are merely being objective.
An over-arching consideration is what in lay terms is called ‘wishful thinking.’ Radin writes (page 229):
We do not perceive the world as it is, but as we wish it to be. . . . . Essentially, we construct mental models of the world which reflect our expectations, biases, and desires, a world that is comfortable for our egos, that does not threaten our beliefs, and that is consistent, stable, and coherent.
This process creates a degree of commitment (page 235) that mobilises us to defend our egos when the picture we believe in is threatened. It also links with what he terms ‘expectancy effects’ (page 234).
He also brings into focus (page 237) a well-attested problem of a similar nature known as confirmation bias. Our brains are wired to accept information that fits with our existing views of the world and to reject out of hand and almost automatically anything that clashes.
All of these aspects combine to create the phenomenon demonstrated in a possibly apocryphal story from the past. A sceptical version from the North Coast Journal goes like this:
Have you heard of the invisible ships phenomenon, cited in several new-age books and movies? It goes like this: When Captain Cook/Columbus/Magellan (depending on the version of the story you’re hearing) arrived at the coast of Australia/Cuba/South America, the native people completely ignored them, presumably because huge ships were so alien to their experience that “… their highly filtered perceptions couldn’t register what was happening, and they literally failed to ‘see’ the ships.” (Quoting here from JZ Knight’s What the Bleep Do We Know?)
The story seems to have originated with Joseph Banks, botanist on Captain James Cook’s 1770 voyage. On several occasions while they were off the coast of Australia, he commented that the natives paid virtually no attention to the 106-foot long Endeavour. On April 28, sailing north along the east coast of Australia, he recorded in his diary that fishermen “… seemd to be totaly engag’d in what they were about: the ship passd within a quarter of a mile of them and yet they scarce lifted their eyes from their employment … ”
Banks seemed to be troubled by not being the star attraction: “Not one was once observd to stop and look towards the ship; they pursued their way in all appearance intirely unmovd by the neighbourhood of so remarkable an object as a ship must necessarily be to people who have never seen one.”
Radin gives us a well-replicated example of the power of ‘prior convictions’ that is far closer to home (page 231).
Bruner and Postman created a deck of normal playing cards, except that some of the suit symbols were colour-reversed. For example, the queen of diamonds had black-coloured diamonds instead of red. The special cards were shuffled into an ordinary deck, and then as they were displayed one at a time, people were asked to identify them as fast as possible.
At short exposures the obvious mistakes were made and people failed completely to spot the anomalies, labelling a card either the queen of spades or the queen of diamonds rather than noting the dissonance. Even increased exposure times did not necessarily create accurate perceptions.
When the researchers increased the amount of time that the cards were displayed, some people eventually began to notice that something was amiss, but they did not know exactly what was wrong. One person, while directly gazing at a red six of spades, said, “That’s the six of spades but there’s something wrong with it – the black spade has a red border.”
As the display time increased even more, people became more confused and hesitant. Eventually, most people saw what was before their eyes. But even when the cards were displayed for 40 times the length of time needed to recognise normal playing cards, about 10 per cent of the colour-reversed cards were never correctly identified by any of the people!
If something so simple can create such confusion it’s not entirely surprising, even if it is disappointing, that scientists, most of whom are committed in advance to the impossibility of psi which is a far more complex phenomenon, should fail to recognise it even when they have fallen over it.
Radin concludes this exploration by saying about this issue (page 246):
The expectations of the scientific elite actually put them more at risk of being swayed by perceptual biases than the general public. After all, the scientific elite have lifelong careers and their credibility on the line. They are strongly motivated to maintain a certain belief system. By contrast, most members of the general public do not know or care about the expectations of science.