. . . . psychotic symptoms exist on a continuum even in healthy individuals (Stefanis et al., 2002). This, too, seems to be explicable if psychosis is a way to cope with existential distress – as psychosis would be quantitatively, rather than qualitatively, different from normal.

(Psychosis as Coping by Grant S Shields – page 146 in Existential Analysis 25.1: January 2014)

There is growing interest in the idea of that ‘psychotic’ crises can sometimes be part of, or related to spiritual crises, and many people feel that their crises have contributed to spiritual growth. A number of clinical psychologists have also explored the interface between psychosis and spirituality. Some believe that at least some ‘psychotic’ episodes can be transformative crises that contain the potential for personal, including spiritual, growth. Many people who believe that there is a spiritual element to their experiences find support from others with similar beliefs invaluable, for example within faith communities.

(From Understanding Psychosis and Schizophrenia published by the British Psychological Society – page 55)

In the last post I began to look at a paper (pages 41-49, from the British Journal of Clinical Psychology – 2012 – 51, 37-53) by Charles Heriot-Maitland, Matthew Knight and Emmanuelle Peters on the subject of what they call Out-of-the-Ordinary-Experiences or OOEs.

Where their findings became even more intriguing from my point of view was when their discussion used terminology with clear spiritual implications that are held in common across NDEs, mystical states and meditative practices. They write:

Another subjective phenomenon reported by both [clinical] and [nonclinical] participants was the sensation of ego loss, what essentially seemed to be a breakdown of the normal psychological relationships between mind-body and/or self-others.

A fear reaction was frequently reported and ‘is likely to have largely come from the unfamiliarity of [the] experience . . . . It is possible that more prolonged absorption was caused by the emotionally fulfilling roll of the OOE in a psychological problem-solving process.’

This was followed in their report by more of a spiritual nature concerning the discovery of deeper meaning:

This symbolic, deeper meaning perhaps reflects the quality of awareness that is not filtered or confined by the conceptual boundaries of ordinary day-to-day experience… If the ego breaks down, then it may be that perception of the world becomes unbounded and limitless . . . .

This, in their view, paves the way for a shift in consciousness:

Following on from the previous theme, which conveys an awareness that is free from the influences of a ‘conditioned’ conceptual framework, this theme suggests the implementation of a new conceptual framework, or a new way of looking at the world.

levels-of-consciousness v3Where their work maps onto that of Jenny Wade is in the idea that, when our old models of reality cease to work in new situations, a state of uncomfortable dissonance is created that leads to a breakthrough to new levels of understanding:

It could be that the initial psychological crisis arose in many participants due to an inadequacy of their existing conceptual framework in making sense of their emotional experience. . . . . . It may be that a new way of thinking was the necessary, adaptive ‘solution’ to the crisis; that the old conceptual framework had to be replaced by a new one for the emotional experience to become integrated.

Dabrowski's TPD diagramWade’s model maps closely onto Dabrowski’s Theory of Positive Disintegration in key respects. She analyses, in a more close-grained fashion than Dabrowski, which kind of conflict and discomfort spurs us to move up from the comfort zone of our present level of consciousness to the next step up the ladder of awareness. Dabrowski, as I have explored elsewhere, correlates this most strongly with an intensity best described as suffering.

The next point the paper makes is crucial:

[T]he fact that, apart from existential questioning, there has been no notable difference up to this point in the OOEs of [clinical] and [non-clinical] groups implies that this problem-solving process is neither pathological nor indicative of clinical psychosis.

The real issue lies somewhere else altogether. They explain in a particularly important passage:

More of the [nonclinical] participants received validating/accepting responses from others, and more of the [clinical] group received invalidating responses, as these quotes illustrate:

‘[I] relayed this experience to psychiatrists in the [hospital] and was sent for EEG tests, was told that I was hallucinating – this guy just didn’t listen to, just obviously haven’t heard anything really that I’d said . . .’

‘Somebody came up to me and said “well, you know, we really need to hear from you. That’s a very powerful message to people, and they need to hear that message.” And that did matter to me.’

For the individual who is, perhaps, already slightly hesitant about how best to incorporate their experience into their social worlds, the difference between these two social interactions could be immense.

All non-clinical participants demonstrated some prior understanding or interest in their OOEs, which are generally described as ‘life-enhancing.’ Furthermore, ‘These life-enhancing qualities, which were reported by the majority of participants, add further support to the psychological problem-solving hypothesis. Not only did the OOEs provide many participants with relief from emotional suffering, but they also added a dimension that enriched other life domains. . . . . The medical (illness) explanation clearly presented barriers to similar reflections in the clinical population . . .’

The blame for why some people’s experiences are eventually experienced as dark, negative and ultimately inescapable seems to lie with the negative approach adopted by others, especially the medical profession:

More [non-clinical] than [clinical] participants viewed their experience as a temporary stage or process. . . . . . [I]f the causes and subjective nature of OOEs are no different between [non-clinical] and [clinical] groups, then it seems misleading for professionals to inform one group that their OOEs signal ‘the end,’ [ie they are stuck with them] while the other group continue with their (enhanced) lives.’

dancing-past-the-darkThis has echoes for me of how the reaction of others determines how the experiencer responds to distressing NDEs, which also has an impact on their future mental well-being. Nancy Evans Bush writes (Dancing Past the Dark: Kindle reference 2502-05):

Experiencers have told many sad stories of going to a professional for help in understanding their NDE, only to find themselves caught up in the medical model, pathologized by a diagnostic label and the NDE dismissed as meaningless. . . . . . . People have also told of being dismissed by their rabbi or pastor as well, for in a secular society much awareness of deep spiritual process is lost or distorted, even within religious institutions themselves.

Stephanie Beards and Helen Fisher, in a 2014 paper (Social Psychiatry Psychiatric Epidemiology 49: 1541–1544), shed further light on the dynamics of this. They write (page 1542):

It has been proposed that negative core schemas [ingrained patterns of thought or behaviour that affect experience] are formed early in life and may result from adverse experiences in childhood. If an individual experiences further trauma later in life, these schemas could become (re)activated, leading to emotional changes which may not only cause the development of psychotic experiences, but alter the appraisal of these anomalous occurrences, further increasing distress, and preventing a benign explanation from being concluded.

Even so, such experiences do not need to cast a shadow over the rest of a person’s life. The experiences themselves, as the current British Journal of Clinical Psychology study demonstrates, are not significantly different between the two groups, nor are the potential explanations they develop. Nearly all participants gave some acknowledgement of the link between psychotic and spiritual experience.

Because the OOEs of all participants seemed, at some level, to fulfil a psychological purpose, they were interpreted as being a part of an adaptive psychological problem-solving process, which frequently involved the breakdown of conceptual ego boundaries, and the formation of a new conceptual outlook.

However, regarding group differences (my emphases), they write:

[T]here was a sense that [non-clinical] participants were better able to incorporate their OOEs into their personal and social world. This was partly due to more [non-clinical] participants having prior conceptual knowledge of, and in some cases, open attitudes towards, there OOEs; however, the more prominent reason seem to be that more [non-clinical] participants received validation and acceptance from others.

The saddest point of all perhaps is this:

It would seem that the more OOEs are associated with clinical psychosis, the less chance people have of recognising their desirability, transiency, and psychological benefits, and the more chance they have of detrimental clinical consequences.

They draw some very strong conclusions from this:

An important clinical implication is that psychotic experiences should be normalised, and people with psychosis should be helped to re-connect the meaning of their OOEs with the genuine emotional and existential concerns that preceded them. . . . . . However, the current findings suggest that the argument for normalisation goes far deeper than just its clinical usefulness; they imply that a more ‘radical normalisation’ approach is needed, when normalising OOEs becomes an intrinsic formulation and treatment principle.

During my decluttering, I also came across a number of journals which describe current approaches to creating psychological descriptions of a patient’s problems, known as formulations in psychobabble. Nowhere, for any patient group, did I find reference to any kind of spiritual dimension, though the word ‘cultural’ was thrown in from time to time, and might have concealed an entrance through which such considerations could possibly have infiltrated the consultation process.

When it comes to psychosis, where the default first-line treatment is medication rather than therapy (or meditation), there is an additional problem:

Unlike antipsychotic drugs, which can suppress the emotional expression, this approach [of accepting the validity of the emotions underlying the OOEs] would validate and encourage the emotional expression, whilst working on building a more helpful conceptualisation or narrative about the emotional concerns.’

The authors do not regard their paper as definitive. They are all to aware of its possible limitations, shown, for example, by their reference to methodological caveats concerning small sample size and possible confounding variables not having been picked up at screening and thereafter controlled for.

I do not think those caveats constitute reasons for ignoring or minimising the significance of their findings, but rather they should be a motivating factor for the generation of further work on this issue. In the meantime, even in advance of further findings, we should be spurred to introduce into the clinical setting a far greater sensitivity to the emotional and spiritual meaning of such experiences.

Elizabeth Gould in her Lab

Given my recent repeat rant about neuroscience’s resistance to the facts of neuroplasticity, reposting this short sequence seemed a good idea.

Going Back over Old Ground

From Lehrer’s account in his book Proust Was a Neuroscientistwe can pick up the story from the point in the previous post on this topic when Elizabeth Gould, much to her surprise, had found long discounted evidence of neuroplasticity in the literature. She began to explore it in depth.

She read papers by Altman, Kaplan and Nottebohm (page 41):

She realised [they] all had strong evidence for mammalian neurogenesis. Faced with this mass of ignored data, Gould abandoned her earlier project and began investigating the birth of neurons.

She published new data over the next eight years (ibid.):

Gould’s data shifted the paradigm. More than thirty years had passed since Altman first glimpsed new neurons, but neurogenesis had become a scientific fact.

By 1998 ‘even Rakic admitted that neurogenesis was real.’

So what?

The implications of neurogenesis are of extreme importance (page 42):

What does the data mean? The mind is never beyond redemption, for no environment can extinguish neurogenesis. As long as we are alive, important parts of the brain are dividing. The brain is not marble, it is clay, and our clay never hardens.

Norman Doidge

It is Norman Doidge‘s book, though – the one I bought at the same time as Lehrer’s, completely unconscious of their close correspondences – that expands upon the human cost of the arrogance that buried the evidence for thirty years. His whole book The Brain That Changes Itself is an accessible but authoritative explanation of the multitude of ways that neuroplasticity impacts upon us – how belief in it promotes healing and scorn of it has prolonged suffering.

I got the heads up about this fascinating book from a friend at a Bahá’í meeting. Standing absolutely upright at over six feet in height, he looked me straight in the eye and said with absolute conviction, ‘You really must read this book. I’m 75 years old now and I’m functioning better mentally than I was at the age of 60 purely as a result of doing the exercises it talks about.’ After a recommendation like that, how could I resist. I’m 73 now trying to pretend I could keep up with myself at 50. It was a no-brainer.

I’ll just take a few points from one chapter – Redesigning the Brain – to illustrate just how poisonous the dogmatism of science has been in this critical area.

He discusses the work of Michael Merzenich (page 49):

In a series of brilliant experiments he showed that the shape of our brain maps changes depending upon what we do over the course of our lives.

At first his interest in brain plasticity had to go on the back burner. After remaining underground for a few years with his ideas, he had an opportunity in 1971 to research them using adult monkeys. His findings were dismissed: they could not possibly be true. He was opposed by the most influential figures in the field. This was not just frustrating at a personal level (page 62):

“The most frustrating thing,” says Merzenich, “was that I saw that neuroplasticity had all kinds of potential implications for medical therapeutics – for the interpretation of human neuropathology and psychiatry. And nobody paid attention.”

People previously seen as beyond help could form new maps in the brain and live more normal lives (page 63) – ‘people with learning problems, psychological problems, strokes, or brain injuries’ – but only if the idea was accepted and became the basis for widespread interventions.

It wasn’t until the late 1980s that Merzenich was able to develop a deep and accurate understanding of how positive changes could be facilitated. He teased out the importance of motivation (page 66), how individual neurones got more selective with training (page 67), how they came to operate more quickly (ibid) and perhaps most importantly of all (page 68) ‘that paying close attention is essential to long-term plastic change.’

It wasn’t until 1996 that he, along with a number of colleagues (page 70), ‘formed the nucleus of a company . . . that is wholly devoted to help people rewire their brains.’

A Costly Case of Dogma

Even if you only date the start of a belief in neuroplasticity at 1962 – and there is some evidence it could fairly be backdated earlier than that – 34 years seems a long time to wait for such a clinically vital concept to surface into general practice.

I can testify to that from personal experience. From when I first studied psychology in 1975 until I qualified as a clinical psychologist in 1982, the conventional wisdom was that the adult brain had virtually no capacity to change itself. I cannot exactly remember when it became respectable to doubt that dogma, but I am fairly sure it was well into the 90s. And even then it was a qualified scepticism only. We were into the new century before I became aware of the wide ranging and radical possibilities that people like Schwartz (See Mind over Matter link below) have written about.

It is horrifying to contemplate the human cost of such resolute intransigence in the face of compelling data. It testifies, in McGilchrist’s terms, to the power of the left-brain to shut out the evidence of experience in order to keep faith with its often misguided maps. If a huge body of carefully accumulated and completely credible evidence such as this took so long to make a dent in this particular dogma, how long is it going to be before science will take serious steps to investigate spiritual realities. Anyone who attempts any such thing at the moment has almost certainly killed their career and will have their evidence subjected to an onslaught of nit-picking that no findings could ever survive.

Thus does science make it impossible even for its own practitioners to investigate, let alone to understand, what it has decided in advance is impossible. So much for its spirit of genuine enquiry. This has to change if human thought and society is to grow beyond the current straitjacket of materialism.

Related Material:

Mind over Matter

The Master and his Emissary by Iain McGilchrist

Michael Merzenich TED talk (I had to grit my teeth every time he referred to the brain as a machine – a fascinating, if slightly breathless talk none the less):

If you are anywhere close you may want to join us tomorrow. It would be great to see you. The service was an inspiring experience last year.

Peace Day Invite


George Eliot

Given my recent repeat rant about neuroscience’s resistance to the facts of neuroplasticity, reposting this short sequence seemed a good idea.

Kicked-Started by Coincidence

Till just over a fortnight ago I would not have connected George Eliot with neuroplasticity. Why on earth should I have?

Well, I can now think of at least one reason.

During the same period I was spending part of my book token loot on ‘Zero Degrees of Empathy‘ (see previous post) I had the chance to spend what was left. Not surprisingly I went to the same shelf (‘Popular Science’) in the same Waterstones. I was on the hunt for two books, one by Norman DoidgeThe Brain that Changes Itself, and one by Jonah Lehrer Imagine: how creativity works. I ended up buying the Doidge but a different one by Lehrer – Proust was a Neuroscientist.

Bear with me a bit longer – this is going to get interesting in a minute. I had no idea that I was setting up a remarkable juxtaposition of events.

It’s my month for such things I suspect. Just recently a couple I had not seen for many many years came to visit us from London. On arrival at our house the wife was in a state of astonishment. ‘You two,’ she said incredulously to her husband, ‘have been exchanging Christmas and Bahá’í New Year cards for decades but I never saw the address you were sending our cards to. This is the very road I used to live in when I was a child. How unlikely is that?’

I had no idea, when I was paying for the two books, that I’d just bought into another improbable coincidence, maybe not so dramatic but a touch more generally significant. Sheldrake’s notion of morphic resonance was gaining credibility by the second.

Marshalling Middlemarch in his Argument

When I got home I kicked off by starting to read Lehrer’s book. There was a mildly interesting chapter on Walt Whitman – no offence to his memory but I’ve always found Emily Dickinson, though secretive to the point of invisibility, far more impressive. But I would prefer the introverted poet of the two, wouldn’t I?

Then Lehrer moved on to one of my all-time favourite writers discussing her greatest novel: it was a chapter involving George Eliot using Middlemarch as a springing off point. He quoted her as follows (page 38):

Dorothea – a character who, like Eliot herself, never stopped changing – is reassured that the mind “is not cut in marble – it is not something solid and unalterable. It is something living and changing.”

I don’t think I’m being pedantic to point out that this is not the exact quote. At the beginning of Chapter 72 Dorothea is talking to Mr Farebrother and the conversation goes like this:

“Besides, there is a man’s character beforehand to speak for him.”

“But, my dear Mrs. Casaubon,” said Mr. Farebrother, smiling gently at her ardour, “character is not cut in marble – it is not something solid and unalterable. It is something living and changing, and may become diseased as our bodies do.”

Lehrer isn’t making the mistake of absolutely identifying Eliot with a statement of one of her characters, albeit a vicar and a man of some integrity. He is looking at how Eliot examines the capacity for change we all have through a variety of lenses. She is part of his body of evidence that science and art are not irrevocably at odds. It is unfortunate that he strengthens his case for her as an early proponent of neuroplasticity by this substitution of ‘mind’ for ‘character.’ It does make the subsequent shift to talking about brains easier though. And she is as ‘anti’ any form of reductionism as Lehrer is.

Science or Scientism?

And when he does start talking about brains the discussion becomes fascinating. One reason for this is that the other book I had bought is entirely focused on aspects of neuroplasticity. The second completely unexpected reason is that both discussions of this issue contain page after page that vindicate Sheldrake’s contention that science in its current form is ‘dogmatic.’ Sheldrake wrote in The Science Delusion (page 4):

I have written this book because I believe that the sciences will be more exciting and engaging when they move beyond the dogmas that restrict free enquiry and imprison imaginations.

His main aim is to attack its materialism as a creed not a fact (page 6):

Contemporary science is based on the claim that all reality is material or physical. There is no reality but material reality. Consciousness is a by-product of the physical activity of the brain. Matter is unconscious. Evolution is purposeless. God exists only as an idea in human minds, and hence in human heads.

However, dogma also operates within science even where there is no such violation of this creed. The mind/brain problem is a powerful example of where dogma within science has clearly impeded our proper understanding of the brain and the mind/brain relationship for decades to the detriment of huge numbers of people. It illustrates how assumptions can and often do become fossilised, useful only for stoning into oblivion all those that disagree.

A Heretic at Work?

Lehrer, in his book, looks at the career of Elizabeth Gould, for example (page 39). In 1985 Pasko Rakic had proclaimed he possessed conclusive proof from experiments with rhesus monkeys that neurones are only generated ‘during pre-natal and early post-natal life.’ So the adult brain could not grow neurones: end of story. Everyone that mattered seemed to have believed him. No resurrection then for the notion of neurogenesis lying discredited in its mausoleum.

In 1989, during a completely different piece of research, Gould had found the unexpected: ‘the brain also healed itself.’  Gould assumed she must have been mistaken. She went back to the literature expecting to find that this was the case. To her astonishment she found the opposite. She found a wealth of evidence dating back to 1962 that all sorts of fully grown creatures were capable of growing neurones in their adult brains. All the evidence had been arbitrarily dismissed out of hand and entombed out of sight on neglected library shelves.

What happened next will have to wait until the next post.

A particularly shocking demonstration of the limitations of the genetic argument is an epidemiological analysis of the prevalence and incidence of schizophrenia in Nazi Germany, wherein it is estimated between 220,000 and 269,500 citizens with the diagnosis were forcibly sterilized or murdered by the Nazi regime (Read & Masson, 2013; Torrey & Yolken, 2010). Contrary to everything that is known about genetic, heritable conditions, the rates of schizophrenia diagnoses in Germany did not diminish after the war but increased. The analysis showed this atrocity provided proof against the very reasoning used to instigate it.

(The Role of Social Adversity in the Etiology of Psychosis by
Eleanor Longden and John Read – page 11)

schwartzSome time ago on this blog I addressed the issue of neuroplasticity. I shared my frustration at how the neuroscientific community’s resistance to the idea that the mature brain could change had been a damaging doctrine for decades.

As I wrote in 2012, even if you only date the start of a belief in neuroplasticity at 1962 – and there is some evidence it could fairly be backdated earlier than that – 34 years seems a long time to wait for such a clinically vital concept to surface into general practice.

I can testify to that from personal experience. From when I first studied psychology in 1975 until I qualified as a clinical psychologist in 1982, the conventional wisdom was that the adult brain had virtually no capacity to change itself. I cannot exactly remember when it became respectable to doubt that dogma, but I am fairly sure it was well into the 90s. And even then it was a qualified scepticism only. We were into the new century before I became aware of the wide-ranging and radical possibilities that people like Schwartz have written about.

It is horrifying to contemplate the human cost of such resolute intransigence in the face of compelling data.

I have expressed equal frustration, if not more, at the obdurate dogmatism with which mainstream materialistic science denies validity to spiritual experiences of almost any kind.

Not even once in my entire experience of being taught psychology did I ever hear of Frederick William Henry Myers, a resolute explorer of the borderland between mind and spirit. The closest encounter I ever had of this kind was with William James. He was mentioned in asides with a dismissive and grudging kind of respect. The implication was that he was an amazing thinker for his time but nowadays very much old hat. I gave him a quick glance and moved on.

Looking back now I realise I was robbed.

Irreducible MindKelly and Kelly capture it neatly and clearly in the introduction to their brave, thorough and well-researched book, Irreducible Mind (pages xvii-xviii):

[William] James’s person-centered and synoptic approach was soon largely abandoned . . . in favour of a much narrower conception of scientific psychology. Deeply rooted in earlier 19th-century thought, this approach advocated deliberate emulation of the presuppositions and methods – and thus, it was hoped, the stunning success – of the “hard” sciences especially physics. . . . Psychology was no longer to be the science of mental life, as James had defined it. Rather it was to be the science of behaviour, “a purely objective experimental branch of natural science”. It should “never use the terms consciousness, mental states, mind, content, introspectively verifiable, imagery, and the like.”

And, sadly, in some senses nothing much has changed. Too many psychologists are still, for the most part, pursuing the Holy Grail of a complete materialistic explanation for every aspect of consciousness and the working of the mind.

I have a comparable, perhaps even greater, sense of frustration about a similarly destructive dogmatism that bedevils the clinical/psychiatric approach to so-called psychotic experiences. This is far more damaging, for reasons that will become clear in a moment, than the a priori rubbishing of psi or near death experiences, unhealthy as that undoubtedly is.

My recent decluttering process triggered the feeling all over again. I’ve been sorting through back issues of my psychology journals. In the process, I found one article of particular interest on this theme. Sadly it was the only one I found in the dozens of journals I have checked through for items of interest before deciding whether to discard them. (As I later discovered through trawling the web and my British Psychological Society website in particular, there are others sailing against the hitherto prevailing current of dogmatic biodeterminism, but they are still the exception rather than the rule. The BPS as a body, to its credit, is getting on board as well, as quotes I use in later posts will testify.)

The journal[1] was dated 2012 and contained a paper by Charles Heriot-Maitland, Matthew Knight and Emmanuelle Peters on the subject of what they call Out-of-the-Ordinary-Experiences or OOEs. The focus of the study was to use a phenomenological interview process that enabled them to compare the experiences of two small groups of people, one group who had been diagnosed as psychotic, labelled the clinical (C) group, and other who had not, labelled the non-clinical (NC) group.

Their operating assumption from the start was that voice-hearing prevalence, which runs at 10-15%, (page 38) ‘suggests that OOEs do not inevitably lead to psychiatric conditions, and that people can experience psychotic-like phenomena whilst continuing to function effectively.’

They also refer to two other pieces of research from this sparsely populated field of investigation.

First of all, they quote Brett et al (2007) as finding that ‘while [their Diagnosed] group were more likely to appraise their experiences as external and caused by other people, the [Undiagnosed] group made more psychological, spiritual and normalising appraisals, and reported higher perceived understanding from others. . . . . They . . . did find trauma levels in both groups to be higher than in the general population.’

Jackson and Fulford (1997), which they describe as the only known published qualitative study of clinical and nonclinical populations with OOEs, also found that psychotic-like experiences were triggered in both groups by intense stress in the context of existential crises, and that the subsequent group distinction depended on ‘the way in which psychotic phenomena are embedded in the values and beliefs of the person concerned.’

Later work has expanded on this. For instance, Eleanor Longden and John Read in their review of the evidence concerning the role of social adversity in the etiology of psychosis (American Journal of Psychotherapy, Vol. 70, No. 1, 2016: pages 21-22) summarise a wealth of data that suggests that, not only is trauma a clear factor in the incidence of psychosis, but also psychotic experiences relate strongly to the nature of the trauma experienced. For example, work with 41 patients experiencing a first episode of psychosis found that attributes of stressful events in the year preceding psychosis onset were significantly associated with core themes of both delusions and hallucinations (Raune, Bebbington, Dunn, & Kuipers, 2006).

Where the OOE work is particularly significant is in the emphasis it places on the potentially positive function of the psychotic experience in and of itself, a rare perspective indeed. Even a paper on the existential approach (Grant S Shields – Existential Analysis 25.1: January 2014 – page 143) takes a somewhat darker view of such experiences, seeing psychosis as ‘a mechanism for coping with existential distress – a way of being that allows an individual to escape existential realities when that individual cannot avoid these things otherwise.’ I will be returning to a more detailed consideration of his valuable but different position in a later post.


Later in this sequence I will refer back to other thinking and data that expand on the relationship between levels of consciousness or understanding, and the stress caused by experiences that challenge the models of reality we have so far developed. I’ll just focus in the reminder of this first post in the sequence on the basics of what this study found (pages 41-49). Please bear in mind as you read that we should do our best to see the experiences labelled ‘psychotic’ not as some alien state remote from anything we might ever have to undergo ourselves, but as simply part of a continuum, a dimension, along which we all are placed and therefore could at some point also be thrust to a similar extreme, given the wrong circumstances. I’ll be retiring to they theme in a later sequence as well.

Nearly all participants in both groups reported a period of emotional suffering before their first OOE. There was a sense, therefore, that the first OOE was a direct expression of emotional concerns at the time. For details of what some of the OOEs were like, see the table above.

A process of existential questioning came into the mix. Similar to the emotional suffering, there also seemed to be some direct relevance of OOEs to the context of participants’ existential questioning. From this, it could be interpreted that the OOE actually emerged as a direct expression of, or indeed solution to, some kind of psychological crisis.

Isolation, which was reported equally across both groups, was either caused by intentional social withdrawal, or by private pre-occupation with other activities. It may therefore be that isolation has more of a causal role in triggering the experience itself, perhaps because it encourages introspective focus on the kinds of emotional and/or existential concerns mentioned above.

At first I thought the authors might be operating on an implicit assumption that isolation is generally undesirable, but revised that view in the light of the paper as a whole.

One of their most striking findings was the powerful language used by participants to describe the emotionally fulfilling and euphoric qualities of their experiences.

Next Monday I’ll be looking more directly at the spiritual implications of this.


[1] British Journal of Clinical Psychology (2012) 51, pages 37-52.


Based on: Hotel Room, by Edward Hopper (1931). Photo: Kim Dong-kyu

Based on: Hotel Room, by Edward Hopper (1931). Photo: Kim Dong-kyu

My good friend Barney sent me the link today. I have just finished reading Andrew Sullivan’s article in the New York Magazine. It is moving, penetrating and manages, to my envy, to be both deep and wide-ranging. It’s a long read but once I started it I wasn’t going to stop till I came to the end. There are many beautiful insights, of which this is only one:

Attached to my phone, I had been accompanied for so long by verbal and visual noise, by an endless bombardment of words and images, and yet I felt curiously isolated. Among these meditators, I was alone in silence and darkness, yet I felt almost at one with them.

Below is a short extract: for the full post see link.

An endless bombardment of news and gossip and images has rendered us manic information addicts. It broke me. It might break you, too.

I was sitting in a large meditation hall in a converted novitiate in central Massachusetts when I reached into my pocket for my iPhone. A woman in the front of the room gamely held a basket in front of her, beaming beneficently, like a priest with a collection plate. I duly surrendered my little device, only to feel a sudden pang of panic on my way back to my seat. If it hadn’t been for everyone staring at me, I might have turned around immediately and asked for it back. But I didn’t. I knew why I’d come here.

A year before, like many addicts, I had sensed a personal crash coming. For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.

I was, in other words, a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. Twitter emerged as a form of instant blogging of microthoughts. Users were as addicted to the feedback as I had long been — and even more prolific. Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating. I remember when I decided to raise the ante on my blog in 2007 and update every half-hour or so, and my editor looked at me as if I were insane. But the insanity was now banality; the once-unimaginable pace of the professional blogger was now the default for everyone.

If the internet killed you, I used to joke, then I would be the first to find out. Years later, the joke was running thin. In the last year of my blogging life, my health began to give out. Four bronchial infections in 12 months had become progressively harder to kick. Vacations, such as they were, had become mere opportunities for sleep. My dreams were filled with the snippets of code I used each day to update the site. My friendships had atrophied as my time away from the web dwindled. My doctor, dispensing one more course of antibiotics, finally laid it on the line: “Did you really survive HIV to die of the web?”

A relevant blast from the past in the light of yesterday’s post.

Forget about jam and Jerusalem. Marmalade and meditation is the real deal.

Bruce made a significant comment on my review of Iain McGilchrist‘s book about the need for a proper balance between the way the two halves of our brain work together, the left with its word-dependent logic and the right with its creative intuition:

At the time I “found” McGilchrist’s book I was reading concurrently a history of Greek philosophers, a narrative of the development of the “western mind” and a quirky travelogue of discovery of “the psyche of Persia we really don’t know”, searching for something I wasn’t sure existed – a unified view, a coherence that McGilchrist just dropped into my lap . . . .  Best of all, when I put the book down, I find myself more inclined to seek out a wetlands forage for watercress than a newscheck on the internet!

When I read it I felt a twinge of envy at the idea of foraging for watercress in a wetlands habitat. It was only fleeting though. My connection with nature has always tended to be passive rather than active. My interest in gardens, for example, extends only as far as sitting in them with immense pleasure: any actual gardening tends to result in injury or accidental damage. I end up lacerated by thorns or by cutting the trimmer cable in half. This takes the edge of any slight pleasure I might have felt and tips me well over the cliff into aversion.

So, I came to feel, perhaps with a slight sense of smug complacency, that the impact on me of McGilchrist’s insights, though considerable, might extend no further than a bit of meditation laced with poetry. And those who have been following this blog will testify there’s been a lot of poetry recently. I never thought I’d sink to practicalities.

Until, that is, a friend of ours gave my wife a hefty bag of plums. It looked like there were millions of them and they were very small. My wife mentioned something about making jam so I made some excuse about needing to answer a load of emails and disappeared into my study. I was there for what felt like several hours and thought the whole thing would have blown over by the time I came downstairs to make a cup of coffee.

As even Basil Fawlty at his most obtuse would have realised, making coffee requires going into the kitchen, and going into the kitchen, when jam making is in the air, is not a smart move for those who don’t want to make jam. As soon as I walked in I knew I had made a fundamental error. There on the table was a mountain of plums piled carefully in a massive bowl. Within seconds – I’m still not sure how it happened – I was back on my computer looking for recipes for plum jam. One of the drawbacks of Google is that you can find exactly what you don’t want if you make the mistake of looking for it. And I did.

Initially I emailed three of the recipes to my wife and came back downstairs to continue making the coffee.

‘Have you got the recipes, love?’ my wife asked quietly.

“I’ve emailed them to you,’ I said defensively.

‘Couldn’t you print one off?’ came the response.

It was at that point I knew the game was up.

. . . . . . . . . . . . . .

What I didn’t yet realise was how fulfilling the making of plum jam would be. And how my decision to resume regular and disciplined meditation two months previously had made it possible for me to take pleasure in exactly the kind of fiddly repetitive task that would have driven me to complete distraction just a few short weeks ago. Meditation had enabled me to maintain focus far better, accept the repetition in good spirit, notice with genuine surprise and pleasure the way each rounded fruit was subtly different from the last one and learn by stealth rather than conscious effort how to become more efficient and dextrous at getting every last piece of flesh off even the tiniest the stone. Preparing the plums in this way became a form of meditation in itself, a spiritual discipline that changed my consciousness, heightened my awareness and developed new skills. It changed me in a way that generalises to many things I do from emptying the dishwasher to replying to emails.

And it paved the way for making marmalade. My favourite form of jam.

But before I come onto that perhaps I’d better explain why I started to meditate again so much in earnest.

Sam Harris meditation pic

Meditation from Article by Sam Harris


It’s true that I have always done some meditation ever since I first learnt at the London Buddhist Society in the early 70s. But it had been a long time since I had done so with the discipline of those early days. It’s also true that for some years the emphasis psychology now places on mindfulness rekindled my interest a little. But I had of late been much more interested in reading about it than really doing it. And the McGilchrist book, while it drew me back to music and poetry, left my pattern of meditation very much as it found it.

In truth, I felt I was far too busy to make the time for anything more than a perfunctory gesture at the task. I had far more important things to do and I raced around doing them until the warnings from my interactions with the world became too strong for me to ignore.

First, in spite of my lip-service to mindfulness, I became so ungrounded by the pace I was keeping up, that I spilt coffee on my lap top and destroyed it. That jolted me more than a little but I still did not fully wake up to my need to change something radically until, late at night a month later, in a haze of fatigue, with my whole close family in the car, convinced I was already on the dual carriageway which was in fact still half a mile down the road, I moved out to pass the slow moving car and trailer ahead of me. I was alerted to my mistake when I saw, with initial incredulity, the headlights of an oncoming car heading straight for me in the distance. I pulled back inside with time to spare more by good luck than good judgement. What shocked me most about this incident was that fatigue had warped my perception of reality so much that what I believed about where I was completely overrode the cues telling me otherwise that were plainly there for me to see and respond to.

I remembered the story about a well-known Bahá’í, Dorothy Baker, who had a serious and almost fatal car-accident on a steep mountain road.

She mused aloud to a friend: ‘I wonder what God is trying to tell me.’

To which the reply came: ‘Dorothy, you drive too fast!’

The same kind of answer came to me in a flash, in the aftermath of this near collision: ‘Pete, you’re driving yourself too fast.’

Carl Jung used to say something like, ‘When life has a message for you, it first of all taps you gently on the shoulder, may be more than once. Then, if you don’t notice, it will slap you in the face. If you still don’t pay attention it will bang you hard in the head.’ This moment was my bang in the head.

It became clear to me that I had to take meditation seriously, slow down and trust that I would still be able to do all that was truly important to do.

So, at the start of every day since then, for half an hour at least, I have practised a form of meditation. (I won’t bore you with the details here but for anyone interested I’ve posted the basic model, as used in a group exercise, at this link Turning the Mirror to Heaven. It also explains how the method can be used alone.)

Initially I found it almost impossible to step back from a very disempowering belief. I believed that making time to meditate, and then using the calm I had generated to slow down my pace of work, would in fact make the whole situation much worse by leaving a trail of neglected tasks in my wake for others to trip over.

And it’s true I’ve had to decline some requests to take on more than I could do, and that was hard. But to my astonishment, almost all the major projects I’ve taken on continue to progress, though it still is hard to trust that the pace is fast enough – but as far as I know there’s no great harm done (‘yet’ says the voice I have to fight every time I meditate or do things mindfully).

And the strangest thing of all is that there has been time to make my own marmalade. I never thought I’d see the day when I would take pleasure in slicing orange peel into thin strips as though I had all the time in the world, my enjoyment marred by only the faintest suspicion that in doing so I must be neglecting something more important.

So my present unprecedented state of mind seems to be thanks to marmalade, McGilchrist and meditation. I still find myself wondering quite often, though, how long it will be before life pricks this bubble too. Some people are never satisfied.

Oh and, by the way, we gave a jar of plum jam to the friend who’d set this whole jam thing going and to my surprise she seemed to love it. Perhaps she was just being polite.