Robert Whitaker writes in the current RSA journal that psychotropic medication is less safe and effective than is commonly believed. The use of antidepressant and antipsychotic medication is widespread, and Whitaker presents troubling evidence that taking these drugs can increase the frequency of relapse and reduce the chances of getting well and staying well. He presents evidence that patients who never take medication fare much better in the long term than those who accept medication early on. Whitaker is not alone in drawing attention to the unstable foundations of psychiatric prescribing.
The psychologists Richard Bentall and Joanna Moncrieff have both urged us recognise that the notion that psychiatric drugs correct imbalanced brain chemistry is a myth. It is a myth that has taken hold in the public consciousness to a very great extent. Moncrieff, like Whitaker, explains that these drugs in fact do no such thing. The drugs act on our brains in such a way as to make them function differently and in doing so change the way we feel. Moncrieff explains that these medications produce drug-induced states which mask or suppress emotional problems. She does not say that we should stop using them, but suggests instead that patients should consider whether they want to use them with greater awareness of how they work.
This is tricky territory for non-experts like me to negotiate. The public misconception that mental illness is caused by brain chemistry being out of kilter is one problem. The poor long term outcomes for patients prescribed psychiatric medication is another. The fact that some patients’ severe suffering is ameliorated in the short term by the use of such drugs cannot be denied. It is clear that the issues raised by conflicting evidence in this field are very concerning, and the voices drawing attention to them are becoming louder and more numerous.
The overall effect is of an attack on the orthodox methods of contemporary Western psychiatry. The principle labour of psychiatrists has become to diagnose and prescribe, and medication is the first line treatment for the majority of patients who come under their care. As evidence builds that such medication is neither a cure, nor always a benign intervention, there is a danger that psychiatrists get demonised as unthinking peddlers of poison. A couple of weeks ago I wrote about the problem of the stigma of mental illness. It might not make me popular, but I do have some sympathy for the psychiatrists and wonder whether they are also becoming stigmatised.
The overall effect is of an attack on the orthodox methods of contemporary Western psychiatry.
It is important to remember that psychiatry is a relatively young discipline. There remains an awful lot that is yet to be discovered about what really causes mental illness, and some of the exciting action in the psychiatric field is in neurobiological and genetic research. But it is also in the psychosocial arena, as the importance of social connectivity, mindfulness and physical exercise become ever more apparent. Although it is clear that the drugs being routinely prescribed are in some ways rather clumsy, don’t work for everyone, and bring with them unwanted side effects, it is not the case that they are exclusively bad, or that they are dished out in bad faith.
The evidence that the drugs might worsen long term outcomes is worrying, but we must remember that this is relatively new evidence and that it takes time to acquire and properly analyse. However, there is a danger that psychiatry has already painted itself into a corner whereby it is only capable of regarding mental illness as a set of neurobiological components with the driving aim being to separate and identify them, and then develop the correct psychopharmacological intervention. So while it is an exciting time for psychiatry, it is imperative that psychiatrists take this opportunity to extricate themselves from the clutches of Big Pharma, and open themselves up to the possibility that drug-based treatment should no longer be the first port of call. If they do, then I can envisage a future in which patients are sufficiently informed and reflective to confidently demand to be supported through episodes of mental illness without medication, and psychiatrists become more holistic, discerning and flexible in their approach to treating their patients.
People with mental health problems are the last minority group against whom it’s socially acceptable to discriminate. Sometimes this discrimination comes about accidentally or covertly, Lisa Appignanesi’s recent piece in the Guardian being a case in point. Appignanesi writes that the mental illness ‘industry’ is medicalising normality to a greater extent than ever before. She raises the question of whether the apparent increased prevalence of mental illness is genuinely down to a rising toll of suffering, or whether we have collectively learned to complain more. Appignanesi suggests that the more evidence there is about the increase in mental disorder in the public domain, the more likely we are to label our own problems of living as requiring the attention of a doctor. She goes on to suggest that attending reading groups or going running might do more for sufferers of depression than taking medication and questions the usefulness of psychiatric classification in helping people deal with the problems of their lives.
While I’m sure Appignanesi does not intend to cause offense to people with serious mental health problems, there is a dangerously stigmatising undercurrent to her argument. A distillation of the points she makes might roughly translate as “There’s nothing much wrong with you, you don’t need any pills, pull yourself together.” This might be a useful message for someone who’s struggling slightly with a mild case of the blues, and has the wherewithal and capacity to make a few positive changes in their life. But, for someone with a seriously debilitating mental illness, it is a potentially very damaging message.
‘Mental illness’ is no more a discrete entity than is ‘physical illness’, and no physician would deign to lump diabetes in with cancer when trying to understand patients’ ways of dealing with their illness.
A serious problem which Appignanesi does not attend to, is that the category of ‘mental illness’ is extremely dense. ‘Mental illness’ is no more a discrete entity than is ‘physical illness’, and no physician would deign to lump diabetes in with cancer when trying to understand patients’ ways of dealing with their illness. So, when we talk about mental illness, we might be referring to depression, anorexia, schizophrenia, or any of the other 300 or so disorders in the DSM. Within any one of those diagnostic categories lies a huge variation of patient experience and no two cases of any one of these conditions is ever the same. Just as we all have different pain thresholds, we all have differing levels of resilience to mental distress. But, whatever your threshold, there is a level of serious mental suffering which is as intolerable as the most excruciating physical pain. Within the classification of depression, there exists a whole spectrum of experience ranging from unpleasant but bearable gloom which allows one to continue functioning, right down to crippling despair which makes it impossible to get dressed in the morning or go to sleep at night. For those at the dark end of the spectrum, attending a reading group or going for a run are utterly inconceivable activities, and no substitute for proper medical intervention.
Appignanesi is caustic about the use of antidepressants, and it seems to me that this might be because she has in her mind people who are just a bit down in the dumps rather than those who have a serious mental health difficulty. The ‘definite lift’ Appignanesi tells us participating a reading group provides would certainly not have helped Sandra, a woman I met some years ago, who at that moment was desperately waiting for her annual ECT treatment. She told me that ECT was her lifeline, the only thing that lifted her depression sufficiently to make her life liveable, and that without it she would have killed herself ‘several times over’.
I think the point Appignanesi is really trying to make is that it has become very easy for pretty much anyone to walk into the doctors, have a bit of a moan, and leave with a diagnosis of depression and prescription for Prozac.
I think the point Appignanesi is really trying to make is that it has become very easy for pretty much anyone to walk into the doctors, have a bit of a moan, and leave with a diagnosis of depression and prescription for Prozac. Cultural factors have made it possible for mental illness to be a lifestyle choice. If you can’t be bothered to exercise, eat well, engage in wholesome activities like reading groups, you don’t have to take responsibility anymore because you can just opt for the convenient excuse that you’re ill. Once your GP has agreed that you’re ill, you can slip into the role of patient, and passively wait for treatments to work and experts to make you better. The overlapping agendas of pharmaceutical companies, the health service, and government have come together to feed this situation.
normalising mental illness is a far more urgent priority for social progress than is preventing the medicalisation of normality
Although Appignanesi’s attack of the usefulness of psychiatric classification is understandable, what we need to understand is that there is a difference between everyday, normal suffering and serious mental illness which requires specialist intervention. It is true that deciding on the cut-off point at which normal suffering becomes mental illness can only be determined using subjective means, and that the boundary is inevitably arbitrary. I agree with Appignanesi that there is something crazy about a world in which literally any kind of idiosyncrasy can be identified as a symptom of mental illness, and that there is a complex range of reasons which explain the apparent increase in prevalence of mental disorder. But, we need to exercise caution when drawing attention to these problems because there are real dangers associated with arguing against the medicalization of ‘normality.’ Firstly, that people who are really suffering and genuinely need help are not taken seriously, and secondly that the advantages that come with understanding that mental health is on a spectrum which we all occupy, are lost. Or in other words, that the stigma of mental illness is encouraged. People with mental health problems are routinely discriminated against at all levels, and normalising mental illness is a far more urgent priority for social progress than is preventing the medicalisation of normality.
Senior Researcher, Social Brain
Salary: £29,000 pa
Contract Type: Permanent, applications for both full and part-time hours are being considered for this post
Location: London, WC2N 6EZ
The Social Brain project is a core part of our research identity at the RSA, underpinning our view of human capability, and informing our approach to behaviour change.
An opportunity has arisen for a creative and skilled researcher to join our team. Working closely with the Associate Director of Social Brain in this newly created role, you will undertake and manage research, analysis and reporting on major strands of Social Brain work. You will also assist with fundraising and be responsible for horizon scanning and maintaining an engaging online presence for the project.
You will have the opportunity to contribute to the future scope of this innovative project by assisting with its development into a wider programme of work.
The ideal candidate will have an active interest in brains and behaviour, an analytical mind, and experience of successful fund raising. You will have a confident approach to your work and strong interpersonal skills, enabling you to communicate and engage effectively with a range of different people and audiences.
For over 250 years the RSA has been a cradle of enlightenment thinking and a force for social progress. Our approach is multi-disciplinary, politically independent and combines cutting edge research and policy development with practical action. This work is supported by our 27,000 Fellows around the world.
To apply for this role please submit the following to firstname.lastname@example.org:
- Your CV
- Covering letter explaining how you fit the requirements of this role and the RSA’s broader mission
- Your preference regarding working full or part time. If part-time, please state how many days or hours you would ideally like to work
Nudging, as Jonathan Rowson points out in a recent post on this blog, is already the flavour of the month and looks like being at the top of the menu for the rest of 2011. The government has recently announced that in the coming year we will be ‘nudged’ towards paying our taxes, quitting smoking, insulating our houses and signing up to be an organ donor. The media is lavishing attention on the idea. And the term is gaining such traction that it’s being misapplied to behaviour change measures which are rather more ‘shove’ than ‘nudge’, such as the decision to increase tax on high-strength beer and reduce it on low-alcohol brews.
At the moment, all this publicity and attention seems a bit ironic, given that nudges are meant to be minor interventions which operate unnoticed in the background. It’s perhaps unsurprising, given this is a new idea – in UK policy terms, at least. But for a number of reasons, it risks causing problems in the long run.
If nudges are to succeed, it’s surely better that we don’t recognise them for what they are and what they are trying to do.
First, there’s the point I’ve just made: if nudges are meant to go unnoticed, will they work if we are looking out for them? One of the arguments made in favour of nudges is that they are the antithesis of public approaches to behaviour change, like didactic communication, education and regulation. Apparently, in the past we have ignored, misinterpreted or reacted against these measures. We seem to have an innate antipathy to being told what to do, but because we are not very good at making behavioural choices that are in our best interests for ourselves, we have been making poor decisions in contexts ranging from healthy eating to financial planning.
Nudges are designed to circumvent this active rejection of good advice, and overcome our inability to choose well, by changing the environments in which we make subconscious decisions and thereby influencing our actions. Essentially, they work by making us passive reactors to suggestion rather than active decision makers responding to stimulus.
If nudges are to succeed, then, it’s surely better that we don’t recognise them for what they are and what they are trying to do. Otherwise we might be tempted to ignore or react against them, just as we have with direct communication. HMRC’s plan to nudge people into paying their tax by rewording its tax letters might be more effective if we respond to the suggestive wording without thinking about it than if we are looking out for it when we open the letter. So perhaps they should just go ahead and do it without telling us all about it.
Nudges are more paracetamol than radiotherapy – they might have an impact on the surface and around the edges, but they won’t address the causes of more serious and long-term problems.
Second, the current focus on nudges attracts the vocal attention of cynics and sceptics, many of whom are arguing that there is something underhand about nudging, that it is just another form of the ‘nanny state’, and/or that it involves ‘playing with people’s brains’. (There’s a wonderful example here, which includes a total misunderstanding of the RSA’s Social Brain project.) It seems to me that much of this criticism stems from a lack of understanding of the idea of ‘choice architecture’ which should underpin nudges – a sensible theory that is not exactly Big Brother and the Thought Police. Still, the negative commentary sounds good, and can’t help.
Third, all this attention risks giving the impression that nudges are the government’s sole response to the problems facing society today. There’s certainly a place for them, but there’s no way they can address deep-seated issues such as obesity, social isolation and binge drinking on their own. They’re more paracetamol than radiotherapy – they might have an impact on the surface and around the edges, but they won’t address the causes of more serious and long-term problems.
I can see why nudges are attractive at the moment – they’re cheap and light-touch, which is just what the government wants. But while they’re useful, they’re clearly not a panacea, and giving the impression that they are risks undermining support for them.
Nudging seems to me to be a good idea, and certainly worth a try. So perhaps the government should stay quiet about what it is planning, and just get on with nudging. If it works, they can tell us all about it afterwards.
Oh, and if I come across another blog post titled ‘Nudge, nudge, wink, wink’ I think I’ll scream!
Nudge is fast becoming to behaviour change what Google is to search engines. We have started to use the term as a catch-all shorthand for a patchy understanding of something like ‘all those fancy psycho-social tricks that alter how people think and act’. In doing so we are in danger of squeezing out all the other approaches we have to changing behaviour that might be more powerful or appropriate, including Think, Steer, and Mindfulness.
Tim Hartford is one of many who argue that ‘nudge’ is being overused, and suggests that we should be wary of applying a concept designed for markets to inform the the way Government changes the behaviour of citizens.
‘Nudge’ was the title of the book by Thaler and Sunstein that created the excitement around behaviour change, but as Richard Thaler indicated while speaking at the RSA, Nudge is really just a catchy term for the much more complex notion of ’Libertarian Paternalism‘ that is supposed to underpin nudge interventions. The idea is that you don’t undermine people’s freedom by choosing for them, but merely encourage them to make certain choices by altering the environment in certain ways, changing feedback mechanisms and shifting defaults.
An intelligent use of choice arcitecture makes good sense to me, but the paternalism is not unproblematic. For instance, speaking at the RSA, Anne Coote referred to “The whole ghastly nudge business which is actually about encouraging conformity”.
Moreover, it is not really transformative, which is perhaps, darkly, why people like it. Nudge changes the environment in such a way that people change their behaviour, but it doesn’t change people at any deeper level in terms of attitudes, values, motivations etc. And, as Clive Gross argued on RSA Comment, we risk oversimplifying why we change our behaviour
In any case, you can hardly open a page these days without being ‘nudged’ and Downing Street’s ‘Behavioural Insight Team’ is called ‘the nudge unit’.
I was prompted to share these thoughts after reading quite a detailed piece on the work of this team in The Independent. However, what I really wanted to say, is that if I am absolutely honest, I think one of the main reasons ‘nudge’ has become so popular is that it sounds like ‘fudge’.
I am not being entirely facetious. For instance, social psychology has taught us that we prefer people with names similar to our own, and although I can’t prove that we are thinking of fudge when we say nudge, my gut feeling is that this tacit association needs to be part of the story…
Over the last decade, I have read a lot of non-fiction books, most of them broadly related to human development, from the technical end of popular science to the facile end of self-help. Highlights have been Hare Brain, Tortoise Mind by Claxton, The Happiness Hypothesis by Haidt, Ethical Know-how by Varela, Connected by Christakis and Fowler, Into the Silent Land by Brocks, and Immunity to Change by Kegan and Laskow. All of these books (alas, mostly written by middle-aged white men) marshalled evidence to elegantly describe and develop a core thesis about human nature, and all of them answered the ‘so what?’ question about practical implications very powerfully.
However, with respect to all of these immensely impressive contributions, none of them compare with The Master and his Emissary, the book I was blessed to read to prepare for chairing last week’s RSA Keynote Event.
The best books are usually those that could only have been written by a particular person. In this case, Iain McGilchrist has a distinguished pedigree in both arts and sciences, having been an All Souls prize fellow in literature before training in medicine and becoming an accomplished psychiatrist. He therefore writes with authority in natural science and humanities, and the abundant links that lie between them for those few who know how to look. In addition to this polymathic erudition, one can also sense, between the lines, an old soul with a dry wit who is immensely generous in spirit.
The book is about the profound significance of the fact that the left and right hemispheres of our brains have radically different ‘world views’(described in the book). The hidden story of western culture, told here, is about how the abstract, instrumental, articulate and assured left hemisphere has gradually usurped the more contextual, humane, systemic, holistic but relatively tentative and inarticulate right hemisphere. The thesis is as strong on science as it is on narrative, replete with nuances, caveats, and references.
If you have ever had the feeling that the world is deeply screwed up in a way that you can’t quite articulate, this book will help you to make your case. If you want some insight into why we might be stupid enough to destroy our own planet, or why the slashing of funds for arts and humanities is even more tragic than you might think, read the book.
I might come to regret being quite so effusive, and there are certainly challenges to the core thesis and its implications that need to be entertained, hopefully in future blogs. Nonetheless, I would currently say it is one of the most important books of the 21st century. It is a grand theory for our times. If properly understood and acted upon, it has the potential to transform our view of our selves and our cultures, and prevent us from making a huge number of mistakes that might otherwise seem like sensible decisions.
For those who can’t wait to hear more, go to the end to watch or listen, but for those who prefer to read, my understanding of the argument goes as follows:
1) The left and right brain hemispheres are both involved in almost everything we do, such that crude dichotomies like the left being the logical side and the right being the creative side are a great disservice to public understanding of the brain.
2) However, if we cease to ask what the hemispheres do (language, reasoning, creativity, forecasting) and instead ask how they do it(contextualise or decontextualise, focus on lived experience or abstract models, instrumental or affective feedback, receptivity to counter-evidence, preference for old or new) we find very significant differences in the two hemispheres. The evidence for these differences are meticulously unpacked in the book in a compelling inductive argument- there is no killer fact, but a gradual unfolding of evidence, carefully tied together with an eye for counter-evidence.
3) The hemispheres are divided for good reason, because they perform different functions. The left is broadly about focussing, and the right is broadly about contextualising. These are compatible but occasionally competing aspects of our cognition and they are both essential. McGilchrist uses the example of a bird that can only focus on finding grain with its beak if it ignores surrounding context, but still needs some background awareness of surrounding context, and a capacity to respond to it, to avoid attacks by predators. The genius of the brain is its ability to switch between these modes in response to the environment.
4) The right hemisphere should be the dominant hemisphere, ‘The Master’, because it shapes the context, meaning and purpose of our experience of the world. The left hemisphere, ‘The Emissary’, should help us to achieve within this contextual, meaningful, purposeful perspective. The right hemisphere keeps us in touch with lived experience- keeps us deeply aware and responsive, while the left hemisphere is more like a very powerful computer that makes use of familiar schemas to achieve familiar ends. Cognition at its best is slightly different from army marching orders in that it should go ‘right-left-right’ i.e. context-focus-context, when in fact it often goes left-left-left, focus, focus, focus, with insufficient attention to the basis for the focussing, what is at stake, what might be different, and what is trying to be achieved.
5) There is insufficient evolutionary time for these changes to take place at a structural level of the brain. It is not that the left hemisphere is getting bigger or denser or better connected than the right. The point is that slowly but surely the left hemisphere shapes our culture in such a way that it makes its own perspective the dominant one, until we reach what McGilchrist calls ‘a hall of mirrors’ in which the explicit, instrumental, defined, abstract voice is the only one we believe in, and the implicit, intrinsic, fluid, visceral perspective sounds diminished and foreign. This perspective speaks to, inter-alia, the Art, Drama and Music therapists currently struggling to make the case for their immense social value against cruel and blinkered market logics that want to measure their impact in numerical terms.
6) The mechanism for increased left hemisphere dominance is imitation, a subject close to our heart at the RSA. Crudely, the cultural ‘stuff’ of the left hemisphere is more contagious than the cultural ‘stuff’ of the right hemisphere. Have you heard the expression: “What gets measured gets done”? Or “If you can’t say it, you don’t really understand it?” Both are examples of the ‘emissary’ overstepping his mark, but doing so in a compelling way that is hard to fight back against.
7) Through epigenetic cultural evolution, the left hemisphere gradually colonises our experience. The good news is that left hemisphere tends to be optimistic, giving us a feel-good factor, but the bad news is that it is remarkably unaware of how partial and/or deluded its view of the world can be, and scarily unreceptive to unfamiliar perspectives. In one of the best lines of the book McGilchrist writes:
“If I am right, that the story of the Western World is one of increasing left-hemisphere domination, we would not expect insight to be the key note. Instead, we would expect a sort of insouciant optimism, the sleepwalker whistling a happy tune as he ambles towards the abyss.”
I think this is a truly wonderful book, and it has certainly given me a new set of tools to think about the direction the world is taking and what we might do about it.
Thanks for reading this far, and before we amble into the abyss together, please listen to the audio of the event, which includes the avalanche of questions expertly fielded by Iain McGilchrist, or watch the video below, with just one question posed by me at the end- namely: If this colossal idea is true, which I now believe it to be, how to guard it against widespread simplification or distortion?
Since our event earlier this month, I’ve been banging on about mindfulness to anybody who will listen. Nobody has yet asked me for a definition, but I can see in their eyes that they want to.
So what is mindfulness? My quick answer is that it’s the surprisingly difficult and strangely liberating experience of paying attention to what is going on inside us.
And yet, that’s a bit of a trap, as any definition would be. There is a place for verbal definitions, because unless you pin an idea down, there is a meaning vacuum, and it is difficult to proceed to say anything useful without fear of talking at cross purposes. However, verbal definitions are not entirely benign. If I ask you: What is water? You could give me its chemical composition, H2O, but I won’t really know what it is until I have taken a drink. Some things have to be experience to be understood, as is the case with mindfulness.
A verbal description prior to a taste of the experience may not be harmful, but it is probably not helpful either. Your best definition is to find a good teacher and start practicing, but if that seems daunting, just to try to sit still, perhaps with eyes closed, and continue to breathe, but now with an ongoing awareness of your breath. You will quickly discover, if you didn’t know already, that our everyday minds are rather chaotic, and that we struggle to hold our attention on any single thing, including something as simple as our breath, our basis of our existence, for more than a few seconds.
I will later come back to why such an experience should be important, or useful, but for now I want to stick with the definition. If the injuction to ‘just do it’ sounds evasive, consider the following definition of definition, for why the experience is necessary:
Definition: The vivisection tray upon which a word is splayed; while the gist may be clearly labelled with coloured pins, resuscitation becomes problematic. (Abrose Bierce, The Devil’s Dictionary)
The Devil’s dictionary is deliberately subervise, but the point is powerful. Definitions can bring ideas to life, but if the conditions of that life are stricly demarcated, one’s idea of what the word means is needlessly limited.
Yes, yes, but what is it? What is it? Well it’s partly an attempt to free ourselves from the narcissistic verbal chattering that goes on in our heads for about 16 hours a day. So in a way, words and thoughts are part of that challenge.
Above all, I would say mindfulness is honest. It’s about facing up to our own minds, in all their complexity, and everything that follows from that honesty. The acclaimed author, Tim Parks, describes one aspect of this experience beautifully towards the end of his disarming book, Teach us to Sit Still.
“But as words and thought are eased out of the mind, so the self weakens. There is no narrative to feed it…. ‘Self” it turns out, is an idea we invented, a story we tell ourselves. It needs language to survive. The words create meaning, the meaning purpose, the purpose narrative. But here, for a little while, there is no story, no rhetoric, no deceit. Here is silence and acceptance; the pleasure of a space that need not be imbued with meaning. Intensely aware, of the flesh, the breath, the blood, consciousness allows the ‘I’ to slip away.”
This idea may sound terrifying to some, and suggests quite an advanced level of practice, but while the ‘I’ may ultimately slip away, our lives, and our personal roles and identities very much go on, as Tim Parks suggests with reference to his wife, his daughter and his dog:
“So if I can recount the first minutes, I can’t tell the rest. There are deepenings. There is a liquefaction of some kind, the things flowing into the calves, the head into the breast. And there are resistances: stones, obstructions, pains. The mind goes back and back to them. An ankle. A shoulder. Maybe they will shift, and maybe not. I am absolutely awake. I hear Rita pad downstairs with the dog behind her. I hear a scooter straining up the hill. And I am not there. I am in the stream.”
Still sounds intense, but his responsibilities are unaffected:
“Then the alarm sounds and I must move. I’m up, dressed and getting Lucy into the car in just a few minutes. By ten past seven we are speeding down the hill, trying to beat the traffic light at San Felice.”
(Tim Parks, Teach us to Sit Still, p331)
According to a story in today’s guardian, a serious migraine can cause a permanent French accent.
I have to confess that I found this idea very funny. Then it occurred to me that for the person concerned it was probably fairly devastating. So much of our identity is tied up in what we say, and how we sound. Such a condition could make you unrecognisable to yourself and to others, relationships might change, depression could set in, and your life could spiral out of control. It is serious.
And yet, does it cease to be funny? As George Bernard Shaw once put it: Life doesn’t cease to be funny when people die any more than it ceases to be serious when people laugh. And while foreign accent syndrome is serious, it is surely not as serious as death.
In this respect, like many others, I experience some degree of polyphasia. I recognise that the condition is serious, and yet I still find it funny. Is that ok? Or I am being unforgivably insensitive to the women concerned?
The Great Room was packed last night for Mindfulness: The Key to a Healthier Society. The speakers were Ed Haliwell and Dr Jonty Heaversedge, authors of The Mindful Manifesto, and Tim Parks, an author once shortlisted for the booker prize, who recently wrote ‘Teach us to Sit Still‘- a darkly humorous and profound examination of an embarrassing medical condition, and an improbable journey back to health through meditation.
I enjoyed chairing the event, and although I think we had a sympathetic ‘home crowd’, I hope we managed to critically engage with the theory and practice of mindfulness as well as possible within the time and format constraints. I was particularly pleased that we managed to do a two minute guided meditation at the end of Ed’s speech, because the main message of the evening was ‘just do it’, and it’s important to walk the talk.
For those who have never heard of mindfulness, I would encourage you not to settle for the first verbal definition that comes your way because the heart of the approach is to loosen the hold of concepts and try to make our perception as concept free as we can. If that seems too evasive, mindfulness is broadly concerned with paying attention, in particular paying attention in a gentle but systematic way to things we normally take for granted, including our bodies, our minds and our breathing.
I will come back to highlights of the event when we can link to the recording, but for now I offer a couple of choice quotes from the books:
“The faster we go, the more we tend to react impulsively, following our unconscious, habitual patterns. It is a nasty vicious circle. And in order to release ourselves from it, we need help. We need a powerful antidote to speed. We need a method.” (The Mindful Manifesto, p9)
“The more we threaten thought and language with silence, or simply seek to demote them in our lives from the ludicrous pedestal on which our culture and background have placed them, then the more fertile, in their need to justify and assert themselves, they become. Reflection is never more exciting than when reflecting on the damage reflection does, language never more seductive than when acknowledging its own unreality.” (Teach us to Sit Still, p244)
I want to write a few different posts on Mindfulness, to clarify in my own mind how we might best make it a part of our Social Brain project, including thoughts on the discomfort around the word ‘spirituality’, the role of science in making the case for mindfulness, and the relationship between mindfulness and other forms of meditation. For now I want to simply state what I think is the heart of the matter.
You can talk about ‘behaviour change‘ until you are blue in the face, without changing your own behaviour in any meaningful way. You can also craft elegant words about 21st century enlightenment, and lay out a roadmap to a better future by describing the kinds of people we need to become. For instance you can argue, as Matthew Taylor does, that we need to become more empathetic, more autonomous and more other-regarding. But the burning question is how? Intelligent discourse is important, as far as it goes, but it remains at the level of theory, aspiration and proto-policy. In this sense it is as much a part of the problem as it is part of the solution.What we need are methods, ongoing practices that are tested and inclusive, and which will allow us to change ourselves for the better in deep and enduring ways.
No less than Albert Einstein said: “We cannot solve a problem from the same level of consciousness that created it. We must learn to see the world anew.” Alas, this rarely happens, because as no less than Tolstoy said: “Everybody thinks about changing the world, but nobody thinks about changing themselves.”
The RSA believes in collaboration, a message embodied in our fellowship. We rarely make social progress without people working together. In this respect we like to ‘do stuff’, and the point of mindfulness is that we ‘do stuff’ a lot more effectively when we invest some time in just being. We find it incredibly, surprisingly, revealingly hard to ‘just be’. Hence the apposite title of Tim Park’s wonderful book: ‘Teach us to Sit Still’.
If we are serious about transformative social change, we need to at least be open to the idea that transformation begins at the level of consciousness. Perhaps our first step should simply be to understand ourselves at a more fundamental level, not just from the perspective of science and reason, but experientially and viscerally. We problematize institutions, people, social systems and structures, policies and places, but we rarely problematize our own wayward minds, and typically take them for granted. We see, think and act through our minds, but rarely look at them directly with deep curiosity and discernment.
The injunction to ‘know thyself’ needs to be taken seriously, even though it is not ‘cool’, or mainstream. We typically resist this kind of self-knowledge because we think it is too difficult, or we are too busy being busy. Even if we periodically glimpse what Tim Parks calls ‘the clamour’ inside our minds- something that meditation shows you- we are usually too scared to look more closely, and keep the disquieting insight at bay through denial and distraction.
I am not advocating casual introspection, navel gazing or self-indulgence. Progress often does require ‘doers’ who communicate and collaborate well, and people need to get together to discover and serve their common good. This much all remains true, but first such ‘doers’ would benefit from sitting still, sometimes alone, and watching their minds at work. If we don’t do this sort of work on ourselves, we remain strangers to our true influences, motivations and potential. Without some form of practice or method to know our own minds, we carry on acting habitually, reactively and busily. We may think we are helping the world through our acts, but often we are merely acting out conditioned behaviour, driven by vexed desires, restlessness, and various forms of denial, not least the denial of our own mortality.
I don’t think mindfulnes is a panacea. Indeed, Ed Haliwell indicates that to think this is just to create another unhelpful attachment. What is clear to me however, is that unless we learn to look more deeply and honestly at ourselves, in ways that are not always easy, comfortable or socially sanctioned, we will not fulfill our potential, either individually or collectively.
There is a lovely line in the film Good Will Hunting where the character played by Robin Williams ends a college class on psychotherapy by saying: “See you Monday. We’ll be talking about Freud, and why he did enough cocaine to kill a small horse.”
Freud is widely quoted, often derided, and rarely appreciated. If you want to understand his work, don’t surf the net or buy a secondary text. Try reading him in the original, and you’ll realise why, small horses aside, he was such a heavyweight.
One of his major ideas concerns the structure of the psyche. I remember Robert Rowland Smith gave a particularly lucid overview of the idea when he spoke at the RSA, but my dramatised version goes as follows (analytical psychologists look away now).
The psyche is comprised of das Es(Id), das Ich(Ego) and das Über-Ich(super-ego). Crudely, the Id is your libidinous, desire-ridden, status conscious self. The Super-ego is your conscience, your tempered reason, your empathetic other-regarding self. And the Ego is your everyday sense of self, the part of you that you refer to when you say “I”.
One way of looking at the psyche is that the ego tries to navigate through life while facing upward pressure from the Id and downward pressure from the Super-ego. In RSA terms, we might say the Id is anti-social (disregarding the feelings of others) the super-ego is pro-social(actively supporting and shaping positive social norms) and the ego is typically a-social (permissively accepting pervasive social norms but doing nothing to pro-actively create them).
So far, so contentious, but let’s try to diagnose the world’s problems with the following argument. The reason anti-social and a-social behaviour are so pervasive is that the Id has a natural ally in the advert. The super-ego, meanwhile, has lost its institutional moorings and needs an alternative to the church. The ego has become imbalanced because while the Id is bombarded by the Ad, the super-ego lacks an equivalent medium through which to communicate.
I offer a hat-tip to Michael Foley for linking the Ad to the Id, and commend his extremely funny and insightful book, The Age of Absurdity: Why Modern Life makes it hard to be happy. The following extract from page 19 gives some idea of the basic claim:
“So the ad woos the id in the traditional way- by impressing, flattering and stimulating…
The AD: Regard the mighty vault soaring to Heaven.
The ID: SHEEZ!
The AD: Now regard the many shiny prizes.
The ID: WANT!
The AD: All of this is for you.
The ID: ME!
THE AD: You are indeed uniquely wonderful
The ID: Lights! Cameras! Put me on Prime-time!
The AD: Nor need to concern yourself with others, but be an infant till you die.
The ID(Scowling): Don’t you mean, be an infant forever?
The AD: I said, be an infant for eternity.
The ID: WHOOP-DE-DOO!
The AD: Never shall your desires diminish or your appetites abate.
The ID: MORE!”
Foley wisely counsels that rather than trying to defeat the ad(not easy given that the average American, for instance is subject to 3000 ads a day) we would be wiser to work on controlling the id but:
“This is not easy either. The contemporary Id is rampant and in no mood to be tamed. Never have so many wanted to so much so badly. Never has the Id been so flattered and indulged. This is the golden age of the id.” (p21)
1) As Matthew Taylor recently reiterated, a large part of the RSA’s new mission is to foster pro-social behaviour. We cannot be a substitute church. Instead we be the kind of institution that actively promotes collaborative pro-social behaviour that might act as a counterweight to the relatively selfish desires of the Id.
2) As spiritual traditions have recognised for thousands of years, what is required to tame the Id is not ideas, projects or policy, but sustained personal/spiritual practice. We need to be still, and look closely at our own natures from the inside. Hence the need for techniques like Mindfulness, an issue we will be discussing at the RSA on Thursday.