If you are late, you have to pay a price. Normally it’s the social price of mild shame, but what happens when you are asked to pay an economic price instead?
The front page of yesterday’s Metro announced a £60 fine imposed on parents if their children are late for school. At first blush this might seem like an obvious solution to a simple problem: to deter an unwanted behaviour, make it less attractive by imposing a monetary fine on it. But research from behavioural science shows that this model of change does not always pan out in real life.
The question is whether and how this £60 fine will affect parents’ actions; to this end research by Uri Gneezy and Aldo Rustichini detailed in their paper “A Fine is a Price” offers a potential answer. The researchers tested the effect of imposing a fine on parents for late collection of their children from a child care centre, and found, perhaps surprisingly, that late pick-ups increased under the fine.
The researchers tested the effect of imposing a fine on parents for late collection of their children from a child care centre, and found, perhaps surprisingly, that late pick-ups increased under the fine.
Whereas prior to the implementation of the fine policy parents would typically feel guilty about coming late, the monetary penalty served as a way to “pay” for their tardiness, thus absolving them of their guilt. It seems that for many people simply paying a fee is preferable to the emotional penalty of feeling ashamed or guilty. The take home message from Gneezy and Rustichini’s research is that introducing a monetary penalty can change a context from being a social transaction to a market transaction, and once this change occurs, it is very hard to revert back to the original relationship which is guided by social norms.
According to the Metro article, at least someone is aware of this risk. “Chris McGovern, of the Campaign for Real Education, warned: ‘They could undermine relations between school and parents – the cornerstone of any school’s success.’”
It is possible, if not likely, that the £60 fine policy being imposed by three schools in Milton Keynes would fall prey to the same surprising results as the child care centre experiment, where the intrinsic motivation to be on time is crowded out by extrinsic drivers. But the £60 policy may have other surprising consequences too, due to the conditions of the fine. According to the article by Le Marie, the fine is imposed on parents for each child who is late 10 or more times in a 12-week term, payable within 21 days. If the fine goes unpaid it doubles to £120 payable within 28 days.
Firstly, the policy changes the norm. Since a child needs to be late 10 times to get the fine, those parents or carers who are frequently late – say 6 or 7 times in a 12-week term – might change their point of reference. Perhaps they will no longer compare themselves to the ideal (always on time) but instead to the most salient marker (which now is being late 10 times), so rather than feeling relatively bad about their tardiness they may start to feel “better than average” or at least “better than the worst”.
Secondly, one could question the efficacy of such steep non-payment (or late-payment) penalties. A 100% penalty would be considered heavy, even compared to the oft-vilified payday loans (on average charging a £12-£25 late fee on a £100 loan). Behavioural economist Sendhil Mullainathan and Psychologist Eldar Shafir explain in their book Scarcity that humans’ cognitive resources are limited, and when we are struggling to deal with not having enough of something such as time or money, our decision-making ability is impaired.
In effect, we become so busy trying to juggle certain pressures that we don’t have the mental energy left to deal with other challenges, such as, for example, bringing our child to school on time. One way to mitigate this sub-optimal decision making in the peripheral domains is to relieve some of the pressure on the major problem (e.g. lack of time or money) – exactly the opposite of slapping an expensive fine onto a parent, potentially further exacerbating the underlying issue.
This point was echoed by Netmums founder Siobhan Freegard’s statement that “‘Children who are frequently late to school are often from chaotic family backgrounds. Taking money away from struggling parents could just make a bad situation worse.’”
Last November an article in The Guardian noted that parents wishing to take their children out from school for holiday during term time would be faced with a similar fine. The initial results of this policy show that applications for taking children out of school have in fact increased since its introduction.
It is yet to be known whether Milton Keynes’s borough-wide £60 fine policy will change rates of tardiness. But both the behavioural science research and the failure of the term-time holiday penalty suggest that the policy will not work. These effects, surprising to many, illustrate why it is so important to question our underlying assumptions about human nature, and to trial interventions on a small scale before rolling out a large scale policy change.
Article on term-time holiday penalty h/t Chris Gaskell.
Most of us recognise that climate change is both serious and caused by human activity, but few of us are managing to turn that recognition into behaviour change to reduce our impact. While this is a multi-dimensional issue, I suggest two crucial factors are that:
- We know that many of the things we do are ‘bad’, but can’t see any way of making constructive changes that don’t require a drastic and unrealistic transformation in how we live
- We know that most changes won’t make a difference unless other people do the same, so acting seems like a pointless sacrifice
I think that a bit more information might change our minds on both these points, and make it a bit easier to motivate positive behavioural changes.
Revolution vs Evolution
To take just one area where these factors apply, consider what we eat. It is widely acknowledged that eating meat is probably not the most ethical thing one can do. In addition to animal rights concerns, the production of meat is a major contributor to climate change through greenhouse gas emissions and land degradation. To summarize:
- Livestock accounts for almost 1/6 of anthropogenic greenhouse gas emissions
- An area of the world’s rainforests 2/3 the size of the UK is destroyed each year to create grazing land
- Over 2/3 of global agricultural land is used to grow crops for animals in feed lots while a billion people go hungry
With population growth and the rise of new meat-eating middle-classes in developing countries, all of these problems are set to multiply. It is therefore clear that current Western levels of meat consumption are completely unsustainable.
The implication is that we should all be vegetarians. But personally, though I had long accepted the moral argument, I simply couldn’t envisage changing my behaviour so drastically. The end result was that I didn’t change at all.
But that was until I made an interesting discovery. To paraphrase Orwell, while all animals are equal, it seems some are more equal than others. Red meat (lamb and beef) is by far the biggest offender, requiring many times more land, feed and fossil energy to produce. This is partly because these animals are such inefficient converters of feed into meat. Cows require about seven kilograms of feed to produce one kilogram of meat, compared to around three kilograms for pork and less than two kilograms for chicken. NPR made this useful infographic to illustrate just how resource and emissions-intensive beef is:
What It Takes To Make A Quarter-Pound Hamburger
The above doesn’t even include the copious quantities of methane these animals produce – a gas which has 23 times the impact of carbon dioxide. Factoring that in, it becomes even clearer that acting on climate change doesn’t necessarily require a radical change like vegetarianism; just cutting out red meat can make a huge difference.
Or can it? The second part of the dilemma described at the outset concerned the link between this kind of individual action and the kind of collective action that will be required to avert dangerous climate change.
Individual vs Collective Action
Game theory describes a classic example of the collective action problem in the form of the Prisoner’s Dilemma. In this thought experiment, two prisoners in solitary confinement each inform on the other in order to get a reduced sentence. The end result is that they both get heavy sentences. Its logic applies, to a certain extent, to acting on climate change. No-one wants to be in the situation where they act but others do not, making them both absolutely and relatively worse off, and rendering their sacrifice meaningless.
But that is where the analogy ends. We do not live in solitary confinement. On the contrary, our decisions are influenced more than anything by social values, social norms and social judgements. A big part of our decision not to change our diet is the fact that hardly anyone else is doing it. But if the social landscape can cause negative outcomes, then it can also engender positive ones. If most people were making personal sacrifices for the sake of others, it would be much easier to make (and much more difficult to resist making) those same sacrifices ourselves.
So how do we get from this social landscape to that one? Work by Nobel Prize winner Elinor Ostrom on the tragedy of the commons has highlighted the need for institutions, rules and incentives for behaviour. But whilst these structures will be vital to global action, we must not lose sight of the role of individuals. Your decisions do not just change your own tiny contribution to climate change; they also change the social landscape for those around you.
I am lucky that a significant proportion of my peers are genuinely altruistic, and their leadership made it much easier to motivate my own behaviour change in cutting out red meat. And hopefully my decision adds a tiny bit more momentum to that movement, making it a fraction easier for the next person to prioritise the common good over personal interests.
If you do not have such role models around you, you can become one yourself. If just one other person decides to follow your lead then you’ve doubled your impact. If you share two more close friends, suddenly they are each confronted with the fact that two-thirds of their friends are making personal sacrifices, massively altering that social landscape and turning it from an inhibiting to an enabling force for change.
This ripple-effect of individual action can (and will need to) play a major role in overcoming collective action problems like climate change. So if you are put off acting because you don’t want to change your entire life, or are discouraged by the collective action problem, it may be time to reconsider. There is probably something much more manageable you can do, and it might have a bigger impact than you think.
Saving money can be hard to do, especially given the current economic climate and falling real wages. It can be difficult emotionally, too, with a recent report published by the Money Advice Service finding that many people prefer to spend their money “more on the here and now than on planning for the future.” But with the right help, maybe saving can even be fun.
Picture the scene: you are browsing online just about to purchase a t-shirt. You don’t actually really need another t-shirt, after all it looks exactly like all the others in your closet, but it is 30% off, so for £10 why not? But just before your online checkout, a message pops up asking whether you’d like to add £10 – instead of (or in addition to) making the purchase – to a savings account named “new computer”, “honeymoon fund”, or perhaps more important but somewhat less motivating “unforeseen emergencies”. Or, as you are waiting for the barista to hand you your coffee you eye up the croissants on the café counter. It looks tasty, so you consider adding it to your order, but instead, you use your phone to transfer the £2 you would have spent on the croissant into your savings pot designated for a gourmet foodie weekend in Paris.
image via edudemic.com
This is what ImpulseSave, a small Boston-based organisation is helping savers to do. Their motto, “go on a saving spree!” reflects its basic function of replacing spending with saving. According to this article, ImpulseSave allows you to transfer money into a savings account via text or app, and provides prompts to save while you are shopping online. Similar to some other savings tools, your savings account is named for a specific goal, so you always have in mind what your savings is building towards. Smarty Pig, another savings tool, also uses named accounts to keep the goal salient, but rather than making impulse saves, you set up automatic transfers from an existing account. How it differs from more conventional bank accounts is that you can share your progress online via various social media and friends or family can actually contribute to your savings pot to help you towards your goal.
The Social Brain Centre has argued elsewhere that saving money can be hugely beneficial to people; having a financial buffer can influence upward social mobility, effective decision making, and psychological wellbeing. But despite its benefits, many people find it hard to save.
So what do savings tools like ImpulseSave and SmartyPig offer to help people save more, that more traditional tools such as budget planners, while helpful, don’t seem to provide? Traditional tools assume that as long as people understand their incomings and outgoings, they will behave in such a way as to stay within their means. But just knowing the budget, while necessary, is not sufficient for many people to actually achieve their savings goals. Instead, we are often side-tracked by impulse purchases (the ImpulseSave website cites a staggering 15-20% of our take home pay is spent on impulse purchases “that we don’t need or even remember buying”!), short-sightedness, or lack of social support.
But just knowing the budget, while necessary, is not sufficient for many people actually achieve their savings goals.
The former tools, however, use insight about human nature and what drives our behaviour to help us (once we know our budget) stick to our savings goals. For example, with such busy lifestyles and our tendency to conserve mental energy, we are more likely to do something the easier it is to do. These tools make saving easy, either through automatic transfers or via simple digital tools. By naming the accounts, this brings our savings goals to the front of our attention, and helps keep us motivated by reminding us what we are working towards, even if that is to be spared the stress and anxiety of an unexpected expense (think a broken boiler or car repairs). And the social aspects of these tools may improve the motivation to save by evoking the desire to remain consistent with your publicly stated commitments, and also perhaps in some way by changing social norms around discussing openly what may be still somewhat of a taboo subject.
This is not to say that financial literacy is not important, but rather that beyond learning how to budget we may need some extra help along the way to achieve our savings goals. Tools like those discussed above seem to using behavioural insight to reposition saving from being something onerous to being something fun. So go ahead and try going on a saving spree, and comment below; we’d love to hear how it goes.
A trained psychologist myself, I took great interest in today’s call of the British Psychological Society for a departure of the biomedical model of mental illness. And, to my delight, so did other colleagues – read a great blog post from Social Brain’s Emma Lindley here, where she writes that we might be right now witnessing a bona fide revolution that may change mental health services so radically, ‘they will be unrecognisable to the children of my generation.’ As Emma points out, the debate is as much driven by differing concepts of human nature as it is by politics, and the struggle for professional relevance and power. It is the latter aspect that I want to focus on in this blog post.
The RSA has long taken an interest in professions and their future (including this project in the early 2000s), and is currently managing an independent review of the Police Federation. Further international projects with other professions may follow soon.
Interestingly, even though Psychiatry is the younger term, it is the arguably the older science, and literally means ‘the medical treatment of the soul’, whereas Psychology means ‘study of the soul’. Psychology and, specifically, its subdomain Clinical Psychology, have always had a hard time standing up to their medical cousin. Part of the reason for that one can find in the etymology; isn’t medical treatment is just so much more tangible than mere study? Thus, in more than one hospital of the world (including one I interned in a long, long time ago), Psychologists have not been much more than overeducated sidekicks to doctors. This may change soon.
The main reason for this is that over the last decade, and particularly since 2008, Psychology has arrived in the scientific establishment. It did so by using a strategy applied by underdogs since the advent of mankind: collaboration. (And, of course, the emergence of discipline rockstars like Steven Pinker has helped.)
Not having enough leverage itself, Psychology entered functional marriages with up and coming disciplines like neuroscience and traditional ones like economics, a process that led to the creation of new interdisciplinary fields like behavioural science. A prominent victim of this process was homo economicus – the notion that humans are wholly rational and narrowly self-interested. Homo biomedicus (not an official term, my inadequate creation), the similarly reductionist paradigm underlying present day psychiatry that acknowledges only the physical side of human existence, but leaves aside the social and psychological aspects, may very well be next.
There are two reasons to be concerned about the potential revolution of mental health services given that professional battle lines are drawn:
Firstly, while for Psychology there was the possibility of a non-threatening complementary relationship in the mutual interest with economics or neuroscience, with Psychiatry it is different. Here the question is ‘who runs the show?’, or, if you will, one of professional hegemony. Still, one hopes that the critical voices on both sides steer the process away from the zero-sum-game it is in danger to become, which certainly would leave everyone worse off.
Secondly, the homo biomedicus model is not entirely wrong, just as the homo economicus model is not completely off the mark. The concept has its merit and adequate areas of application, and it will need to be taken into account when designing future services based on a richer, more complex understanding of man as Homo biopsychosocialis that is embedded in a capabilities-based approach. Throwing out the baby with the bath water would be just as wrong.
Josef Lentsch is Director of RSA International – follow him at @joseflentsch
A lamentable aspect of friendship involves watching movies.
Specifically, when one friend has seen the film and the other hasn’t. For me, it’s a simple truth- not because I’m (that) disagreeable, moody or withdrawn, but because some of my friends succumb to the temptation of announcement. You know who you are.
We’ve all been there:
A living room, two friends, X and Y are midway through a film:
X: Oh, this part- mate, this, this is such a funny bit- watch this:
Event happens in the movie.
X reels with laughter, seeks confirmation from Y that it was, truly, oh so funny.
Y is silent.
X: Wasn’t it funny? I love that part.
In Hitch-22 ,the late Christopher Hitchens cited Theodore Adorno on this point:
Adorno made a beautiful corkscrew or double-helix-shaped aphorism about the Hays Office, which was then the headquarters of moralistic and ideological invigilation of the movie industry. Under its unsmiling rules, no double beds could be shown, no “race-mixing,” no untoward conduct or risqué speech. Nonetheless, ventured Adorno, an intellectually and aesthetically satisfying film could be made, observing all the limitations prescribed by the Hays Office, on the sole condition that there was no Hays Office.
Or, by enforcing, endorsing or applying a proposition or plan, one defeats or perverts the intended outcome. Y almost certainly would have found the part in the movie funny if only X hadn’t announced its funniness. Or the deflation New Year’s Eve revellers feel after making overzealous declarations of debauchery at the start of the night. A Parisian friend swears the least romantic thing a handholding date can do is swoon at the Eiffel tower, pout beneath his fringe and say ‘oh isn’t it romantic’? Feel free to add your own.
New technologies and social media however influence behaviour subtly. They permeate our lives ubiquitously, allowing us to snap, update and share wherever we like. So frequent is our online activity that we’re amateur photographers, reporters or commenters without even realising it. Of course, activity and accomplishment are not the same things; I’m not suggesting that a photo of dinner or giving the peace sign before a monument will be recognised by the WPO.
Why do we do it? Well, precisely because there is no imperative to do so. People seem to prefer doing things without overt prompting or advertising. I’m certain the Social Brain team here in the RSA can explain or refute this with something more than mere assertion. But consider it: using a social network is not normally the result of a heavy-handed advertising campaign. Indeed, all David Dimbleby has to say during Question Time is ‘if you’re following us on Twitter, the hashtag is-”. If.
Therefore, you don’t need me to tell you that you can interact with the RSA through social media. In fact, given Adorno’s gorgeous thought-problem, you’d probably cringe with shame if another human being told you that it was ‘really great to upload your photos to our Flickr group’.
So I’m not. I’m just going to leave a link here, which you can choose to click or not: RSA Flickr Group.
Whether or not you click on the first group in the list, I promise not to tell you how fun it is, or how much I love it or how it’s my absolute favourite site…..
Gurmeet Singh is a Fellowship Researcher. You can contact him on email@example.com
Everyone has his or her own prejudice-drenched and ideology-affirming reason for why Finland appears to have such fantastic schools. Better trained, higher status teachers; a later school starting age; less time at school; ‘curriculum coherence’; an atmosphere of collaboration; no external testing or inspection; a monocultural population; dark boring winter nights with no choice but to study.
Pahsi Sahlberg did a terrific job at an event at the House of Commons last week to tie most of these strands together whilst refuting a few of them. He outlined a historical set of values and particular set of agendas and priorities that came together in the 1970s, and enabled the government to nurture and trust teacher professionalism and collaboration. His book, Finnish Lessons, is a beautifully written history and exposition of a Finnish miracle.
However, I have one more reason he didn’t mention, although it’s probably more symbolic than causal.
In the 1970s, Finland became the only country in the world to enter both the Eurovision Song Contest and its Communist bloc rival The Intervision Song Contest. In 1980, Finnish singer Marion Rung won the Intervision with Hyvasti Yo( Where is the Love?)
What might such a promiscuous attitude to song contests tell you about Finland’s approach to education?
First, Finland’s careful but precarious neutrality during the Cold War may have forced the government to adopt a different, more enlightened attitude to its people and public service professionals. Not trusting either market or state control, it had to rely on citizen control, despite being in the same, or even worse, economic mess as the rest of us.
Second, Finland’s deliberately pragmatic political ideology left itself able to borrow valuable educational ideas and practices from anywhere in the world, and adapt them to its own, well-understood context. Sahlberg acknowledged how much of Finland’s success had been inspired by the adaptation of initiatives from England and elsewhere. Picking and mixing policies from abroad carries obvious dangers, but doing this without ideology-tinted glasses gives you a greater chance of finding the right ideas, and adapting them in the right way. This pragmatism remains at the core of Finnish education. The most incredible mini-story Sahlberg told was of a group of hockey players whose school allows them to study more in the dark winter so that they can spend more time training and competing during the summer. The Finnish for ‘personalisation’ is, Google thinks, ‘mukautus’.
Finally, during a time of economic crisis and real global fears about nuclear war, Finland managed to think and act long term – its education strategies worked because it stuck to them.
Finland’s education system is far from perfect, and new challenges are emerging. Although Finland is not resting on its, er, fir branches, the next OECD PISA results might have different winners (and maybe, like the Intervision, someone should create an alternative to PISA and TIMMS). Thousands of education tourists might soon be flocking to Japan, or Poland, or Turkey (if PISA results could be disaggregated by region, they might even come to London). However, as the RSA develops our family of academies based on a set of values that foreground trust, collegiality, innovation and school-to-school support, maybe we are trying to create a piece of England that is forever Finland.
Yesterday, straight from an energising discussion with our Projects team about future RSA approaches to public services issues, I rushed to deal with something more current and tangible. My twelve year daughter has a long term health condition, which means regular appointments and occasional bouts of hospitalisation. After twelve years navigating a Victorian monolith, we now have the airy complexity of a brand new PFI building. We’ve gone straight from Dickens to Huxley.
My daughter has always been intense and feisty – most people who spend a few hours with her need to come up for air at some point – but in her regular interactions with medical people and places, this is amplified. And adolescence is now adding to the mix. Yesterday, she refused to answer questions that weren’t using the correct medical terms on the piece of paper in front of the physiotherapist. She asked irritating questions, gave cryptic answers, and her body language was moody, sullen and horizontally sprawled – she looked like she was on our sofa watching something excruciatingly boring on TV.
Like any parent would, I often plead for her to be more polite to a group of people that definitely want her to be as well as possible. At the same time, I know that her assertive games are a form of resilience – a way of coping with loss, setbacks and change, and steeling herself for future battles and disappointments. She is an expert patient now, and her attitude in some ways ensures that the system treats her as such.
I remember Maria Balshaw, now Director of Manchester City Galleries, arguing that ‘arsiness’ was a key attribute of creativity, so should possibly be taught in schools. I doubt if this idea will catch on, but we do need to accept the need to develop qualities in our young people that aren’t always pleasant. Whether it’s the liberal perspective on social and emotional learning, or the more traditional approach through character education, both emphasise qualities and attitudes that, in essence, make children easier for us adults to deal with. Just be nice. Even our Opening Minds framework, which includes ‘coping with change’ as a key aspect of the ‘managing situations’ competency, might not be quite ready to develop and assess approaches which elicit and celebrate the nasty.
This links to an emerging idea for a broader RSA project: can we harness new insights into the teenage brain and other research to ask how can schools and society relish rather than fear the teenage years? What kinds of behaviour change do we need to promote, in both teenagers and the adults and institutions which deal with them, to ensure a happy, productive adolescence?
Being reflective means never having to say ‘sorry if these comments come a bit late’. So, after my recommended twenty minutes of reflection, here are some thoughts inspired by Monday’s excellent Reflexive Coppers report. The Social Brain team said reflexive, I’ll say reflective, even though I am sure that the difference matters to someone out there.
The report demonstrated a real appetite from the police to engage in new kinds of thinking and conversations, and also outlined the barriers, both cultural and institutional, that prevent reflectivity. These barriers are far from unique to the police force. Even the teaching profession, which by its nature you might expect to embrace reflection as a key pedagogy for pupil and adult learning, finds this difficult. It’s often a case of ’Teach first, ask questions later, if at all’.
In my previous job with Creative Partnerships, although the excitement came during the classroom projects themselves, most teachers and practitioners recognised that the most important, sustainable learning came through the reflective processes we built into the programme’s design and values – ‘question, connect, imagine, reflect’. This was often tough stuff , but ultimately it was the reflection that changed teachers’ practices when our circus left town.
My own experience of Action Learning as a powerful tool for solution-focused reflection was that it worked best with people who weren’t only outside your own workplace, but from different professions. Common Purpose‘s model is partly built on this cross-professional approach, but their operation can appear too evangelical and assertive to encourage genuine reflection. It is also expensive. Are there cheaper, more self-facilitated ways for professionals across different public services to reflect collaboratively, possibly based on the TeachMeet DIY approach, and possibly on particular themes (for instance, children and young people)?
When my sister was training to be a nurse, during one of her first lectures her class of sixty students was told that “half of you will end up marrying policemen”. She neither became a nurse or married a policeman; but if her lecturer was right then reflective, cross-professional pillow talk may already be happening, off -duty, in various rooms of various homes.
Whose responsibility is it to support people with mental health problems in/to employment? Later this month, the OECD will publish what looks like an important piece of work examining the myths and realities about mental health and work. The issue is a spectacular tangle of grey areas, discrepancies and imprecision.
There’s the matter of looking after the mental health needs of those currently in employment. There’s the challenge of supporting people with long term mental health conditions in getting off benefits and into work both sustainably and without exacerbating their illness. There’s a complex relationship between recovery (which in the case of mental illness is rarely a linear process), therapeutic occupation and the pressure of responsibility.
There are underlying problems around the hindering of aspiration and ambition as a result of the onset of mental illness. There’s the confusing business of common mental disorders, such as anxiety and depression, and severe mental disorders, all of which can be chronic, transitory and, very likely, fluctuating in degree of severity.
The very idea of mental illness takes us to extremely uncomfortable places
Dealing with these challenges falls neatly between health and social care provision, the welfare system, and the big bad world of the free market. The very idea of mental illness takes us to extremely uncomfortable places, and the fact that employers don’t really want to know and employees don’t really want to talk about their mental health is only one indication of the deep social stigma that still exists.
So even though we know that the costs of mental ill health are enormous (estimated at as much as 4% of GDP in the EU), it does not seem to be an issue that people outside the mental health sector are engaging with very seriously. Employers and corporations are, so far, not a real partner to the mental health care system.
The government is ploughing money into the Work Programme, which pays specialist providers to get people with long term conditions into work, but without any kind of systematic approach towards employers and the workplace. The fact is that employers do not like the idea of mentally ill staff (only four in ten employers say they would hire someone with a mental disorder) and we cannot simply sidestep or overlook this when trying to encourage people off benefits and into work.
The fact is that employers do not like the idea of mentally ill staff and we cannot simply sidestep or overlook this when trying to encourage people off benefits and into work.
And, although there is overwhelming evidence that employment is an important element of continuous, on-going recovery from mental illness, the mental health care system takes minimal responsibility for the employment status of its patients. Part of the problem here is that so much of the mental health care system is geared up towards severe mental disorders, and a long held lack of expectation of recovery. Once a person falls into the system of mental health care, the path towards becoming a career mental health patient is a lot easier to fall onto than any alternative paths.
The OECD’s report will argue that policy can and must respond more effectively to these challenges, but to do so will require a co-ordinated approach and a multi-level shift. This will require a level of integration that is hard to see emerging from what the coalition is up to.
For example, we know that prevention and early intervention are crucial, so we need to find ways to join up vocational support with first line health care response to mental distress. To make working life compatible with long term mental disorder, there is a need to stop trying to shoehorn people with mental health conditions into inflexible and conventional models of working and encourage employers to accept variations in people’s productivity, and a more diverse and creative view of what it means to get the job done.
Within all of this I’m sure there are real opportunities for social enterprises to play a role in bringing about shifts not only in attitudes to mental health at work, but also in terms of matchmaking people with employers and helping employers to respond proactively to the mental health needs of the workforce. The Social Brain and Enterprise teams at the RSA are currently interested in exploring this area, so if you’re working in this space or have ideas you’d care to share, do get in touch.
Facts are so last century. In the Internet-dominated world, networked facts have pretty much taken over. The old-fashioned view of the fact is that it is an irreducible atom of knowledge. The way information is organised on the Web means that everything is connected and it is only as a result of the links between elements of information that facts come into being.
The way information is organised on the Web means that everything is connected and it is only as a result of the links between elements of information that facts come into being.
This is one of the points that David Weinberger puts across in his new book, Too Big to Know, launched yesterday in the US (not out in the UK til 19th January). Weinberger calls these configurations of linked data, in which two ideas are connected by a relationship, ‘triples’. In an interview given to Thomas Rogers for Salon, Weinberger elaborates:
OK, so, if the triple is “Edmonton is in Canada,” ideally each of those should link to some other spot on the Web that explains exactly which Edmonton, because there’s probably more than one, along with which Canada (though there’s probably only one). And “is in” is a very ambiguous statement, so you would point to some vocabulary that defines it for geography. Each of these little facts is designed not only to be linked up by computers, but in itself consists of links. It’s a very different idea than that facts are bricks that lay a firm foundation. The old metaphor for knowledge was architectural and archaeological: foundations, bricks. Now we have clouds.
Now, I think I get this, and when we think about the ubiquity of the hyperlink, it’s pretty clear that Weinberger is absolutely right. But, even before the Internet, information was still linked, and it was still necessary to reference one idea in order to construct a basis for another. Aristotle, Darwin and Newton all did it. It was just a slower process. You had to have located and read the relevant source, be it a book, paper or article and access to these things was far more restricted than it is now. But, the basic principle was the same. I think it’s reasonable to say that Weinberger’s point about metaphors rings true not because of a fundamental shift in what facts are, but rather that the Internet age has speeded everything up and made access to data (almost) universally accessible.
Our burgeoning taste for punchy, sound-bitten data is obvious – if you can’t express an important idea in 140 characters, you’ll struggle to be listened to in some circles.
The title of the book, Too Big To Know, implies that the volume of information we now have access to could be leading to a kind of overload, and there is a genuinely important (and unanswered) question about the impact of this on our brains. Are we getting cleverer or stupider as a result? Our burgeoning taste for punchy, sound-bitten data is obvious – if you can’t express an important idea in 140 characters, you’ll struggle to be listened to in some circles. Indeed, this review of Weinberger’s book on Inc.com is designed to give you the top line messages in about the time it takes to write a tweet. And, this very blog post indicates that I’m clearly as much as sucker for this as anyone.
Nevertheless, I’m pretty sure that Weinberger expresses some important ideas, not least that it isn’t individual cleverness that really matters, but the collective cleverness of the networks in which we operate. In his interview for Salon he says:
With the new medium of knowledge — the Internet — knowledge not only takes on properties of that medium but also lives at the level of the network. So rather than simply trying to cultivate smart people, we also need to be looking above the level of the individual to the network in which he or she is embedded to see where knowledge lives.