Saving money can be hard to do, especially given the current economic climate and falling real wages. It can be difficult emotionally, too, with a recent report published by the Money Advice Service finding that many people prefer to spend their money “more on the here and now than on planning for the future.” But with the right help, maybe saving can even be fun.
Picture the scene: you are browsing online just about to purchase a t-shirt. You don’t actually really need another t-shirt, after all it looks exactly like all the others in your closet, but it is 30% off, so for £10 why not? But just before your online checkout, a message pops up asking whether you’d like to add £10 – instead of (or in addition to) making the purchase – to a savings account named “new computer”, “honeymoon fund”, or perhaps more important but somewhat less motivating “unforeseen emergencies”. Or, as you are waiting for the barista to hand you your coffee you eye up the croissants on the café counter. It looks tasty, so you consider adding it to your order, but instead, you use your phone to transfer the £2 you would have spent on the croissant into your savings pot designated for a gourmet foodie weekend in Paris.
image via edudemic.com
This is what ImpulseSave, a small Boston-based organisation is helping savers to do. Their motto, “go on a saving spree!” reflects its basic function of replacing spending with saving. According to this article, ImpulseSave allows you to transfer money into a savings account via text or app, and provides prompts to save while you are shopping online. Similar to some other savings tools, your savings account is named for a specific goal, so you always have in mind what your savings is building towards. Smarty Pig, another savings tool, also uses named accounts to keep the goal salient, but rather than making impulse saves, you set up automatic transfers from an existing account. How it differs from more conventional bank accounts is that you can share your progress online via various social media and friends or family can actually contribute to your savings pot to help you towards your goal.
The Social Brain Centre has argued elsewhere that saving money can be hugely beneficial to people; having a financial buffer can influence upward social mobility, effective decision making, and psychological wellbeing. But despite its benefits, many people find it hard to save.
So what do savings tools like ImpulseSave and SmartyPig offer to help people save more, that more traditional tools such as budget planners, while helpful, don’t seem to provide? Traditional tools assume that as long as people understand their incomings and outgoings, they will behave in such a way as to stay within their means. But just knowing the budget, while necessary, is not sufficient for many people to actually achieve their savings goals. Instead, we are often side-tracked by impulse purchases (the ImpulseSave website cites a staggering 15-20% of our take home pay is spent on impulse purchases “that we don’t need or even remember buying”!), short-sightedness, or lack of social support.
But just knowing the budget, while necessary, is not sufficient for many people actually achieve their savings goals.
The former tools, however, use insight about human nature and what drives our behaviour to help us (once we know our budget) stick to our savings goals. For example, with such busy lifestyles and our tendency to conserve mental energy, we are more likely to do something the easier it is to do. These tools make saving easy, either through automatic transfers or via simple digital tools. By naming the accounts, this brings our savings goals to the front of our attention, and helps keep us motivated by reminding us what we are working towards, even if that is to be spared the stress and anxiety of an unexpected expense (think a broken boiler or car repairs). And the social aspects of these tools may improve the motivation to save by evoking the desire to remain consistent with your publicly stated commitments, and also perhaps in some way by changing social norms around discussing openly what may be still somewhat of a taboo subject.
This is not to say that financial literacy is not important, but rather that beyond learning how to budget we may need some extra help along the way to achieve our savings goals. Tools like those discussed above seem to using behavioural insight to reposition saving from being something onerous to being something fun. So go ahead and try going on a saving spree, and comment below; we’d love to hear how it goes.
A trained psychologist myself, I took great interest in today’s call of the British Psychological Society for a departure of the biomedical model of mental illness. And, to my delight, so did other colleagues – read a great blog post from Social Brain’s Emma Lindley here, where she writes that we might be right now witnessing a bona fide revolution that may change mental health services so radically, ‘they will be unrecognisable to the children of my generation.’ As Emma points out, the debate is as much driven by differing concepts of human nature as it is by politics, and the struggle for professional relevance and power. It is the latter aspect that I want to focus on in this blog post.
The RSA has long taken an interest in professions and their future (including this project in the early 2000s), and is currently managing an independent review of the Police Federation. Further international projects with other professions may follow soon.
Interestingly, even though Psychiatry is the younger term, it is the arguably the older science, and literally means ‘the medical treatment of the soul’, whereas Psychology means ‘study of the soul’. Psychology and, specifically, its subdomain Clinical Psychology, have always had a hard time standing up to their medical cousin. Part of the reason for that one can find in the etymology; isn’t medical treatment is just so much more tangible than mere study? Thus, in more than one hospital of the world (including one I interned in a long, long time ago), Psychologists have not been much more than overeducated sidekicks to doctors. This may change soon.
The main reason for this is that over the last decade, and particularly since 2008, Psychology has arrived in the scientific establishment. It did so by using a strategy applied by underdogs since the advent of mankind: collaboration. (And, of course, the emergence of discipline rockstars like Steven Pinker has helped.)
Not having enough leverage itself, Psychology entered functional marriages with up and coming disciplines like neuroscience and traditional ones like economics, a process that led to the creation of new interdisciplinary fields like behavioural science. A prominent victim of this process was homo economicus – the notion that humans are wholly rational and narrowly self-interested. Homo biomedicus (not an official term, my inadequate creation), the similarly reductionist paradigm underlying present day psychiatry that acknowledges only the physical side of human existence, but leaves aside the social and psychological aspects, may very well be next.
There are two reasons to be concerned about the potential revolution of mental health services given that professional battle lines are drawn:
Firstly, while for Psychology there was the possibility of a non-threatening complementary relationship in the mutual interest with economics or neuroscience, with Psychiatry it is different. Here the question is ‘who runs the show?’, or, if you will, one of professional hegemony. Still, one hopes that the critical voices on both sides steer the process away from the zero-sum-game it is in danger to become, which certainly would leave everyone worse off.
Secondly, the homo biomedicus model is not entirely wrong, just as the homo economicus model is not completely off the mark. The concept has its merit and adequate areas of application, and it will need to be taken into account when designing future services based on a richer, more complex understanding of man as Homo biopsychosocialis that is embedded in a capabilities-based approach. Throwing out the baby with the bath water would be just as wrong.
Josef Lentsch is Director of RSA International – follow him at @joseflentsch
A lamentable aspect of friendship involves watching movies.
Specifically, when one friend has seen the film and the other hasn’t. For me, it’s a simple truth- not because I’m (that) disagreeable, moody or withdrawn, but because some of my friends succumb to the temptation of announcement. You know who you are.
We’ve all been there:
A living room, two friends, X and Y are midway through a film:
X: Oh, this part- mate, this, this is such a funny bit- watch this:
Event happens in the movie.
X reels with laughter, seeks confirmation from Y that it was, truly, oh so funny.
Y is silent.
X: Wasn’t it funny? I love that part.
In Hitch-22 ,the late Christopher Hitchens cited Theodore Adorno on this point:
Adorno made a beautiful corkscrew or double-helix-shaped aphorism about the Hays Office, which was then the headquarters of moralistic and ideological invigilation of the movie industry. Under its unsmiling rules, no double beds could be shown, no “race-mixing,” no untoward conduct or risqué speech. Nonetheless, ventured Adorno, an intellectually and aesthetically satisfying film could be made, observing all the limitations prescribed by the Hays Office, on the sole condition that there was no Hays Office.
Or, by enforcing, endorsing or applying a proposition or plan, one defeats or perverts the intended outcome. Y almost certainly would have found the part in the movie funny if only X hadn’t announced its funniness. Or the deflation New Year’s Eve revellers feel after making overzealous declarations of debauchery at the start of the night. A Parisian friend swears the least romantic thing a handholding date can do is swoon at the Eiffel tower, pout beneath his fringe and say ‘oh isn’t it romantic’? Feel free to add your own.
New technologies and social media however influence behaviour subtly. They permeate our lives ubiquitously, allowing us to snap, update and share wherever we like. So frequent is our online activity that we’re amateur photographers, reporters or commenters without even realising it. Of course, activity and accomplishment are not the same things; I’m not suggesting that a photo of dinner or giving the peace sign before a monument will be recognised by the WPO.
Why do we do it? Well, precisely because there is no imperative to do so. People seem to prefer doing things without overt prompting or advertising. I’m certain the Social Brain team here in the RSA can explain or refute this with something more than mere assertion. But consider it: using a social network is not normally the result of a heavy-handed advertising campaign. Indeed, all David Dimbleby has to say during Question Time is ‘if you’re following us on Twitter, the hashtag is-”. If.
Therefore, you don’t need me to tell you that you can interact with the RSA through social media. In fact, given Adorno’s gorgeous thought-problem, you’d probably cringe with shame if another human being told you that it was ‘really great to upload your photos to our Flickr group’.
So I’m not. I’m just going to leave a link here, which you can choose to click or not: RSA Flickr Group.
Whether or not you click on the first group in the list, I promise not to tell you how fun it is, or how much I love it or how it’s my absolute favourite site…..
Gurmeet Singh is a Fellowship Researcher. You can contact him on firstname.lastname@example.org
Everyone has his or her own prejudice-drenched and ideology-affirming reason for why Finland appears to have such fantastic schools. Better trained, higher status teachers; a later school starting age; less time at school; ‘curriculum coherence’; an atmosphere of collaboration; no external testing or inspection; a monocultural population; dark boring winter nights with no choice but to study.
Pahsi Sahlberg did a terrific job at an event at the House of Commons last week to tie most of these strands together whilst refuting a few of them. He outlined a historical set of values and particular set of agendas and priorities that came together in the 1970s, and enabled the government to nurture and trust teacher professionalism and collaboration. His book, Finnish Lessons, is a beautifully written history and exposition of a Finnish miracle.
However, I have one more reason he didn’t mention, although it’s probably more symbolic than causal.
In the 1970s, Finland became the only country in the world to enter both the Eurovision Song Contest and its Communist bloc rival The Intervision Song Contest. In 1980, Finnish singer Marion Rung won the Intervision with Hyvasti Yo( Where is the Love?)
What might such a promiscuous attitude to song contests tell you about Finland’s approach to education?
First, Finland’s careful but precarious neutrality during the Cold War may have forced the government to adopt a different, more enlightened attitude to its people and public service professionals. Not trusting either market or state control, it had to rely on citizen control, despite being in the same, or even worse, economic mess as the rest of us.
Second, Finland’s deliberately pragmatic political ideology left itself able to borrow valuable educational ideas and practices from anywhere in the world, and adapt them to its own, well-understood context. Sahlberg acknowledged how much of Finland’s success had been inspired by the adaptation of initiatives from England and elsewhere. Picking and mixing policies from abroad carries obvious dangers, but doing this without ideology-tinted glasses gives you a greater chance of finding the right ideas, and adapting them in the right way. This pragmatism remains at the core of Finnish education. The most incredible mini-story Sahlberg told was of a group of hockey players whose school allows them to study more in the dark winter so that they can spend more time training and competing during the summer. The Finnish for ‘personalisation’ is, Google thinks, ‘mukautus’.
Finally, during a time of economic crisis and real global fears about nuclear war, Finland managed to think and act long term – its education strategies worked because it stuck to them.
Finland’s education system is far from perfect, and new challenges are emerging. Although Finland is not resting on its, er, fir branches, the next OECD PISA results might have different winners (and maybe, like the Intervision, someone should create an alternative to PISA and TIMMS). Thousands of education tourists might soon be flocking to Japan, or Poland, or Turkey (if PISA results could be disaggregated by region, they might even come to London). However, as the RSA develops our family of academies based on a set of values that foreground trust, collegiality, innovation and school-to-school support, maybe we are trying to create a piece of England that is forever Finland.
Yesterday, straight from an energising discussion with our Projects team about future RSA approaches to public services issues, I rushed to deal with something more current and tangible. My twelve year daughter has a long term health condition, which means regular appointments and occasional bouts of hospitalisation. After twelve years navigating a Victorian monolith, we now have the airy complexity of a brand new PFI building. We’ve gone straight from Dickens to Huxley.
My daughter has always been intense and feisty – most people who spend a few hours with her need to come up for air at some point – but in her regular interactions with medical people and places, this is amplified. And adolescence is now adding to the mix. Yesterday, she refused to answer questions that weren’t using the correct medical terms on the piece of paper in front of the physiotherapist. She asked irritating questions, gave cryptic answers, and her body language was moody, sullen and horizontally sprawled – she looked like she was on our sofa watching something excruciatingly boring on TV.
Like any parent would, I often plead for her to be more polite to a group of people that definitely want her to be as well as possible. At the same time, I know that her assertive games are a form of resilience – a way of coping with loss, setbacks and change, and steeling herself for future battles and disappointments. She is an expert patient now, and her attitude in some ways ensures that the system treats her as such.
I remember Maria Balshaw, now Director of Manchester City Galleries, arguing that ‘arsiness’ was a key attribute of creativity, so should possibly be taught in schools. I doubt if this idea will catch on, but we do need to accept the need to develop qualities in our young people that aren’t always pleasant. Whether it’s the liberal perspective on social and emotional learning, or the more traditional approach through character education, both emphasise qualities and attitudes that, in essence, make children easier for us adults to deal with. Just be nice. Even our Opening Minds framework, which includes ‘coping with change’ as a key aspect of the ‘managing situations’ competency, might not be quite ready to develop and assess approaches which elicit and celebrate the nasty.
This links to an emerging idea for a broader RSA project: can we harness new insights into the teenage brain and other research to ask how can schools and society relish rather than fear the teenage years? What kinds of behaviour change do we need to promote, in both teenagers and the adults and institutions which deal with them, to ensure a happy, productive adolescence?
Being reflective means never having to say ‘sorry if these comments come a bit late’. So, after my recommended twenty minutes of reflection, here are some thoughts inspired by Monday’s excellent Reflexive Coppers report. The Social Brain team said reflexive, I’ll say reflective, even though I am sure that the difference matters to someone out there.
The report demonstrated a real appetite from the police to engage in new kinds of thinking and conversations, and also outlined the barriers, both cultural and institutional, that prevent reflectivity. These barriers are far from unique to the police force. Even the teaching profession, which by its nature you might expect to embrace reflection as a key pedagogy for pupil and adult learning, finds this difficult. It’s often a case of ’Teach first, ask questions later, if at all’.
In my previous job with Creative Partnerships, although the excitement came during the classroom projects themselves, most teachers and practitioners recognised that the most important, sustainable learning came through the reflective processes we built into the programme’s design and values – ‘question, connect, imagine, reflect’. This was often tough stuff , but ultimately it was the reflection that changed teachers’ practices when our circus left town.
My own experience of Action Learning as a powerful tool for solution-focused reflection was that it worked best with people who weren’t only outside your own workplace, but from different professions. Common Purpose‘s model is partly built on this cross-professional approach, but their operation can appear too evangelical and assertive to encourage genuine reflection. It is also expensive. Are there cheaper, more self-facilitated ways for professionals across different public services to reflect collaboratively, possibly based on the TeachMeet DIY approach, and possibly on particular themes (for instance, children and young people)?
When my sister was training to be a nurse, during one of her first lectures her class of sixty students was told that “half of you will end up marrying policemen”. She neither became a nurse or married a policeman; but if her lecturer was right then reflective, cross-professional pillow talk may already be happening, off -duty, in various rooms of various homes.
Whose responsibility is it to support people with mental health problems in/to employment? Later this month, the OECD will publish what looks like an important piece of work examining the myths and realities about mental health and work. The issue is a spectacular tangle of grey areas, discrepancies and imprecision.
There’s the matter of looking after the mental health needs of those currently in employment. There’s the challenge of supporting people with long term mental health conditions in getting off benefits and into work both sustainably and without exacerbating their illness. There’s a complex relationship between recovery (which in the case of mental illness is rarely a linear process), therapeutic occupation and the pressure of responsibility.
There are underlying problems around the hindering of aspiration and ambition as a result of the onset of mental illness. There’s the confusing business of common mental disorders, such as anxiety and depression, and severe mental disorders, all of which can be chronic, transitory and, very likely, fluctuating in degree of severity.
The very idea of mental illness takes us to extremely uncomfortable places
Dealing with these challenges falls neatly between health and social care provision, the welfare system, and the big bad world of the free market. The very idea of mental illness takes us to extremely uncomfortable places, and the fact that employers don’t really want to know and employees don’t really want to talk about their mental health is only one indication of the deep social stigma that still exists.
So even though we know that the costs of mental ill health are enormous (estimated at as much as 4% of GDP in the EU), it does not seem to be an issue that people outside the mental health sector are engaging with very seriously. Employers and corporations are, so far, not a real partner to the mental health care system.
The government is ploughing money into the Work Programme, which pays specialist providers to get people with long term conditions into work, but without any kind of systematic approach towards employers and the workplace. The fact is that employers do not like the idea of mentally ill staff (only four in ten employers say they would hire someone with a mental disorder) and we cannot simply sidestep or overlook this when trying to encourage people off benefits and into work.
The fact is that employers do not like the idea of mentally ill staff and we cannot simply sidestep or overlook this when trying to encourage people off benefits and into work.
And, although there is overwhelming evidence that employment is an important element of continuous, on-going recovery from mental illness, the mental health care system takes minimal responsibility for the employment status of its patients. Part of the problem here is that so much of the mental health care system is geared up towards severe mental disorders, and a long held lack of expectation of recovery. Once a person falls into the system of mental health care, the path towards becoming a career mental health patient is a lot easier to fall onto than any alternative paths.
The OECD’s report will argue that policy can and must respond more effectively to these challenges, but to do so will require a co-ordinated approach and a multi-level shift. This will require a level of integration that is hard to see emerging from what the coalition is up to.
For example, we know that prevention and early intervention are crucial, so we need to find ways to join up vocational support with first line health care response to mental distress. To make working life compatible with long term mental disorder, there is a need to stop trying to shoehorn people with mental health conditions into inflexible and conventional models of working and encourage employers to accept variations in people’s productivity, and a more diverse and creative view of what it means to get the job done.
Within all of this I’m sure there are real opportunities for social enterprises to play a role in bringing about shifts not only in attitudes to mental health at work, but also in terms of matchmaking people with employers and helping employers to respond proactively to the mental health needs of the workforce. The Social Brain and Enterprise teams at the RSA are currently interested in exploring this area, so if you’re working in this space or have ideas you’d care to share, do get in touch.
Facts are so last century. In the Internet-dominated world, networked facts have pretty much taken over. The old-fashioned view of the fact is that it is an irreducible atom of knowledge. The way information is organised on the Web means that everything is connected and it is only as a result of the links between elements of information that facts come into being.
The way information is organised on the Web means that everything is connected and it is only as a result of the links between elements of information that facts come into being.
This is one of the points that David Weinberger puts across in his new book, Too Big to Know, launched yesterday in the US (not out in the UK til 19th January). Weinberger calls these configurations of linked data, in which two ideas are connected by a relationship, ‘triples’. In an interview given to Thomas Rogers for Salon, Weinberger elaborates:
OK, so, if the triple is “Edmonton is in Canada,” ideally each of those should link to some other spot on the Web that explains exactly which Edmonton, because there’s probably more than one, along with which Canada (though there’s probably only one). And “is in” is a very ambiguous statement, so you would point to some vocabulary that defines it for geography. Each of these little facts is designed not only to be linked up by computers, but in itself consists of links. It’s a very different idea than that facts are bricks that lay a firm foundation. The old metaphor for knowledge was architectural and archaeological: foundations, bricks. Now we have clouds.
Now, I think I get this, and when we think about the ubiquity of the hyperlink, it’s pretty clear that Weinberger is absolutely right. But, even before the Internet, information was still linked, and it was still necessary to reference one idea in order to construct a basis for another. Aristotle, Darwin and Newton all did it. It was just a slower process. You had to have located and read the relevant source, be it a book, paper or article and access to these things was far more restricted than it is now. But, the basic principle was the same. I think it’s reasonable to say that Weinberger’s point about metaphors rings true not because of a fundamental shift in what facts are, but rather that the Internet age has speeded everything up and made access to data (almost) universally accessible.
Our burgeoning taste for punchy, sound-bitten data is obvious – if you can’t express an important idea in 140 characters, you’ll struggle to be listened to in some circles.
The title of the book, Too Big To Know, implies that the volume of information we now have access to could be leading to a kind of overload, and there is a genuinely important (and unanswered) question about the impact of this on our brains. Are we getting cleverer or stupider as a result? Our burgeoning taste for punchy, sound-bitten data is obvious – if you can’t express an important idea in 140 characters, you’ll struggle to be listened to in some circles. Indeed, this review of Weinberger’s book on Inc.com is designed to give you the top line messages in about the time it takes to write a tweet. And, this very blog post indicates that I’m clearly as much as sucker for this as anyone.
Nevertheless, I’m pretty sure that Weinberger expresses some important ideas, not least that it isn’t individual cleverness that really matters, but the collective cleverness of the networks in which we operate. In his interview for Salon he says:
With the new medium of knowledge — the Internet — knowledge not only takes on properties of that medium but also lives at the level of the network. So rather than simply trying to cultivate smart people, we also need to be looking above the level of the individual to the network in which he or she is embedded to see where knowledge lives.
Today is my 99th day of working at the RSA and I have just realised that a new habit seems to have formed in this time. Until now, I hadn’t thought of my lunchtime behaviour as either habitual or particularly routine. In my head, there’s all manner of lunching possibilities on my doorstep, and really I’m only temporarily buying lunch every day, as what I usually do is bring lunch from home. I tell myself I just haven’t got into the habit of it since I started this job and relocated to London but I’ll get back into the rhythm of it soon.
But, today, I have to admit that my old habit of bringing lunch from home may have had its day, having been usurped by a new one. When I went to get my lunch today, the woman behind the counter said two things which took me by surprise: “You’re a bit late today,” and “See you tomorrow”.
Ok, I’d already recognised that I tend to opt for that particular establishment, and the woman in question is a lovely, smiley person who at some point not long ago started recognising me and greeting me as a familiar customer. So, her talking to me was no surprise. What I had not realised that is that, from her perspective at least, I have a ‘normal time’ for going for lunch, which is sufficiently predictable for her notice and comment that I was ‘late’. And, although, like I say, I recognise that I go to the same place more often than not, I wouldn’t have thought it was regular enough for a member of staff to expect to see me every day.
Habits are behaviours which we perform automatically because they have been performed many times in the past. The repetition of a particular behaviour creates an association between a situation and an action. The situation acts as a cue which prompts a behaviour to be performed automatically. Doing something automatically means doing it without thinking.
Phillipa Lally and her colleagues at UCL found that breaking habits is very difficult. The easiest way of breaking a habit is to control your environment so that you don’t encounter the cue which triggers your habit. They also know that being highly motivated to change a habit doesn’t help much, although it is even harder if you are ambivalent.
What does this mean for my lunchtime behaviour? Well, I can’t really remove the cue from my environment, however you conceive of the cue – which could be my need to eat at lunchtime, my being at the office at lunchtime, or the specific food outlet that I habitually go to. I need to control the environment in a different way. Maybe if I consider the cue as the combined situation of my needing to eat, being at the office and not having brought anything with me, there is scope to change. Essentially what I need to do differently is to bring my lunch in from home.
However, Lally’s team point out that new habits don’t stop old habits from existing. Although new habits can trump old habits once they become stronger influences on behaviour, the old habit is still in place. So, for fear of getting lost in a habits hall of mirrors, my old habit of bringing lunch from home must be lurking somewhere, and although it has latterly been replaced by this new habit of buying lunch, it’s still the older habit, and therefore might stand a good chance of displacing the new one and getting back into pole position.
The season for making resolutions is almost upon us, so after Christmas I’ll have a go at reinstating my old lunchtime habit. We’re back in the office on 3rd January; 66 days after that takes us to 9th March by which time I’ll be able to say whether, in the case of what I have for lunch, new habits die as easily as old ones.
Following from my previous post on the Grandparent index, an attempt to add some fresh perspective on the key indicators of wellbeing, I would now like to add another: The haircut index.
A key indicator of wellbeing, I believe, is the temporal gap between deciding you need/want a haircut and actually getting round to having one. The longer this gap, the less perceived control you have over your own circumstances, which is key predictor of wellbeing.
I’m having a haircut on Friday, and I feel well because of that fact- it is some sort of breakthrough after a month of putting it off due to perpetually imminent deadlines at home and work.
You might think this is a trivial matter of personal tidiness, but I suspect it goes much deeper. Haircuts are a modern ritual in which we suspend our role as productive agents, and surrender ourselves to the tender care of a skilled stranger – a kind of secular shaman – who treats us as much with their benign attention as their manual dexterity.
And if that doesn’t convince you, here is the ‘blind them with science’ bit from our new secular oracle, Wikipedia:
“Hair is a filamentous biomaterial, that grows from follicles found in the dermis. Found exclusively inmammals, hair is one of the defining characteristics of the mammalian class. The human body, apart from its glabrous skin, is covered in follicles which produce thick terminal and finevellus hair. Most common interest in hair is focused on hair growth, hair types and hair care, but hair is also an important biomaterial primarily composed of protein, notably keratin.”
So if that’s what hair is(I particularly like the ‘notably’) surely cutting it off must be some sort of symbolic act?
So I propose the ONS should ask people about their capacity to follow through on their desire to have a haircut as a proxy for their wellbeing, and I am beginning to wonder if we could establish a whole new wellbeing index based on similar factors.
Proximity of grandparents, capacity to achieve haircut…what next?