Skills versus Qualifications

20141027-072951.jpg

I attended an interesting debate last week regarding UK skills policy – specifically how to ensure that young people leave school with the skills employers need. It was one of those things where you’re ready to give all the answers, but when you get there you realise everyone else already knows the answers too. Just nobody is doing anything about it.

And who can blame them? It’s a problem much bigger than any of us.

The setting for the debate was rather apt, given the subject. Beamish is an open air museum that recreates life in North East England 100 years ago. Travelling to the meeting location on an old fashioned tram, we passed the pit village where the school, moved brick by brick from nearby East Stanley, provides a model of education in the Victorian era.

In those days, manual jobs were as much a feature of children’s daily lives as they were in the classroom. Alongside the “3 R’s”, practical subjects such as woodwork, baking and needlework were taught in school. At home it was expected that children would undertake housework and chores on a daily basis from an early age. While school was already compulsory at that time, many left between age 12 and 14 to start work and earn money or help to run the family business.

Practical skills have all but disappeared from the modern curriculum. There is little time for them in a system focussed on only one outcome – not skills, but qualifications. We recognise, anecdotally at least that the traditionally academic route through university isn’t for everyone. Yet our culture continues in it’s attempt to force everyone into the same mould, inevitably leaving those who are unable to conform excluded and branded as failures.

While it is no doubt a good thing to open up the potential of a university place for everyone, to make it in essence compulsory is a mistake. The degree has unwittingly become the minimum entry criteria of employers who little understand the constantly changing education system.

The lack of alternatives (or the stigmatisation of alternatives) coupled with the removal of compulsory work experience and careers advice, on top of the requirement to stay in education longer means that many young people have no opportunity to find out what they are good at and little clue of what the world of work is actually like. Our teachers do an amazing job within those constraints, but they don’t understand the world of work either.

I could say that I don’t expect things to change with the general election approaching etcetera, but I’m not at all optimistic that things will ever change. Yes it’s all very well placing more emphasis on outcomes and destinations after education, but without more fundamental changes performance isn’t going to improve.

There isn’t a simple fix but ceasing to confuse qualifications with skills would be a good start.

Keep the debate going – let me know your thoughts below

Why Employees Really Fake Sickness

20140929-212500.jpg

There’s something inately human that seeds doubt in our minds about whether people’s illnesses are genuine, especially when it comes to absence in the workplace. Maybe it’s the UK’s history of generous health benefits coupled with a selfish reluctance to cover other people’s backs, but unless someone’s displaying clearly observable severe symptoms there’s often suspicion. The Confederation of British Industry tells us that just over 20% of the workforce see paid sickness absence as an entitlement rather than an unfortunate emergency (CBI survey 2013). Whether this is supported by the same level of fraudulent activity, we’ll probably never know.

Absence levels increase with age. On the face of it this would indicate declining health as people get older. Yet contrary to this theory is the fact that absence falls after state pension age. This may be due to only the most dedicated workers remaining in employment when they could retire, but this is probably not the whole picture. Many of today’s pensioners will remember the UK prior to the introduction of the National Health Service, alongside a welfare system much tougher and sporadically provided than it is today. Only those who worked were entitled to minimal care by a doctor via unemployment and illness insurance which was fifty percent funded by employers. This excluded the majority of women and children, because they didn’t work. Medicines and care were both expensive and not as advanced as today. The very poor might receive some charitable assistance, or end up in a workhouse where they were deemed in need of reform. Illness was often the start of the slippery slope towards death and there was simply no incentive to fake it.

I’m not suggesting we return to those days in order to strike out the (hopefully) small minority. Non-genuine sickness is always going to be a difficult subject because it’s unfortunate for those who really are sick. Human Resources practitioners aren’t medically qualified and are forced to rely on those who are to help them get the answers they need to make tough decisions. Yet more than three quarters of GPs feel under pressure to issue sick notes (DWP survey). With the need for more practical answers I think it’s helpful to understand the root causes, and with that in mind I propose that the two main issues are upbringing, and culture.

Upbringing

My sickness absence history is very minimal and a large part of that is due to my parent’s attitude. While they never explicitly told me as much, I know for certain it was their philosophy that if you’re well enough to stand, you go to work as normal. The guilt I felt at my parent’s disappointment would still stop me in my tracks today.

There’s no denying that welfare affects people’s behaviour. This is demonstrated throughout the history of the welfare state and most recently by the decline in sickness absence alongside welfare reform. Putting aside the stressful effects of the recession, many employees under pressure of redundancy were no doubt just grateful to keep a job and the question of taking a sick day didn’t come into it. Yet underlying all this is people’s general attitude to work, absence and perhaps even honesty. While we can’t undo a person’s upbringing or unravel their complex personality traits, there’s one key element employers can impact upon. This brings me to my second cause.

Culture

Employee perks such as the “unlimited duvet days” being proffered by companies like Virgin and Netflix are enough to strike fear into the heart of even the most laissez faire manager. In theory they imagine deserted offices and employees pushing it as far as they can. In practice the pressures of performance targets and presenteeism likely mean that those employees take less leave than average. This is glossed over in a big PR employee trust exercise. Underlying this is the fact that if you prove yourself untrustworthy you can forget the privileges, along with your job.

It’s always going to be a difficult balance to strike between protecting the genuinely ill and weeding out those who are abusing the scheme. Like upbringing, culture is a very personal thing and it depends how organisations want to play it. There will always be an element of carrot and stick, of give and take. It’s not trust alone, but a complicated mix of inputs, outputs and management relationships. Employees who respect that culture will feel those emotions like guilt and wanting to avoid disappointment, and in the absence of workhouses and moral correction, I’m afraid that’s the only option we have left.

Image credit: from Time Out article “Fake a sick-day call”

I Am Brand

20140919-072409.jpg

I’m starting to think that branding is developing a brand. Suddenly it’s everywhere. I yearn for the good old days when a brand meant what kind of cola you drank and comparisons between “big brands” and “supermarket brands” warranted half page spreads in women’s magazines.

Employer branding I get. We need a consistent message to tell people what we’re about and attract them to work for us. But the rise of the “personal brand” is a mystery to me. Sorry but where I’m from we call that personality.

I guess it fits with the fakery and gimmickry of the digital age. Online we can be a traffic cone or a unicorn if we want to (yes both of these really followed me on twitter). The pressure to present ourselves as a witty super human in just 140 characters is immense. If only so when our former boyfriend/boss/partner stalks us online they’re gutted they let us go.

Advice about developing a personal brand is readily available on the Internet, and is all contradictory nonsense. “Be consistent” yet also “be authentic”. “Let people know your personality”, “let people see you” and “be good at something”. Really? And my personal favourite “it’s not just enough to know that you know what you know – others have to know you know it”.

Sounds to me like describing to a complete outsider what it means to be human. Except not quite. It’s too formulaic. Too stale. A poor reflection of itself. If I know anything about branding, it’s this. If that’s what it’s become, then it’s time to change.

Modern Day Mastery

20140912-132657.jpg

You could be forgiven for thinking that nothing ever endures amongst the fast changing fads and fashions of modern living. It’s not just talk that’s cheap. Clothes, cars, cash are all consumable clutter used to communicate status. Money, people and even love, are a means to an end. Everything is temporary and we can always get more, more, more.

But there’s one thing we can’t control. It’s infinite, yet we can never get enough. We try to  apportion it, manage it, record it and analyse it, yet it continues on regardless. It is both a harsh mistress and heals all ills. We spend it more freely than money. We can’t stop it, or turn it back. Yes, our time is precious, ticking away towards an unknown end as sure as the sun rises. Yet how many of us can say we use our time wisely, doing something worthwhile.

Of course defining a good use of time is extremely subjective. Time is constant, yet how we use it is infinitely variable. Some of these uses are dictated by government policy, employers, family and cultural norms. The vast majority of us attend school, college and work as regular as clockwork in order to avoid punishment for non compliance. Whether we engage fully in that time by concentrating, learning and performing is a different matter. There’s always choice.

Just as one person’s trash is another’s treasure, one person’s time well spent is to another wasted. As employers we want our staff to conform to our definition of a good use of time, and we’re paying them to do so. As recruiters we pass judgement on whether a candidate’s life up until that point has been used wisely. We question whether their time spent in a job is too long or too short. Whether they’ve spent enough time doing academic training. Whether they’ve spent any time doing nothing and what that means. If we conclude they’ve made good use of that time, we ask them to commit their future working time to us and reward them with a job and salary.

It takes a significant investment of time to master a skill – according to Malcolm Gladwell, 10,000 hours. This is much longer than a typical modern day apprenticeship or degree and implies sustained dedication towards a singular goal. It’s easy to see how this lengthy time period could lead to mastery of a physical skill with measurable tangible improvements demonstrating competence. But for many modern job roles, focussed on gathering and assimilating knowledge, time is a poor measure. Neither is mastery as whether what we know is correct is subjective, may have little impact and changes quickly. Think of technolgy – we might invest a significant amount of time mastering certain digital processes – only for them to be obsolete tomorrow.

So modern day mastery is not just about learning, it’s about looking ahead to see what we don’t know now, but might need to know in the future. But how have our recruitment and development processes evolved to identify the presence of indicators of this behaviour, such as potential and talent? The answer is, they haven’t. And that’s a problem.

Can HR help the homeless?

20140908-071258.jpg

“If you have a good idea and people don’t like it, don’t take it away and tone it down so that more people like it. Take it away and amend it so it pokes people in the eyes”

David Orr, Chief Executive of the National Housing Federation

I just had to share this great quote, spoken last week at the fantastic “Creating Homes and Futures” event, organised by Youth Homelessness North East. Continue reading “Can HR help the homeless?”