We live in a world where the skills needed to thrive are changing fast. A combination of the rise of digital, artificial intelligence and the move to a global economy means that many previously ‘safe’ middle-income administrative jobs have either been offshored or computerised. Consequently commentators predict a hollowing out of the economy, with a greater number of low wage, low skill roles at the bottom and a smaller number of highly paid jobs at the top of the pyramid. This growing imbalance – and the potential social issues it brings – has been analysed and written about by a number of leading economists, such as Thomas Piketty, in his surprise bestseller, Capital in the Twenty-First Century.
Despite what the nostalgic might think, this process is irreversible. Globalisation is accelerating and we can’t put the genie of artificial intelligence back into the box. So, how do we ensure that the UK workforce, and UK companies, are able to cope?
Go.On and On?
The key start point is to understand that the traditional model of learning a particular trade or profession and then spending your entire life working at it is no longer valid. Kids at school today will have multiple jobs during their careers, many of which may not even have been invented yet. Given that you can’t teach someone about a profession that doesn’t exist, the best approach is to provide the skills for lifelong, independent learning, such as self-reliance, adaptability, collaboration and other thinking skills.
The other vital element is to have an understanding, and mastery of, technology. To be fair, most children are miles ahead of their parents in this regard, and initiatives such as re-introducing programming to the school curriculum and low cost machines such as the Raspberry Pi are helping to drive these digital skills.
But the risk is that the current adult generation is falling behind. Research by charity Go.On UK has found that 12 million people (roughly a quarter of the adult population), lack the basic digital skills required today. 23% of small businesses also don’t have these skills. Go.On defines these skills in 5 areas:
- Managing information (finding, storing and managing online information)
- Communicating (communicating digitally, interacting online)
- Transacting (shopping/selling online, managing finances digitally, registering for government services)
- Problem-solving (using online resources to learn and solve problems)
- Creating (basic content creation, such as writing a social media post)
For many of us, these are not particularly complex or challenging, but failure to learn them not only hurts the chance of a good job, but also financially impoverishes people. If they aren’t able to buy goods online, they may well end up paying more, while they will be increasingly cut off from family and friends. At the same time a significant number of people are being held back, such as by slow internet access speeds, poverty and a lack of technology.
To show the scale of the problem Go.On has created a digital heatmap of the country, which combines local factors (infrastructure, education, demographics), with the percentage of those with digital skills. This shows the areas that are at risk of being left behind – “digitally excluded” – in the future. What is stark when looking at the map is how few regions and local authorities are safe – the vast majority have a medium to high likelihood of exclusion.
The Go.On findings must act as a wake-up call and a way of focusing efforts on increasing digital skills. My concern is that there doesn’t seem to be one body responsible for this – it is left to a combination of local authorities, central/regional government, schools, colleges, charities and even the BBC. While everyone should be responsible for learning basic digital skills, it needs a co-ordinated effort to level the playing field. Otherwise the imbalance shown in the Go.On map will actually widen, rather than shrink, hurting both individual prospects and the overall economy. It is time for rapid government-led action, and it needs to happen quickly.
There’s been a number of recent pieces about the rise of self-learning technology that uses artificial intelligence (AI) to carry out tasks that would previously have been too complex for a machine. From stock trading to automated translations and even playing Frogger, computers will increasingly take on roles that used to rely on people’s skills.
Netflix used an algorithm to analyse the most watched content on its service, and found that it included three key ingredients – Kevin Spacey, director David Fincher and BBC political dramas. So when it commissioned original content, it began with House of Cards, a remake of a BBC drama, starring Spacey and directed by (you’ve guessed it) Fincher.
This rise of artificial intelligence is worrying a lot of people – and not just Luddites. The likes of Stephen Hawking, Bill Gates and Elon Musk have all described it as a threat to the existence of humanity. They worry that we’ll see the development of autonomous machines with brains many thousands of times larger than our own, and whose interests (and logic) may not square with our own. Essentially the concern is that we’re building a future generation of Terminators without realising it.
They are right to be wary, but a couple of recent stories made me think that human beings actually have several big advantages – we’re not logical, we don’t follow the facts and we don’t give up. Psychologist Daniel Kahneman won a Nobel Prize for uncovering the fact that the human mind is made up of two systems, one intuitive and one rational. The emotional, intuitive brain is the default for decision making – without people realising it. So in many ways AI-powered computers do the things we don’t want to do, leaving us free to be more creative (or lazy, dependent on your point of view).
Going back to the advantages that humans have over systems, the first example I’d pick is the UK general election. All the polls predicted a close contest, and an inevitable hung parliament – but voters didn’t behave logically or according to the research and the Tories trounced the opposition. While you might disagree with the result, it shows that you can’t predict the future with the clarity that some expect.
Humans also have an in-built ability to try and game a system and find ways round it, often with unintended consequences. This has been dubbed the Cobra effect after events in colonial India. Alarmed by the number of cobras on the loose, the authorities in Delhi offered a bounty for every dead cobra handed in. People began to play the system, breeding snakes specifically to kill and claim their reward. When the authorities cottoned on and abandoned the programme, the breeders released the now worthless snakes, dramatically increasing the wild cobra population. You can see the same attempt to rig the system in the case of Navinder Singh Sarao, the day trader who is accused of causing the 2010 ‘flash crash’ by spoofing – sending sell orders that he intended to cancel but that tricked trading computers into thinking the market was moving downwards. Despite their intelligence, trading systems cannot spot this sort of behaviour – until it is obviously too late.
The final example is when humans simply ignore the odds and upset the form book. Take Leicester City. Rock bottom of the English Premiership, the Foxes looked odds-on to be relegated. Yet the players believed otherwise, kept confident and continued to plug away. The tide now looks as if it has turned, and the team is just a couple of points away from safety. A robot would have long since given up……..
So artificial intelligence isn’t everything. Giving computers the ability to learn and process huge amounts of data in fractions of a second does threaten the jobs of workers in the knowledge economy. However it also frees up humans to do what they do best – be bloody minded and subversive, think their way around problems, and use their intuition rather than the rational side of their brain. And of course, computers still do have an off switch………….
The industrial revolution mechanised previously craft-based activities, and since then machines have become more and more involved in creating the world around us. But until a few years ago, this mechanisation didn’t affect those of us in the creative industries – after all, our imagination and skills couldn’t be replicated by a machine.
The internet has changed all of that. In some cases it has allowed computers to take on tasks that were previously only done by humans, by applying artificial intelligence and machine learning and breaking them into discrete tasks. You can now get computer-written journalism, which use algorithms to bring together data and organise it into a rudimentary article. In the US, stories about minor earthquake reports are now routinely created and published, based on information supplied by the US Geological Survey. It isn’t much of a stretch to see short sports reports written based on player data and profiles, avoiding the need to send a reporter out to lower league matches.
However the biggest threat or opportunity to the creative industries is that the internet and digital technology has broken down the barriers around previously specialist occupations. Take photography. In the past only professional photographers could afford the equipment needed to create (and manually develop) arresting images. Now, similar levels of performance are available in a smartphone, and PhotoShop can do the rest. News stories frequently use amateur shots from bystanders who happened to be in the right place at the right time, adding extra depth to articles. Design and PR are both equally affected. Anyone can set up as a web designer or copywriter, without necessarily needing to undergo lengthy training.
In many ways this is a good thing – the internet has democratised creative industries that were previously off limits to most of us and enables more people to share their thoughts, feelings and ideas. It uncovers real talents who never previously would have been spotted, whether that is musicians on YouTube or specialist bloggers with a passion for their subject. But what it also does is amateurise previously professional occupations. How can a portrait photographer compete on cost with a bloke and an iPhone? Again, a copywriter on eLance charges much less than a professional. And the overall effect is that there is more stuff out there (words, pictures, videos of cute cats), but quality is far more hit or miss.
Before people start complaining, as someone that makes a living through PR and copywriting I obviously do have a vested interest here. But that doesn’t mean I don’t welcome more competition and the chance for more people to be creative. Far from it. However businesses need to understand that you get what you pay for – in the same way that fixing your car yourself is inherently riskier than going to a garage (unless you are a mechanic), working with amateurs opens you up to potential issues. Do they have insurance if something goes wrong, do they understand copyright, are they using legal images on your new website? There are 101 questions that you need to be sure of, before handing over your money. And it can be pretty obvious when a website has been put together by the managing director’s teenage son or daughter. Businesses therefore need to strike a balance between democratisation and working with amateurs if they are to stand out in an increasingly crowded global market.
The world of work has changed immeasurably over the last ten years, not just in the UK but across all developed countries. Repetitive, process driven jobs have been automated, with technology replacing paper-based workflows. In many cases this has led to a hollowing out of sectors and companies, with the remaining workforce split between menial roles and higher level management.
And these changes are accelerating. A report in The Economist points to new technological disruption in the workplace, driven by computers getting cleverer and becoming self-learning. Lightweight sensors, more powerful cameras, cloud computing, the Internet of Things, big data and advances in processing power are all contributing to helping computers do brain work. Innovations such as driverless cars and household robots don’t require human intervention to operate, and can do more than traditional machines.
Research from Oxford University suggests that 47% of today’s jobs could be automated within the next 20 years. Many of these roles are in previously ‘safe’ middle class professions such as accountancy, the law and even journalism.
So, this begs two questions. What skills do people need if they are going to thrive in this new world – and are we teaching them to children quickly enough?
The employees of the future will require skills that complement machine intelligence, rather than mirror it. Empathy, the ability to motivate, and being able to think outside the box will all be needed. Essentially soft skills, backed up by specialist knowledge that is based on experience that cannot be replicated by machines. Professions such as therapists, dentists, personal trainers and the clergy are all seen as being relatively safe from replacement by robots. Interestingly entrepreneurs often possess these talents, so expect them to thrive as they use technology such as the cloud to bring their innovations to market quickly.
As a knock on effect, the will be a change in the size of companies people work for. Before the Industrial Revolution most people worked either for themselves or in small organisations (the village carpenter and his apprentice for example). Industrialisation required scale, so vast mega companies grew up. These won’t disappear, but the number of people working for them will shrink dramatically as intelligent machines take over. We’ll move to a larger proportion of the population being self-employed, providing their services on a personal basis.
Looking at education, schools will also need to change. Pupils need to understand the world around them, so they have to be taught a certain number of facts and dates, but rote learning of what made the British Empire great is going to be useless for a large proportion of people’s careers. What is needed is to teach skills for learning and adapting, thinking for yourself and how to motivate and show empathy to others. Essentially, children starting school today will be going into careers that may not even exist yet – so lifelong learning and flexibility are critical.
The predictions of the havoc that technology will cause to the world of work may be overstated – just because something is technically possible, it doesn’t mean it will quickly become mass market. And governments, worried about massive social change, are likely to step in to mitigate the worst impact through legislation. But changes are coming, and we need to think more like entrepreneurs and less like machines if we’re going to thrive.
The New Year has seen a ton of hype about Quora, the community-based question and answer site. In part this is due to its pedigree, with a management team that left Facebook to set the company up, secondly a lack of news prior to CES and finally a reflection of the basic human interest in asking questions (and giving answers). Just look at the number of failed attempts to provide intelligent question answering – Ask Jeeves being a prime example.
But while the ability for anyone to provide answers is interesting, I think it could be the site/network’s Achilles heel. Essentially, rather than using smart technology to catalogue existing content on the net, like Cambridge-based True Knowledge, Quora relies on it all being resubmitted. Great if you are a journalist looking for a range of responses for an article, but difficult to scale and not helpful if you are in a real hurry. How many Quora replies could have been handled through a Google or Wikipedia search and a bit of lateral thinking?
Essentially Quora is point technology – a cynical take on the current hype is that it has a smallish window to build subscribers and sell itself to another social network before they build the technology themselves. Like the rather sparse Future of Quora topic, what happens next to the company is difficult to predict.