50 years ago, engineer Gordon Moore wrote an article that has become the bedrock of computing. Moore’s Law, as first described in the article, states that the number of elements that could be fitted onto the same size piece of silicon doubles every year. It was then revised to every two years, and elements changed to transistors, but has basically held true for five decades. Essentially it means that computing power doubles every two years – and consequently gets considerably cheaper over time.
What is interesting is to look back over the last 50 years and see how completely different the IT landscape is today. Pretty much all companies that were active in the market when Moore’s Law was penned have disappeared (with IBM being a notable exception and HP staggering on). Even Intel, the company Moore co-founded, didn’t get started until after he’d written the original article. At the same time IT has moved from a centralised mainframe world, with users interacting through dumb terminals to a more distributed model of a powerful PC on every desk. Arguably, it is now is heading back to an environment where the Cloud provides the processing power and we use PCs, tablets or phones that, while powerful, cannot come close to the speed of Cloud-based servers. This centralised model works well when you have fast connectivity but doesn’t function at all when your internet connection is down, leaving you twiddling your thumbs.
Looking around and comparing a 1960’s mainframe and today’s smartphone you can see Moore’s Law in action, but how long will it continue to work for? The law’s demise has been predicted for some time, and as chips become ever smaller the processes and fabs needed to make them become more complex and therefore more expensive. This means that the costs have to be passed on somehow – at the moment high end smartphone users are happy to pay a premium for the latest, fastest model, but it is difficult to see this lasting for ever, particularly as the whizzier the processor the quicker batteries drain. The Internet of Things (IoT) will require chips with everything, but size and power constraints, and the fact that the majority of IoT sensors will not need huge processing power means that Moore’s Law isn’t necessary to build the smart environments of the future.
Desktop and laptop PCs used to be the biggest users of chips, and the largest beneficiaries of Moore’s Law, becoming increasingly powerful without the form factor having to be changed. But sales are slowing, as people turn to a combination of tablets/phones and the processing power of the Cloud. Devices such as Google Chromebooks can use lower spec chips as it uses the Cloud for the heavy lifting, thus making it cheaper. At the same time, the servers within the datacentres that are running these Cloud services aren’t as space constrained, so miniaturisation is less of a priority.
Taken together these factors probably mean that while Moore’s Law could theoretically carry on for a long time, the economics of a changing IT landscape could finish it off within the next 10 years. However, its death has been predicted many times before, so it would take a brave person to write its epitaph just yet.
After announcements last year, this week saw the launch of the first Apple Watches, although they won’t go on sale until 24 April. The cutely named Spring Forward event saw the tech giant reveal all 38 models, which will range in price from £299 (for the sport model) to £8,000+, depending on screen size, design and whether you want it in 18 carat gold.
More importantly Apple showed a selection of the apps that it expects to drive demand for the device. You can make touchless payments, receive phone calls, open a compatible hotel room door (rather than using a keycard), and remotely open an internet-connected garage door (no, I don’t have one of those either). However for a large number of functions, such as messaging, GPS tracking and making phone calls you’ll need an iPhone 5 to run alongside your new watch.
Apple is not a stupid company and has grown to be the biggest quoted business in the world by revenues through reinventing the music and smartphone markets. It hired former Burberry chief executive Angela Ahrendts to head up its online and physical stores, partly to help its move from technology into fashion with watches. I remember loudly proclaiming that the iPad would never catch on due its innate pointlessness, and now I rely on it every day. But I still see some serious challenges to the Apple Watch attaining critical mass. Here are four of them:
The cost of the Sport model begins at £299, with prices for the mid-tier Watch version starting at £479. To me, this is a lot of money to spend on a watch, even one that looks as sleek as the
Apple device. And for £900+ you can buy a low-end TAG Heuer, that you know will last for a long time without needing to be upgraded as software advances. Yes, millions of people have iPhones, but the vast majority got them on subsidised deals that meant they didn’t have to fork out close to the real sales price. A better comparison is the similarly priced iPad, which has seen sales slow as the market becomes saturated over time. Therefore predictions of sales of 60 million seem excessive, with the market much more limited than that.
2. Does it do anything different?
Anyone of a certain age who saw or read Dick Tracy loves the idea of using their watch to make a call, even if it is to the office rather than for police back up. But Dick Tracy didn’t have a smartphone, which can do pretty much everything a watch can do – and more besides. And as Apple has said, you’ll need to retain your iPhone to provide many of the functions that can’t be squeezed into the watch. Admittedly the iPhone is getting bigger, making it more difficult to use for things such as contactless payments, but equally the watch could be seen as too small for many other activities.
3. A whole new market
Apple has always been known for its design excellence, and the Watch appears to be equally stunning, admittedly with a bulkier face than a traditional wristwatch. Hiring Ahrendts also points to a desire to bring in luxury marketing nous to help it move into a different sector, where factors outside technology excellence and cool apps could be more important. Can it become the fashion accessory that everyone wants? In the ultra-competitive watch market it will be difficult, though expect Apple to try to jump the chasm from geek to cool.
4. Battery life
Watch batteries traditionally last for years. In contrast iPhones provide just hours of charge, depending on how much Candy Crush you are actually playing. So the news that the Apple Watch will keep going for 18 hours is disappointing to say the least (although the company says that it will continue to show the time for up to 72 hours after that). Essentially consumers will need to charge the watch every night, plugging it in alongside their iPhone ready for the morning. It just reinforces that this is a technology product, rather than something you wear, and is bound to put some people off.
I could be as wrong about the Apple Watch as I was about the iPad, but to me, despite the hype, it won’t move beyond being a niche product for fanboys and girls who want to pair it with their latest iPhones. For me, if I had the spare cash I’d buy a TAG instead and leave technology to my phone……….
If you needed evidence of the growth of the smartphone market and its move into every part of our lives, then this week’s Mobile World Congress (MWC) provides it. It wasn’t that long ago that the event was dominated by network infrastructure companies, but now it is essentially a consumer electronics show in all but name. And one that looks far beyond the handset itself. Ford launched an electric bike, Ikea announced furniture that charged your smartphone and a crowdfunded startup showed a suitcase that knows where it is and how much it weighs.
Five years ago none of these companies would have even thought of attending MWC – and it is all down to the rise of the smartphone. It is difficult to comprehend that the first iPhone was only launched in 2007, at a time when Apple was a niche technology player. It is now worth more than any other company in the world and 2 billion people globally have an internet-connected smartphone. By 2020 analysts predict that 80% of the world’s adults will own a smartphone.
As any honest iPhone owner will freely admit, they may be sleek, but they are actually rubbish for making and receiving calls. What they do provide is two things – a truly personal computer that fits in your pocket, and access to a global network of cloud-based apps. It is the mixture of the personal and the industrial that make smartphones central to our lives. We can monitor our own vital signs, and the environment around us through fitness and health trackers and mapping apps, and at the same time access any piece of information in the world and monitor and control devices hundreds or thousands of miles away. Provided you have a signal……….
So, based on what is on show at MWC, what are the next steps for the smartphone? So far it seems to split into two strands – virtual reality and the Internet of Things. HTC launched a new virtual reality headset, joining the likes of Sony, Microsoft, Samsung and Oculus Rift, promising a more immersive experience. Sensors to measure (and control) everything from bikes and cars to tennis racquets are also on show. The sole common denominator is that they rely on a smartphone and its connectivity to get information in and out quickly.
It is easy to look at some of the more outlandish predictions for connected technology and write them off as unlikely to make it into the mainstream. But then, back in 2007, when Steve Jobs unveiled the first iPhone, there were plenty of people who thought it would never take off. The smartphone revolution will continue to take over our lives – though I’m not looking forward to navigating streets full of people wearing virtual reality headsets who think they are on the beach, rather than on their way to work…………
This week the election campaign has been focusing on education, with the Conservative Education Secretary, Nicky Morgan, promising that every child leaving primary school must know their times tables up to 12 and be able to use correct punctuation, spelling and grammar. It follows her predecessor, Michael Gove, revamping the history curriculum to ensure that pupils know about key dates in British history – a move that some saw as a return to Victorian rote learning of facts.
Morgan complains that Britain has slumped in international education league tables, and has vowed to move the country up in rankings for maths and English. But ignoring the fact that children are already tested on times tables, I think she’s missing the point about modern education and the skills it teaches. Of course, children should know their times tables, and be able to read and write. These are basic skills that everyone should have.
But we are in an era of enormous change, and the skills that the workforce of tomorrow requires will be very different to those of today. Increased globalisation, the advent of the knowledge economy and greater technology are impacting on all jobs. Previously safe, middle income management occupations will be broken into smaller chunks and either computerised or outsourced, hollowing out the workforce so that what remains are high end, knowledge-based roles or more menial tasks.
What we need to do is prepare our children for this world by helping them to develop the skills that they require to work in this brave new world. A large proportion of today’s pupils will end up working in jobs that don’t currently exist, so you need to focus on three areas:
1. Learning to learn
Rather than simply teaching facts and tables, you need to instil in children the skills they need to keep learning. These range from problem solving, resilience and working as a team, to ensuring they have inquiring minds and are always pushing themselves.
2. Lifelong learning
Alongside learning to learn, everyone needs to understand that education doesn’t stop when you leave school or university. Whatever field you are in, you’ll need new skills as your career evolves, so it has to be seen as natural to keep learning. The days of working for the same company for ever are long gone, and the days of working in the same role throughout your career are going the same way. So, people will have to make radical moves into new industries and careers, and that will require ongoing investment in learning new skills.
The UK government has re-introduced coding to the school curriculum, which is a major step forward in ensuring that everyone has the basic skills needed to understand and work with technology. While most jobs have required IT for a while, the spread of software into every corner of our lives means that those who understand and program computers will have a big advantage over those that just use them to type emails or surf the net. I’d like to see more government investment in coding for all, alongside schools, so that everyone learns the skills they need.
Don’t get me wrong, it is a laudable aim that every child should leave primary school knowing that 12×12 is 144 and how to use an apostrophe. But we need to be teaching our children a lot more than that if we want to nurture a workforce of self-starting, motivated and problem solving adults that can drive innovation and wealth for the country and wider society.
The internet has radically changed how we bank, removing the need to physically visit and turning a thousand and one redundant branches into All Bar Ones and Wetherspoons. But the actual mechanics of transferring money around haven’t really changed. Through a combination of regulation and the sheer complexity of the financial world most of us still entrust our money to a bank and use their systems to move it around. There are some notable new entrants, such as PayPal, and smaller banks, like Metro Bank, have been launched, but the majority of transactions still go through the same channels as before. The only change being that we do the work ourselves online rather than queuing up for hours in a draughty branch behind the man from the arcade paying in his weekly takings one penny at a time.
But most people recognise that the banking system doesn’t deliver the flexibility or mobility that technology can underpin. So how do you do banking without the banks? One way would be to make it simple to transfer money from person to person using a web-based platform that the majority of the world is a member of. Step forward Facebook, which has applied to the regulator in Ireland to launch e-money across Europe. This would allow people to transfer money to others on the social network as well as to buy things online. The combination of Facebook’s reach and brand could provide stiff competition to the likes of Western Union. However those worried about privacy may baulk at giving Facebook access to their bank details in any way, shape or form.
A second way is to change the currency altogether and allow payments and transfers through new forms of money, such as Bitcoin. However, the danger of an unregulated market has come back to haunt Bitcoin, with exchanges mysteriously emptied of money and government concern that the currency is used to pay for drugs, arms and sundry Bad Things.
Now the banking industry itself has come up with a third way. Paym, has been created by umbrella body the Payments Council and enables money to be transferred by simply typing in the phone number of the recipient, provided they are also registered on the service. Fast, direct and no need to give out your bank details to other people through insecure channels such as email. However it looks like the banks themselves are unconvinced by the possibility of doing themselves out of a job. 20 million account holders of RBS (and its subsidiaries NatWest, Ulster Bank, Clydesdale and Yorkshire banks), as well as First Direct, won’t be able to use the scheme until later in the year, while Nationwide’s five million customers will have to wait until 2015. RBS says it is prioritising getting its IT systems straight, after several high profile meltdowns, before joining.
With more and more of our money transferred online to friends and relatives who are further and further away from us, we need options that make it easy to transfer money simply, and quickly. But given our previous bad experiences with banks, will it be Facebook that steals a march and becomes the new financial hub for the internet age? Either way, consumers should benefit through genuine choice and hopefully better service, whoever they pick.
The world of work has changed immeasurably over the last ten years, not just in the UK but across all developed countries. Repetitive, process driven jobs have been automated, with technology replacing paper-based workflows. In many cases this has led to a hollowing out of sectors and companies, with the remaining workforce split between menial roles and higher level management.
And these changes are accelerating. A report in The Economist points to new technological disruption in the workplace, driven by computers getting cleverer and becoming self-learning. Lightweight sensors, more powerful cameras, cloud computing, the Internet of Things, big data and advances in processing power are all contributing to helping computers do brain work. Innovations such as driverless cars and household robots don’t require human intervention to operate, and can do more than traditional machines.
Research from Oxford University suggests that 47% of today’s jobs could be automated within the next 20 years. Many of these roles are in previously ‘safe’ middle class professions such as accountancy, the law and even journalism.
So, this begs two questions. What skills do people need if they are going to thrive in this new world – and are we teaching them to children quickly enough?
The employees of the future will require skills that complement machine intelligence, rather than mirror it. Empathy, the ability to motivate, and being able to think outside the box will all be needed. Essentially soft skills, backed up by specialist knowledge that is based on experience that cannot be replicated by machines. Professions such as therapists, dentists, personal trainers and the clergy are all seen as being relatively safe from replacement by robots. Interestingly entrepreneurs often possess these talents, so expect them to thrive as they use technology such as the cloud to bring their innovations to market quickly.
As a knock on effect, the will be a change in the size of companies people work for. Before the Industrial Revolution most people worked either for themselves or in small organisations (the village carpenter and his apprentice for example). Industrialisation required scale, so vast mega companies grew up. These won’t disappear, but the number of people working for them will shrink dramatically as intelligent machines take over. We’ll move to a larger proportion of the population being self-employed, providing their services on a personal basis.
Looking at education, schools will also need to change. Pupils need to understand the world around them, so they have to be taught a certain number of facts and dates, but rote learning of what made the British Empire great is going to be useless for a large proportion of people’s careers. What is needed is to teach skills for learning and adapting, thinking for yourself and how to motivate and show empathy to others. Essentially, children starting school today will be going into careers that may not even exist yet – so lifelong learning and flexibility are critical.
The predictions of the havoc that technology will cause to the world of work may be overstated – just because something is technically possible, it doesn’t mean it will quickly become mass market. And governments, worried about massive social change, are likely to step in to mitigate the worst impact through legislation. But changes are coming, and we need to think more like entrepreneurs and less like machines if we’re going to thrive.
It’s CES time again, and one of the key trends at the world’s largest consumer electronics show is wearable tech. That, and the need to make sure your autocue is working in the case of film director Michael Bay.
I’ve previously talked about how devices such as smart watches (and Google Glass) only really have a market to monitor vital signs and provide information on how our bodies are performing, and the products on show at CES bear this out. Intel launched a $1.3m competition to find the best uses for wearable tech (as well as a copy of the ARM-based Raspberry Pi) while Sony unveiled life-logging software and a device called the Core that monitors time spent on various activities, from jogging to watching movies. So if you’re bored with your life you can revisit what you were doing ten minutes ago, which sounds like the recipe for getting trapped in an infinite, ultra-mundane loop.
But the best uses of wearable tech are all linked to fitness. There’s a very good reason for this – as someone that runs a bit (and the husband of a hardened marathoner) I’ve seen how lucrative the market is. After all, like paying for gym membership, money spent on clothes/equipment used for keeping fit is guilt-free – so you can invest in five pairs of trainers and 23 different tops that may make you go faster without worrying too much. Add in the obsession runners have with stats and there is a large, and affluent, sector to target. Products on show at CES include headbands that monitor heart rates and core body temperature as well as smart socks that check your running style. Copying a project seen at Idea Transform several years ago, there’s a sensor that attaches to your baseball bat, golf club or tennis racquet to provide real-time information on your swing to enable you to improve. Pity it came too late for England’s batsmen.
For once, I can see some of the wackier products announced at CES as having a future. Everything is there to make wearables work – sensors have been miniaturised, communication technologies such as Bluetooth are widespread and most of us have smartphones to provide control. This offers the chance to provide unprecedented, real-time information on your body that can either improve your fitness or guard your well-being. There are definitely going to be privacy issues that need to be tackled, but the market is there. Whether that’s the same for the connected toothbrush that ticks you off for not brushing properly, I somewhat doubt……….
Very few of us like paying tax, but there’s a fine line between legitimately reducing your tax bill and actively avoiding paying the tax that is due. And at a time of austerity where everyone is tightening their belts, there’s obviously a push by governments to close loopholes and maximise the revenues they receive.
Given their high profile and obvious success Starbucks and Amazon have both been the subject of widespread condemnation of their tax avoidance methods, and I’ve covered Starbucks inept PR response in a previous blog. Google was up before a House of Commons Select Committee last week (for the second time), backing up its claims that, despite revenue of £3 billion in the UK, all its advertising sales actually take place in the lower tax environment of Ireland. Google boss Eric Schmidt has countered that the company invests heavily in the UK with its profits, including spending £1 billion on a new HQ that he estimates will raise £80m per year in employment taxes and £50m in stamp duty.
Apple is the next company caught in the public spotlight, with CEO Tim Cook appearing before a US Senate committee that had accused it of ‘being among America’s largest tax avoiders’. Meanwhile, the loophole that sees Amazon and other big US ecommerce companies avoid paying local sales taxes is being challenged by a new law passing through Congress, with estimates of between $12 and $23 billion extra being collected.
Given the close links between Google and UK politicians (Ed Miliband is appearing at a Google event this week and Schmidt is expected to meet David Cameron on his current UK trip), the cynical view is that this is a lot of sound and fury, signifying nothing. But it does create an image problem for the companies involved, particularly at a time when we’re all meant to be in it together.
Obviously the most popular thing for companies to do would be to re-organise their tax affairs so that they meet the spirit as well as the letter of the law. But that’s not likely to happen given the enormous sums at stake. Instead expect increased calls for global tax reform (so that the organisations involved don’t have to operate the way they are currently ‘forced’ to) and a slew of feel good announcements that demonstrate the level of investment and support for the UK economy by the companies concerned. Being ultra cynical perhaps the whole tax situation explains the huge support by big tech companies for Tech City – it is simply an elaborate way of diverting attention from their financial affairs…………..