After announcements last year, this week saw the launch of the first Apple Watches, although they won’t go on sale until 24 April. The cutely named Spring Forward event saw the tech giant reveal all 38 models, which will range in price from £299 (for the sport model) to £8,000+, depending on screen size, design and whether you want it in 18 carat gold.
More importantly Apple showed a selection of the apps that it expects to drive demand for the device. You can make touchless payments, receive phone calls, open a compatible hotel room door (rather than using a keycard), and remotely open an internet-connected garage door (no, I don’t have one of those either). However for a large number of functions, such as messaging, GPS tracking and making phone calls you’ll need an iPhone 5 to run alongside your new watch.
Apple is not a stupid company and has grown to be the biggest quoted business in the world by revenues through reinventing the music and smartphone markets. It hired former Burberry chief executive Angela Ahrendts to head up its online and physical stores, partly to help its move from technology into fashion with watches. I remember loudly proclaiming that the iPad would never catch on due its innate pointlessness, and now I rely on it every day. But I still see some serious challenges to the Apple Watch attaining critical mass. Here are four of them:
The cost of the Sport model begins at £299, with prices for the mid-tier Watch version starting at £479. To me, this is a lot of money to spend on a watch, even one that looks as sleek as the
Apple device. And for £900+ you can buy a low-end TAG Heuer, that you know will last for a long time without needing to be upgraded as software advances. Yes, millions of people have iPhones, but the vast majority got them on subsidised deals that meant they didn’t have to fork out close to the real sales price. A better comparison is the similarly priced iPad, which has seen sales slow as the market becomes saturated over time. Therefore predictions of sales of 60 million seem excessive, with the market much more limited than that.
2. Does it do anything different?
Anyone of a certain age who saw or read Dick Tracy loves the idea of using their watch to make a call, even if it is to the office rather than for police back up. But Dick Tracy didn’t have a smartphone, which can do pretty much everything a watch can do – and more besides. And as Apple has said, you’ll need to retain your iPhone to provide many of the functions that can’t be squeezed into the watch. Admittedly the iPhone is getting bigger, making it more difficult to use for things such as contactless payments, but equally the watch could be seen as too small for many other activities.
3. A whole new market
Apple has always been known for its design excellence, and the Watch appears to be equally stunning, admittedly with a bulkier face than a traditional wristwatch. Hiring Ahrendts also points to a desire to bring in luxury marketing nous to help it move into a different sector, where factors outside technology excellence and cool apps could be more important. Can it become the fashion accessory that everyone wants? In the ultra-competitive watch market it will be difficult, though expect Apple to try to jump the chasm from geek to cool.
4. Battery life
Watch batteries traditionally last for years. In contrast iPhones provide just hours of charge, depending on how much Candy Crush you are actually playing. So the news that the Apple Watch will keep going for 18 hours is disappointing to say the least (although the company says that it will continue to show the time for up to 72 hours after that). Essentially consumers will need to charge the watch every night, plugging it in alongside their iPhone ready for the morning. It just reinforces that this is a technology product, rather than something you wear, and is bound to put some people off.
I could be as wrong about the Apple Watch as I was about the iPad, but to me, despite the hype, it won’t move beyond being a niche product for fanboys and girls who want to pair it with their latest iPhones. For me, if I had the spare cash I’d buy a TAG instead and leave technology to my phone……….
This week BBC director general Tony Hall launched a slew of initiatives designed to reposition the beleaguered broadcaster. The aim is to show that the BBC is central to meeting the needs of consumers now and in the future, and to draw a line under an annus horribilis for the corporation, which has been plagued by scandals from Jimmy Savile to excessive payoffs for senior managers.
Amongst the news of a BBC One + 1 channel (by my maths that’s BBC Two), and expansion of iPlayer, one thing that caught my eye was a pledge to “bring coding into every home, business and school in the UK”. As someone who grew up in the 1980s it made me misty-eyed with nostalgia for the last time the BBC got involved in technology, with the original BBC Micro. Essentially the BBC put up the money for the machine to be given to every school in the UK, as well as producing TV programmes and courses on coding.
While I never had a BBC (I was a Sinclair Spectrum diehard), we used them in at school and it did help me learn to code. It really was a golden age for UK computing, as it introduced a generation to computers they could play games on, but equally program and learn with. Programming your own creations was a viable alternative to just treating these machines as games consoles – particularly as a Spectrum game took about 10 minutes to load (and often mysteriously crashed just before it should have started). I was incredibly proud of my amazing horse racing game (complete with betting and flickering graphics), even if my programming days are now long behind me.
Not only did the BBC/Spectrum age produce a generation that wasn’t afraid of coding, but it also helped shape the UK IT industry. Acorn, the makers of the BBC Micro, spawned ARM, now a world leader in chip design, while countless games companies developed from bedrooms into multi-million pound concerns. You could easily argue that Cambridge wouldn’t be the technology powerhouse it is today if it wasn’t for the BBC.
But then IT became marginalised as a school subject – essentially replaced with learning to use desktop applications rather than program. In a global economy where companies compete on knowledge, the need to rekindle that interest in coding has never been greater. The BBC is not the first to understand this – the Cambridge-designed Raspberry Pi has become a global phenomenon as it brings back the spirit of adventure and exploration to children weaned on iPads and Wiis. There’s also a new computer science curriculum for schools and coding courses are becoming increasingly popular across the UK.
So where does the BBC fit into this? There’s a lot of hyperbole in the announcement about “using world class TV, radio and online services to stimulate a national conversation about digital creativity”, but very little detail. The challenge for the BBC is to pitch whatever it offers in a way that doesn’t replicate what is being done in the private sector and doesn’t dumb down coding to a simple point and click level. As seen in the 1980s, the backing of the BBC can be a major force for good, but it could equally stifle the innovation and creativity that it is trying to encourage. The jury’s out, but I hope it can turn the undoubted niche success of the coding revival into a mainstream movement – working with the industry to create the Acorns and ARMs of tomorrow.
The PC market has obviously been having a tough time of it recently, with sales plummeting 14 per cent in the first quarter of 2013, according to analysts IDC. The combination of the rise of tablets and smartphones, the global recession and the resurgence of Mac sales at the top end have all put a dent in sales figures. And this has obviously hurt the divisions of Microsoft that make most of their money from PCs, particularly the Windows operating system.
At the same time Microsoft has realised that it needed to up its game in the faster growing smartphone and tablet market to compete with the likes of Apple and Android. But then someone somewhere decided that solving these problems required a single solution. The result? Windows 8, a new universal operating system that would work across PCs, tablets and smartphones, giving the same look and feel whatever device was being used.
Unsurprisingly for something that tries to appeal to everyone, Windows 8 is dreadful. Its completely new, tile based interface may work well on tablets and smartphones – though given Microsoft has sold less than a million of its Surface tablets (compared to 19.5 million iPads) it is difficult to make valid comparisons. But it has flummoxed traditional PC users who have to learn a completely new interface that seems very much focused on consumer needs, with fast links to music and videos, rather than business requirements. No wonder that companies are putting off PC purchases in the current climate – why splash out on something that will require a lot of training when Windows 7 works perfectly well.
The talk is now of a redesign for Windows 8, but my concern is how it has got to this stage. Microsoft has never really had a company-wide culture of innovation – from the original Windows it has tended to improve upon what is out there and deliver it well. Yes, it has areas of innovative research (the Cambridge office responsible for the Kinect for example), but (business) people buy Microsoft because it is the safe option.
Instead of following that path this time, it has thrown out everything that has come before and decided to re-invent the user interface. Not just on one device, but across three – PCs, tablets and smartphones. Neither Apple nor Android have attempted that, because there are significant differences between small screen size mobile devices and PCs/laptops. Given that lots of people (including myself) still moan about the changes made in the last version of Microsoft Office, this has resulted in perplexed users and falling sales.
Microsoft can still fix Windows 8, but what it really needs to address are the issues that led to its development direction. People (and their devices) aren’t ready for a universal operating system and the fall in PC sales mean that Microsoft isn’t in the position of power it occupied five years ago. No-one seemed to realise that, hence trying to force feed the PC market with a completely new concept that seemed doomed from the start. Everyone wants to be Apple the stylish innovator, but Microsoft needs to take step back and come to terms with its role as the boring bloke in the suit that makes things tick. After all, there’s nothing worse than Bill Gates trying to look cool…………
Video games are big business. Whether you measure it on the £1 billion contribution to UK GDP of the industry, or the amount of time my children spend playing Angry Birds, the impact is enormous. In Cambridge alone companies such as Jagex and Frontier Developments employ hundreds of staff, an estimated 10% of the UK’s games developers.
But the era of the blockbuster console game is coming to an end. Despite the recent announcement of the Sony PlayStation 4, more and more games are now played casually on smartphones, tablets or simply online. As the current furore about the in-app charges
run up on iPhones and iPads demonstrates, all of these small payments add up to a big (and ongoing) windfall for developers. Rovio, the creator of Angry Birds, and king of the casual game companies, is allegedly worth as much as fellow Finnish tech company Nokia.
Handheld consoles have suffered – now analysts predict it could be the turn of the big budget gaming devices such as the Microsoft Xbox or Nintendo Wii. Ouya, a new Android-based console is now shipping at the knockdown price of $99 following an $8m Kickstarter funding round. As any gamer/parent will know, it isn’t just cost of the console, but the price of the games that adds up. And the Ouya’s games are expected to be low cost apps as seen on Android devices but beefed up to use the power of the console. Ouya’s not alone, with UK-based PlayJam launching its own portable GameStick Android device.
But there’s a big marketing challenge for these low cost consoles. Casual gamers with a tablet or smartphone need persuading that they should shell out for a separate device, as well as investing in new games, particularly as many already have a PC. Serious gamers will look at the quality of the games available compared to the blockbusters available on big brand consoles while children (a key market for games) want to be able to play the same games as their friends. Additionally the likes of Microsoft and Sony have been working to turn their consoles into home entertainment hubs, acting as the bridge between the living room TV and the internet to try and cement their position in the market. Essentially it is chicken and egg – people won’t buy a console until they know there’s sufficient games available, while serious developers won’t invest until there’s a big enough target market.
I can see two ways for the likes of Ouya to get round this dilemma – and it’ll take bravery and a bit of radical thinking. Firstly, adopt the same business model as casual games themselves – give away the hardware and charge for anything beyond the basic, either as a one off or on a subscriber basis. Risky, but it gets consoles into people’s houses and if they then take 30-40% of each £1.99 spent on a game they will build a subscriber base and some revenues. The second way is to partner with companies with a big brand to bring the hardware prices down to under a tenner. Whether it is a telecoms company (Sky, BT or Virgin Media), a retailer (Amazon, Tesco) or actually an Angry Birds-badged console it would widen the audience beyond the early adopter. The worry here is that as we move to a cloud-based future traditional console makers will go down the same route and already have major brand recognition.
However the gaming wars play out, the old market of monolithic consoles is under serious pressure – now is the time for new business models and smart use of subscription and cloud-based ideas if new comers are going to emulate Rovio, rather than follow the likes of Atari into bankruptcy.
Last week Google’s executive chairman Eric Schmidt gave the prestigious MacTaggart lecture at the Edinburgh Television Festival, making him the first person from outside the broadcasting industry to do so.
As well as some crowd-pleasing attacks on Alan Sugar and a call for greater UK focus on technology innovation he used the platform to talk about the forthcoming launch of Google TV. Already out in the US, this allows viewers to access the internet while watching TV programmes and search content across both.
Google TV is a logical move for the search giant, and the desire to be the gateway between the TV and the internet is a major reason for the recent purchase of Motorola, which has a big business in set top boxes. Google exists as it is able to collect and analyse vast amounts of data and use the outputs to deliver up targeted content and adverts, and, given that bugging people is illegal, the TV is the one untapped area of our lives that they don’t currently have access to.
But I don’t think Google TV is going to have as easy a ride as some may think, for a number of reasons.
Firstly, the TV experience is still about content and this is something produced by broadcasters first and foremost, whether it is live, accessed through catch-up services like iPlayer or streamed, rented content from the likes of LoveFilm or Hulu. So Google needs to take the industry with it, hence the partnership tone of Schmidt’s Edinburgh speech.
Secondly, the TV market is still conservative and slow moving. In my experience people buy PCs/tablets more often than they change TV and, even then, normally buy from trusted brands. The aim for Google is therefore to become part of the ecosystem, such as through Motorola set top boxes and inclusion in new TVs. However this will take time, particularly to reach a critical mass of mainstream consumers.
Thirdly, there is a lot of competition. At a basic level you can have your laptop on your knee to access programme information while you are watching or hook your PC to your TV to see downloaded or streamed programmes. The advent of the iPad has given you the chance to add a second screen to find out more information and share it with your family and friends quickly and easily. And a whole range of other manufacturers, both technology players such as Apple and Microsoft and existing electronics brands are bidding to be the portal linking your TV to the internet. Winning the trust of consumers and getting on as many platforms as possible will dictate who wins this war.
Finally, there are a number of differences between how people watch TV and surf the net. One is public and the other is (we like to think) very personal. If Google combines your online search history with your TV viewing habits to serve up personalised ads, it may not work make for harmonious family viewing. Just imagine your partner’s reaction if Coronation Street is interrupted by adverts for dubious sites that you’ve ‘accidentally’ surfed to, while on your own………….
Overall, this is shaping up to be an intriguing struggle to control the TV, but Google will need to think smart if it wants to win its place in our living rooms…………
- Google Retains Optimism on Google TV (carocomarketing.com)
- Google TV, Despite Shaky Domestic Start, to Launch in Europe (pcworld.com)
- Watch Out, UK. Google TV Is Coming Your Way (techcrunch.com)