Everyone understands that the bigger a company gets, the more difficult it is to create and nurture ideas. There are a number of reasons. The sheer size of the organisation mitigates against change – it is incredibly difficult to get everyone to understand a game-changing idea and align themselves behind it. You get a fragmented approach and the whole thing can get mired down in bureaucracy and finger-pointing.
Large organisations are inherently conservative, with people not wanting to rock the boat, while there is fierce rivalry between different divisions/departments which can lead to ideas being squashed if they seem to tread on someone else’s turf. There’s also a fine line between a strong company culture and having too inward looking a focus. Even successful companies such as Facebook have been accused of a lack of perspective – because they solely use (and love) their own products they assume they everyone else believes they are equally awesome. Step outside the organisation and your obsession is just a minor part of the lives of your customers.
The good news is that the majority of organisations do understand the need for a stream of fresh ideas. After all, the world today is dominated by companies such as Google, Facebook and Amazon that either didn’t exist twenty years ago, or were considerably smaller. Competition in every market is increasing and no-one wants to go the way of Nokia or Woolworths.
So how do you align your company to create the best forum to create ongoing ideas? I’m no management consultant, but I’ve seen a few attempts over the last twenty years and it boils down to three broad types:
1 Innovation silos
In many industries (such as pharmaceuticals), where innovation relies on expensive capital equipment it makes sense to create separate, concentrated, research labs. These have the intellectual muscle and resources but can suffer from their sheer size and distance from the business. They can then hit the same problems as any other big organisation, with divisional rivalry and static corporate culture. Alternatively businesses have focused innovation in standalone business units – either skunkworks operations that are locked away from the rest of the organisation, incubators that support promising ideas at arms length or even smaller companies that have been bought and are run as ideas factories. All of these can work, provided management stay true to their word not to meddle or demand fast results, but there’s still no connection with the wider business and its needs.
2 The campus
You break up your monolithic organisation into a campus style environment, with different divisions occupying their own buildings, but close together. Splitting into smaller teams is good for creativity, and you get the economies of scale of having everyone on a single, but large, site. However the ability to cross-pollenate between groups can be limited – unless you happen to bump into someone over lunch you might be completely in the dark about what other sections of the company are working on.
3 The college
What I think is really interesting about the campus model is that it deliberately mimics the university campus structure. While this makes for a good working environment, it doesn’t help spread ideas. So I think companies need to look at a more collegiate model, similar to that of universities like Cambridge. You have two allegiances/bases – your division (essentially your college) and your actual project (your faculty). So you get the chance to mix with people from other divisions and collaborate on joint projects. Some people may find it disorienting, but if projects are scheduled to last 2-3 years the goal is never that far away.
Innovation is vital in every industry, and the size and structure depends on the sector and the market each company operates in. But I think it is time for more organisations to look at the college structure if they want to nurture and develop a stream of ideas that take their business forward over the long term.
There’s nothing as embarrassing as a politician trying to explain complex technology and completely missing the point. And as IT is increasingly seen as ‘cool’ and therefore something they want to be associated with, you can see a growing number of our elected leaders showing their ignorance in public. Following George Osborne’s cringeworthy appearance in the Year of Code video, and David Cameron’s attempted hijacking of Silicon Roundabout, the PM is at it again.
Speaking at CeBIT in Germany this week Cameron started off well, lauding the potential of the Internet of Things, praising the innovation of UK companies such as ARM and Neul, talking about Anglo-German co-operation and doubling funding for the area. But then what example did he trot out to show what it means to the general public? That our fridges can talk to each other, and order a pint of milk when we’re running low. Hardly a New Industrial Revolution.
The internet enabled fridge has been around as long as I’ve been in PR (nearly 20 years) and despite regular press appearances (and some actual products), it has failed to take off. Primarily because it is a stupid idea. Most of us (apart from one of my old housemates), can see when they are running low on food/milk and visit the shops accordingly. If not, are supermarkets expected to drop everything and rush you a single pint of semi-skimmed because your fridge told them to? Hardly economically viable. And what happens if you bought something and didn’t like it – will your fridge keep ordering more until your house is full of Dairylea cheese triangles? How will privacy be managed? Will the device send your eating habits direct to FMCG companies, like a giant ClubCard? What about security?
Poking fun at ill-advised politicians is easy, but the danger with the fridge fixation is it masks the real benefits of the Internet of Things and paints the wrong picture to the general population. We are talking about the ability to monitor our health, reduce the need for hospital stays as patients can be treated in their own homes, better manage our energy use, save us money and create smart cities that share information to make our travel and lives easier and more fulfilling. It is probably true to say that the real innovations of the Internet of Things haven’t even been thought of yet – but will develop on the platform once it becomes prevalent.
Due to the combination of the UK’s existing strengths in low power chip design, Bluetooth, availability of radio spectrum and the efforts of pioneers in the smart home space, there is a real chance that the country (and Cambridge) can become a major player in the Internet of Things. And given that the sector is expected to be worth £8.7 trillion globally when it hits maturity (according to Cisco), it is definitely the right market to target.
To succeed, UK companies need support, a well-defined technology plan that maximises investment in the right areas and a long term vision, rather than lazy examples of machines that no-one wants that will put people off the entire concept, raise potential privacy concerns and stifle acceptance. Install an internet-enabled fridge in Number 10 by all means, but do a better job of explaining the bigger benefits to the wider population if you want the Internet of Things to really take off.
I’m a passionate believer in getting more people to learn to code. Like a lot of those my age I grew up with a ZX Spectrum and learned basic programming on a BBC Computer at school. Not only did it reap benefits then (my horse racing game was a triumph, albeit not a financial one), but it gave me an idea of how computers worked that removed any fear of them when I went into the workplace.
And, as my career in PR has progressed, more and more of what I do has a technical element to it – whether that is getting a WordPress site up and running or stitching together data from different tools to measure the impact of campaigns. Not understanding technology or being unable to use it would significantly impact my productivity and my overall job prospects.
When I look back, comparing my childhood to now, the world has changed dramatically. On the plus side we’re now in an era where geekiness is cool and entrepreneurs are celebrated for their ideas. But the opportunities we have to code have been lessened – rather than ZX Spectrums we have gaming consoles that cannot be programmed, except by studios with multi-million pound budgets. Yes, we have the iOS and Android ecosystems where anyone can create an app, but the majority of us are consumers, not programmers.
Clearly there’s a need for change, and initiatives such as the Raspberry Pi and the inclusion of coding in the National Curriculum from September are helping accelerate this. However the fiasco that is the government-backed Year of Code project is an unwelcome bump in the road to the future. For those that haven’t heard of it, the Year of Code is supposed to be an umbrella organisation to encourage everyone to learn to code in 2014.
Unfortunately so far it appears to be a PR-led initiative to muscle in on the work that is already being done. Backed by venture capitalists, and the TechCity community, its main claim to fame is the ill-fated appearance of its executive director Lottie Dexter on Newsnight, where she earned the ire of Jeremy Paxman by admitting that she didn’t actually know how to code. More importantly it appears to have alienated many people who have been working in the space for years by simply not recognising what has already been done.
And, judging by its website, apart from a promotional film (warning – contains footage of George Osborne) and a commitment to “banging the drum for all the fantastic coding initiatives taking place over the course of year and helping many more people engage with technology and access important training opportunities,” it isn’t actually going to do much that is concrete. Essentially it is PR spin on a serious subject, trying to take the lead in the same way as the government has decreed that TechCity is the only viable tech cluster in the UK. It is jumping on a bandwagon and trying to take the reins from those that know what they are doing.
Coding is essential to our competiveness and the future of our children – it is simply too important to be left to a slick marketing machine that is imposed from the top down. Time for the Year of Code to be switched off and then on again to remove the bugs from the system.
The world of work has changed immeasurably over the last ten years, not just in the UK but across all developed countries. Repetitive, process driven jobs have been automated, with technology replacing paper-based workflows. In many cases this has led to a hollowing out of sectors and companies, with the remaining workforce split between menial roles and higher level management.
And these changes are accelerating. A report in The Economist points to new technological disruption in the workplace, driven by computers getting cleverer and becoming self-learning. Lightweight sensors, more powerful cameras, cloud computing, the Internet of Things, big data and advances in processing power are all contributing to helping computers do brain work. Innovations such as driverless cars and household robots don’t require human intervention to operate, and can do more than traditional machines.
Research from Oxford University suggests that 47% of today’s jobs could be automated within the next 20 years. Many of these roles are in previously ‘safe’ middle class professions such as accountancy, the law and even journalism.
So, this begs two questions. What skills do people need if they are going to thrive in this new world – and are we teaching them to children quickly enough?
The employees of the future will require skills that complement machine intelligence, rather than mirror it. Empathy, the ability to motivate, and being able to think outside the box will all be needed. Essentially soft skills, backed up by specialist knowledge that is based on experience that cannot be replicated by machines. Professions such as therapists, dentists, personal trainers and the clergy are all seen as being relatively safe from replacement by robots. Interestingly entrepreneurs often possess these talents, so expect them to thrive as they use technology such as the cloud to bring their innovations to market quickly.
As a knock on effect, the will be a change in the size of companies people work for. Before the Industrial Revolution most people worked either for themselves or in small organisations (the village carpenter and his apprentice for example). Industrialisation required scale, so vast mega companies grew up. These won’t disappear, but the number of people working for them will shrink dramatically as intelligent machines take over. We’ll move to a larger proportion of the population being self-employed, providing their services on a personal basis.
Looking at education, schools will also need to change. Pupils need to understand the world around them, so they have to be taught a certain number of facts and dates, but rote learning of what made the British Empire great is going to be useless for a large proportion of people’s careers. What is needed is to teach skills for learning and adapting, thinking for yourself and how to motivate and show empathy to others. Essentially, children starting school today will be going into careers that may not even exist yet – so lifelong learning and flexibility are critical.
The predictions of the havoc that technology will cause to the world of work may be overstated – just because something is technically possible, it doesn’t mean it will quickly become mass market. And governments, worried about massive social change, are likely to step in to mitigate the worst impact through legislation. But changes are coming, and we need to think more like entrepreneurs and less like machines if we’re going to thrive.
In many ways the news that Google has bought smart home company Nest Labs shouldn’t be a surprise. It has been talking to the company for some time and apparently lots of Google employees had installed the company’s sensor based thermostat in their own homes.
More to the point I think it fits in with Google’s overall objectives. As analysts have pointed out, Google isn’t a search engine company (and hasn’t been for some time), but is about data – collecting it (analysing search results, Google Glass, StreetView) and then using it to either sell you things (through adverts) or make your life better in some way.
With billions of sensors embedded in previously dumb objects that will be communicating in real-time, the Internet of Things promises to create a tidal wave of data. Each piece will be tiny, but if you can bring it together and analyse it you can get an even deeper view of the world around us, and the people in it. Nest’s products are much more than thermostats, and provide Google with the sensor/Internet of Things expertise it needs to add to its product portfolio. It already has Android-based smartphones/tablets to act as controllers, the mapping technology to show where sensors are located and the technology to analyse billions of events in real-time. And with Google Fiber rolling out in several US cities, it has a network to send the data through as well.
A simple example – your Nest thermostat notifies you that your boiler has gone wrong via your smartphone while you are at work. And suggests a registered tradesman that can fix it by trawling the web and any recommendations in your Google+ circles. Or alternatively gives you the address of the nearest clothing shop, so you can stock up on thick jumpers.
Many people (myself included) would find this a bit creepy, but it is potentially possible if you can knit all the technology together. What I think is interesting is how utilities will respond to the future entry of Google into the market. After all, as publishers and others have found, Googlification can squeeze out incumbents through sheer scale and by engaging more closely with customers. Utilities have to decide whether they want to partner with the likes of Google, risk losing the customer relationship and become commodity suppliers of gas and electricity or take a stand and build stronger engagement with customers. In current circumstances that’ll be difficult – people are at best ambivalent about their utility supplier, and in an era of rising prices and poor customer service many actively dislike them.
So there’s a big opportunity here – and something that Cambridge’s cluster of smart home/green tech companies could exploit. For example, AlertMe already has a partnership with British Gas, while Sentec is working with metering companies to make their products smarter. If energy companies don’t want to work with Google then they have two choices – do it themselves (teaming up with smaller tech companies), or partner with larger industrial tech companies, such as Siemens or Bosch. And these industrial giants will need the specialist expertise that smart home companies can provide.
The utility market doesn’t move fast, so don’t expect to see Google running your home in the next year, but the Nest acquisition should actually spur the whole sector on, attracting both interest and investment. The world just got more interested in smart homes, which is good news for relevant startups in Cambridge and beyond.
As I’ve said before, startup clusters are springing up all over the place and that’s great. There’s even one in my village (population 3,000) – well, two startups and a group of support services, including myself.
Clusters encourage innovation, particularly through external economies of scale – i.e. by providing access to the people, resources and infrastructure that startups need but don’t have themselves. And the more startups there are in an area, the lower the price of these services as they are shared across a greater number of companies.
A lot of these clusters seem to be driven from outside, particularly as both central and local governments realise that startup clusters are (a) sexy and (b) cheap. Why not give them a small pot of money/some space/a patronising visit to show you’re supporting innovation?
So, putting cynicism aside, what does a startup cluster require – and how does Cambridge measure up? I’ve been looking at Brad Feld’s work on building a startup ecosystem, based on 20 years experience in Boulder, Colorado. The aptly named Boulder Thesis highlights four things that these communities must have:
- They should involve entrepreneurs and feeders (people/institutions like universities, government, venture capitalists, lawyers, PR people). BUT they have to be led by entrepreneurs if they are to truly take off.
- Long term. It can take 20 years to build a community, so entrepreneurs need to stick around, even if they’ve built and sold their company long ago. And the same goes for those that fail – encourage them to stick around.
- They need to be inclusive, welcoming anyone, no matter what their skills or ideas.
- They need to be active, with a range of events and accelerator programmes to help encourage and nurture startups.
That’s Boulder. Let’s compare it to Cambridge.
Firstly there’s a large community of entrepreneurs and feeders in the city (so a tick there) and entrepreneurs are taking a leading role. And given the longevity of the Silicon Fen success story there are plenty of long term entrepreneurs who have stuck around, from Hermann Hauser to Mike Lynch.
It’s the third and fourth points where I believe Cambridge has issues. Don’t get me wrong, there are some incredibly welcoming people in the Cambridge community and some great events/accelerators that nurture startups. But, perhaps because of the size and depth of the community, spanning everything from medtech to green IT, groups can appear disconnected, with everyone focused on their niche. Some of this may come from the research-led nature of many Cambridge innovations, but, even in academia, cross-discipline working is becoming more normal after centuries of specialism. Compare this to places such as Norwich, which has a smaller (but still substantial) startup community that seems more cohesive, with greater communication between disparate companies with radically different ideas.
What Cambridge does have, and that I think is missing from Feld’s thesis, is the combination of new and old blood. The universities, and increasingly tech businesses, attract talent, much of which stays on and contributes to the ecosystem. But enough leaves to make space for new ideas so that things don’t go stale.
So, in true end of term report style, Cambridge needs to try harder when it comes to building a cohesive, overarching supercluster. It has the constituent parts, but what is needed is stronger glue to stick it together and help connect the bigger picture. Let’s see if 2014 brings a solution to this long term problem.
There can be a tendency in Cambridge to think that innovation ends at the city limits, and particularly that we’ve got the monopoly on tech startups in East Anglia.
Proof positive that this isn’t the case was on show last week at SyncNorwich, where more than 300 entrepreneurs, developers and members of the Norwich tech cluster talked about their diverse successes. This included market leaders such as FXhome, which produces special effects software for both Hollywood blockbusters and amateur filmmakers, Liftshare.com, the world’s most popular car sharing site and mobile interaction/payment firm Proxama. A whole host of newer startups, such as targeted mobile advertising company Kuoob, music community site SupaPass and educational software provider Wordwides (set up by a 16 year old) also talked about what they could offer.
There’s obviously been lots of activity in Norwich for quite a while (FXhome has been going for 10 years and Liftshare.com for 15), but what the evening did was give outside endorsement to the cluster. Mike Butcher from Tech Crunch came along and it gave everyone present belief that they were on the right road and that they should be shouting about it. In the days since, I’ve seen emails offering co-working spaces and there’s even a cluster name (Silicon Broads) being bandied about, along with a startup map.
Norwich isn’t the only cluster rising to prominence across Europe – the growth of cloud-based technologies, new agile development methodologies and a focus on entrepreneurship mean they are springing up everywhere. Some people see this as a bad thing – they point to the size of Silicon Valley and wonder how hundreds of disparate European cities can compete or scale. But as Butcher pointed out, the Valley has a 60 year head start and what is needed is to build bridges between the different hubs – after all takes 2 hours to drive from London to Norwich (or Cambridge), the same time to get from one end of Silicon Valley to the other.
What Europe needs to do is to use the nimbleness of having multiple centres to its advantage and turn disparateness into diversity. I’m reminded of the story of the ‘discovery’ of America. At the same time as Christopher Columbus was touting his plans around the courts of Europe, the Chinese Emperor was assembling a great fleet to explore the same area. Given the scale and backing put into the expedition it would have been likely that the first non-native settlers in the present day United States would have been Chinese, not European. However the Emperor died and his plans died with him – there was no alternative power that could take them on. In contrast Columbus, originally Genoese, travelled round Europe for years until he found a backer in the Spanish monarchy. The result? The world we know today.
So it is time that European startups (and political leaders) stopped dreaming of a single super hub that on its own can rival Silicon Valley. It’ll never happen and what we need to do is build bridges between the enormous variety of hubs across Europe. Making everyone aware of what is going on up the road (or further afield) is crucial to driving collaboration, unlocking opportunities and building a successful pan-European tech ecosystem that can break down barriers and silo working and deliver jobs and growth.
This week BBC director general Tony Hall launched a slew of initiatives designed to reposition the beleaguered broadcaster. The aim is to show that the BBC is central to meeting the needs of consumers now and in the future, and to draw a line under an annus horribilis for the corporation, which has been plagued by scandals from Jimmy Savile to excessive payoffs for senior managers.
Amongst the news of a BBC One + 1 channel (by my maths that’s BBC Two), and expansion of iPlayer, one thing that caught my eye was a pledge to “bring coding into every home, business and school in the UK”. As someone who grew up in the 1980s it made me misty-eyed with nostalgia for the last time the BBC got involved in technology, with the original BBC Micro. Essentially the BBC put up the money for the machine to be given to every school in the UK, as well as producing TV programmes and courses on coding.
While I never had a BBC (I was a Sinclair Spectrum diehard), we used them in at school and it did help me learn to code. It really was a golden age for UK computing, as it introduced a generation to computers they could play games on, but equally program and learn with. Programming your own creations was a viable alternative to just treating these machines as games consoles – particularly as a Spectrum game took about 10 minutes to load (and often mysteriously crashed just before it should have started). I was incredibly proud of my amazing horse racing game (complete with betting and flickering graphics), even if my programming days are now long behind me.
Not only did the BBC/Spectrum age produce a generation that wasn’t afraid of coding, but it also helped shape the UK IT industry. Acorn, the makers of the BBC Micro, spawned ARM, now a world leader in chip design, while countless games companies developed from bedrooms into multi-million pound concerns. You could easily argue that Cambridge wouldn’t be the technology powerhouse it is today if it wasn’t for the BBC.
But then IT became marginalised as a school subject – essentially replaced with learning to use desktop applications rather than program. In a global economy where companies compete on knowledge, the need to rekindle that interest in coding has never been greater. The BBC is not the first to understand this – the Cambridge-designed Raspberry Pi has become a global phenomenon as it brings back the spirit of adventure and exploration to children weaned on iPads and Wiis. There’s also a new computer science curriculum for schools and coding courses are becoming increasingly popular across the UK.
So where does the BBC fit into this? There’s a lot of hyperbole in the announcement about “using world class TV, radio and online services to stimulate a national conversation about digital creativity”, but very little detail. The challenge for the BBC is to pitch whatever it offers in a way that doesn’t replicate what is being done in the private sector and doesn’t dumb down coding to a simple point and click level. As seen in the 1980s, the backing of the BBC can be a major force for good, but it could equally stifle the innovation and creativity that it is trying to encourage. The jury’s out, but I hope it can turn the undoubted niche success of the coding revival into a mainstream movement – working with the industry to create the Acorns and ARMs of tomorrow.