Revolutionary Measures

Big Brother is manipulating you?

As anyone that has read George Orwell’s 1984 knows, the ability to rewrite history and manipulate information is at the heart of controlling behaviour. As communist Russia showed, people could simply be airbrushed from the official account and would vanish from the public consciousness. 1984

Of course, in the age of social media, the web, and 24 hour global media, this ability to control news should have disappeared. If a government blocks a site or a mobile phone network, there are ways around it that spread information quickly, bypassing attempted censorship.

However, I’d argue that the reverse has happened and that Big Brother can operate stealthily in two ways. Firstly, rumours can start and spread unchecked, with the majority of us not taking the time to get to the original source, instead believing something that has been retweeted or shared on Facebook. I’ve had people swear blind to me that a major incident took place ‘because I saw it on Facebook’ – though I can’t believe they’d be as credulous if a random stranger told them the same story down the pub. By the time the truth is out, immeasurable damage can be done – to a company’s brand or share price or a person’s reputation.

Secondly, we believe what our computers tell us, and act accordingly, particularly when it chimes with our own preconceptions. Essentially we think that the complex algorithms that control what appears on our screens are unbiased, rather than reflecting what the site owner has determined in some way.

This leaves us open to manipulation, whether by marketers trying to sell us things or more sinister experiments. Facebook received justified criticism for running an experiment where it tampered with the stories in people’s timelines, seeing what the impact would be on what users themselves wrote. Unsurprisingly the percentage of negative or positive posts had a direct link to the tone and language people used in their own posts.

Now dating site OKCupid has admitted that it experimented on its users. This included deliberately pairing up unsuitable couples and telling them that they were a perfect match to see what would happen. Now, there’s nothing wrong with a little serendipity, but deliberate meddling risks breaking the trust between a site and its users. Throwing in a wildcard of “here’s someone completely unlike you, but why not see what happens if you meet?” is one thing if it is advertised, but quite another if it is hidden behind the veil of computer processing.

Some might argue that this is just a next step in techniques such as Nudge, where choices are ordered in a way to drive particular outcomes. These are supposedly for the greater good. For example, if diners come to the salad bar first in a cafeteria they eat more healthy stuff and if you automatically enrol people in pensions, they tend not to take the opportunity to opt out. But I’d say it goes much further than this, and is about trust.

In many ways breaches of trust are similar to security breaches – something that the user relied upon unthinkingly has been removed, calling into question the entire relationship they have with a company. And like trust in any relationship, it is a time-consuming and difficult process to rebuild it.

So, anyone involved in marketing, media or technology does have a responsibility to be as open and transparent as they can be. At the very least there are legal safeguards (such as the Data Protection Act) that need to be obeyed, but I think companies need to go further than that. We live in a world where people want to have a genuine relationship with brands that they respect and trust, rather than the transactional, one-sided versions of the past. Therefore organisations need to think first about the consequences of experimenting on their users before playing Big Brother with their lives.

July 30, 2014 Posted by | Marketing, Social Media, Startup | , , , , , , , , | Leave a comment