If you’re too tired to finish your sentence then Google Autocomplete will take the baton from you. For ‘I like to-’, it helpfully throws out ‘I like to think of Jesus as a mischievous badger’. Asking ‘why is there’ used to yield ‘why is there a dead Pakistani on my couch?’, but now the internet’s big questions range from ‘an Easter bunny?’ or ‘a war in Syria?’
Normally it’s good friends who can finish each other’s sentences, but autocorrect has been making a lot of enemies of late.
Autocomplete can show a sad slice of our world: ‘does he…’ becomes ‘does he like me’, ‘does Gordon Brown…’ changes to ‘does Gordon Brown owe you a grand?’, and the suggestion after ‘is it wrong to eat meat’ is ‘is it wrong to sleep with your mom?’ But autocorrect does more harm than suggesting that incest is up for debate, and now a German Federal Court has ruled that libellous autocompletes by Google’s algorithm are actually a violation of privacy.
This is a kind of authorless libel, and one which Google refuses to take much responsibility for. They say that the algorithm works by filling in blanks based on the frequency of our searches – we’re all defaming each other, hanging question marks over people’s heads. They said, “Google does not determine these terms manually – all of the queries shown in Autocomplete have been typed previously by other Google users.”
This case, brought by a businessman angry at the association of his name with ‘scientology’ and ‘fraud’ in autocomplete, overturned the precedent set by previous decisions in favour of Google. The world’s largest search engine was told to change its autocomplete function when it was made aware by this kind of ‘unlawful violation’.
In April this year the Tokyo District Court ordered the search giant to pay out 300,000 yen ($3,100) for the pain and personal anguish of one man whose name brought up autocompletes of criminal activity of which he was innocent.
Google has been hit continually with these claims: in December of last year, Australian cancer surgeon Guy Hingston sued Google for suggesting that he was bankrupt, and in September, Google was hit with a lawsuit from former German first lady Bettina Wulff for autocompleting her name with terms like ‘escort’ or ‘prostitute’.
But if you don’t win your case or if you never get your day in court with the behemoth then the cruel irony is that, by attracting media attention to the falsehoods, you will only make it more likely that they will appear next to your name in the search engine.
There was a time when it was the nameless internet which was frightening. Everyone functioned as a coded, numbered and abbreviated version of themselves, with a nod to their love of anime, gaming or date of birth. Benji232G4G4mepl4y and DarkMoon94 would chatter all night on video comments without knowing how to find each other in the phonebook, and stories of teenage abductions and trolling were laced with fears about a world without names or culpability.
But now, as careers implode in 140 characters and people attempt to construct social media identities with far more care than they take over their real-life demeanour, it is the naming of the internet that is attracting problems. No wig, sunglasses or hasty move to another town is a defence against the lingering damage of google-able misbehaviours, and the falsely accused suffer both from the explicit internet naming and the faceless perpetrators of libel.
On one level, suing Google for what its users type into the search bar seems like calling the OED offensive for including swear words. It is a record of use, an account of the discourse surrounding a name in the same way that any linguistic corpus brings up instances of use and words that collocate with the search term.
It might seem that cause and effect have been wrongly construed; the MailOnline recently screamed ‘Is Google Making Us RACIST?’ after a research team from Lancaster University said that the search giant ‘perpetuates prejudices’. But actually all that the study, which used 2,600 questions on the search tool, showed was that people have already shaped the internet in their image, teaching stereotypes to search engines which now come up with negative evaluations for groups such as males, black people and homosexuals.
While it seems ridiculous that the search engine should take responsibility for this discourse, it has arguably brought it on itself through positing the function as ‘autocomplete’. It finishes your sentence, it suggests it is correct; it says ‘this is what you meant, isn’t it?’
Aptly, googling ‘Google autocomplete’ brings up the suggestion ‘Google autocomplete not working’.