Google: You (auto)complete me

0
3999

If you’re too tired to finish your sentence then Google Autocomplete will take the baton from you. For ‘I like to-’, it helpfully throws out ‘I like to think of Jesus as a mischievous badger’. Asking ‘why is there’ used to yield ‘why is there a dead Pakistani on my couch?’, but now the internet’s big questions range from ‘an Easter bunny?’ or ‘a war in Syria?’

Normally it’s good friends who can finish each other’s sentences, but autocorrect has been making a lot of enemies of late.

Autocomplete can show a sad slice of our world: ‘does he…’ becomes ‘does he like me’, ‘does Gordon Brown…’ changes to ‘does Gor­don Brown owe you a grand?’, and the sugges­tion after ‘is it wrong to eat meat’ is ‘is it wrong to sleep with your mom?’ But autocorrect does more harm than suggesting that incest is up for debate, and now a German Federal Court has ruled that libellous autocompletes by Google’s algorithm are actually a violation of privacy.

This is a kind of authorless libel, and one which Google refuses to take much responsi­bility for. They say that the algorithm works by filling in blanks based on the frequency of our searches – we’re all defaming each other, hang­ing question marks over people’s heads. They said, “Google does not determine these terms manually – all of the queries shown in Auto­complete have been typed previously by other Google users.”

This case, brought by a businessman angry at the association of his name with ‘scientol­ogy’ and ‘fraud’ in autocomplete, overturned the precedent set by previous decisions in fa­vour of Google. The world’s largest search en­gine was told to change its autocomplete func­tion when it was made aware by this kind of ‘unlawful violation’.

In April this year the Tokyo District Court ordered the search giant to pay out 300,000 yen ($3,100) for the pain and personal anguish of one man whose name brought up autocom­pletes of criminal activity of which he was in­nocent.

Google has been hit continually with these claims: in December of last year, Australian cancer surgeon Guy Hingston sued Google for suggesting that he was bankrupt, and in Sep­tember, Google was hit with a lawsuit from for­mer German first lady Bettina Wulff for auto­completing her name with terms like ‘escort’ or ‘prostitute’.

But if you don’t win your case or if you never get your day in court with the behemoth then the cruel irony is that, by attracting media at­tention to the falsehoods, you will only make it more likely that they will appear next to your name in the search engine.

There was a time when it was the nameless internet which was frightening. Everyone functioned as a coded, numbered and abbre­viated version of themselves, with a nod to their love of anime, gaming or date of birth. Benji232G4G4mepl4y and DarkMoon94 would chatter all night on video comments without knowing how to find each other in the phone­book, and stories of teenage abductions and trolling were laced with fears about a world without names or culpability.

But now, as careers implode in 140 charac­ters and people attempt to construct social media identities with far more care than they take over their real-life demeanour, it is the naming of the internet that is attracting prob­lems. No wig, sunglasses or hasty move to an­other town is a defence against the lingering damage of google-able misbehaviours, and the falsely accused suffer both from the explicit internet naming and the faceless perpetrators of libel.

On one level, suing Google for what its users type into the search bar seems like calling the OED offensive for including swear words. It is a record of use, an account of the discourse sur­rounding a name in the same way that any lin­guistic corpus brings up instances of use and words that collocate with the search term.

It might seem that cause and effect have been wrongly construed; the MailOnline re­cently screamed ‘Is Google Making Us RACIST?’ after a research team from Lancaster Univer­sity said that the search giant ‘perpetuates prejudices’. But actually all that the study, which used 2,600 questions on the search tool, showed was that people have already shaped the internet in their image, teaching stereo­types to search engines which now come up with negative evaluations for groups such as males, black people and homosexuals.

While it seems ridiculous that the search engine should take responsibility for this discourse, it has arguably brought it on itself through positing the function as ‘autocom­plete’. It finishes your sentence, it suggests it is correct; it says ‘this is what you meant, isn’t it?’

Aptly, googling ‘Google autocomplete’ brings up the suggestion ‘Google autocom­plete not working’.

LEAVE A REPLY

Please enter your comment!
Please enter your name here