Oxford Philosopher Nick Bostrom put forward a case for mass government surveillance in a TEDTalk last week.
Speaking in Vancouver on 17th April, Bostrom argued in favour of mass surveillance, claiming that it may be a necessary step in preventing the destruction of humanity.
Sharing some insights from his latest publication The Vulnerable World Hypothesis, Bostrom argued that humanity’s demise is likely to be at the hands of a technology of our own design.
To counter this, he argued, we may require a more effective global government that could quickly outlaw any potential civilization-destroying ideas or technologies.
To demonstrate this idea, Bostrom employed the metaphor of humans standing in front of a giant urn filled with balls, each ball representing a different idea.
Based on their effect on humanity, the balls are of different colours: white for beneficial ideas, grey for moderately or possibly harmful, and black for civilization-destroying.
According to Bostrom we haven’t selected a black ball yet because we’ve been “lucky”. However, if “scientific and technological research continues, we will eventually reach it and pull it out”, he writes.
Speaking to Cherwell, Bostrom elaborated on his theory, saying, “this paper doesn’t exactly argue for mass surveillance; rather it observes that there are two structural features of the current world order that would make it vulnerable to the extraction of a technological black ball, and that to have a general capacity to stabilize civilization against this kind of vulnerability would require a capacity for extremely effective preventive policing, supported by mass surveillance, and sufficiently effective ways of resolving the worst global coordination problems.”
In conversation with Chris Anderson, the head of TED, he suggested that we implement a system of mass government surveillance in which each person is fitted with necklace-like “freedom tags” with multi-directional cameras.
Information gathered by these “freedom tags” would be sent to “freedom centers”, where artificial intelligence monitor the data, alerting human officers if they detect signs of a possible “black ball” idea.
“Obviously there are huge downsides and indeed massive risks to mass surveillance and global governance”, Bostrom said, conceding criticisms of the notion.
“I’m just pointing out that if we are lucky, the world could be such that these would be the only ways you could survive a black ball.”
This is not the first time Nick Bostrom has made controversial predictions. In 2010, his paper Are You Living In A Computer Simulation? argued for the statistical likelihood of human existence being a technological simulacrum.
Since then, his theory has mainly focused on artificial intelligence, with his co-authoring of a letter with Stephen Hawking to establish “23 principles of AI safety”.
He is the founder of the Oxford Martin Programme on the Impacts of Future Technology, and is the founding director of the Future of Humanity Institute.