stephen hawking

Hawking calls for AI regulation in posthumously published essays

Image credit: dt

In a collection of essays published posthumously by Penguin, the late cosmologist Stephen Hawking lays out his predictions for the future of humanity, including the rise of an elite ‘superhuman’ species.

Excerpts from the book of essays (‘Brief Answers to the Big Questions’) were published in the Sunday Times this weekend.

Before his death in March 2018, Hawking was famed not just for his influential work in cosmology but also for his commentary on subjects beyond his academic field, such as the acceleration of machine learning technology, the destructive nature of climate change and the impact of Brexit on UK science.

In ‘Brief Answers, Hawking warns of the potential dangers of unchecked artificial intelligence (AI) being trusted with important responsibilities. “In the future, AI could develop a will of its own, a will that is in conflict with ours,” he comments, adding that it is necessary to regulate AI, particularly smart autonomous weapons systems.

“In short, the advent of super-intelligent AI would be either the best or the worst thing ever to happen to humanity. The real risk with AI isn’t malice, but competence,” he says. “You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green-energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.”

Other high-profile scientists and innovators who have spoken publicly about AI include Jim Al-Khalili, who has encouraged discussion about AI regulation, and Elon Musk, who has described AI as a “fundamental risk to the existence of human civilisation”.

Speaking on the subject of other existential threats, Hawking opines that an asteroid impact, environmental catastrophe or nuclear war will “cripple” the planet in the next 1000 years. He describes how a rise in global average temperature could transform Earth’s climate to the extent that it is comparable with that of Venus.

Despite his assuredness that calamity would arrive within a millennium, Hawking also writes that humans would find a way to “slip the surly bonds of Earth and will therefore survive the disaster” in the case of nuclear or environmental meltdown.

He comments that while other species – and many humans – would not survive these outcomes, the humans who set up civilisation elsewhere would likely belong to their own elite subspecies of ‘superhuman’ capable of expanding into space colonies. These humans would use technology – such as CRISPR gene editing technology – to fix genetic defects and eventually augment their natural cognitive and physical characteristics, such as by boosting memory capacity and moderating aggression.

“We are now entering a new phase of what might be called self-designed evolution, in which we will be able to change and improve our DNA,” he writes. “We have now mapped DNA, which means we have read the “book of life” so we can start writing in corrections […] Laws will probably be passed against genetic engineering with humans. But some people won’t be able to resist the temptation to improve human characteristics, such as size of memory, resistance to disease, and length of life.”

While acknowledging the severe competition that this will present ordinary humans, Hawking does not present this transhumanist future as entirely negative, arguing that there is “no time to wait for Darwinian evolution to make us more intelligent and better natured”.

Other subjects covered in Hawking's book include his take on the reasons why aliens have not yet been observed (humans have overlooked different types of intelligent life), whether a personal God exists (no) and the best idea that humanity is yet to make happen (nuclear fusion power).

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close