Stephen Hawking’s ideas continue to influence and inspire millions across the world. And in one of his last writings before his death in March of this year, the renowned scientist warned us about a future that will be so dominated by AI (artificial intelligence) and gene editing that normal human beings will be rendered useless, left to their own extinction. The gloomy predictions were made in the book Brief Answers to the Big Questions launched recently.
The rise of the AI
On the subject of AI, Hawking believed that the real risk of artificial intelligence is not malice, but competence. He also brought attention to the importance of putting strict regulations on AI weapons, warning that any failure to do so could lead to catastrophic results.
“A super-intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble. You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green-energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants,” Newsweek quoted Hawking.
According to a survey of more than 350 experts in machine learning conducted by Katja Grace from the Future of Humanity Institute at the University of Oxford, there is a 50 percent chance that AI will become capable of nearly all human tasks within about 45 years. As such, humans could effectively become “useless” in less than five decades. And as Hawking warned, there would be nothing stopping AI from seeing human beings as ants to be stepped upon.
Superhumans are coming
Hawking also wrote about the dangers posed by gene editing technologies like CRISPR. The scientist predicted that gene editing would initially be used to repair genetic defects in human beings. But eventually, the technology would be used to enhance human capabilities to the extent that the population would be divided into two groups — superhumans and normal humans.
He also saw a class divide when it comes to who becomes superhuman and who remains “normal.” Hawking predicts that even though laws against genetic engineering would be created, the rich wouldn’t be able to resist the temptation to enhance their characteristics like memory size, length of life, immunity to diseases, and so on.
“Once such superhumans appear, there are going to be significant political problems with the unimproved humans, who won’t be able to compete… Presumably, they will die out, or become unimportant. Instead, there will be a race of self-designing beings who are improving themselves at an ever-increasing rate,” Business Insider quoted his warning.
In addition to AI and gene editing, Hawking also talked about various topics in the book, right from the existence of aliens to the question of God. He believed that the reason we have not discovered extraterrestrial organisms is because humans have “overlooked” the different types of intelligent life that exist outside Earth.
Hawking also stated that the biggest threats to Earth are an asteroid collision and climate change, warning that a combination of both could end up making Earth’s climate similar to that of Venus, with an extremely hot temperature of up to 250°C.