Stephen Hawking's predictions of how we will all die

When Stephen Hawking was in his early 20s, he was told he only had two years to live - but survived another 55.

His unlikely longevity is something the Earth itself needs if it's to survive the various calamities the genius cosmologist has predicted.

In his later years, Prof Hawking regularly warned about the dangers of artificial intelligence, alien life, environmental degradation and short-sighted leaders.

In 2016, Prof Hawking said it was a "near certainty" that sometime in the next 10,000 years, humanity would be faced with a global disaster.

"We face a number of threats: nuclear war, global warming and genetically engineered viruses," he told the Radio Times.

"By that time we should have spread out into space, and to other stars, so it would not mean the end of the human race.

"However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period."

Here are some of the threats Prof Hawking expects the human race to face.

Aliens

"If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans," Prof Hawking told the Discovery Channel in 2010.

It's estimated more than 80 percent of indigenous Americans were wiped out after Europeans made contact, thanks to disease, warfare and displacement. Prof Hawking said any aliens that reach Earth would be by definition more advanced than we are, making humanity the Native Americans in this scenario.

A scene from Independence Day.
A scene from Independence Day. Photo credit: 20th Century Fox

"Such advanced aliens would perhaps become nomads, looking to conquer and colonise whatever planets they can reach.

"If so, it makes sense for them to exploit each new planet for material to build more spaceships so they could move on. Who knows what the limits would be?"

In fiction: Star Trek: Enterprise, Stargate, Independence Day

Artificial intelligence

"The development of full artificial intelligence could spell the end of the human race," Prof Hawking said in 2014.

A popular concept amongst futurists is the idea of the 'technological singularity' - the point at which computers are so powerful, it's quicker for them to design their own upgrades, than rely on humans. The result is an exponential growth in computing power that far exceeds the breakneck pace set over the last 70 years, resulting in an artificial intelligence that supersedes that of humans.

"It would take off on its own, and re-design itself at an ever increasing rate," said Prof Hawking.

"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."

A number of estimates have put the date for the technological singularity as between 2040 and 2050. Tesla and Space-X boss Elon Musk has made similar warnings in recent times.

The artificially intelligent Borg Queen captures Captain Jean-Luc Picard in Star Trek: The Next Generation.
The artificially intelligent Borg Queen captures Captain Jean-Luc Picard in Star Trek: The Next Generation. Photo credit: Paramount

Prof Hawking says we need to adapt computers into our bodies and minds in a way that suits humanity, before an artificial intelligence does it for us.

"We must develop as quickly as possible technologies that make possible a direct connection between brain and computer, so that artificial brains contribute to human intelligence rather than opposing it."

In fiction: Blade Runner, The Matrix, Metropolis

Nuclear warfare

If aliens don't kill us, thousands of years of evolution that made us willing to fight at the drop of a hat just might.

"The human failing I would most like to correct is aggression," Prof Hawking said in 2015.

"It may have had survival advantage in caveman days, to get more food, territory or a partner with whom to reproduce, but now it threatens to destroy us all. A major nuclear war would be the end of civilisation, and maybe the end of the human race."

In fiction: Beneath the Planet of the Apes, Mad Max, Dr Strangelove, Threads, The Terminator

Climate change and Donald Trump

"We are close to the tipping point where global warming becomes irreversible," Prof Hawking warned last year.

"Mr Trump's action could push the Earth over the brink, to become like Venus, with a temperature of 250degC, and raining sulphuric acid."

Mr Mr Trump pulled the US out of the global Paris Agreement, promising to boost the use of fossil fuels that directly contribute to climate change - which the President once called a Chinese hoax.

Donald Trump.
Donald Trump. Photo credit: Getty

Earlier this year, Prof Hawking said those who deny the reality of climate change should "take a trip to Venus. I will pay the fare."

In fiction, The Day after Tomorrow, The Simpsons, Back to the Future Part II

Asteroid strike

"One of the major threats to intelligent life in our universe is the high probability of an asteroid colliding with inhabited planets," Prof Hawking said in 2016.

"We only know about 15 or 20 percent of the objects which are larger than a few hundred metres in size. If these bodies impact Earth, they can cause regional damage across a whole country or even a continent."

Last year Oumuamua, a bizarre asteroid from interstellar space, flew close by the Earth, and no one even realised it until it had passed.

Artist's impression of how the asteroid Oumuamua might have looked.
Artist's impression of how the asteroid Oumuamua might have looked. Photo credit: ESO

In fiction: Armageddon, Deep Impact

Genetically engineered virus

Prof Hawking once warned a virus created in a lab could be unleashed on humanity, killing us all - even perhaps accidentally.

Gary Sinise in the miniseries adaptation of Stephen King's The Stand.
Gary Sinise in the 1994 miniseries adaptation of Stephen King's The Stand. Photo credit: Laurel Entertainment/Greengrass Productions

In fiction: The Stand, I Am Legend

Inequality

One of Prof Hawking's final warnings was about the growing divide between rich and poor, and how it contributed to the Brexit split.

"I believe that wealth, the way we understand it and the way we share it, played a crucial role in [the public's] decision. If we fail, then the forces that contributed to Brexit, the envy and isolationism not just in the UK but around the world that spring from not sharing, of cultures driven by a narrow definition of wealth and a failure to divide it more fairly, both within nations and across national borders, will strengthen.

"If that were to happen, I would not be optimistic about the long term outlook for our species."

A scene from Snowpiercer.
A scene from Snowpiercer. Photo credit: Moho Film/Opus Pictures/Stillking Films

In fiction: Elysium, The Great Gatsby, Snowpiercer

How humanity might survive

Even if none of the above calamities come to pass, eventually the sun is going to go supernova and envelope the Earth in its fiery embrace.

"We must... continue to go into space for the future of humanity," he told Oxford University students in 2016. "I don't think we will survive another 1,000 years without escaping beyond our fragile planet."

He later downgraded that timeframe to 100 years - perhaps within our own lifetimes.

Newshub.