Stephen Hawking has warned the Earth could become a "ball of fire" or a planet dominated by misanthropic artificial intelligence.
The wheelchair-bound physics genius gave the grim warnings at a technology summit in Beijing, China, earlier this week.
In the first nightmare scenario, Prof Hawking said a growing population combined with ever-increasing energy consumption will lead to catastrophe.
To avoid it, he said, humanity will need to "boldly go where no one has gone before", borrowing a phrase from Star Trek.
- Robots will treat humans as pets – or gods, says Steve Wozniak
- Robot arms race puts Earth at risk - expert
- Kiwis in denial about robot job apocalypse - survey
Prof Hawking then spoke about Breakthrough Starshot, a US$100 million mission to send a tiny light-propelled 'nanocraft' into interstellar space.
The craft would weigh about a gram, and using a metre-wide sail only atoms thick, would travel at 20 percent the speed of light.
"Such a system could reach Mars in less than an hour, or reach Pluto in days, pass Voyager in under a week and reach Alpha Centauri in just over 20 years," Prof Hawking said.
NASA's Voyager launched in the 1970s and took 35 years to leave the solar system. It's not expected to reach another star for 40,000 years.
Prof Hawking also touched on a regular theme of his - the threat of artificial intelligence. In the past he's warned of killer robots, and this week tried - and failed - to take an optimistic view.
"Perhaps with the tools of this new technological revolution, we will be able to undo some of the damage done to the natural world by the last one, industrialisation. We will aim to finally eradicate disease and poverty. Every aspect of our lives will be transformed," he said, before reverting to type.
"Success in creating effective AI could be the biggest event in the history of our civilization. Or the worst. We just don't know. So we cannot know if we will be infinitely helped by AI, or ignored by it and sidelined, or conceivably destroyed by it.
"Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilisation. It brings dangers like powerful autonomous weapons, or new ways for the few to oppress the many."
A number of high-profile futurists have posited the year 2045 as when, if historical rates of technological progress continue, computers will overtake humanity in brainpower.