Asteroids, Robots And Deadly Viruses Could Kill Millions, Report Warns

Scientists Think We're Not Taking The Apocalypse Seriously Enough

A robot apocalypse, deadly virus or giant asteroid might all sound like great plots for B-list movies but scientists have warned that they're far more likely than humanity probably realises.

Climate change, nuclear war and huge natural disasters such as supervolcanoes are also on the list in case you're wondering.

Open Image Modal
puchan

This bleak outlook comes courtesy of Global Catastrophic Risks, a report compiled by the Global Challenges Foundation and the Global Priorities Project at Oxford University.

The Foundation's job is to analyse and then highlight global risks which could then potentially wipe out over 10 per cent of Earth's human population.

It warns that while most generations never experience a catastrophe, they are far from fanciful, as the bouts of plague and the 1918 Spanish flu that wiped out millions illustrated.

Sebastian Farquhar, director at the Global Priorities Project, told the Press Association: "There are some things that are on the horizon, things that probably won't happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way.

Open Image Modal
Video games like The Division posture far more realistic scenarios than we might have imagined.
Ubisoft

"History teaches us that many of these things are more likely than we intuitively think.

"Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks."

In the next five years asteroids, super volcanic eruptions and unknown risks are ranked as the biggest threat to humanity.

In the longer term, the rise of artificial intelligence (AI) has been listed alongside catastrophic climate change, nuclear war and pandemics as a threat to humanity.

Open Image Modal
AI might offer us a brave new world but if left un-checked it could bring about a 'Terminator' style uprising.
Mike Agliolo via Getty Images

Mr Farquhar said: "There is really no particular reason to think that humans are the pinnacle of creation and the best thing that is possible to have in the world.

"It seems conceivable that some AI systems might at some point in the future be able to systematically out-compete humans in a bunch of different domains and if you have a sufficiently powerful form of that kind of artificially intelligent system, then it might be the case that if its goals don't match with what humanity's values are then there might be some sort of adverse consequences.

"So this doesn't depend on it becoming conscious, it doesn't depend on it hating humanity, it is just a matter of it being powerful, its objectives being opaque or hard to determine for its creators, and it being in some sense indifferent to at least some of the things we find valuable."

The biggest long term threat to civilization is natural and engineered pandemics and nuclear war, according to the report.

Open Image Modal
Nuclear war remains one of the greatest risks for humanity.
Air Force - US via Getty Images

Strides have been taken to cut the number of nuclear weapons in the world, but the rise of synthetic biology could open the door to the creation of "off the shelf" deadly viruses.

While the danger of a potent form of avian flu mutating and rapidly infecting humans could also kill many millions, researchers said.

Mr Farquhar said that while there is no evidence to suggest that in the "very near term" militant groups such as Islamic State (IS) will be able to manufacture their own viruses, this could be a future threat.

He said: "What we want to worry about in the future though is as it becomes easier and cheaper to do a lot of these things in an almost off-the-shelf kind of way, or to order the parts for say a smallpox virus off the internet, that might start to change.

"We have seen that in the field of synthetic biology and genetic manipulation of small organisms or things like viruses, the cost has come down unbelievably in the last decade.

"It is still too expensive to worry about rogue groups trying to use the technology, but that might not remain true."

Open Image Modal
Although rare, a supervolcano has the potential to be an extinction event.
IPGGutenbergUKLtd via Getty Images

The report calls for the international community to improve planning for pandemics and health systems, investigate the possible risks of AI and biotechnology and continue to cut the number of nuclear weapons.

Mr Farquhar said ameliorating these risks "definitely requires international co-ordination".

He said: "What is really important to remember is that many of these risks don't stop at the borders and wait patiently for their passports to be checked, they are truly global in nature.

"This is not the sort of thing where one country can say 'Oh well we are prepared and the rest of the world can fend for itself'. That is one of the things we saw with the Ebola crisis is how this thing spilled over national borders."