We certainly seem to have no shortage of crises.
What, when you consider Ebola, ISIS, ecological disaster, and the potential for economic collapse, human extinction can almost seem at times to be a given. If that's not enough of a cheery thought for you, the fine folks over at io9 put an even sharper point on it. There are likely more crises coming up "around the hood ornament" so to speak that carry the capability to wipe out all of humanity. What's more, they will all be our own doing.
Yet we don't seem that concerned. As the article states:
"Yet, these risks remain understudied. There is a sense of powerlessness and fatalism about them. People have been talking apocalypses for millennia, but few have tried to prevent them. Humans are also bad at doing anything about problems that have not occurred yet (partially because of the availability heuristic – the tendency to overestimate the probability of events we know examples of, and underestimate events we cannot readily recall)."
So what does the author consider to be the five biggest culprits for extinction on the horizon? Let's take a look at them. I will be treating them in ascending order of "what scares me the most."
-Nanotechnology. Face it. Robots on the molecular or even atomic scale scare people. It's that threat of the unseen, of tiny mechanisms that can enter your body while you are none the wiser. A maniacal mind could use such micro-sized devices to poison or perhaps control someone or even just driving them crazy by making them think they have a poltergeist in the house. This is to say nothing of the "grey goo" scenario where self-replicating nanobots get out of control and devour everything in sight, thus sending humanity into extinction.
Then again, this technology could aid us in getting climate change under control or defending our nation. Stop thinking about what could go wrong and consider what could go right.
-Superintelligence. This covers everything from enhanced human cognition to artificial intelligence. The concern stems from the fact that high intelligence does not always come with a high sense of ethics. A highly intelligent person...or machine...in a position of authority or control might see a situation in terms of pure logic and not be sensitive to side consequences of a decision.
While I can share a bit of concern over this possibility, I again see this as another case of "rise of the machines" Luddite reactionary fatalism. See above.
-Human created pandemic. Now I'm getting scared. While pandemics such as Ebola have killed many, they generally aren't favored by nature as extinction tools as wiping out their hosts is problematic to their own survival. Someone eventually demonstrates resistance to the pathogen. Human ingenuity can overcome that defect, however. We can make diseases more contagious and more robust against resistance. A study on bird flu demonstrated that the contagious quality of that virus could be deliberately boosted. Bioweapons. If we can turn nature into a weapon, we will.
-Unknown unknowns. You might wonder why I place this Donald Rumsefield-esque entry second to last and not the first as the article lists it. Well again, as the writer states, I suppose I fall to the "availability heuristic." It's hard for me to be afraid of something I don't know about. That being said, I know that the law of averages and probability states that humans can only face so many catastrophes before our number is up.
-Nuclear war. I've written about it on here so many times that it isn't even funny, but this Cold War child is still scared to death of it (thank you, John Chancellor.) While a full-tilt nuclear exchange between the armed nations seems unlikely at this time, it still isn't far from our minds. Putin is sending nuclear-armed bombers and submarines closer and closer to the United States as a means of waving his genitalia about. There is always the threat that a terrorist organization such as ISIS will get their hands on a nuclear device left roaming about after the fall of the Soviet Union. Even a regional exchange between say, India and Pakistan would have drastic global consequences. We're certainly good at finding ways of killing ourselves.
In the end, that might be the factor that belongs at the top of this list. Human beings carry such a streak of avarice, selfishness, and short-sightedness.
That may be the biggest threat of all.
Follow me on Twitter: @Jntweets