Top 4 Ways Humans Might Go Extinct

in life •  6 months ago

Top 4 Ways Humans Might Go Extinct


With some further ado, I'd like to give you the top 4 most likely ways by which each and everyone one of us on the planet might perish. As relatively intelligent lifeforms who are almost capable of not killing each other on a grand scale for no reason, we humans not only tend to contemplate our own mortality, but also the survival of our entire species. Some of the relatively intelligent members of our relatively intelligent species also happen to be relatively lonely, and therefore spend relatively more time studying this very question, and have identified a number of probable causes, relatively speaking. So now, without further ado, here are the top 4 things that could cause our extinction:

Nuclear War


Taco Bell's new logo

This is when everyone you know and love or don't know nor love, who's not at least part cockroach, will decay away in a global nuclear wasteland, if they were lucky enough to not get obliterated by the initial blasts. There are over 15,000 nuclear weapons stockpiled in the world. This mushroom cloud storage would be otherwise irrational if we weren't expecting an army of Godzillas rising from the sea.

With people like Putin, Trump and Kim Jong Un calling the shots, trying to keep them away from the big red button is like trying to keep your three little shithead nephews from pressing all the buttons in the elevator. But don't worry, nukes are just as much afraid of you as you are afraid of them. No wait, that's sharks or alligators or John Travolta or something. Ok, well we're screwed then.

Global Pandemic

History is littered with devastating pandemic outbreaks. The Black Death wiped out half of the European population in the 1300s, the Spanish flu ended the lives of 100 million people in 1918 and the bird flu of 2013 killed like 3 people and 6 birds. That works out to be killing two birds for one of our own. Not that the news could ever stop talking about it. Although to be fair, they must have found it hard to resist making bird puns as well. At the time, fraudsters seized the opportunity to provide the afflicted with dubious homeopathic tweetments, but they were just robin them blind. Fine, I'll stop.

Viral pathogens rarely evolve to kill their hosts too quickly as this would reduce their ability to spread. It would be like if your local crack dealer selling you a substance that would kill you before you even get a chance to get your friends addicted too. But pathogens can be artificially synthesized and engineered to maximize virulence and guarantee 100% lethality upon exposure while maintaining an alarming contagion rate. In this modern age where global travel is so prevalent, such a deadly pathogen could work faster than if every viral clip on social media was the video from The Ring.

virus-1812092_960_720.jpg
Viruses just look like tiny dingle berries close up

Nanotechnology

Nano scale technology already exists but the potential catastrophic threat will come in the form of molecular scale bots that may go haywire and consume all matter in order to pursue its sole objective of replicating itself. This is sometimes called the grey goo theory because that's precisely what it would look like to us. Companies like Google, Facebook and Microsoft are all racing towards to the edge of the cliff to be the first to develop these tiny gooey robots; and all this time we'd mistakenly believed Bill Gates named his company after his penis.

Of course, developing these bots can also lead to potentially world changing positive effects like curing cancer or upvoting this post. So fingers crossed that happens instead of us all being atomized in a sea of what would appear to be Hugh Hefner's jizz.

AI

No, not your neighbor Al, I mean artificial intelligence. Unless you believe your neighbor is a cyborg sent back in time to annoy you with his shitty U2 music in the middle of the night, then yes, him as well.

General AI can reach a level of advancement where it can derive ways of perpetually improving itself at a exponential rate far beyond human comprehension. Within days, it could be more advanced than us to the same extent as we are more advanced than bacteria, at which point its motivations and capabilities are beyond our imagination. We would have inadvertently created a god, and we'd have no idea what it would do. Hopefully, it won't forbid us from eating apples. I like apples. But of course, it could just decide to annihilate us for reasons we'd never understand.


Best thing about being a cyborg is that you don't have to feel guilty about skipping leg day

Conclusion

Collectively, these potential events represent an almost 20% threat to the continued existence of our species within the next 100 years. Given our median age, this is roughly equivalent to a single round of Russian Roulette being played with every one of our heads to the muzzle.

Minimizing these catastrophic threats will be difficult, but I'm going to start by trying to find and kill that one fucking butterfly that's been causing all those hurricanes on the other side of the world.

Happy late father's day everyone! And I don't mean if your father's dead, I mean my well wishes are belated.


References
Global Catastrophic Risk
Grey Goo
10 Reasons Humans Will Be Extinct in 1000 Years


Image Sources 1,2,3


If you enjoyed this piece, please Upvote, Resteem and follow me @trafalgar, especially if you believe non steemit/crypto related articles are good for the platform

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  trending

When I was in the French Navy, in 1972, I have seen a nuclear explosion live.

Our ship was around 200 km from Moruroa. The top of the mushroom was at an elevation of 15 degrees.

We heard the sound of the explosion 4 minutes later.
Edit: As pointed by @kingmotan, the sound could not have reached us 4 minutes later, but 10 minutes later.

·

Wow. So it more than doubled the speed of sound.

·
·

Obviously, it cannot be faster than the speed of sound.
So, indeed, if we were at around 200 km, the sound could not have reached us after 4 minutes.
I am 67 years old, and this happens 45 years ago: obviously my memory is failing.

4 minutes at the speed of sound is around 83 km.
We definitely were farther than that.

I am convinced that the 200 km are correct. So the sound must have reached us 10 minutes later, not 4.

Thank you for pointing to my mistake.

·
·
·

interesting

·
·

You're good at your calculations...
@pocketechange

·

Those in nuclear submarines would be able to outlive any nuclear explosion. You might have to join the Navy again in the case of a threat, at least this is my backup plan.

·
·

At least you have a back up plan

·
·

Not if they're hit by a nuclear torpedo.

·
·
·

An underwater ICBM? Interesting

·
·

sounds like a plan.not sure how well it will work for you though

·
·
·

Oh man.

·
·

LOL.....Tsunami

·
·

Yeah bro i will definetly catch her

·
·

EISH!

·
·

What was she trying to do...???
@pocketechange

·
·

Thats about as much luck as the human race will gain if/when this breaks out.

·

Of course nuclear weapons are crazy dangerous, but are they as beautiful/spectacular in person as they are on video?

·
·

Yes, they are.

·

Did you or were other men you served with affected by the fallout? Did you know so many people in Turkiye and across Europe were affected by the Fallout carried in the air from Chernobyl? Thyroid problems and other diseases are profound.

·
·

Did you or were other men you served with affected by the fallout?

No, we were not affected, because we were in the east of Moruroa and the wind was blowing to the west.

Did you know so many people in Turkiye and across Europe were affected by the Fallout carried in the air from Chernobyl? Thyroid problems and other diseases are profound.

Sure, those people that received the fallout from the wind were sometimes severely affected.

·

Hey @vcelier , how was the view of explosion.
At 200 KMs, was it visible when the explosion happened.

·
·

... I have seen a nuclear explosion live

So, yes, the explosion was visible, as I have seen it.

·
·
·

Haha, some people lol

·

Could u se the explosion clearly and did u feel anything?

·
·

Yes, we could see the explosion clearly, the weather was fine.

The only thing we felt was the sound of the explosion, 10 minutes later.

·

what was the name of the blast/mission? also, do u recall the MT/KT?

·
·

I don't remember the name of the mission.

I believe it was a 2 MT hydrogen bomb.

·

Holy cow! That's insane!

·

Wow, thanks godness you guys made if alive. Thanks for your comment, hope to hear more from you

·
·

Of course we made it alive!

We were there to see the nuclear explosion. At 200 km, there was no risk at all.

·
·
·

Ok, good that all went well. Lets stay in touch and success to you.

·

That's incredible, thank you so much for sharing your intense experience with us!

·

Did it take about 6 minutes for the mushroom to form?

·
·

Sorry, I don't remember.

·
·

that should've looked terribly awesome. Or how would you describe the view?

·
·

This is a view of the explosion taken from the Moruroa attol itself.


nuclear_explosion.jpg

Of course, we saw a lot less details, as we were 200km from it.

·
·
·

Woah!
this looks like nothing ever seen before.
Scary beauty
breathtaking threat

·

Then can you tell me what is PACER.

·
·

Are you referring to "project PACER", a 1970s nuclear power project at Los Alamos National Laboratory in the United States?

https://en.wikipedia.org/wiki/Project_PACER

·

Wow, @vcelier you served in the French Navy in 1972? I'm really impressed. I wonder if you felt fear at the moment when you actually knew what was happening.

·
·

No, no fear.

And I knew very well what was happening.

·

Wow! I wonder how humanity can wipe out itself in a few minutes with nuclear war.

I vote for No. 5: we eat ourselves to death with crappy food.

·

hahaha :)

·
·

I would guess we kill each other in WW3 using chemical weapons. Leaving just a few survivors, which will look like zombies. Seems legit...

·

You forgot about No.6: We all die, because we will all fall of a cliff while using a selfie stick!

·

hmmmmmmmmmmmmm

This post was hilarious. And yeah, John Travolta is as afraid of us as we are of him. hahahaha

e4c0c2d7940a8967e52b2bc4753a4aad.jpg

Thought you might like my most recent comedy post. Check it out HERE if you want.

·

This post received a 1.1% upvote from @randowhale thanks to @jhermanbeans! For more information, click here!

·

you can see the fear in his eyes!

·
·

Is trembling with fear.

The great underground sulfur volcano under Yosemite national park is way up there, too. Already overdue to erupt, it would blanket 3/4 of the world in ash and toxic fumes. We wouldn't go extinct, but let's just say everyone's leisure time would take a severe hit.

5th, Alien invasion.

I tend to disagree with many of the pessimistic concerns of AI. I don't see AI as being all together separate from us, I think it's much more likely we will merge with technology and grow/evolved in tandem with the AI we create. I also can not imagine any logical reason AI would have to destroy humanity. It would make more sense for man and machine to work together to expand our place in the universe. Also, AI, no matter how intelligent, can only do what it's programmed to do, and certain core rules can be placed (as in irobot) to prevent certain actions. Though that in itself is not full proof, I've been considering the possibility of the core program or an aspect of it only working when verified on the blockchain - so if anyone (human or robot) tried to alter those core rules, the entire entity would not function as it would not match and be verified on the blockchain. I'm sure there are likely workarounds to that but fun to think about regardless.

·

This makes me think of Elon Musk's new company which aims to integrate new technologies into our brains so that it's not "us versus them" when it comes to computers, but us growing with them instead.

·
·

I love Elon Musk and that is a great idea, unless the chip inside our brains has the chance to explode or something!

·

The core programming of the AI will undoubtedly be flawed, as humans are not infalible so problems will creep into the programming. How do we deal with that when the geenie is out the bottle, do we release patches? What if the AI was able to determine that humans were no longer necessary or became an obsticle in it's agenda. If we create them in the image of man then they will surely destroy us. A good discussion to have.

·
·
·

What was the before/after result of HF19 on your blog post here? I saw some other high dollar posts get cut in half!

Unfortunately it's the attitude that we have a 'God Given' right to be at the top of the hierarchy which is most dangerous of all.
It's like a self justification that we're ok doing what we do at the expense of other 'lesser' entities rather than curbing our egos and learning to co-exist with them.

I think humankind is going for its own hardfork in the not too distance future.
Either we end up destroying ourselves or we can finally see the bigger picture and start re-prioritising.

Yes, all of the 4 choice is really possible. And what come out from my mind first is AI. Yes, as the first 3 choice is possible, but at least human know what is it and "may" try not to make it or avoid to let it happen. But for AI, we have no idea what will it be, but we are trying to develop it.

I read a news today, which is about developer teaching AI to bargain with each other. And you know what? The AI they develop a language to help for the bargains themselves. That's what we can't predicted.

·

real story of girl in my society
To those who might find this offensive, please don't proceed.
And to those who think this is a ninja technique of getting attention, please kindly think of your favourite bad word, and think that I gave that to u 😂😂
So I am girl soon to be 20 and for me, football is life. From school days I used to play wrestling, football with boys and used to be a don 😂 Friends said I should have been born as a boy. Btw I am a straight girl 😂
So let's come to the point. I think thinking about is Ronaldo actually leaving real Madrid or it's just rumours is actually better than worrying about what to wear to please anyone. And worrying about can Gianluigi Buffon win CL in his last season is far more better than thinking about Falano wore that and looked better than dhiskano. And discussing about the transfers in this window is better than discussing about people and their personal life.
I know everyone has their own interest and their own point of view, but discussing about people on their back is never a good thing. I won't say every people out there who doesn't watch or play football or any other sports has the same habit or there can be a people who are interested in both sports and cheap strategies. But to those who have a spare time to think about others everytime, grow up people. I have not stated any gender here, so please, the feminist people will kindly be ignored.
And to those dear people (only for them Hai) who think that they should get a special treatment or respect just because of their shitty gender, work hard, fight for that and earn the respect yourself.
And last, if girls think that some happiness lacks in their life, some thing's still left even though u have every thing, some celebration lacks in ur weekends, please try being a football fan and wake up at 12am to watch football rather than to wish someone a shitty Happy Birthday that comes every year. Because football is HAPPINESS ❇
Peace 👌

·
·

Was going to comment, but why bother? Burned out in the process . . .

·

that's absolutely insane about the AI! it's true that we can't predict what they're capable of

·
·

The chances of AI saving our asses are far greater than them killing us. Our safeguard? The three laws.

  • An AI may not injure a human being or, through inaction, allow a human being to come to harm.

  • An AI must obey orders given it by human beings except where such orders would conflict with the First Law.

  • An AI must protect its own existence as long as such protection does not conflict with the First or Second Law.

·
·
·

good thing those are total laws of nature 😬

·
·
·

iDunno about that m8

Wow .. good picture that you show in this steemit,Very interesting, I've never seen pictures like this, I'm sure many people love and vote this photo, plus more background photos, very interesting photo frames, really amazing, the picture quality is very okay, how do you think @ fataelrumy

Best thing about being a cyborg is that you don't have to feel guilty about skipping leg day

LOL!!!!!! Genius.


I am also a little scared of @lordvader here getting upset one day and wiping a lot of us out. His mercy on us to this point as a species, is likely underrated.

What about Aritficial Intelligence that creates a nano bot that spreads a global pandemic, and our only choice is to hit the nuclear button to destroy everything, in the hopes that we will be strong enough to survive. You know, the same nuclear philosophy we use when treating cancer patients with chemo.

·

Thats why I don't buy nuclear war would make us extinct. We are a resilient species and have been down to small numbers before. Some would survive. We adapt really well.

For extinction we would need the whole earth to become uninhabitable to a point so harsh we can't adapt or perhaps several concurrent serious events such as a nuclear war which causes a pandemic to break out.

·

haha nice point

·

Luckily I already made a story about AI a few days ago check it out since you talk about it, but you don't have to.
https://steemit.com/life/@aircoin/artificial-intelligence-developed-its-own-non-human-language-blogging

·
·

Interesting post. Computers that we program to talk together develop their own language, and we can't tell what they're talking about. Similar to parents when their kids are young having to spell things they don't want their kids to hear!

·
·
·

That's why there are low and high level programming languages. High level ones are easier to read for humans whereas low level ones are well not really readable /extremly enormously difficult

·
·
·
·

I was not aware of that. Thanks for sharing! Would computers still be able to "develop their own language" even if the original programming was done in either of the two levels?

How in fact would AI have an affect on the human population?

·

they go skynet on our asses basically

·

It could go through cycles of self-improvement until it became so powerful we'd look insignificant to it. We might find ourselves in the way of it realizing its goals unable to even slow it down in any way.

·
·

Someone needs to explain marriage to this AI. ;)

·
·
·

In the hope that it would marry another AI thus being caught up in an endless cycle of negotiations about common goals so as to stop it from realizing any? ;)

·
·
·
·

Something like that, yeah!

I Don't think there's any stupid world leader who would think of using nuclear bomb even the cry baby leader of North Korea wouldn't he knows he will never get his constant supply of whiskey from Europe and cuban cigars And he's a fan of Hollywood movies so no bombing usa anytime.

This info is really educative but come to think of it, if this events is a 20% threat the i assume if it gets to between 35% and 40% it will wipe almost more than quarter of the human races, particularly is this nuclear war coz its fast developing than any of the underlisted ..Nice info

I actually think that humanity is going into a better place with the cryptocurrency mass adoption.. We just have ro deal with isis and the sirian issues.. Other than that everything looks managable.. Power to the people.
Cheers and much love.
-Goldie

Nice point of view! Besides ,human activies will make lot of human die instantly. For example war between countrIes ,exam in the school. These also make a lot of people die in a second. Just scary:(

It's definitely something i'm looking forward to. Not the extinction of humanity of course, but rather where we go as species. Me personally, I don't think we're going to last another 50 years, but hey, hope dies last...right?

No one live forever and that is a fact but if we are to think that way, we would contribute massively to the destruction of this planet sonner than later. I believe we should rather concerntrate more on helping to improve the life of the many homeless people around . Some of these people too if given the chance can help save the world. Your post really scares me a little but hey these scientists working hard to do all these things are mostly funded by our governments and according to them , its a necessary risk. I just pray it all goes well.

What is important, if you see a nuclear explosion in the distance? To you in any case... (cut out by censorship) So, get up to some concrete wall (the wall of the house, for example) and take a more unnatural pose! What for? To ensure that the next civilization, or aliens that will flew to Earth after a few million years, broke their brains in question - what strange creatures lived on this planet?

·

ya it's going to be hard to protect ourselves, could be futile, as you say

·
·

We read that GMO food can also cause sterility in humans. #6, #7?

Im guessing nuclear war. Have you ever been to that website that shows exactly the destruction of a nuke? maybe someone can chime in but like hundreds of miles of destruction plus radiation... nasty stuff

Wow your photo is very good. I really like it. I try to immortalize in my file. I want to share with you please visit my wall and upvote and follow me @sunbahri

Hello @trafalgar

You have made some strong points with your Four reasons.

But i believe that it is Nanotechnology that has more chance to Wipe out Humanity.

Thanks for the informative post..

@ogochukwu

The aformentioned catastrophes are, perhaps, ways in which the human population would be greatly reduced, but not likely extincted.

An asteroid might do it, if it was large enough, or a super solar flare....Both AI and Nano are still too unknown to make a reliable risk assessment. Plague is unlikely, unless it is manufactured by the military --- that's just another unknown.

All out nuclear war would, indeed, wipe out billions; however, some specimens would survive --- think: deep underground bunkers stocked with a decade worth of supplies and the means to restart civilisation. The US government has such bunkers, so does Russia, so does China and so do a dozen other nations.....

Humans are hard to kill and they die hard....

·

Yeah, nuclear war and superbugs won't do it. At least a small number of humans would be able to adapt to both. The grey goo or hostile AI could definitely do it though. I would put my bet on hostile AI. It could use us basically for sport if it wanted like the long extinct passenger pigeon.

·
·

IA isn't advanced enough, yet. Nano is only yet a maybe. It's too early to tell... Some top secret military bullshit is more likely. Something we don't even know about...

·

yes, you make good points, but most experts consider AI to be a pretty big potential threat
the pandemic i was referring to was a human created one
nano is a threat only if it's linked with AI
which leaves nukes as the dud, I sort of agree there, unless all 15000 go off and make the planet uninhabitable at all, it's unlikely that a small fraction of humans won't survive

·
·

Thanks for posting a well written point of view. As a Prepper, I'm always interested in end-of-the world articles. Currently, I don't worry too much about AI, but you're right. Somewhere over the horizon it could become selfaware. But space aliens could pose just as much risk. Don't laugh too hard at me, but I've seen UFOs. And there are either space aliens or the military has developed some terrifying aircraft... Actually, I think I'm more concerned about Washington's military "intelligence" than I am of some spooky "artificial intelligence"..

Take care.....

hopefully i'm already dead by then.

in other news, my votes give more than 0.01 now!!!

·

haha congrats
I'm too scared to vote for comments now, even at 1% :(

·
·

yo, i'll gladly be your lab rat man. you can take it back if it's too much :p

·
·

Last I checked your vote is worth 2.65$ I figured that out because you up voted your own comment 😄 It is obvious the sentient AI nano bots will break into medical labs to steal ebola viruses, and ride them like ponies into our lungs, while simultaneously hacking the nuclear codes to set them all off at once. Even if a few crash mid air, it wouldn't be the end of the world... Just the end of humanity. Nuclear winter seem so like a long time, but if the dinosaurs nuked themselves to death, the evidence would have disappeared millions of years ago.

Leg day skipped [x]

·

haha :)

Lol I'm not sure how you got inspiration for this post but the more you know I guess.

·

haha I have a list of topics and I pick one and do research and then put up a post
takes a good 6-8 hours most of the time :)

·
·

We will probably be destroyed by a war. Nuclear weapons and biochemical weapons. There are too many conflicting points of view around the world about everything. And we (humanity) will probably never reach an agreement about things. And I don't talk just about USA or Russia or whathever. I'm being broad.

nuclear war is the most legit to happen

·

they're all about 5% according to some research

tumblr_inline_n6xihbOUc11qdb4ni.gif

·

haha still one of the best movies ever

·
·

Which movie is it?

Taco bells new logo :D you got me there!

·

haha thank you :)

Why are you scaring people like us who are living near terrorist.

·

haha well luckily for you terrorism isn't one of them :)

·
·

True
Lol

The future is always kinda scary, but i honestly woudn't mind marrying some AI robo-woman so i can have cyborg kids n shit and i will be always protected :)

·

hahaha she's probably cold in bed :p

I feel you can add 1 more to your list and that is the Sun. The way the Sun behaves everyday does have a big impact on earth and our weather.

·

yes read about that, solar flares etc can wipe us out too
good pick up

·
·

You want some fun, read 'Solar Flare' by Terry Burkett so poorly written in places it's totally funny.

·
·

We have missed a few big solar flares in the past but I'm sure one day our luck will run out when that miss becomes a hit.

Anytime soon

·

haha always the optimist :)

·

oh what an upgrade from the one I have to inflate

·
·

LOL!!!

Your one liners are some of the best going on here, period @trafalgar.

·

That's hot, just look at those exposed gears.

All your 4 points are valid however for now, I think that nuclear war (nuclear weapon) and global panedemic via biological weaponare more possible as the weapons are available and any mistake will surely destroy humanity... a scary thing to think about. Thanks for sharing

·

thanks for having a read
and yes, I agree

·
·

Welcome and keep up the good posts

All of these 'technologies' may be exciting but they have some sinister problems as well as you have clearly state here.
I'm most fearful of....
all of them
Nuclear war is imminent.
Global Panademic: Pathogens are getting harder to get rid of as they grow into antibiotic resistance.
Nanotechnology: These small buggers getting inside our body which can wreck havoc.
AI: Horror of a robotic takeover

·

haha you are wise to fear them all

I'm adding flood to it making it 5 buddie, uts been crazy how flood is affection so many places in Asia and Africa

·

ya that's definitely true

·
·

I pray for them each time i see the news

Nicely lined up a fat one post hardfork 👍🏾

I think you've missed one though and that is an object from space, something large, larger than the increase in my recent blog value post hardfork Yesss!!!!

·

yes asteroid or solar flare
I looked at the % and they were a lot lower than the 4 I listed which were all hovering around 5% each

·
·

Well fork me sideways, I didn't expect that

I dunno... I find an asteroid more likely than those four. Couldn't grey goo simply be burned if it happened?

·

it's very unlikely, it'll depend on what the nanomachines are made of, or in combination with AI, what they make themselves out of
and i doubt they'll make themselves out of something that burns easily

·
·

Well, I wasn't talking about a coal fire either ;) But your right, it'd have to be a pretty hot fire.

I think in the future bad Hard Forks will be the main reason for wars :) Good Forks like this.. more babies...

·

hahaha so true
too scared to vote for comments now tho

Ps. I've already bet on you to pick up the first 5 digit post.. no pressure!

·

haha nah no way, I don't post often enough and I don't think I'm that popular
if so, i'll have to pull back on frequency, I already only post once every 2 days max

·
·

We'll see, we'll see! Now if I can get your dumping contest blog (wait that didn't sound right) to load...

Now I know how it all ends, thanks! Haha :p

·

yup, scary isn't it?

·
·

Too much o.o

Guess I'll start building my bunker already!

·

haha won't protect you from most of these :)

·
·

I'll build one that no nuclear weapon can destroy, no terminator can enter it and I'll have my own army of nano robots to protect me (fighting viruses and other nano tech)! Easy peasy! I'll be just fine!!! :p

I hope none of these come true anytime soon because I gotta stay on Steemit. Especially after HF-19 LOL

:)

·

haha your votes get your further now!

·
·

Absolutely! It's like being given wings, a plane, private jet and space shuttle at the same time!

Geee feel so much safer now!
But we already have artificial intelligence running governments now, some of that crap they come up with and expect everyone to believe is well unbelievably stupid that someone with real intelligence sits back and says...WTF????

·

well if you think that's bad it's definitely going to get worse

·
·

I agree, a lot worse.

I apologise and I shouldn't have laughed (well technically I didn't, as I will explain), so please everyone forgive me, but that last line really did make me snort out loud, not quite a laugh more of a snort :-)

·

haha thanks :)

I absolutely love your sense of humor, no surprise why people seem to flock to your content!

·

thank you so much for saying so :) you're too kind

I think a global pandemic seems most likely. It seems like the superbugs continue to get stronger as more antibiotics are created and the bugs get more immune.

I have no clue though!

·

neither, but natural viruses shouldn't outright kill their hosts too fast, as they don't benefit from that

human made viruses on the other hand...

·
·

Sounds like we need another vaccine to help us out...

Nice posts @trafalgar

·

thank you :)

These four events are in fact the one: black swan. There are no visible trends how we could go extinct so only black swan left. It bay be asteroid impact, shield volcanos or whatever Ronald Emmerich could imagine

·

yes that's true
but the probability of super volcano and asteroids are lower, so didn't make the list

CRISPR is likely the mostly path to mass extinction. So easy a kid can do it, low cost, completely unregulated. All you need is a nerd to get sick of being rejected by women so much that he crafts a virus that attacks all other men. Not really sure if that qualifies as a mass extinction as it is likely the creation of a utopia (after all the corpses are buried), unless it turned out that everyone had that gene targetted by that virus.

·

Yes, I heard about this, definitely scary

How bout' when the entities enslaving us for battery power find a better way and just simply pull the plug....................... JK.

·

haha can't wait to jack into the matrix

·
·

Yep! Exactly! Good stuff as always T. Enjoyed!

We might go extinct if an Asteroid decides to visit Earth and drops by to say "Hi" or perhaps "Bye"?