• Philosophy and Spirituality
    Welcome Guest
    Posting Rules Bluelight Rules
    Threads of Note Socialize
  • P&S Moderators: Xorkoth | Madness

How many more years do you think humanity has, really?

CFC

Bluelight Crew
Joined
Mar 9, 2013
Messages
18,171
I was thinking we're probably on the final straight now, maybe 5-10 years tops? :unsure:

I guess it's been prompted by all the stuff about AI recently, and its probable efforts to exterminate humans, which does seem fairly logical. And also reminds me that any technological lifeform, not just humans, would probably be exterminating itself just about now, once the democratization of the means to take life on a vast scale reaches a certain threshold.

You only need a handful of mendacious types, an accident or three, individuals with unstable personalities and so on, to flip the kill switch. I kinda think of it like a natural process for any advancing lifeform that's acquired the means for little people to do big things, that must inevitably happen every time a civilization reaches this point. Not unlike a supernova I suppose. Or an asteroid crashing into a planet. A natural and determinate process.

People say "ah Ceeeffcee, whattabout nuclear weapons - we survived that, we can survive anything!" But nukes are not democratized forms of mass murder. Sure, one nutty person could launch them, but on the whole, you need huge numbers of people working collaboratively to manufacture and maintain nukes. You can't just pop over to your computer and print off a nuke like you (almost) can a virus with synthetic biology. Or like an AI would shortly be able to do.

Maybe this is indeed why we don't really see anything out there? Because all technological life everywhere will always reach this point. I can't see how it would be possible for advancing lifeforms to not reach this point, since democratizing the means to the destruction of life on a vast scale is an inevitable corollary of technological advancement, and so inherently entails an extinction event sooner or later, from something or other somewhere.
 
It happened to the dinosaurs and it'll happen to us. God's, mother nature's, the universe's or whoever is running the shit shows little plan I guess?
 
  • Like
Reactions: CFC
School house Earth is going nowhere for now. And I bet each generation thought they were at end times. I saw a woman talking about her Hiroshima experience and of course they all though could be end times. Lots of dark moments in history.

What is not reported in the news are some actions that should be. Recently I read that a couple of kids had a lemon aid stand. They said they were trying to raise money for Disney Land. So one person came by and bought them the tickets and flight down. I was surprised the news reported good news.

Eh the Earth will shake us off if humans get too unbalanced. The Earth will balance if need be.
 
Since dawn of man there’s someone shouting – THE END IS NEAR!

Likelihood of shit going very bad is just about as things going very good. We’re most likely end up somewhere in the middle, again.
 
  • Like
Reactions: CFC
The end will probably be some cataclysmic natural disaster or disasters. Obviously there was a great flood here in the past
 
  • Like
Reactions: CFC
IIRC the average mammal species lasts for about 1 millon years. So we probably still have at least 100 000 years left, it might be much more unless something unprecedented happens to us. As a species we aren't as dumb as many people think, we have managed to survive pretty extreme disasters before.
But if we manage to create a super intelligent AGI then we could have like 10-20 years left since there is no current solution for the AI alignment problem.
 
  • Like
Reactions: CFC
Since dawn of man there’s someone shouting – THE END IS NEAR!

Likelihood of shit going very bad is just about as things going very good. We’re most likely end up somewhere in the middle, again.
I waiver on this a bit. To some extent, I feel it's part of the human mind, that we are are always predicting our imminent doom. And then I consider how much more plausible it has become with our technological progress, and how much more likely it's surely going to get. AS CFC says, the means for destruction have been, or will soon be, democratised. Once we pull a black ball out that makes mass destruction 'simple' it's hard to see how we avoid it.

Two uncertainties though- whether we will produce a technology that makes mass destruction easy, and whether anyone will have the will to do so. To the former, I would say that we can already imagine those means (CRISPR becoming widespread with the simple ability to create pathogens for eg) and to the latter- we will always have a small percentage of psychopaths or ideologically charged 'normies'. Combine the two, and it feels like crying apocalypse isn't just a reflex.
 
  • Like
Reactions: CFC
More and more i see this technological race as an elaborate excuse for not being nice to each other. What else did we ever need?

Technology as an extension of envious macho posturing in response to actual high-quality human beings.

As for the question i've got no way of arriving at a quantified prediction.
 
  • Like
Reactions: CFC
I waiver on this a bit. To some extent, I feel it's part of the human mind, that we are are always predicting our imminent doom. And then I consider how much more plausible it has become with our technological progress, and how much more likely it's surely going to get. AS CFC says, the means for destruction have been, or will soon be, democratised. Once we pull a black ball out that makes mass destruction 'simple' it's hard to see how we avoid it.

Two uncertainties though- whether we will produce a technology that makes mass destruction easy, and whether anyone will have the will to do so. To the former, I would say that we can already imagine those means (CRISPR becoming widespread with the simple ability to create pathogens for eg) and to the latter- we will always have a small percentage of psychopaths or ideologically charged 'normies'. Combine the two, and it feels like crying apocalypse isn't just a reflex.
Sure that some catastrophe is now more likely than ever, well more than in known history at least. But so is surviving natural catastrophes more likely than ever.

End of current civilization, that I can see happening, end of humanity, no. That's like when some people think humans can destroy all life on earth, nah, even if we tried, life would continue.
 
I was thinking we're probably on the final straight now, maybe 5-10 years tops? :unsure:

I guess it's been prompted by all the stuff about AI recently, and its probable efforts to exterminate humans, which does seem fairly logical. And also reminds me that any technological lifeform, not just humans, would probably be exterminating itself just about now, once the democratization of the means to take life on a vast scale reaches a certain threshold.

You only need a handful of mendacious types, an accident or three, individuals with unstable personalities and so on, to flip the kill switch. I kinda think of it like a natural process for any advancing lifeform that's acquired the means for little people to do big things, that must inevitably happen every time a civilization reaches this point. Not unlike a supernova I suppose. Or an asteroid crashing into a planet. A natural and determinate process.

People say "ah Ceeeffcee, whattabout nuclear weapons - we survived that, we can survive anything!" But nukes are not democratized forms of mass murder. Sure, one nutty person could launch them, but on the whole, you need huge numbers of people working collaboratively to manufacture and maintain nukes. You can't just pop over to your computer and print off a nuke like you (almost) can a virus with synthetic biology. Or like an AI would shortly be able to do.

Maybe this is indeed why we don't really see anything out there? Because all technological life everywhere will always reach this point. I can't see how it would be possible for advancing lifeforms to not reach this point, since democratizing the means to the destruction of life on a vast scale is an inevitable corollary of technological advancement, and so inherently entails an extinction event sooner or later, from something or other somewhere.
It will not be one or two catastrophic events that will mark an end to human species, it will be incremental steps towards destabilization and dehumanization. Very gradual and very calculated and systematic.

There will be devastation on large scales but it will not necessarily result in complete annihilation. Rather it is a series of events that will eventually lead to societal collapse such as more public shootings, more sickness and illness, and professionals and law enforcement will be overwhelmed. It will not be something that will be apparent to us while we are experiencing it, it will just seem like there is a lot of messed up shit happening. And once it reaches its pivotal point, there will be the ones who disguise themselves as saviors and healers but they will be the ones who direct you into the abyss.

Natural disasters are caused by collective human focus. Or collective focus through consciousness in general. We choose with our own free will to attract things such as earthquakes, tsunamis, hurricanes, asteroids and comets and meteors and such, and we are also in control of the volcanoes under the ground. The only reason there is this anticipation that one of these things will happen is because the human species is at a crossroads and is trying to decide on a collective level whether or not it wants to survive or not. And this uncertainty and disharmony within the human consciousness is what gives rise to natural disasters afflicted upon us in the first place.
 
Last edited:
Sure that some catastrophe is now more likely than ever, well more than in known history at least. But so is surviving natural catastrophes more likely than ever.

End of current civilization, that I can see happening, end of humanity, no. That's like when some people think humans can destroy all life on earth, nah, even if we tried, life would continue.
I agree with the sentiment that the idea of humans having the capability to destroy "Mother Earth" is mostly nonsense. Even cyanobacteria during the Great Oxidation Event (which led to a massive extinction) were unable to do it.

However, the prospect of a superintelligent AI that isn't in alignment with the values of DNA-based life forms (let alone with human values) is quite scary. If such a sophisticated entity were ever unleashed, we would be completely powerless to do anything about it. This AI might decide to embark on an exploration of the universe (or something), leaving Earth undisturbed, or it could even pursue an abstract objective or goal that would be impossible for us to understand, yet inadvertently benefits humanity as a side effect. However, this scenario seems improbable, kind of like winning the lottery.

The AI wouldn't necessarily have to be malevolent, it would likely transcend our conventional concepts of good and evil. But what if it "accidentally" creates a synthetic virus that degrades all DNA/proteins it encounters? This is just a basic example, yet there is an almost infinite number of scenarios that could result in our demise...
 
I was thinking we're probably on the final straight now, maybe 5-10 years tops? :unsure:

I guess it's been prompted by all the stuff about AI recently, and its probable efforts to exterminate humans, which does seem fairly logical. And also reminds me that any technological lifeform, not just humans, would probably be exterminating itself just about now, once the democratization of the means to take life on a vast scale reaches a certain threshold.

You only need a handful of mendacious types, an accident or three, individuals with unstable personalities and so on, to flip the kill switch. I kinda think of it like a natural process for any advancing lifeform that's acquired the means for little people to do big things, that must inevitably happen every time a civilization reaches this point. Not unlike a supernova I suppose. Or an asteroid crashing into a planet. A natural and determinate process.

People say "ah Ceeeffcee, whattabout nuclear weapons - we survived that, we can survive anything!" But nukes are not democratized forms of mass murder. Sure, one nutty person could launch them, but on the whole, you need huge numbers of people working collaboratively to manufacture and maintain nukes. You can't just pop over to your computer and print off a nuke like you (almost) can a virus with synthetic biology. Or like an AI would shortly be able to do.

Maybe this is indeed why we don't really see anything out there? Because all technological life everywhere will always reach this point. I can't see how it would be possible for advancing lifeforms to not reach this point, since democratizing the means to the destruction of life on a vast scale is an inevitable corollary of technological advancement, and so inherently entails an extinction event sooner or later, from something or other somewhere.
 
  • Fire
Reactions: CFC
AS CFC says, the means for destruction have been, or will soon be, democratised. Once we pull a black ball out that makes mass destruction 'simple' it's hard to see how we avoid it.

Yeah that was basically the crux of what I was trying to convey. We're used to being teeny tiny beings having (lets face it) a fairly limited ability to change the world around us by ourselves. But once us teeny tiny individuals have the power to do megahuge (and destructive) things, we're definitely in a new place, and not everyone is out there trying to be creative and nurturing. Human competitiveness has a known tendency to bring out the worst of us in terms of developing lethal tools and weapons, brinkmanship and one upmanship, even if we don't set out to do it with malicious intent (ie purely defensively, etc).
 
I agree with the sentiment that the idea of humans having the capability to destroy "Mother Earth" is mostly nonsense. Even cyanobacteria during the Great Oxidation Event (which led to a massive extinction) were unable to do it.

However, the prospect of a superintelligent AI that isn't in alignment with the values of DNA-based life forms (let alone with human values) is quite scary. If such a sophisticated entity were ever unleashed, we would be completely powerless to do anything about it. This AI might decide to embark on an exploration of the universe (or something), leaving Earth undisturbed, or it could even pursue an abstract objective or goal that would be impossible for us to understand, yet inadvertently benefits humanity as a side effect. However, this scenario seems improbable, kind of like winning the lottery.

The AI wouldn't necessarily have to be malevolent, it would likely transcend our conventional concepts of good and evil. But what if it "accidentally" creates a synthetic virus that degrades all DNA/proteins it encounters? This is just a basic example, yet there is an almost infinite number of scenarios that could result in our demise...
I think very powerful AIs of the future will for a long time be neither good nor bad, just kind of created in our image. After enough time passes and AI starts creating more powerful and more complex AIs, than we can hope it’ll be evolution toward entity that can achieve goals with as little violence and learn from humanities mistakes.
 
I agree with the sentiment that the idea of humans having the capability to destroy "Mother Earth" is mostly nonsense. Even cyanobacteria during the Great Oxidation Event (which led to a massive extinction) were unable to do it.

However, the prospect of a superintelligent AI that isn't in alignment with the values of DNA-based life forms (let alone with human values) is quite scary. If such a sophisticated entity were ever unleashed, we would be completely powerless to do anything about it. This AI might decide to embark on an exploration of the universe (or something), leaving Earth undisturbed, or it could even pursue an abstract objective or goal that would be impossible for us to understand, yet inadvertently benefits humanity as a side effect. However, this scenario seems improbable, kind of like winning the lottery.

The AI wouldn't necessarily have to be malevolent, it would likely transcend our conventional concepts of good and evil. But what if it "accidentally" creates a synthetic virus that degrades all DNA/proteins it encounters? This is just a basic example, yet there is an almost infinite number of scenarios that could result in our demise...

Precisely, immense power comes with immense responsibility. I accidentally destroyed a rare mason bee colony last year because I didn't see their small holes hidden among the leaf litter covering the dried mud in my front flower bed and drowned them all watering the garden. I'm the last person to intentionally harm any kind of wildlife, and felt devastated. But give me a hose and all that water power and I can do vast damage to tiny creatures without the means to defend themselves.
 
I think very powerful AIs of the future will for a long time be neither good nor bad, just kind of created in our image. After enough time passes and AI starts creating more powerful and more complex AIs, than we can hope it’ll be evolution toward entity that can achieve goals with as little violence and learn from humanities mistakes.

The problem is, a lifeform that evolves from us is still basically us, with all our evolutionary drives and motivations guiding its actions and behaviors, no matter how inappropriate they may be to present circumstances. Will it be able to (or want to) eliminate that core essence from itself? Would we want to from ourselves? And will it really last much longer than the biological humans it replaces, or just wipe itself out shortly thereafter as a result of still being basically human with all our fucked up flaws?
 
We are deffo (imho) making our own Coffin, the issue is a small amount of humans have issues & they want to control everyone else.

I agree with CFC, with the A.I. that is coming soon it will have powers beyond our control & look how humans treat animals as they are a "lower form of life" & A.I. will see us that way too.

We are doomed folks.
 
  • Like
Reactions: CFC
“I believe that human brilliance manifests itself only in flashes, among rare individuals. For this reason, humanity as a whole is enormously destructive: the creation of something as devastating as Western culture, which is now allowed to spread throughout the world, offers sufficient proof of this fact.”
― Pentti Linkola.

“The real problem is posed by those countrymen who are complete slaves to machines from a shockingly young age. All exceptions aside, it is impossible to make the average Finnish country dweller of over fifteen years of age ride a bicycle, ski or row — or even exercise in the fields. The spell of the car and its antecedent — the scooter — is unbelievable. A young man will travel a hundred metres to the sauna by car; as this involves backing the car, reversing and manoeuvring, opening and shutting garage doors, it is not a matter of saving time. In the case of farmers, moreover, the more technology advances — every sack of fertiliser now being lifted by a tractor, the spread and removal of manure being a mechanical feat — the more will their physical activities be limited to taking a few steps in the garden and climbing onto the benches of saunas. Lumberjacks have already been replaced by multi-tasking machines, while fishermen lever their trawl sacks with a winch, haul their nets with a lever, and gather their Baltic herrings with an aspirator from open fish traps.”
― Pentti Linkola
 
  • Like
Reactions: CFC
we can hope it’ll be evolution toward entity that can achieve goals with as little violence and learn from humanities mistakes.
We can hope... But why would that kind of AI listen to us? The gap in intelligence might be so large that it's pretty much impossible for us to imagine what it would look like.

With enough computing power, once we create the first AI capable of creating a more advanced version of itself there would be no turning back. Worst case scenario: millions of generations of AI could be created in a couple of seconds, and it's impossible to predict what the last one would be like. It's pretty easy to lose control.
 
Top