• Current Events, Politics
    & Science

    Welcome Guest
    Please read before posting:
    Forum Guidelines Bluelight Rules
  • CEPS Moderators: cduggles | JessFR | tathra
  • Bluelight HOT THREADS
  • Let's Welcome Our NEW MEMBERS!

What happens if your mind lives for ever on the internet?

JessFR

Moderator: CEPS
Staff member
Joined
Oct 22, 2012
Messages
6,317
That sure seems disrespectful to all the hard work people have put into advancing technology. Suggesting it just happens on its own uncontrollably.

Seems to me to be twisting the generally accepted meanings.
 

andyturbo

Moderator: AADD, MDMA, TL; Administrator: PR.net
Staff member
Joined
Dec 12, 2006
Messages
1,916
Location
Melbourne - Under the lasers
Thats not correct Shady. At the current rate of technological advances we are looking at almost spot on 2030. Definition below.

Technological-Singularity-Source-Kurzweil-2006.png Projections.png




The technological singularity (also, simply, the singularity)[1] is a hypothetical point in the future when technological growth becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization.[2][3]

According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent(such as a computer running software-based artificial general intelligence) would enter a "runaway reaction" of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.

The first use of the concept of a "singularity" in the technological context was John von Neumann.[4] Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".[5] Subsequent authors have echoed this viewpoint.[3][6]

I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.[7]

The concept and the term "singularity" were popularized by Vernor Vinge in his 1993 essay The Coming Technological Singularity, in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate. He wrote that he would be surprised if it occurred before 2005 or after 2030.[7]

Four polls, conducted in 2012 and 2013, suggested that the median estimate[clarification needed] was a 50% chance that artificial general intelligence (AGI) would be developed by 2040–2050.[8][9]

Public figures such as Stephen Hawking and Elon Musk have expressed concern that full artificial intelligence could result in human extinction.[10][11] The consequences of the singularity and its potential benefit or harm to the human race have been intensely debated.
 

Shady's Fox

Bluelighter
Joined
Jul 12, 2017
Messages
2,626
Location
Palm trees & sea, just walkin along the beach.
What you said there it's useless and fuck technology, if I had a button to destroy it, BAM you know, I would save the humanity. Also fuck technology in what circumstances? In the way that you already see what happend in the world, Facebook/instagram/snap/tumblr/etc. Everyone is glued to their screen which's nothin new. Now on topic, the so called ''Brain Cloud'' is similar to the idea when the host a device implemented in them and so everyone is controlled.

I can see people efforts in this thread. They are trying to bring up new ideas to discuss but honestly I don't understand why all these generic answers? If I had nothing to say I would shut up, right. It's logical, haha.
 

JessFR

Moderator: CEPS
Staff member
Joined
Oct 22, 2012
Messages
6,317
Well, good news. You don't have a button, and we're all gonna continue enjoying the fruits of technology. And apparently so will you, if you weren't, you wouldn't be here bitching about them. ;)
 

Xorkoth

Sr. Mod: PD, TR, TDS, P&S
Staff member
Joined
Feb 8, 2006
Messages
41,035
Location
Shadowmeister v0
LOL, Shady you so dark. I thouight there were some pretty good ideas tossed about in here, well, they were definitely better than "everyone should have just shut the fuck up" ;)Guess you're entitled to your opinion.
 

JessFR

Moderator: CEPS
Staff member
Joined
Oct 22, 2012
Messages
6,317
I think it's important to note that this is only a perception - none of us have any way of knowing whether or not the past actually happened, or the future we expect to happen will ever happen. It just feels that way. The illusion, if it is that, is obviously very convincing, of course.

I would be interested to get your opinion since you seem to hold the viewpoint I alluded to earlier - that consciousness exists as a separate entity from the accumulated collection of memories, both conscious and unconscious that we generally use to define who we are. My own opinion on this is that this is generally problematic, at least partly because as far as I can see without this assortment of memories there does not seem to be anything left to differentiate one person from another (I will use the term "memories" as a catch all from here onwards that includes the impact of all our experiences as well as innate genetic tendencies - "genetic memory", if you like - to save constantly having to elaborate on this definition).

You mentioned that someone could have been born into an evil neonazi murderer - let's just say Hitler for simplicity - I have no doubt myself that this is the case, but it's difficult for me to extrapolate that this implies that consciousness does exist as a separate entity to the collection of memory and experience that makes up a person. It would seem to me that this implies the opposite. If we take you, me, and Hitler, say from the perspective of an omnipotent god, extract these souls from their incorporated state, wipe the slate clean of every memory of who and what we are - then what is it that you suppose is left? If consciousness is separate from these things - and even if it is always dormant except when in it is temporarily imbued into matter - then what is the difference between any one unit of consciousness and the next? Or do you suppose that there is no difference?

Following on from that, do you have any opinion on what the "cut off point", so to speak, is for matter, specifically biological matter - or, perhaps, let's just say sufficiently complex matter, since you did not rule out the possibility of consciousness inhabiting an artificial substrate - to be suitable for housing one of these souls, or "units of consciousness". Would a worm be conscious, for example? Or a mouse... or a monkey?

To me it seems that any "cut off point" is a fairly arbitrary line - and it seems intuitively strange to argue that the same consciousness could jump from, say, a fruit fly, or wherever you'd put the lower bound for a conscious being, up to a human being, with all memories of being a fruit fly previously wiped clean. Of course the fact that it is intuitively strange does not rule it out, but I won't bother to elaborate on that line of reasoning for now... In any case, it seems if you draw the line at any point below present day human, then you have to allow for the possibility of different "flavours" of this consciousness, or that, essentially, it is possible for one thing to be "less conscious" than another thing - do you think that this makes sense to say?

But as soon as consciousness ceases to be a collection of discrete units, or individual, separate souls, then it becomes instead a fluid property, that something can have more of, or less of, and once you get into this territory, in my view, the more likely conclusion is the one Xorkoth expressed and which I generally share - that there is no "line" where inanimate matter suddenly becomes conscious, but that consciousness is an intrinsic property of the universe existing on a sliding scale from the smallest vacuum fluctuation in the void, to the unfathomably complex system that is the human brain, and potentially, to other complex systems that we would generally consider to be unconscious, such as stars - even if the experience of being a star would be very different to the experience of being a human.



Other than that, "continuity" of consciousness, I think, is something that people get hung up on - but I'd like to propose a thought experiment I think is interesting, I feel like perhaps I've mentioned before on this or another forum, but anyway - say that mind uploading can be done gradually, while the biological entity is awake.

Say in the future we have the technology to gradually replace sections of the brain - with no interruption in consciousness to the conscious being - or, even at the level of the individual neuron, say tiny nanobots begin to augment neurons with specialised hardware equipped with the most high bandwidth future WiFi we have available - but the whole time this is happening, the individual being upload is awake and conscious, speaking to their friends and family, perhaps some of them incorporated and biological, others inhabiting a virtual world. They start to become aware gradually of different senses and abilities that they didn't have before that allow them to interface with this virtual world, and they can switch their focus at will - but they still maintain control over their physical body because of the replacement hardware being installed in their skull.

Finally the process is almost complete, they can still control their physical body, but also interact at will with the virtual world. At this point, their biological brain is preserved but they also have the capacity to interact with the virtual. Eventually, to finish the process, these nanobots start to destroy the original neurons. This is done gradually, at all points the person being uploaded remain conscious and awake and they can stop the process at any point if they feel uncomfortable. But because of the perception of continuity, and that the computer hardware takes over the function of each deleted neuron one by one, they do not experience an interruption of consciousness and the process is eventually complete. At this point, their biological brain has been completely destroyed. Have they actually been killed, without them knowing?

What if instead, the same process occurs, but this time they are asleep - rendered unconscious via anaesthesia, and their virtual self is not actually running, but just being written into data - so when they go to sleep, their mind is biological - and when they "wake up", they are virtual - in terms of the "transference of the soul", is there really any difference in these 2 processes?
Hey, Sorry for the delay in replying to you, but this was a big post so I wanted to wait till I had an opportunity, with a proper computer keyboard rather than the phone I usually post from, to reply to you.

Ill try to reply to your points in the order you raised them. Starting with differentiating consciousness from other aspects of the mind.

Yes, I tend to believe that consciousness is separate from much of the rest of the mine. I don't know how much. It stands to reason to me, that assuming there exists "consciousness" as an independent phenomenon from things like memory, personality, etc. That we don't have it simply by chance. It stands to reason to me, that whatever the underlying physical basis for consciousness is, that we have it because it serves some kind of biological function beyond simply providing consciousness. But since I don't know what that function might be, for the purposes of discussion I don't assume that it does anything beyond providing consciousness.

You asked, " as I can see without this assortment of memories there does not seem to be anything left to differentiate one person from another" and "what is the difference between any one unit of consciousness and the next? ". This is where we again run into the problem of not having a word to define the difference, and the difficulty of defining the word without already having an accurate word to define it with.

What is the difference between one consciousness and the next? Between yours and mine say? Separate from memories, personality, etc. What separates them in my mind, is simply that they both have their own unique experience of being "them". For example, lets say you somehow lost all your memories. As well as all your personality somehow. Everything that generally makes you a unique person, different from others. It stands to reason that even without that, you are still experiencing consciousness. You would still be 'you' even without those defining characteristics, separate from being somebody else.

The problem with discussing this is I have to use words like "you" in multiple different ways. The you as in all the things outside observers can say make you, you. And you as in your own internal perception of existing. For instance, From your perspective, when you see other people in your life. You (the latter meaning of 'you'), can see those people, but you are not experiencing existence from their perspective. For that matter, there's no clear reason why whatever it is that makes you consciously aware, need be experiencing existence from any perspective. Presumably the whole world could be the same as it is now, with all the same people doing the same things. But with none of them truly conscious. None of them having a soul so to speak. And because of that, it not matting what harm comes to them. Because no matter what suffering they experience, there would be no true consciousness to experience that suffering.

That's what I mean by the soul here. That distinction between somebody who is experiencing existence, and thus can experiencing suffering, which presumably applies to all humans and probably a bunch of other animals. And a hypothetical somebody who might have all the same memories, personality, desires, etc. But lacks true consciousness. Only having apparent consciousness from the perspective of an observer, but not truly being conscious. For such a hypothetical person, causing suffering to them would not matter. Because no matter how much they might seem conscious, they aren't.

So what is the difference between one unit of consciousness and the next? From the perspective of an external observer, possibly nothing whatsoever. The difference could only be perceived in the sense that each unit is experiencing ongoing consciousness, distinct from any other unit of consciousness.

What is the cutoff? Are worms conscious? In my opinion based on my thinking about this. Unknown. It stands to reason that these units of consciousness don't JUST provide distinct consciousnesses. If they did, why do we have them? Why aren't we just a race that has all the apparent behavior of conscious life, but with no individual truly experiencing it. So I suspect this unit of consciousness, or soul, whatever you want to call it, also has some other function. Presumably involving other aspects of cognition. In which case, there may well be some animals that don't have this. Animals who's suffering doesn't matter because there's no true consciousness to experience it. Though if I had to guess, I'd guess that at the very least all mammals have it. And probably a bunch of other species too.

As for different flavors of consciousness, different levels. I suspect (I keep saying suspect because all of this is deduced from my thought experiments, which is hardly scientific evidence) that a soul/unit of consciousness can experience its awareness from brains with a wide range of cognitive capabilities. So while a dog for instance, doesn't have the same cognitive reasoning capabilities as a human does, it probably does have true consciousness. It's just that it isn't capable of comprehending its experience of existence. Hence why it would be possible for a soul to be reborn into a new host after the death of its old one. Even if one has vastly different cognitive capabilities as the other. So if for example you died, and one day were reborn into some species similar to a dog. You would have none of your old memories, old personality. Or even the same ability to comprehend the world and form rational thoughts and opinions. But you would still be the same underlying consciousness, experiencing existence. And capable of experiencing suffering from your unique awareness. As opposed to simply being a creature that only appears to observers to be experiencing suffering.

As for your hypothetical involving the transition to a technological brain from an organic one. Without knowing how the underlying unit of consciousness really works, what it is. It's impossible to say if you would still be truly conscious at the end. Of if that consciousness were at some point lost. Is there a difference in the end? Maybe, maybe not. But as with the swampman thought experiment. It seems to me that there would be a real danger that by the end you would only appear conscious, rather than truly being conscious.

Sorry for such a long reply. But this is deep philosophical stuff. which we don't have satisfactory words to explain.
 

Vastness

Moderator: PD
Staff member
Joined
Mar 10, 2006
Messages
1,384
Location
iterating through cyclic eternities
Thanks for taking the time to respond! No need to apologise for the long reply, my own post was obviously a long one also and this topic is surely one that justifies a little depth of discussion.

I wasn't aware of the "Swampman" thought experiment, I just read up about it now, very interesting and I guess that's a very similar scenario to the one I proposed except that the "causal link", so to speak, is broken entirely, as Swampman was an entirely random if highly improbable occurrence rather than a deliberate creation with reference to the original...

I guess there is some critical information about the nature of consciousness missing, that means this debate will for the time being always remain somewhat undecided.

I appreciate what you're saying regarding the difference between one "unit of consciousness" and another being the current experience of that consciousness, and I guess the difference here is kind of self-evident and in some respects the absolute tangibility of consciousness as an unmeasurable "substance", if you like, is kind of irrelevant... like if we had 2 rocks, made of the same material, completely identical, one at the north pole and one at the south pole... although they might be made of the same stuff, and if we were to swap them round, no-one would know the difference and it would not make any difference to anything - they would still not actually be the same rock, as each one would have a different causal chain of events that caused them to be where they are. And in the same sense, whether or not on some level we are all one and all different iterations of a single fundamental consciousness pervading the universe, this still doesn't do much to answer the age-old question of WHY and HOW we are WHO WE ARE AT THIS MOMENT in time, even if we won't always be us, haven't always been us, and even if, if we had found ourselves to be someone else, it wouldn't make any difference to anything else in the universe... or would it? :unsure:


JessFR said:
For that matter, there's no clear reason why whatever it is that makes you consciously aware, need be experiencing existence from any perspective. Presumably the whole world could be the same as it is now, with all the same people doing the same things. But with none of them truly conscious. None of them having a soul so to speak.
I agree with the first part of this sentence, of course - but personally I'm not sure it would make sense for the world, or indeed, life, to be alive but unconscious. If we look at a brain at it's most basic level as something that records past events in order to predict future events, then it seems this inherently introduces a layer of "self-awareness" that maybe unavoidably manifests as an experience of some sort... language of course is a relatively recent evolution which adds a layer of recursive and fascinating complexity by allowing self-analysis of an especially rigorous and specific nature, by allowing us to assign words to things that seem to exist only within our internal experience of self... but we can still look at this as an especially high level abstraction of the kind of far more basic survival-oriented self-analysis that occurs in every life form down to single celled organisms...

The bacteria thinks without words, "I sense food, therefore I drift this way!" ... which, in fact, is probably a sentiment echoed all the way up the chain with ever-increasing complexity, to larger animals who think, or perhaps just feel "I am hungry, therefore I must hunt!", to the human whose internal monologue has become so convoluted that we aren't even sure what our goals are anymore and can endlessly discuss the nature of what exactly we are and what it is to feel anything, "I feel hungry, but I am not hunger itself. What am I? Do "I" exist? What is food, really?" ...would it make sense for zombie-humans with no inner world to be having these same simulated discussions endlessly?

I'm not sure it would... in which case the chain of wordless experience can maybe be extended to pre-biological things - the electron thinks/feels/experiences without words, "I see that proton and I bind to it" ... although the word "I" here is maybe somewhat dubious, to say the least...

Of course even if it's not possible to have soulless zombie humans this still doesn't explain anything about the mystery of experience, specifically WHY experience is like what it is, and the nature of qualia... and obviously even if the lines between different types of self awareness are arbitrary on some level - some lines still need to be drawn if we are ever to have any hope of reaching a greater understanding of ourselves and the nature of reality - because "experience" seems to be a vast and varied space which is ever changing... or at least, it feels that way... is a fascinating mystery for sure...
 

Hannah Capps

Bluelighter
Joined
Jan 29, 2006
Messages
405
If this were possible I’d have to say no, it wouldn’t be myself or my thoughts to represent who I was and secondly I wouldn’t want to in any form live eternally on this plane it’s just to tainted with muck
 

Captain.Heroin

Sr. Moderator: H&R, Words
Staff member
Joined
Nov 3, 2008
Messages
78,940
Location
Dead
how do people know memories have no physical basis in the cns?
I didn't say that. I said they don't take up physical space i.e in a traditional binary computer model way.

It's more about configurations. Not literal 0's and 1's taking up a "void", so to speak.
 

jabaqb

Greenlighter
Joined
May 15, 2016
Messages
6
Location
philippines, samar island
granting for the sake of the infinite possibilities, foreverer the mind lives on internet.so be it. My mindf for me is dispensable, i can readily give it up, but not my conciousness, i a m my conciousness, it is me. would i upload my conciousness to the internet? i need not to, maybe the internet uploads itself unto my conciousness. the way conciousness conduct its existense is mystical in nature, unlike the brain that generates the mind, easy to manipulate. but if ever i may upload my brain to the internet, that maybe my own design to get rid of my sinful mind.
 

JessFR

Moderator: CEPS
Staff member
Joined
Oct 22, 2012
Messages
6,317
I wonder what it would be like to meet digital moi. Would we be friends? What if I didn’t like me? Therapeutic possibilities (of a non-traumatic nature 😝)? It would definitely be worth doing, imo, although I’d definitely do it privately.
I think I'd hate digital me. She's such a smug jackass always acting like she knows everything about everything. And don't even get me started about how she randomly capitalizes words... I DON'T think it works very well. I DO think it's stupid. And then there's the way she constantly reuses phrases like... "honestly, I think" or "if you ask me", sometimes several times within a couple paragraphs.

Yeah... No, honestly, if you ask me, digital me is ANNOYING. :D
 

jabaqb

Greenlighter
Joined
May 15, 2016
Messages
6
Location
philippines, samar island
The "Soul" or "Spirit" simply wouldn't be there.
The "data" may well be in a digital form but there is nothing in it that truly makes us human.

I've said it now so many times on BL & I'll say it again you can ALWAYS tell the people that have smoked DMT from those who haven't.
That "death" is an illusion as we think of it, "God" is real & he/she/it is NOT the nice & loving being Christians think it is. "God" can see everything we do & "it" has many entities that are around us all the time & can crush us faster than a blink of the human eye. The human ego is given to us by free choice & it is upto you how you use it BUT if you think you are better than anyone else on Earth at the moment your body & "consciousness" as you know it "dies" you are in for one hell of a HUGE fucking shock.

Be kind to animals, people less off than you are, don't go around like you are something as you may get away with it for decades but what we call "death" isn't the end, it is the next stage & that place is way beyond anything we understand in our normal day & if you are a prick on Earth as a human something is waiting to show you where you went wrong & the "punshiment" waiting for you will be HORRIFIC!!!!!

@TripSitterNZ knows what I'm saying I feel.
,on the bird on the cage :The best bird must be the bird that thinks of (since the subject bird can think ,) curing the seemingly life threathening flying ailment.
 

jabaqb

Greenlighter
Joined
May 15, 2016
Messages
6
Location
philippines, samar island
Would you upload your brain so the virtual you could “live” forever?

remarks 👹: in my insight, the computer generated brain/mind is impostor, a fraud and an aper. by the way, granting its valid, would you practice the same spiritual conduct?
 

JessFR

Moderator: CEPS
Staff member
Joined
Oct 22, 2012
Messages
6,317
I didn't say that. I said they don't take up physical space i.e in a traditional binary computer model way.

It's more about configurations. Not literal 0's and 1's taking up a "void", so to speak.
While the brain doesn't work exactly like a modern digital computer, it does work in the same fundamental way. Which is to say, built out of electrically controlled switches.

I don't think simulating the brain is as much a question of having a computer that worlds exactly like it, but one that can accurately map what it does.

And you can simulate the behavior of a neuron in software even if that software isn't running on the same hardware as our minds do.

I would presume that if you could simulate our entire neural net in software and connect it to simulated senses, there's no reason it wouldn't behave exactly like the person it came from.

Modern computers aren't nearly fast enough to simulate a full human brain, and we can't realistically make a completely accurate map of a human brain in a particular slice of time. But in theory I see no problems.
 

Hylight

Bluelighter
Joined
Jan 4, 2019
Messages
1,817
@
. . . . consciousness is separate from much of the rest of the mind I don't know how much.


consciousness might be the short term awareness and the mind might be the long term awareness.

ram and rom

😁
just don't let the mind be blind.
try to stay kind.
 
Top