If you have an email ending in @hotmail.com, @live.com or @outlook.com (or any other Microsoft-related domain), please consider changing it to another email provider; Microsoft decided to instantly block the server's IP, so emails can't be sent to these addresses.
If you use an @yahoo.com email or any related Yahoo services, they have blocked us also due to "user complaints"
-UE

Why extropianism is unscientific

24

Comments

  • I'm a transhumanist in the "cyborgs are awesome" sense. I do totally agree with the brain-uploading thing and the singularity thing, by-the-way.


    Sure, that is in no way what you're talking about, but it was your choice to start using labels.


    This thread should instead be; "Why brain-uploading is unscientific."

  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    ^^ Good article. Thanks for sharing.


    ^ I renamed it to something more appropriate.

  • Internet debates about extropianism -- the only place you'll ever hear the term "extropianism".

  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    ^ Which is a good thing, as we know that they are probably never going to be anything more than an annoying fringe group.

  • Worth noting is that Vernor Vinge COINED the term "singularity", still considers it merely a thought experiment and even he's weirded out by some of the thinking the topic has spawned.

  • "you duck spawn, refined creature, you try to be cynical, yokel, but all that comes out of it is that you're a dunce!!!!! you duck plug!"

    So I see I can agree with Vinge here. I mean, all these Internet nerds who buy that, they seem to believe it works like this:


    1) invent an AI


    2) ???


    3) I BE GOD


    Like, as if the existence of a computer you can have a conversation with means the laws of physics are suddenly reduced to, like, optional footnotes.

  • yea i make potions if ya know what i mean

    Only transhumanist I'm familiar with is the guy who writes Dresden Codak. So uh, no comments here other than why did you write the OP as if it were addressed to everyone reading it?

  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    Are you talking about this:



    You will often see many transhumanists try to defend their beliefs by claiming that your opposition to transhumanism is unscientific or grounded in religion. Well, you'd be surprised to see that they live in an illusion that their views are anything but scientific.



    Or this?



    It is impossible. Face it.



    Regardless, I sometimes write parts of essays/posts like this one like I'm talking to some indefinite person because the alternative would look awkward. 

  • yea i make potions if ya know what i mean

    Either or. I dunno just struck me as odd.


    But I have nothing to really add to this conversation, so y'all can carry on. I'll go back to watching my animoo.

  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    Meh, it's just a stylistical thing.

  • I doubt brain uploading will ever happen but I don't think your counterargument is nearly as strong as you think at is.

    In fact, brain uploading doesn't rely on dualism at all. It relies on your mind being information stored in your brain; that is to say it relies on MATERIALISM being true. If you had a soul there's no way to upload that to a physical object.

    Now, strong monism of the sort that you're not just information in your brain, you are literally your brain itself would indeed mean brain uploading is impossible, but that's no more supported by science then "information within the brain" is.

  • No rainbow star
    I still don't get how one could think uploading brains is impossible since tech keeps advancing



    Impossible in our life times? Probably. Impossible in the long run? No way to know yet
  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    ^^ Hmm, that's a very sound argument, I'll give you that.


    But I believe that the mind is far more than simple information stored in your brain. In fact, I believe that, in a sense, we, as in, out biological selves, are our minds. Our brains function dynamically, and are constantly affected by internal and external stimuli, bodily chemicals etc. The information stored within our brain is only a part of a wider network that rules our mental processes; this may be a kinda weak analogy, but saying that uploading the information stored within our brain to an artificial body would completely upload our "selves" into it would be like copying all files from your old computer to a new one and saying that you are still using your old computer.


    The point is that information in the brain, independent of other factors, is passive and thus can't produce consciousness, as consciousness requires stimuli and an active understanding of the world. Some mental traits are determined by genetics - again, biology. There's much, much more.


    The reason why the idea of brain uploading dualistic is because it assumes that cognitive processes are independent of biological processes (false), which again creates the dissonance between the material and the mental.


    ^ It's as much a question of technology as it is a question of philosophy and the nature of the human mind and consciousness.

  • OOOooooOoOoOOoo, I'm a ghoOooOooOOOost!

    ^In theory, the solution to that would be to just replicate and/or gradually replace the entire body.


    I say in theory because accomplishing that assumes the discovery of technology that runs on wishes.

  • edited 2012-12-28 03:21:35

    I don't really see how brain damage is an argument against the existence of souls. If we posit that there is some kind of eternal soul which is the basis of our mind, personality and rationality then the brain is simply a tool which translates this into a form which is usable in the physical world. It's kind of like how a television can receive a signal from somewhere, which is then the true origin of what you see on the screen. However if one just naively looks at the screen one might think that the screen is in fact the source of what is being conveyed. In the same sense one might naively think that the brain is the source of our selves, rather than simply a tool which can receive and translate the soul. If this tool breaks down, obviously the reception and decrypting of the soul breaks down and we get things like Alzheimer's disease.


     


    Not saying that this is the case (I really don't know if we have a soul or not), just that I'm not entirely convinced by your opening reductionist statements.


     


    Though it should be noted that in terms of natural science, the postulation of an immortal soul is indeed not scientific. After all it is pretty much a completely untestable hypothesis, meaning that it can't really be falsified.


  • No rainbow star

    ^ Well, it's untestable insofar as ghosts are


    Unfortunately, there's the tricky issue of the goal posts constantly moving

  • BeeBee
    edited 2012-12-28 04:11:53

    I'm willing to argue under the assumption that there isn't a soul as transhumanist wank fuel tends to fall apart just as easily on their own terms.


    Like, best case scenario you'd create an AI that thinks it's you while the real you is still sitting around in your brainpan.  Even under assumptions of weak monism all you're doing is making a copy of information.

  • if u do convins fashist akwaint hiz faec w pavment neway jus 2 b sur

    ^In theory, the solution to that would be to just replicate and/or gradually replace the entire body.


    I say in theory because accomplishing that assumes the discovery of technology that runs on wishes.



    Though, assuming that it's possible to create a perfectly functioning artificial replica of a biological body and all its processes, it also raises the question of could your consciousness be preserved in a replica body. We don't yet know what consciousness is, exactly, and what processes govern it, but it's most probably a very complex phenomenon that can't be boiled down to passive memory storage.



    I don't really see how brain damage is an argument against the existence of souls. If we posit that there is some kind of eternal soul which is the basis of our mind, personality and rationality then the brain is simply a tool which translates this into a form which is usable in the physical world. It's kind of like how a television can receive a signal from somewhere, which is then the true origin of what you see on the screen. However if one just naively looks at the screen one might think that the screen is in fact the source of what is being conveyed. In the same sense one might naively think that the brain is the source of our selves, rather than simply a tool which can receive and translate the soul. If this tool breaks down, obviously the reception and decrypting of the soul breaks down and we get things like Alzheimer's disease.



    Hmmm. Still, a "soul" or a "mind" is supposed to be the sole source of our personality and mental qualities, and it's been proven that information is physically stored in the brain. If that's the case, it kinda invalidates the whole point of a transcendental soul, does it?


    And even if we take the existence of a Cartesian mind as granted, can something that exists only in the transcendental, idealistic sense really be uploaded into a machine? Even computer data operates in the form of electrical signals.


  • This may be a kinda weak analogy, but saying that uploading the information stored within our brain to an artificial body would completely upload our "selves" into it would be like copying all files from your old computer to a new one and saying that you are still using your old computer.



    From the point of view of the computer, you would be. Or well, you'd be using your "old computer" with all the hardware replaced, but otherwise you would be.



    Like, best case scenario you'd create an AI that thinks it's you while the real you is still sitting around in your brainpan.  Even under assumptions of weak monism all you're doing is making a copy of information.



    No, under the assumptions of weak monism that IS you. ALL copies of the information in your brain are you. Even if there is more than one of them, they are all still you, because you are exactly the information stored in your brain and nothing more.

    Yes this is weird. But it's not inconsistent, just unintuitive, so that's not a counterargument.

  • OOOooooOoOoOOoo, I'm a ghoOooOooOOOost!
    ^You're failing to account for perspective.
  • If you must eat a phoenix, boil it, do not roast it. This only encourages their mischievous habits.

    It just considers all your thought processes, memories and such to be what makes up 'you', not your body.

  • BeeBee
    edited 2012-12-29 04:14:54

    No, under the assumptions of weak monism that IS you. ALL copies of the information in your brain are you. Even if there is more than one of them, they are all still you, because you are exactly the information stored in your brain and nothing more.



    It is a copy of you.  It is not specifically you.  Unless you are actively sharing thought processes with that copy, it's not your instance of you.  In fact, the moment it's created, it starts having experiences that you don't -- so within seconds, it's not really even a copy of you either.

  • If you must eat a phoenix, boil it, do not roast it. This only encourages their mischievous habits.

    It is a copy of you.  It is not specifically you.



    Not really. It depends on what you think 'you' are, again. You don't have to be limited to one mind, one body, etc; at least, to some people.


    To some, for all intents and purposes, a copy of you which shared all your experiences, emotions, thought patters, etc, is you, as there is no meaningful differences. There is simply you, shared among two instances.


    Gah, my head hurts now.

  • Except as soon as it starts existing independently, it's now building its own set of experiences, emotions, and thought patterns.  At the very latest, from the moment you turn it on, it stops being you in any meaningful sense.

  • If you must eat a phoenix, boil it, do not roast it. This only encourages their mischievous habits.

    Not really. 'You' aren't just a sum of thoughts, experiences and everything.

  • BeeBee
    edited 2012-12-29 05:24:32

    Except that's the whole fucking basis of it counting as you!


    Either you are a replicatable collection of thoughts, memories, and experiences, or you aren't.  If you aren't, the copy never counted.  If you are, it stopped counting at t>0 if it even did in the first place.

  • If you must eat a phoenix, boil it, do not roast it. This only encourages their mischievous habits.

    Or, it's you in that it's you, being defined as something "you" and having the basis of your mind.


    That is; it depends on if someone defines 'you' as something that can be measured/defined/whatever, or if it's a concept that can be applied to any instance of you, even if that instance continues to grow outside of the first instance.


  • It is a copy of you.  It is not specifically you.  Unless you are actively sharing thought processes with that copy, it's not your instance of you.  In fact, the moment it's created, it starts having experiences that you don't -- so within seconds, it's not really even a copy of you either.




    The concept of "your instance of you" is meaningless. It's you, it continues being you. You can have multiple different "you"s at the same time having multiple different experiences; none of them is the single real "you" and so none of them are "copies".

    I realize this is terribly unintuitive, but again: weird, not inconsistent.

  • If you must eat a phoenix, boil it, do not roast it. This only encourages their mischievous habits.

    Basically, what he said.

  • edited 2012-12-29 07:37:29
    OOOooooOoOoOOoo, I'm a ghoOooOooOOOost!
    The difference between "a copy of me" and "me" is quite meaningful on the basis that if you wanted to make a precise copy of me and destroy the original, I would be dead and there would simply be another entity with a mind identical to mine. Quite different from the transfer of consciousness that would be the goal here.
Sign In or Register to comment.