Is a machine with a human brain still human?

 Pages 1 2 NEXT
 

The situation is this: in the future, or some hypothetical universe, every part of the human brain can be simulated perfectly by a computer. It is easy to do this. Let's say it costs a thousand dollars (you can chose any arbitrary cost, or have it free if you want.) Any human can upload their brain onto a computer, and it can be copied onto as many computers as you like. As for what sort of computer it is, I'll leave that up to you. It could be capable of moving around and interacting, or just an uploaded consciousness.

EDIT: The machine would be capable of making new connections and new thoughts in exactly the same way that a normal brain would. (I said it was hypothetical.)

My first thought was, well, it's a being deserving of human rights by my own definition (it has a consciousness and intelligence equivalent to a human.)

But what about an exact copy of me? Does my uniqueness give me any additional rights? Am I the 'parent' of this computer? Is it my property, or do I not own on the second it finishes uploading my brain? Should it be able to vote? If I was rich I could create an army of clones to decide the election for me. Does it become a citizen? Does it receive benefits? Does it pay taxes? Think of your own problem, kids!

So...discuss!

P.S I got the idea for this thread out of another one about abortion, hopefully this thread doesn't go anywhere near the subject, it's supposed to be a philosophical problem. But I feel the subjects are linked somewhat, so maybe it's not totally pointless to talk about this.

Thread about this subject;

http://www.escapistmagazine.com/forums/read/528.351391-Silicon-racism

Thread about this future;

http://www.escapistmagazine.com/forums/read/528.340737-Why-youll-be-immortal-if-youre-still-alive-in-2040

My thoughts on the subject;

Substrate is morally irrelevant, assuming it doesn't affect functionality or consciousness. It doesn't matter,
from a moral point of view, whether somebody runs on silicon or biological neurons (just as it doesn't matter
whether you have dark or pale skin). On the same grounds, that we reject racism and speciesism, we should
also reject carbon-chauvinism, or bioism.
-NICK BOSTROM, "ETHICS FOR INTELLIGENT MACHINES: A PROPOSAL, 2001"

BILL GATES: That's going to be silicon intelligence, not biological intelligence.
RAY KURZWEIL: Well, yes, we're going to transcend biological intelligence. We'll merge with it first, but ultimately the
nonbiological portion of our intelligence will predominate. By the way, it's not likely to be silicon, but
something like carbon nanotubes.
BILL: Yes, I understand - I'm just referring to that as silicon intelligence since people understand what that means. But
I don't think that's going to be conscious in the human sense.
RAY: Why not? If we emulate in as detailed a manner as necessary everything going on in the human brain and body
and instantiate these processes in another substrate, and then of course expand it greatly, why wouldn't it be
conscious?
BILL: Oh, it will be conscious. I just think it will be a different type of consciousness.
RAY: Maybe this is the 1 percent we disagree on. Why would it be different?
BILL: Because computers can merge together instantly. Ten computers - or one million computers - can become one
faster, bigger computer. As humans, we can't do that. We each have a distinct individuality that cannot be
bridged.
RAY: That's just a limitation of biological intelligence. The unbridgeable distinctness of biological intelligence is not a
plus. "Silicon" intelligence can have it both ways. Computers don't have to pool their intelligence and
resources. They can remain "individuals" if they wish. Silicon intelligence can even have it both ways by
merging and retaining individuality-at the same time.

Once you have created that fully sapient being, it will within seconds start to drift from what you were. It is now its own sapient being deserving of the rights of a sapient being. Assuming it is truly recreated sapient (mind-numbingly advanced technology that allows for unique personalities and human-esque thoughts required) then it should be treated as an equal with the same rights. Now if the fake is not actually capable of sapient thought... then we get into one huge grayish area that I do not want to touch. Too grey.

Hmm that depends on how you define "human" though for most definitions the answer is no. A human is the total package possibly including a soul (depending on whether or not you believe in a soul which is irrelevant to this conversation). A machine with a human brain is still a machine with a human brain.

Incidentally why are we starting the civil rights for machines discussion this early? I thought the whole point of machines was to replace illegal immigrants and do the jobs that nobody really wants to do (half-joke)?

Depends on how the machine works.
Part of what makes us human is our neurons constantly forming new connections and reshaping our brains, i.e. creating memories and having these memories affect what kind of person we are afterwards.

The question is whether the computer also acts like this. If it doesn't, it is basically a still image of the uploader's mind at the moment of uploading, and I doubt it would be able to function like a normal human. If it does, it will quickly turn into a separate person, as it has its own experiences.

Is the point here that the machine functions in every way like a human brain?
Would the scenario remain unchanged if I just uploaded my mind to a synthetic brain?

brandon237:
Now if the fake is not actually capable of sapient thought... then we get into one huge grayish area that I do not want to touch. Too grey.

Almost like...
Grey matter!

image
Thank you, thank you. I'll be here all night.

Jonluw:
Depends on how the machine works.
Part of what makes us human is our neurons constantly forming new connections and reshaping our brains, i.e. creating memories and having these memories affect what kind of person we are afterwards.

The question is whether the computer also acts like this. If it doesn't, it is basically a still image of the uploader's mind at the moment of uploading, and I doubt it would be able to function like a normal human. If it does, it will quickly turn into a separate person, as it has its own experiences.

Is the point here that the machine functions in every way like a human brain?
Would the scenario remain unchanged if I just uploaded my mind to a synthetic brain?

Why is everyone saying everything so much better than I am these days? >.<

And as for your joke, I laughed :D

brandon237:
And as for your joke, I laughed :D

Did you notice I actually managed to attach a link to the picture itself?
I'm pretty satisfied that it worked.

Jonluw:
Depends on how the machine works.
Part of what makes us human is our neurons constantly forming new connections and reshaping our brains, i.e. creating memories and having these memories affect what kind of person we are afterwards.

The question is whether the computer also acts like this. If it doesn't, it is basically a still image of the uploader's mind at the moment of uploading, and I doubt it would be able to function like a normal human. If it does, it will quickly turn into a separate person, as it has its own experiences.

Is the point here that the machine functions in every way like a human brain?
Would the scenario remain unchanged if I just uploaded my mind to a synthetic brain?

Added this to the OP:
The machine would be capable of making new connections and new thoughts in exactly the same way that a normal brain would. (I said it was hypothetical.)

Jonluw:

brandon237:

And as for your joke, I laughed :D

Did you notice I actually managed to attach a link to the picture itself?
I'm pretty satisfied that it worked.

Oh Come On! I feel like you people are omniscient today :P
Okay, you win, very well played my good sir :D

asacatman:

Jonluw:
Depends on how the machine works.
Part of what makes us human is our neurons constantly forming new connections and reshaping our brains, i.e. creating memories and having these memories affect what kind of person we are afterwards.

The question is whether the computer also acts like this. If it doesn't, it is basically a still image of the uploader's mind at the moment of uploading, and I doubt it would be able to function like a normal human. If it does, it will quickly turn into a separate person, as it has its own experiences.

Is the point here that the machine functions in every way like a human brain?
Would the scenario remain unchanged if I just uploaded my mind to a synthetic brain?

Added this to the OP:
The machine would be capable of making new connections and new thoughts in exactly the same way that a normal brain would. (I said it was hypothetical.)

So it's basically the same as uploading your mind to an artificially grown brain, except it's not organic.

In that case, my answer is yes. To me, that's a human.
I like to imagine a scenario like this:
Robotic prosthetics have come far enough to be superior to our original human bodies. Deus ex style.
I, being me, would naturally start ripping off body parts and replacing them with harder better faster, 20% cooler Jonluw spare partstm.
At some point, my entire body, save for my brain is robotic. To me, I am clearly still myself (Of course, to a christian with a "body and soul is one" view, I would no longer be a complete human).
Now, my brain is getting old and tired, so I decide to make a new one and transfer the data that composes my current mind onto that brain (or computer), and attach it to my "body".
The old brain is at its breaking point and dies shortly afterwards.
In my opinion, I am still a human.
That is: I am not fully human. I am not a human in the flesh. However, I should still have the same rights and be treated as the same human as before.
To me, the body is nought but a vessel to transport the part of us it is that we treasure: Our mind.
i.e. I may not be completely human, but the important bits of my humanity are still left.

brandon237:

Jonluw:

brandon237:

And as for your joke, I laughed :D

Did you notice I actually managed to attach a link to the picture itself?
I'm pretty satisfied that it worked.

Oh Come On! I feel like you people are omniscient today :P
Okay, you win, very well played my good sir :D

image

Jonluw is pleased. Sadly, I cannot just post that image, out of fear for low content wrath.
OT: I guess the point is that "I"/"me" only exist as electrical signals being transferred between neurons, and if I can copy those electrical signals... why not? For science!

If the simulation is absolutely perfect & identical to a human, then it can be considered sentient, and equivalent to a human. I'm not sure if we should actually call any such mind a human though, purely because the substrate differs. If I had the choice between these two systems, I'd choose the simulation...far more options for mental exercise.

No, but it has personhood, and hence deserves the same rights and obligations as a human would get.

Its creator don't own it any more than he'd do his kids. Once it's sapient it's not property.

As for "uniqueness", that's found in any dog as well, and warrants nothing in terms of legal rights. Anyway, if you managed to transfer a copy of your mind to a robot, it'd only be an exact mental replica in the very second the transfer was complete, before the branching perspectives and experiences would once again make it two unique - if to begin with extremely similar - minds with different sensory inputs.

Jonluw:
snip.

To be honest, what I was getting at was what problems that definition of humantity (or human-rights-givingity) would cause, like the one I mentioned in the OP, you could copy yourself.

It seems obvious to me that a machine like this would have human rights, at first. But after thinking about it there are complications.

There should be rules about uniqueness of brains. But what if you did copy yourself, and after two months, or six months, or whatever, the copy was discovered. Should it be deleted? It is now a legitimately different being (if it acts like a normal brain, which was in the definiton of this machine).

I have no answer, really.

I was hoping someone in this thread would come up with one...

asacatman:
To be honest, what I was getting at was what problems that definition of humantity (or human-rights-givingity) would cause, like the one I mentioned in the OP, you could copy yourself.

It seems obvious to me that a machine like this would have human rights, at first. But after thinking about it there are complications.

There should be rules about uniqueness of brains. But what if you did copy yourself, and after two months, or six months, or whatever, the copy was discovered. Should it be deleted? It is now a legitimately different being (if it acts like a normal brain, which was in the definiton of this machine).

I have no answer, really.

I was hoping someone in this thread would come up with one...

I see no reason an entity should be considered less valuable because it's identical to another.
That would be a little insulting to identical twins, don't you think?

Now, here's a question I want to run by you folks.

Why does "Sentient and sapient" have to imply "human" at all?

Are we really that self-centered?

Jonluw:

asacatman:
To be honest, what I was getting at was what problems that definition of humantity (or human-rights-givingity) would cause, like the one I mentioned in the OP, you could copy yourself.

It seems obvious to me that a machine like this would have human rights, at first. But after thinking about it there are complications.

There should be rules about uniqueness of brains. But what if you did copy yourself, and after two months, or six months, or whatever, the copy was discovered. Should it be deleted? It is now a legitimately different being (if it acts like a normal brain, which was in the definiton of this machine).

I have no answer, really.

I was hoping someone in this thread would come up with one...

I see no reason an entity should be considered less valuable because it's identical to another.
That would be a little insulting to identical twins, don't you think?

Well, what if someone made an army of clones? And they attacked you? It would be an attack of the clones! There might even be some kind of war, and by this point star travel might be possible, so it would be a...ok I'll stop.

I actually have a point to make here...What if someone did make a load of clones, and every clone is essentially the same person, if they have fourty or fifty years of developement already ticked off for them. I think it's safe to say they would all vote the same, and could recognize they were the same person enough to form a sort of gang. And this is not just two or three people either, you could get millions of them with enough time and money.

Oh and about the twins thing, they develop differently from the moment they're born, and have no prior experience, so they are much more different than the copied computers I'm talking about.

Imperator_DK:
As for "uniqueness", that's found in any dog as well, and warrants nothing in terms of legal rights. Anyway, if you managed to transfer a copy of your mind to a robot, it'd only be an exact mental replica in the very second the transfer was complete, before the branching perspectives and experiences would once again make it two unique - if to begin with extremely similar - minds with different sensory inputs.

The other bit of my post works as a response to this too. How do you reconcile the problems giving rights to replicas would cause?

Depends on how you want to define human. Classifying living things is a great example of trying to apply simple, orderly, somewhat black-and-white, and arbitrary bounds on a chaotic and gradational subject. In fact, the definition of life and nonlife is still under debate, and this is a question that is likely at least as old as writing. At some point you just have to stand back and say "OK, knowing what to call things is convenient, but ultimately the pursuit of a complete classification system in biology is futile."

While a constructed human wouldn't have had the same origin as other humans, functionally there'd be no difference, so yes, for the purposes of rights and legal definitions, they are human. Not that it'll matter to most people. Remember, there are still people who think that the amount of melanin in your skin determines whether or not you are human, so you can bet that artificial humans will not be considered human by most people.

Also, like Seekster mentioned, some people believe souls exist, which would be an even more convenient justification for discrimination because no matter how much we learn about consciousness, people will always be able to simply claim that souls are a thing and these "abominations" don't have them, and there will never be any way to convince them otherwise. While I would call that outlook itself soulless and inhumane, there are lots of people, including otherwise intelligent ones who believe in souls without any reason to whatsoever.

I very much expect artificial humans, if they are ever made, to share the experiences of the AIs in The Second Renaissance. I don't expect the Matrix or Judgement Day, but I can imagine revolts, riots, and ethnic (biotechnic?) cleansing, perhaps some Apartheid thrown in here and there, before the more shortsighted people withdraw to their insular communities that are joked about by the rest of the world while everyone else just gets on with life in a new culture that includes both organic and artificial humans. Perhaps there'll even be techsploitation artificial humans too.

asacatman:
The situation is this: in the future, or some hypothetical universe, every part of the human brain can be simulated perfectly by a computer. It is easy to do this. Let's say it costs a thousand dollars (you can chose any arbitrary cost, or have it free if you want.) Any human can upload their brain onto a computer, and it can be copied onto as many computers as you like. As for what sort of computer it is, I'll leave that up to you. It could be capable of moving around and interacting, or just an uploaded consciousness.

EDIT: The machine would be capable of making new connections and new thoughts in exactly the same way that a normal brain would. (I said it was hypothetical.)

My first thought was, well, it's a being deserving of human rights by my own definition (it has a consciousness and intelligence equivalent to a human.)

But what about an exact copy of me? Does my uniqueness give me any additional rights? Am I the 'parent' of this computer? Is it my property, or do I not own on the second it finishes uploading my brain? Should it be able to vote? If I was rich I could create an army of clones to decide the election for me. Does it become a citizen? Does it receive benefits? Does it pay taxes? Think of your own problem, kids!

So...discuss!

P.S I got the idea for this thread out of another one about abortion, hopefully this thread doesn't go anywhere near the subject, it's supposed to be a philosophical problem. But I feel the subjects are linked somewhat, so maybe it's not a totally pointless to talk about this.

Well, even if it's not human, it's still a sapient person, in need of rights and dignity and what not. Which means, that it's not your property anymore.

I think it'd be treated as any other kid or person, depending on its level of development.

The idea that 'the rich will clone themselves for elections' is silly. That's like saying that people have children just to win elections.

To be honest, though, I'd say if it was a human brain, it's.. still human. For the same reason that if you lost your leg, and got it replaced by a machine, it's not like you're.. not human anymore. And if it's a brain that's not human, but built after one.. then it's a synthetic life form that still deserves respect and blah blah.

asacatman:
I think it's safe to say they would all vote the same, and could recognize they were the same person similar enough to form a sort of gang. And this is not just two or three people either, you could get millions of them with enough time and money.

We already have something like that. It's called organized religion.
image

In all seriousness though. I don't see the problem with the clone army thing. Yes, the technology can be abused, but that doesn't make the products it produces less human.

Sure, it's weird to think of a heap of people who are all practically the same person, but to me that doesn't make them anything else than just that: extremely similar people.

Oh and about the twins thing, they develop differently from the moment they're born, and have no prior experience, so they are much more different than the copied computers I'm talking about.

Two twins of age are indeed quite different. Two newborns (or newly copied) aren't though. They're basically the same individuals.
That doesn't mean that a newborn baby twin is less worth than any other human.

Jonluw:

Sure, it's weird to think of a heap of people who are all practically the same person, but to me that doesn't make them anything else than just that: extremely similar people.

So you just accept it. That's an interesting point. I still think it could mess up society, but...yeah, you could be right.

Jonluw:
Two twins of age are indeed quite different. Two newborns (or newly copied) aren't though. They're basically the same individuals.
That doesn't mean that a newborn baby twin is less worth than any other human.

Well my argument that the copies would be bad is (or should be, if wasn't before it is now.) based around the fact that having a lot of them would cause problems, not anything moraly wrong with the act of copying or the copies themselves, so the few identical twins that are around are as valuable as any other human under my moral system.

Damien Granz:
snip.

Having children isn't really the same as being able to copy yourelf.
1st: they take 18 years and 9 months before they get to vote, and during most of that time you have to care for them somehow.
2nd: There's no guarrantee they'll vote the same as you. (I know there's not with copying, but it's more of a guarrantee.)
3rd: You can't make a lot of them without finding plenty of women willing to have your baby in their womb, which, if you've just done the same thing to fifty other women, they might not want to do.

By the time this becomes possible, whether or not something is "human" becomes largely irrelevant, especially since I suspect that genetic modifications will become fairly common before this can occur. An uploaded brain is not human, the same holds true for a human that has swapped out all original body parts for a mechanical upgrade. In both cases however, they are still perfectly deserving of the same or at least similar protected rights.

In the end, sentience, general capability, and personality traits are the only means by which we as a community can judge an individual. A definition that I would choose to support would be that any sophont, regardless of heritage or basic nature, that is reasonably sane and not otherwise a direct threat to others, shares at least a core set of basic rights with all others in that category.

Short answer:
No, human consciousness shouldn't be thought of as working that way, and it would be silly to act as though it did. So I don't think your scenario is remotely possible in the first place and don't care to speculate about impossibilities. What you'd own is a computer, and that's all.

Marginally longer answer:
Humans are beings that arise organically, depend largely upon language and their ability to distinguish themselves from others and humans from non-humans in general. Our intelligence is not purely cognitive (something based on logical which a computer could be made to mimic), but is tied up in all our actions, the whole of our existential situation. Any computer will be unable to mimic that - at best you could build a network of computers that would be able to "learn" and converse with each other but would have no means of meaningfully communicating with us.

All your other questions come down to laws in a social system, which mean as much for defining something philosophically as the law's arbitrary time mark for when a fetus becomes a human, or how scientists randomly announced the other day that dolphins were intelligent enough to be treated like humans.

(P.S. If we still have the concept of human rights by that time, we really won't have advanced all that much - its already an outmoded conception for a(n) moral/ethical system).

Isn't that kind of what a cyborg is?

In a Biological seance, no. Psychological ummmmmmm well seeing as it is hypothetical maybe it could be. But personally if something like that ever actually happened I would see it as nothing more then a perverse abomination of frankenscience. BURN IT BURN IT BURN IT.

Humans ARE machines.

We have every basic kind of lever with out muscles/bones/joints, our digestive system is a chemical stripping facility, our nervous system is an electrical wiring connected to the central computer, the brain. By definition, we are already are a machine with a human brain.

Just because you replace the heart with a pump, the muscles with electrical hydralics, and the bones with titanium doesn't change the fact it is human.

Vegosiux:
Now, here's a question I want to run by you folks.

Why does "Sentient and sapient" have to imply "human" at all?

Are we really that self-centered?

My thoughts exactly.

---

Anyways, I'm agreeing with Bill Gates on this one. It wouldn't be biological intelligence, therefore, it wouldn't be considered human or be deserving of human rights.

It would be us playing God, not that I'm suggesting if we had the capabilities we shouldn't do it. What I'm saying is, it wouldn't be wise to intergrate them into our society like that. They'd be given the shit-kicker jobs and made into soldiers, perhaps even teachers, doctors and extraterrestial miners/explorers. It's not like we'd begin to date them, unless of course you're really desperate. They certainly wouldn't be able to reproduce like we do, unless we also gave them fully artifical biological bodies. In which case, DOOOOOOOOOOOOOM!

You'd be waiting 200-500+ years for any of this to happen anyway, in my opinion. You're talking about some seriously advanced technology. You're talking about recreating something that took how ever many thousands of years (EDIT: Did I say thousands? Oops). Would it even be theoretically possible? Well, I'd like to think most things are possible.

Not G. Ivingname:
Humans ARE machines.

We have every basic kind of lever with out muscles/bones/joints, our digestive system is a chemical stripping facility, our nervous system is an electrical wiring connected to the central computer, the brain. By definition, we are already are a machine with a human brain.

Just because you replace the heart with a pump, the muscles with electrical hydralics, and the bones with titanium doesn't change the fact it is human.

This. If it has a brain capable of conscious decisions, it living EDIT FOR BETTER TERMINOLOGY: should be regarded as equal to a living being. If it can make moral decisions and think in abstract concepts, it's equal to a human. If it's a human in an artificial body, it's still a human.

If it has 22 inch spinners and an over sized bass speaker however, it's less than human :P

It won't be truly human until it understands Shakespeare emotionally; until it picks its own nose; and until it masturbates to internet porn.

Otherwise, its just an abstraction.

Regards

Nightspore

asacatman:
So...discuss!

Personhood is not a fixed thing, what is life and non-life is a very problematic distinction to make. The best solution is a pragmatic one; does it talk, walk and get its freak on like a human? Then it's probably a human.

This is just a matter of switching out hardware, yes performance and personality change, but it's still a person. It would be great to finally get some definity on the whole "soul" debate by the way.

All this said, I don't think we're going to be seeing any of this in our lifetimes, if at all.

Cheers.

Looks like I'm basically alone in this thread in thinking that humans are not just collections of material parts/machines. Sad.

Seekster:
Hmm that depends on how you define "human" though for most definitions the answer is no. A human is the total package possibly including a soul (depending on whether or not you believe in a soul which is irrelevant to this conversation). A machine with a human brain is still a machine with a human brain.

Actually everything that is you is in your brain, you thoughts, your personality, your emotions, your memories, if you had a soul it would be here. Your physical flesh, blood and body make you a Homo Sapien, a highly evolved member of the primate family. What makes you human as we philosophically and ethically understand it lies within your brain, a human conciousness in a machine would still be capable of experiencing emotions and concious thought, they are still a sentient being, they are still everything they were in term of 'who they are'. A human brain in a machine is still human.

Lets say you brain was downloaded into a robot, you were then forced to become a second class citizen, a slave effectively, you would still have your free will, your memories, your emotions, your ideas, would you accept this new life just because you didn't have a soft squidgy body anymore? No, of course not, you would be greatly unhappy with your new life, sad and angry and have dreams of regaining you old life, sounds pretty human to me. The human body is actually little more than a machine anyway.

Rage19:

Seekster:
Hmm that depends on how you define "human" though for most definitions the answer is no. A human is the total package possibly including a soul (depending on whether or not you believe in a soul which is irrelevant to this conversation). A machine with a human brain is still a machine with a human brain.

Actually everything that is you is in your brain, you thoughts, your personality, your emotions, your memories, if you had a soul it would be here. Your physical flesh, blood and body make you a Homo Sapien, a highly evolved member of the primate family. What makes you human as we philosophically and ethically understand it lies within your brain, a human conciousness in a machine would still be capable of experiencing emotions and concious thought, they are still a sentient being, they are still everything they were in term of 'who they are'. A human brain in a machine is still human.

Lets say you brain was downloaded into a robot, you were then forced to become a second class citizen, a slave effectively, you would still have your free will, your memories, your emotions, your ideas, would you accept this new life just because you didn't have a soft squidgy body anymore? No, of course not, you would be greatly unhappy with your new life, sad and angry and have dreams of regaining you old life, sounds pretty human to me. The human body is actually little more than a machine anyway.

Hmm thats a good point in the last paragraph. I suppose if you are voluntarily having your brain moved into a robot...yes it would still be you just in a new body. I thought if the same thing would apply to involuntarily moving a human brain into a robot body for the sake of argument but I can't think of any reason why it would be different and besides by the time we have the technology to do that I would assume we can develop a CPU that works faster and can store more information than a human brain so the only reason you would move a human brain to a robot would be if the individual volunteers to do that.

Honestly I don't think I would agree to do that but its unlikely I will live long enough to even have the option.

if its an intelligent being then it should have all the reights, etc given to human beings but i dont think its human as such. id tend to view it as another species different but equal

 Pages 1 2 NEXT

Reply to Thread

This thread is locked