If there were intelligent robots in our society, how would you treat them?
If they're intelligent, they are people, and deserve to be treated as such
71.7% (142)
71.7% (142)
I don't think intelligence and self awareness is the same as having a soul
7.1% (14)
7.1% (14)
Doesn't matter how intelligent they are, they are tools, they were built to be used
10.1% (20)
10.1% (20)
Destroy them! The created will always rebel against the creator!
6.6% (13)
6.6% (13)
other (comment)
4.5% (9)
4.5% (9)
Want to vote? Register now or Sign Up with Facebook
Poll: What if... ROBOTS!!

 Pages 1 2 NEXT
 

So, suppose it were the year 2013 (stay with me!) and there are robots in our world! We made them initially to serve us, but now they are equal to us in awareness and intellect, capable of independent thought. How do you treat these new androids?
? Oh, sorry! Mechanical Americans, I should say :s

We treat them like us, which we probably should have done to begin with

"I'm PsychicTaco115 and I believe in robosexual rights. And robot rights in general"

Treat them like humans, of course! Now, where's that porn- *cough, cough* video game code I've been working on?

Seriously, I'd treat them just like us. Might be interesting to see their viewpoint on things.

Why the hell would you make servants with human level intelligence?

I could understand making some intelligent AIs for help with complex mathematics and science, but why you would program something that exists solely to fetch you beers to have emotions, wants and ambitions?

Robots are tools, just tools that can do more complicated jobs. It's basically an evolution of the hammer or knife, a tool built to do a job.

I would start building Reapers, and give the player 5 different colors to choose from instead of just 3!

I'll treat them like I treat other people. Maybe better. It depends on how they react to the world.

They don't have souls but that doesn't matter, doesn't mean I need to treat them like shit.

Why are there only Mechanical Americans?

Why cant I have Australian robots?!?
"Beep Boop shrimp on the barbie"

On second thought....you keep your Ameribots.

Call whoever gave them sentience an idiot then treat them as people.
It's not their fault they've now become "the perfect race", immortal, intelligence, capable of evolution whenever they want, a billion times more durable than any organic organism. Now we just have to wait till they find us "unnecessary" and try to overthrow us.

If they are intelligent and show emotions they deserve to be treated fairly. Whether or not its a master-servent relationship or a equal peer relationship depends on the amount of intelligence. also, the ability to feel pain factors in. Oh! and stimulants. If their "brain" can be affected by certain stimuli and feel emotions similar to that of joy, fear, sadness, etc. they are basically human. If they only display it because their programming tells them to do something based on a situation, they are less likely to be treated as human.

Also, on a VERY relevant note: captcha is "be my friend".

I am certain that the captcha system on this site may be one of the first sentient robots to exist. And it is psychic. NOW SHOW REAL EMOTION.

This seems appropriate, so I'll just leave it riiight here:
http://www.questionablecontent.net/view.php?comic=2085
I feel the same way about robots the same way I do about biological creations. If they are below the "sapience" level, then they are tools that are created by us and for us. If they are "sentient" like a dog or something, then it is up to one's personal decision to treat them well or poorly. Humans hold no obligation to treat them with "animal rights" regardless of whether or not they are "sentient". If they are sapient, then they should receive all of the same rights as humans and laws should be in place to prevent discrimination.

The only exception I would make would be for game shows, because having the internet in you head can qualify as cheating.

Can I have a persocom ? Please? I needs me a love bot.

It's always more fun to see how the mechanicals react and how mankind adapts to mechanicals than what the specs of the mechanical is. I wouldn't have a problem treating a mechanical like people as long as they showed a decent amount of self awareness. But, you'll mail me a Persocom, right?

One very important rule to life-like androids:

If they look and act human, treat them as if they were human. If they are abused or mistreated, that is where the superiority complex comes into questions and we effectively given birth to Skynet and all that follows it.

It would be pretty cool to one day be able to effectively create androids that are like humans only that they are capable of doing tasks that we deem too dangerous or improbable (handling radioactive materials, traveling/scouting other planets, disarming bombs/involved in stealth reconnaissance in hostile environments, etc.). It would also be interesting to see how androids view purpose and existence: will they willfully accept and fulfill their purpose while retaining a sense of individuality or will they be allowed choose and even change their functions in order to serve society in any manner they see fit?

We've got Curiosity; we've got female mosquito-seeking lasers; we've got tractor beams. Let's make this happen people!

I believe Optimus Prime said it best:
image
Intelligence matters far more than being "Alive". Fungus is alive. Intelligence is what makes us special.

Yeah, no, it's a great idea. Let's make our OWN species effectively OBSOLETE. How could that possibly have any negative consequences?

Seriously if this ever happens I will destroy them and their creators. Whether you like it or not I can see only two options if such things were allowed to exist: human existence ends, or is rendered meaningless and is allowed to sadly carry on bereft of any purpose.

If you don't want your own species to continue existing and improving you have just failed at evolution D:

In the beginning, there was man. And for a time, it was good. But humanity's so-called civil societies soon fell victim to vanity and corruption. Then man made the machine in his own likeness. Thus did man become the architect of his own demise.

The pessimist in me considers events such as depicted in the Animatrix short The Second Renaissance rather likely.
The optimist thinks that the best we can hope for is a future alike that in Asimov's creation.

Personally, life is life, no matter what the form.

Treat them as superiors for they are.

I for one welcome our ne-...
(actually reads post)

Oh, it's not that sort of robot.
Fine then, they can live. Even if they are like the Geth. No, wait, ESPECIALLY if they are like the Geth.

Inteligent robots?
Well, treat them as people.
More importantly, I believe that would mean thatthe opssibility to upgread this meatbag of a body isn't far away.

There's no option for "enslave them and force them to produce heavily armed also enslaved robots"

How the Geth and Quarians ended up, if you made the relevant decisions, is kind of like the society that Sarton and Fastolfe aspired towards in "The Caves of Steel"

Captcha- Trust me. They are aware of our scepticism

Didn't Star Trek cover this can of worms? I seem to remember an episode where Star Fleet was trying to declare Data "Federation Property". Let me see if I can find a clip from that. It just seems too relevant to ignore.

itchcrotch:
So, suppose it were the year 2013 (stay with me!) and there are robots in our world! We made them initially to serve us, but now they are equal to us in awareness and intellect, capable of independent thought. How do you treat these new androids?
? Oh, sorry! Mechanical Americans, I should say :s

First off, watch the new Battlestar Galactica reimagining then come back when you have an opinion of your own. (You really aren't starting a conversation if you don't post your own opinion, you are just asking us ours.)

Second, they will only be equal to us as long as we are the ones building them with limited intelligence.

The second we let them start building their replacements, they will start building ones that are smarter than themselves, and then we will be their inferiors within three generations.

At this rate, we can be obsolete in the time it takes two robot to be assembled.

The three laws (or Four for that matter) do not apply because now the robots are smarter than humans in every way. They will be the logical choice to run everything while humans are reduced to wasting oxygen and natural resources, which will cause the robots to determine that we are a detriment to the natural order, and must be eradicated.

It is I, Robot (the novel) and Battlestar Galactica (either version) all over again.

When I saw the new Battlestar, my final thought was "Robots are friends, not equipment," but after analyzing this idea through this post, I am convinced Robots are meant to end humanity. Which sucks, because I really wanted a Golden Bender as my friend.

every single example of a superior lifeform being introduced through to technologically superior countries taking over has always resulted in something bad happening to the original population.

in the case of robots with human intelligence you have the obvious potential to wipe out humanity

Spade Lead:
(You really aren't starting a conversation if you don't post your own opinion, you are just asking us ours.)

Well it's a good thing that's all I was trying to do! XD

Spade Lead:
The second we let them start building their replacements, they will start building ones that are smarter than themselves, and then we will be their inferiors within three generations.

snip

At this rate, we can be obsolete in the time it takes two robot to be assembled.

Eh, that's not so much an issue really. For one, what do they do with this intelligence? No matter how intelligent they become, it won't necessarily do them a lot of good. They will lack in military might and numbers, and whilst they can be intelligent as they like even a Protoss Mothership eventually falls to a swarm of mindless Zerg. Any weapons they develop can easily be taken and reverse engineered by us, and there's no real reason for them to develop weapons so long as we let them live peacefully. Of course with attention constantly being paid to them like the rest of the human race.
Don't give them a reason to rebel, they won't. You also overestimate how easy it is to create something more intelligent than yourself. We could maybe eventually give them more processing power, but that doesn't make them more intelligent. It just means they can think faster. We struggle to create robots with even a 5 year old's level of intelligence, making one more intelligent than us will be an even bigger hurdle because we don't have the relatively simple task of making it understand what we do, we've got to make it understand what we don't - and that difficulty will carry across to that robot too.
Additionally, the more intelligent you try to make it, the more CPU and database useage it is eventually going to need. It'll need a large network to be able to calculate enough for even 1 robot to be as intelligent as us, and such networks are relatively easy targets.

The three laws (or Four for that matter) do not apply because now the robots are smarter than humans in every way. They will be the logical choice to run everything while humans are reduced to wasting oxygen and natural resources, which will cause the robots to determine that we are a detriment to the natural order, and must be eradicated.

On another point, why does their improved intelligence make the 3 laws obsolete?
The three laws were in regards to humans, not in regards to the most intelligent lifeform. You could arguably say that they could make a machine unfettered by the 3 laws, but if you program in a law that makes them pass the laws onto any robot they have a part in designing or creating, that problem disappears as well.
And to be honest, humans wasting oxygen and natural resources whilst robots do all the work is somewhat of a Utopia for mankind.
There's also no reason for them to destroy us for being a detriment to the natural order. Where did they get this want to uphold this "Natural Order", and are they not inherently a detriment to it as well?

Honestly it sounds like you've been watching too much...

It is I, Robot (the novel) and Battlestar Galactica (either version) all over again.

...
I would have thrown in Terminator too, but you said it yourself. Movies, books and TV shows rather exaggerate things. Look at BSG and they rebelled. Forgotten why, but if memory serves it was because they were sick of being used as canon fodder for wars. Step one: Treat them as people, not tools. You'll note that when humans do this, the robots cease trying to exterminate us - well, some do. Others keep trying to exterminate us but that's another issue.
I Robot I've forgotten the cause for too, but its another equally flawed bit of reasoning, or a mistake on a humans behalf.

A lot of the problem with most reasons is that they are inherently human, yet they try to push it as a logical thing for a machine to do. "Humans are logically disrupting natural order, and thus must be eliminated". Well, why are they protecting natural order? If they're using logical thinking patterns as opposed to emotional ones, how did they come across the idea of defending natural order against their programming to defend humanity?
The rest focus on us going to war with them, or using them in wars, or them going back in time and programming themselves to declare war on us in a kind of paradox, which has the obvious solution of don't fucking go to war with them.
Emotional reasons for going to war with humans can be dealt with by not offending those emotions. Logical reasons to go to war with humans rather don't exist. It results in large losses of robots worldwide, results in many of the manufacturing facilities being shut down, results in the possible extinction of robots, and offers little to gain in return. Logically they would continue to fulfil what they were made to do because its logical.

There really isn't much issue of robots declaring war on us without provocation. Even with provocation, why would they necessarily have self preservation wants unless we program them into it?
The worst we'd have to worry about would be robots designed for war having their programming messed with and told to exterminate all humans rather than just a certain nation. Even that's rather unlikely.

And finally, the best defence is to not give them too much power. We can monitor robots easily enough. We can monitor the finite resources they have, what that gets put into. We can monitor their location through Electromagnetic Communication with their server. We can monitor how many there are, and assure they never exceed the number we can deal with. The main problem for any robot world conqueror is that we own the world's infrastructure. They might own a few factories. The second they try to declare war, their supply lines are cut, they can't replace their fallen anymore, and bombs and missiles far more plentiful than what they'd have are dropped on them.
BSG throws in the risk of hacking, but as seen the Cylons had to get their hands on the Colonial Military Software to be able to hack into it. Brute force hacking isn't effective when even now there's a code we've had a machine running on for several decades trying to crack it, and its expected to be going for several decades or centuries more, even if Moore's Law is invoked, before it actually manages to crack the code. We can protect things too well these days, and if something dedicated all its effort to cracking one of those codes, it'd lose all other functionality. So, your robot hive mind would suddenly cease control of all robots and instead try to crack into the Nuclear Missile Launch system of America or something. We'd notice fast, and destroy it fast. Additionally if you ensure it doesn't have a landline connection it's got other problems such as connectivity to worry about for hacking into things. You've got a finite number of attempts you can send each second, and if something like a jamming device gets in the way of your connection, you're screwed.

Anyway, I've gone on for a bit too long now, but Robots don't really stand a chance at winning against humanity unless we let them. Give them all the intelligence you want, they'll probably just realise the same thing. Probably comes across as arrogant and overconfident, but unless there are some SEEEERIOUS technological MIRACLES in the next 100 years they don't even stand a chance. Even then, so long as the laws of physics aren't miraculously broken they're still at a fair disadvantage.

Look at the geth we treat them right and they will love us we treat them like crap and well they fight back for one I am sure I will enjoy the robot's company more then the humans In fact i would be for their rights and if they do turn bad and try to destroy us we will have deserved it

Destroy them.

They have no use for us. We'd either be serving them or wiped out.

I'm treat them like anyone else I guess. It would be cool to have a robot friend. Of course I don't see how robots will be that advanced that they actually care. Sure, they could act like they care but I feel we are a fair way off actual emotion.

I learn from the medias (film, book or tv etc) that if a robot had become sentient and as intelligent as us, it should be treated as equal to us humans and should not be discriminated.
I mean look at The Animatrix: The Second Renaissance (especially this seeing how it would had never happen if the human had treated the robot as equals), "I Robot", Mass Effect (Quarian and Geth), Bicentennial Man and other media that had robot uprising due to human still treating them as machine and ignore their sense of free will or themes around that concept.

Scarim Coral:
I Robot"

Recollection of the literary material I Robot was based on is extremely hazy but in the movie at least it was more due to the messed up logical deduction by the central AI and poorly cemented *Laws of Robotics* "Machine must protect man, however man is danger to their selves and others around them, thus we must confine man against their will and through that avert harm to the collective" or something like that. Kind of like a extremely fucked up zeroth law.
So extreme care should be taken in that field as well, not just proper social treatment.

seeing as i treat most machines with the same respect i show living creatures -anyway-, having another machine around i could relate to would be a pleasure. of course i would treat it normally.

If they're capable of feeling emotions, genuinely capable of becoming sentimentally attached to something, then they should be treated as a sentient, synthetic life form. If theyre only intelligent in the sense that their programming allows them to outthink humans and perfectly mimic human thought, destroy them. The ability to develop fondness for something even when logic says that there's no reason to, that's what should differentiate a very efficient machine from a living thing with a personality. It's also probably the only reason why they might not see us as obsolete and 'discontinue' us as a species.

Did you lot see this too?
http://www.youtube.com/watch?v=1EvqiGm0wz8 (emotional shit ahoy)

I cant think of any use for a robot that thinks like a human, all they would want to do is be made happy. Why bother making something which will covet happiness, when you can make something that creates happiness. Like a robot who likes to build shops and clean toilets, this way everyone is happy.

I picked use them as tools, just make a robot who is happy doing the jobs the humans cant be asked to do and everyone is happy.

Complex artificial intelligence is a shit idea. Maybe I have seen too many films but it always goes wrong. On the other hand if we could make an android so complex that they basically were a human being, the best next step would be to integrate them into society, and eventually be able to breed with them to make a new organic cyber hybrid race. Humanity would then achieve new heights of intelligence, understanding, discovery and download speed.

excuse me whilst I write a book about this.

 Pages 1 2 NEXT

Reply to Thread

This thread is locked