Do you plug in?
Yes. Happiness is the ultimate goal of life.
20.9% (32)
20.9% (32)
Yes, I am sick of this life.
10.5% (16)
10.5% (16)
Yes. (Other)
6.5% (10)
6.5% (10)
No, I want to live a 'real' life.
34% (52)
34% (52)
No, I don't want to be a vegitable in the real world.
7.2% (11)
7.2% (11)
No, I don't want to be limited to a man-made reality.
10.5% (16)
10.5% (16)
No, (Other)
10.5% (16)
10.5% (16)
Want to vote? Register now or Sign Up with Facebook
Poll: The Experience Machine.

 Pages 1 2 3 4 NEXT
 

Assume there is a machine capable of being plugged directly into your brain. This machine can simulate for you the most happy life possible. All the luxuries life can offer experienced exactly the same as if you were actually there.

Do you plug into the machine? I know I would.

Edit: Oh, and you can never unplug. Assume you are a brain floating in a tank, also assume that you'll live for as long as your brain can provide you happiness.

Edit 2: For those of you who are effecively saying 'I wouldn't be happy in the experience machine', you are wrong. You absolutely would be happy in the experience machine, otherwise it wouldn't be the experience machine discussed in this example.

No. In fact, I'm sure anyone who found themselves in such a machine would fight to get out. Batman did.

madwarper:
No. In fact, I'm sure anyone who found themselves in such a machine would fight to get out. Batman did.

The machine Batman was hooked up to is not a true experience machine in the sense of the hypothetical. He would feel total happiness if it were.

Arakasi:
The machine Batman was hooked up to is not a true experience machine in the sense of the hypothetical. He would feel total happiness if it were.

Oh, really?

Arakasi:
This machine can simulate for you the most happy life possible. All the luxuries life can offer experienced exactly the same as if you were actually there.


Bruce had his parents. He had Selina Kyle. He had a normal (well, about as normal a billionaire can have) life. He rejected it.

To truly experience life, you must experience pain. Else, you might as well be dead.

madwarper:

Arakasi:
The machine Batman was hooked up to is not a true experience machine in the sense of the hypothetical. He would feel total happiness if it were.

Oh, really?

Arakasi:
This machine can simulate for you the most happy life possible. All the luxuries life can offer experienced exactly the same as if you were actually there.


Batman had his parents. He had Selina Kyle. He had a normal (well, about as normal a billionaire can have) life. He rejected it.

To truly experience life, you must experience pain. Else, you might as well be dead.

That was everything he thought he wanted. The machine does not take into account what you think you want, it takes only into account what will make you maximally happy.
The reasson he was discontent with his existence in the machine is that:
1. He knew he was in the machine because he was simply tossed in without it making any sense to him.
2. It didn't make him happy.
3. What makes him happy is being Batman, the simulation didn't even try to take that into account.

Arakasi:
That was everything he thought he wanted. The machine does not take into account what you think you want, it takes only into account what will make you maximally happy.

How? How can it give you what would make you happy, when you don't know what would make you happy?

3. What makes him happy is being Batman, the simulation didn't even try to take that into account.

I'd disagree. While just speculation, being Batman doesn't seem like a source of happiness for Wayne, more like just a compulsion. Much like Dexter's "dark passenger".

Hmm, can you pick and choose with it? Or is it a 'for life' kind of thing? I'd love to experience sex with some celebrities or whatever with this machine, but many other things, such as video games or whatever, I'd much rather just do in real life. If that makes sense.

madwarper:

Arakasi:
That was everything he thought he wanted. The machine does not take into account what you think you want, it takes only into account what will make you maximally happy.

How? How can it give you what would make you happy, when you don't know what would make you happy?

Science, magic, it doesn't matter. It's a hypothetical.

madwarper:

3. What makes him happy is being Batman, the simulation didn't even try to take that into account.

I'd disagree. While just speculation, being Batman doesn't seem like a source of happiness for Wayne, more like just a compulsion. Much like Dexter's "dark passenger".

The compulsion was not adequately taken care of, leading to unhappiness. So it is a failure of the machine.

BathorysGraveland2:
Hmm, can you pick and choose with it? Or is it a 'for life' kind of thing? I'd love to experience sex with some celebrities or whatever with this machine, but many other things, such as video games or whatever, I'd much rather just do in real life. If that makes sense.

That makes sense, but yes it is a for life thing.

The problem with this is I think most people have been conditioned to believe there is no such thing as a perfect reality.
If you woke one day where there was no war, no conflict, everyone was nice ect what would be your gut impulse?

Mine would be that something is wrong here.

It's like Agent Smith said in The Matrix movie, the first version of the matrix was a perfect construct, free of violence, disease and strife, but humans or the human brain just couldn't accept it.

Arakasi:
That makes sense, but yes it is a for life thing.

Then I choose no. While it would be fun to feel the pleasure of all my fantasies, I'd feel much better in the knowledge that the pleasure I'm receiving are from real actions I am really performing/taking part in.

BathorysGraveland2:

Arakasi:
That makes sense, but yes it is a for life thing.

Then I choose no. While it would be fun to feel the pleasure of all my fantasies, I'd feel much better in the knowledge that the pleasure I'm receiving are from real actions I am really performing/taking part in.

If you would feel better for it, the machine would make you think that they are real.

Arakasi:
If you would feel better for it, the machine would make you think that they are real.

Perhaps, but it would ultimately be deceiving me, I'd be living a lie. That doesn't sit comfortably with me, at all. No matter how great the feeling of said celebrity sex would be.

BathorysGraveland2:

Arakasi:
If you would feel better for it, the machine would make you think that they are real.

Perhaps, but it would ultimately be deceiving me, I'd be living a lie. That doesn't sit comfortably with me, at all. No matter how great the feeling of said celebrity sex would be.

So you're saying that you wouldn't sacrifice your comfort now for a lifetime of happiness?

Okay. Let me ask you this then.
Hypothetical:
This current world isn't real, there is a world beyond this one, this is just a simulation like The Matrix. Outside this simulation lies a much worse life, you will die young, it will be painful, dirty and lonely.
Would you want to know that? Would you choose to leave the current life you have? Would it invalidate all you've worked for?

This actually reminds me of a concept from the Red Dwarf universe, a videogame called Better Than Life which basically gave you everything you wanted.

People got so addicted to the videogame that their physical bodies decayed while playing it. They refused to stop playing even to sustain themselves, preferring instead to remain submerged in the alternate reality. Most of them died, unsurprisingly (despite their families' best efforts to keep them alive).

I wouldn't plug into the machine because I imagine something like that would be highly addictive. Also, surely all the challenge/sense of achievement is taken out of it if you're handed everything you want on a silver platter?

Mind you, it'd make an excellent morale boost for people who are in palliative care.

Arakasi:
snip

To be honest, this is starting to get a little silly, and feels like you're taking your own thread off the rails. What I said, is I'd rather live a real life than a lie. So to answer your first question, yes, I wouldn't sacrifice my current, real comfort for a lifetime of falsely-acquired happiness, no matter how real it felt.

As for the weird question. I can't answer that. You're saying this very life we're living is a lie, so everything I've ever done is a lie. How can I possibly answer that with any semblance of accuracy or sanity?

No because I actually like my life and like madwarper said If you experience pain only then can you experience true happiness /lamequote

BathorysGraveland2:

Arakasi:
snip

To be honest, this is starting to get a little silly, and feels like you're taking your own thread off the rails. What I said, is I'd rather live a real life than a lie. So to answer your first question, yes, I wouldn't sacrifice my current, real comfort for a lifetime of falsely-acquired happiness, no matter how real it felt.

Okay then. It is hardly off the rails though.

BathorysGraveland2:

As for the weird question. I can't answer that. You're saying this very life we're living is a lie, so everything I've ever done is a lie. How can I possibly answer that with any semblance of accuracy or sanity?

Well that's the thing isn't it. If you were in the happiness machine that's what it would feel like if someone told you that you were living a lie. It doesn't seem like it could be possible, it seems stupid, and you'd have to wonder why you'd even try to escape it, especailly if it is worse.

R4ptur3:
No because I actually like my life and like madwarper said If you experience pain only then can you experience true happiness /lamequote

If that were true, the machine would incorperate the least amount of pain nessecary to get the most amount of happiness.

Ahri:
This actually reminds me of a concept from the Red Dwarf universe, a videogame called Better Than Life which basically gave you everything you wanted.

People got so addicted to the videogame that their physical bodies decayed while playing it. They refused to stop playing even to sustain themselves, preferring instead to remain submerged in the alternate reality. Most of them died, unsurprisingly (despite their families' best efforts to keep them alive).

I wouldn't plug into the machine because I imagine something like that would be highly addictive. Also, surely all the challenge/sense of achievement is taken out of it if you're handed everything you want on a silver platter?

Mind you, it'd make an excellent morale boost for people who are in palliative care.

That reminds me of the experiment with rats, where they hooked up the pleasure centres of the brains of rats to a button, so they could push it and experience maximum pleasure, they also provided food for the rats.
The rats pushed the button until they died of starvation, with the food next to them.

After reading this I made sure to clarify in the OP that you cannot unhook from this, but keep in mind that the sustinance of your brain would be provided for, and you would live a relatively long life.

Arakasi:
snip

Yes, but that isn't how I would feel in the period before being hooked up to this machine. I would know I'm giving up my life to live in an illusion. That alone is a choice I could not make of my own free will.

BathorysGraveland2:

Arakasi:
snip

Yes, but that isn't how I would feel in the period before being hooked up to this machine.

The machine would presumably ensure you forgot that, if it would provide you unhappiness.

BathorysGraveland2:

I would know I'm giving up my life to live in an illusion.

Well say that the instant that you say 'Yes' you are knocked unconsious until you are plugged in, at which point you forget you ever said yes anyway. So there is no discomfort whatsoever in the transaction.

BathorysGraveland2:

That alone is a choice I could not make of my own free will.

Free will is an illusion.

Arakasi:
snip

Yes, yes. But if I was told all this and was asked to make a decision, the first thing that would enter my mind is exactly what I have spoken of. Illusion and deceit. Which is why I would say no, so the machine would never get the chance to make all these impacts in the first place.

BathorysGraveland2:

Arakasi:
snip

Yes, yes. But if I was told all this and was asked to make a decision, the first thing that would enter my mind is exactly what I have spoken of. Illusion and deceit. Which is why I would say no, so the machine would never get the chance to make all these impacts in the first place.

But if it is truely the illusion and deceit that matters I have to ask this question again:
Hypothetical:
This current world isn't real, there is a world beyond this one, this is just a simulation like The Matrix. Outside this simulation lies a much worse life, you will die young, it will be painful, dirty and lonely.
Would you want to know that? Would you choose to leave the current life you have? Would it invalidate all you've worked for?

Am I the only one that thinks having a machine that can basically mess with everything you know pretty risky?
What if someone messes with it?

Arakasi:
snip

I guess I would remain in the simulation, because such a world you speak of sounds like an irredeemable nightmare. The difference is that the world we live in right now is anything but that.

BathorysGraveland2:

Arakasi:
snip

I guess I would remain in the simulation, because such a world you speak of sounds like an irredeemable nightmare. The difference is that the world we live in right now is anything but that.

In comparison to the pleasures you could recieve, I'm sure this would seem like an irredeemable nightmare too. But there's no real way to prove that until we build an experience machine.

Edit: But wait, doesn't that show that you don't really care if it's real or not? So long as it isn't incredibly painful? Why not make an exception for incredible happiness also?

King Aragorn:
Am I the only one that thinks having a machine that can basically mess with everything you know pretty risky?
What if someone messes with it?

That'd work outside the hypothetical.
In the hypothetical it is perfectly safe.

In fact, it'd be safer than your life currently is.

Arakasi:
In comparison to the pleasures you could recieve, I'm sure this would seem like an irredeemable nightmare too. But there's no real way to prove that until we build an experience machine.

That may be so, but this "matrix world" you spoke of lacks one thing our world has: something worth living for. Such a hellish world, why would anyone even want to hang on? Which is why I would remain in the simulation in that scenario, but would choose our real world in the other.

BathorysGraveland2:

Arakasi:
In comparison to the pleasures you could recieve, I'm sure this would seem like an irredeemable nightmare too. But there's no real way to prove that until we build an experience machine.

That may be so, but this "matrix world" you spoke of lacks one thing our world has: something worth living for. Such a hellish world, why would anyone even want to hang on? Which is why I would remain in the simulation in that scenario, but would choose our real world in the other.

See my edit on the last post.

Also, what is worth living for other than happiness?

Arakasi:
Also, what is worth living for other than happiness?

Well, things such as excitement and thrills, the feeling of satisfaction, helping others, etc. I like happiness, and the feeling it brings, but I'd prefer it to be real than simulated. That's my personality and character speaking. I would experience certain things with this machine if it wasn't a permanent, no-turning-back choice.

BathorysGraveland2:

Arakasi:
Also, what is worth living for other than happiness?

Well, things such as excitement and thrills, the feeling of satisfaction, helping others, etc. I like happiness, and the feeling it brings, but I'd prefer it to be real than simulated. That's my personality and character speaking. I would experience certain things with this machine if it wasn't a permanent, no-turning-back choice.

One such as myself would incude feelings such as satisfaction, helping others (maybe), excitment and thrills under the category of 'things that lead to happiness'. So really the machine would account for that too.

But your choices do not match. If it truely matters whether it is real or not you should pick the real world in the dystopia scenario.

My choices do not match because they are different realities. The dystopia scenario is a hellish world not unlike a post-apocalyptic wasteland in which it would be very difficult, perhaps impossible to experience happiness at all. The only thing you'd have to live for is to survive another day in horrible conditions. I think most would choose a happy illusion from that reality.

Where as in our current world, it is very easy to obtain happiness and pleasure, and passion. There is much to live for, and live is, for most of us, good. So less would choose an illusion over this reality.

You see what I mean?

*scratches head*

I see a different angle from which this hypothetical falls flat...

...basically, if you wouldn't know the difference, then it doesn't even matter what I'd choose now, so the choice itself carries no meaning at all. I'm not a fan of solipsism, but if we accept the assumption that "your mind makes it real" and that there's no way to be sure there's really the reality we perceive outside our minds, we have to treat any potential simulated reality that's rendered completely acceptable and "real" by our minds as equal to the one our minds are rendering and experiencing as we speak.

And at this point it becomes a meaningless question, since the "current" reality suddenly stopped holding a unique position in opposition to all other potential ones.

madwarper:
No. In fact, I'm sure anyone who found themselves in such a machine would fight to get out. Batman did.

I think we'd be more likely to go insane, since the hypothetical assumes that there's no way we'd even know we are in some such thing. Basically, our mind would be subjected to conflicting information on a subconscious level, that there's something seriously not right about the world, and that it's a perfectly fine real world. Since that'd not even enter our conscious processes, we couldn't really make a conscious, rational decision to "break out", either.

Actually, I think this scenario isn't as close to Batman as it is to Vanilla Sky. Only that in that case it also involved some kind of suspended animation/life extension shtick. And it's ultimately the subconscious that messes it all up; because parts of our minds we're not aware of will interfere, for example, with our "wants", as was mentioned, what we want as opposed to what we think we want.

I see what you mean, but in comparison to the pleasures able to be achieved from the hypothetical machine, this life contains barely any pleasure. Count also that it has no pain, no premature death, no suffering for anyone (unless you enjoy those things), what more could you ask for?

Why is reality such a deal-breaker?

 Pages 1 2 3 4 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here