Immersion in Games: Are You Into It?

 Pages 1 2 NEXT
 

Immersion in Games: Are You Into It?

First it was the subjectivity of beauty in games. Now Yahtzee riffs on what it means to be immersed in a game.

Read Full Article

"Even the [Amazing Spider-Man 2] video game had its supporters, although that might have had more to do with the comfortable padding of brain distance."

I feel this needs some explanation. On one front, it's a new open-world game on the Wii U that isn't 10 years old and isn't a Lego game. In fact, the first time I heard about it was from Nintendo's YouTube channel. And thus, people on Miiverse praise it.
Also, the game's developers had a gameplay preview that screamed, "Guys, we swear it's good this time. Spidey has different moves and everything!"

But now it's out, and it's wank, and we Wii U gamers without strong PCs or new consoles might as well just wait for Watch_Dogs to satiate our sandbox desires. I got fingers crossed for it working well.

"Suddenly it's not Lord Carstairs aggressively seducing Dolly the parlor maid and her faithful sheepdog, now it's just two actors in costume on a sound stage with a very perplexed border collie."

This made my day.

Regarding the argument on the first page, Yahtzee is falling into the same trap many people use when defending their subjective opinions - not realizing there is a difference between saying "I like this thing" and "This is a quality thing."

Some people are arrogant as to think that they ONLY like quality things, but this is not the case. You can like something that is bad. It's ok. It doesn't mean you're an idiot, and it also doesn't make the thing you like good.

For example, I like Cross Edge. However, I would never suggest that it's a good game. It's a terribly designed game.

Similarly, I despise Halo and everything it stands for, but I admit it is a well designed quality game.

And so on and so forth with everything subjective in the world. YES, your opinion about the quality of something can be wrong. Just having an opinion doesn't make it as valid as other more informed opinions.

>YES, your opinion about the quality of something can be wrong.

To a point, but not always. Quality of a game is subjective, since you cannot measure with the scientific method and have enough data to support it.

I think the best part of this article was the picture of messed up Jesus painting labled "bad immersion Jesus."

duwenbasden:
>YES, your opinion about the quality of something can be wrong.

To a point, but not always. Quality of a game is subjective, since you cannot measure with the scientific method and have enough data to support it.

Objective quality does exist. Yahtzee made some points about it in this very article. A TV show with a boom mic floating into the frame is not high quality, regardless of whether you like the show or not.

There are shared standards of quality that we as a collective consciousness have decided is "good" or "bad" in any given medium.

Thanatos2k:
Regarding the argument on the first page, Yahtzee is falling into the same trap many people use when defending their subjective opinions - not realizing there is a difference between saying "I like this thing" and "This is a quality thing."

Some people are arrogant as to think that they ONLY like quality things, but this is not the case. You can like something that is bad. It's ok. It doesn't mean you're an idiot, and it also doesn't make the thing you like good.

Well, in order to separate things that are good from things which are bad but enjoyable, you first have to nail down what 'quality' means. If we're comparing adhesives we can just figure out which one holds more weight. If we're comparing printers we can look at how fast they print, the precision of printing, and so on.

How do you propose to objectively measure the quality of games?

Falterfire:

Thanatos2k:
Regarding the argument on the first page, Yahtzee is falling into the same trap many people use when defending their subjective opinions - not realizing there is a difference between saying "I like this thing" and "This is a quality thing."

Some people are arrogant as to think that they ONLY like quality things, but this is not the case. You can like something that is bad. It's ok. It doesn't mean you're an idiot, and it also doesn't make the thing you like good.

Well, in order to separate things that are good from things which are bad but enjoyable, you first have to nail down what 'quality' means. If we're comparing adhesives we can just figure out which one holds more weight. If we're comparing printers we can look at how fast they print, the precision of printing, and so on.

How do you propose to objectively measure the quality of games?

Again, we have shared metrics of quality in games, even if they are not entirely concrete. We know what good and bad writing is, we know what good and bad pacing is, we know what good and bad graphics are, we know what good or bad voice acting is, we know a good or bad save system (See: Shadowrun Returns), and so on. We even know what makes a good or bad tutorial.

Some stuff is more subjective than others (What makes a good battle system in an RPG?) but many things are not.

Thanatos2k:

Falterfire:

Thanatos2k:
Regarding the argument on the first page, Yahtzee is falling into the same trap many people use when defending their subjective opinions - not realizing there is a difference between saying "I like this thing" and "This is a quality thing."

Some people are arrogant as to think that they ONLY like quality things, but this is not the case. You can like something that is bad. It's ok. It doesn't mean you're an idiot, and it also doesn't make the thing you like good.

Well, in order to separate things that are good from things which are bad but enjoyable, you first have to nail down what 'quality' means. If we're comparing adhesives we can just figure out which one holds more weight. If we're comparing printers we can look at how fast they print, the precision of printing, and so on.

How do you propose to objectively measure the quality of games?

Again, we have shared metrics of quality in games, even if they are not entirely concrete. We know what good and bad writing is, we know what good and bad pacing is, we know what good and bad graphics are, we know what good or bad voice acting is, we know a good or bad save system (See: Shadowrun Returns), and so on. We even know what makes a good or bad tutorial.

Some stuff is more subjective than others (What makes a good battle system in an RPG?) but many things are not.

Then again, there are things where the 'context' is important; in fact, I'd say 'context' is most important, or rather 'cohesion', i.e. how all the different aspects of the game (or piece of art in general) work together to create an experience (whether it's the one the producers wanted or not).
To use your example of save systems: being able to save whenever I want, as often and in as many slots as I want, and then continuing exactly where I left off might seem, in theory, the best possible way.
But in a shooting game, disallowing that in favor of save points creates far more tension (which is why Half-Life 2 didn't quite 'click' with me; if I took to much damage, I could just reload and try again).
Or in a game with multiple branching paths, only allowing for one, continuous save instead of 50 save spots gives each decision much more weight.

The same goes for pacing, because not only the genre of the game, but also the story demand a certain kind of pacing, so when assessing the pacing of the game one would need to consider both these aspects before passing judgement; just saying 'I know a good pacing when I see one' is not really going to help.

When dealing with art, everything is more or less subjective; there are things that one perceives as 'objective factors', but given the right context / interpretation, those things that break or bend the rules might end up the most interesting.
Showing 'the works', destroying immersion on purpose, for example, is a stylistic device used by Brecht to make the audience aware that they were indeed just watching two actors and a sheep dog; he did this to make the audience reflect on what they saw before them.

The most important thing is to know that there are no 'right' or 'wrong' opinions; there are simply opinions and in between conflicting opinions may lay the truth.
The important thing is not WHAT your opinion is - or indeed how 'subjective' it may seem - but how it is presented.
Saying 'I did (not) like this' is not helpful.
Saying 'I did not like this because it didn't follow the standard formula for save points' allows for discussion.
Saying 'I did like this because the bad voice acting evoked a feeling of loneliness because I was the only sane character in the game, an experience that filled me with existential dread' is highly subjective, but well presented.

With the obvious exception of the lead actors, every person involved in a created work is doing their job perfectly well if you never think about them or know that they're there.

I feel like the line is a bit blurrier than that. In Men in Black 3, Josh Brolin's performance as young K was so spot-on that my mind instantly accepted him as the same character, without actively noticing that he wasn't Tommy Lee Jones. Yeah, even lead actors can be never thought about. On the flip side of the coin, some filmmakers like to get particularly artistic with their cinematography. Breaking Bad, The Empire Strikes Bakc, Pulp Fiction, and Beauty and the Beast all had some extremely memorable shots; the camerawork really did stand out in those films... but that doesn't mean they weren't doing their job right.

I don't think that what matters is necessarily that the audience doesn't think about certain aspects of a film/game/work of art, but that whatever the audience notices is intentional; it's okay if a boom mic pops into frame as long as it's on purpose and for a good reason. Much like a good magic trick illusion, the goal above all else is to control the audience's attention. That's where suspension of disbelief comes from. Frankly, you can do whatever you want, and as long as you can make sure your audience is looking the right way, where they're looking is the only part that has to be well-crafted. That's why enemies spawn behind walls or off-camera. That's how games like Antichamber and The Stanley Parable are even possible.

P.S. Thanks

>There are shared standards of quality that we as a collective consciousness have decided is "good" or "bad" in any given
>medium.
>we know what good and bad graphics are

What do you mean by "our shared standards of quality"? How do we collectively judge "good" graphics from "bad" graphics? polygon count? texture res? the appearance of 8 bit sprites/graphics novel/photo-realistic/ASCII? what about brown vs colourful vs monochrome palette? film grain? the similarity between a RL photo and the CGI? organic vs artificial?

and what if it is done deliberately eg. a voice actor is phoning it in in a scene where something absurd happens? Is it a "good" performance because the VA's acting complements the scene, or is it a terrible performance because our collective metrics said so?

>A TV show with a boom mic floating into the frame is not high quality.

What if the said TV show is a satire, where the boom mic floating into the frame is followed immediately by the director yelling "cut" and chastising the boom operator?

Yes, you can relatively judge something as objectively as possible; you can say some metrics of quality is objectively measured, but you will never get to a point where there is an absolute objective measurement of quality.

ConjurerOfChaos:
Then again, there are things where the 'context' is important; in fact, I'd say 'context' is most important, or rather 'cohesion', i.e. how all the different aspects of the game (or piece of art in general) work together to create an experience (whether it's the one the producers wanted or not).
To use your example of save systems: being able to save whenever I want, as often and in as many slots as I want, and then continuing exactly where I left off might seem, in theory, the best possible way.
But in a shooting game, disallowing that in favor of save points creates far more tension (which is why Half-Life 2 didn't quite 'click' with me; if I took to much damage, I could just reload and try again).
Or in a game with multiple branching paths, only allowing for one, continuous save instead of 50 save spots gives each decision much more weight.

Well, sure. Context is important, I thought that was assumed. Though context doesn't always save something. Something can still be bad despite the context being intentional - you can intend for your save system to not allow people to take back their choices and still mess it up objectively. The Souls games save system for example is a great example of them intentionally preventing you from save/loading your way to success and it working perfectly. Half Life allows you to quicksave/quickload everywhere and that works perfectly at its intended purpose as well (and lets the player choose how shamelessly they abuse it).

The same goes for pacing, because not only the genre of the game, but also the story demand a certain kind of pacing, so when assessing the pacing of the game one would need to consider both these aspects before passing judgement; just saying 'I know a good pacing when I see one' is not really going to help.

To be fair, pacing is more of a "I know bad pacing when I see it" thing than the opposite. You don't notice good pacing. You definitely notice bad pacing.

When dealing with art, everything is more or less subjective; there are things that one perceives as 'objective factors', but given the right context / interpretation, those things that break or bend the rules might end up the most interesting.
Showing 'the works', destroying immersion on purpose, for example, is a stylistic device used by Brecht to make the audience aware that they were indeed just watching two actors and a sheep dog; he did this to make the audience reflect on what they saw before them.

Well, you have specific qualifiers for each subsection you're looking at. In gaming we call them "genres" and what works for one definitely won't always work for another. There's standards of quality for Racing games, Fighting games, Shooters, etc...

You can counter that there's an infinitely fragmenting amount of sub genres that make such statements useless, but I don't think they're useless.

Additionally, this is not to say there are never exceptions to the rules.

The most important thing is to know that there are no 'right' or 'wrong' opinions; there are simply opinions and in between conflicting opinions may lay the truth.
The important thing is not WHAT your opinion is - or indeed how 'subjective' it may seem - but how it is presented.
Saying 'I did (not) like this' is not helpful.
Saying 'I did not like this because it didn't follow the standard formula for save points' allows for discussion.
Saying 'I did like this because the bad voice acting evoked a feeling of loneliness because I was the only sane character in the game, an experience that filled me with existential dread' is highly subjective, but well presented.

The more you explain your opinion, the less subjective it becomes.

Riffing off of what some commenters are saying, I think we need to redefine "opinion" as something you can have a subjective view of. I hear/see way too many news sources giving voices to people because "everyone has an opinion" while not realizing that things like homeopathy are not only provably wrong, but actively dangerous when promoted as somehow equal to the real world. You can't give legitimacy to things that are definitely bullshit.

duwenbasden:
>There are shared standards of quality that we as a collective consciousness have decided is "good" or "bad" in any given
>medium.
>we know what good and bad graphics are

What do you mean by "our shared standards of quality"? How do we collectively judge "good" graphics from "bad" graphics? polygon count? texture res? the appearance of 8 bit sprites/graphics novel/photo-realistic/ASCII? what about brown vs colourful vs monochrome palette? film grain? the similarity between a RL photo and the CGI? organic vs artificial?

Intentional or not, artistic or not, graphics still look good or bad to us, and there are definitely reasons why that happens. Play a game with an inverted color palette filter on your TV/monitor and boggle at how worse things usually look. This isn't a coincidence.

You can intentionally make your graphics bad for whatever reason, maybe even a specific reason you're trying to evoke through your game, but they still look good or bad. There is huge leeway for graphical styles that one person may like and another will not (cell shading anyone?) but there are still technical things you can do right or wrong.

and what if it is done deliberately eg. a voice actor is phoning it in in a scene where something absurd happens? Is it a "good" performance because the VA's acting complements the scene, or is it a terrible performance because our collective metrics said so?

>A TV show with a boom mic floating into the frame is not high quality.

What if the said TV show is a satire, where the boom mic floating into the frame is followed immediately by the director yelling "cut" and chastising the boom operator?

Many of the rules change significantly once comedy is involved. And there are indeed a whole new set of rules for what makes good comedy or not.

duwenbasden:
>There are shared standards of quality that we as a collective consciousness have decided is "good" or "bad" in any given
>medium.
>we know what good and bad graphics are

What do you mean by "our shared standards of quality"? How do we collectively judge "good" graphics from "bad" graphics? polygon count? texture res? the appearance of 8 bit sprites/graphics novel/photo-realistic/ASCII? what about brown vs colourful vs monochrome palette? film grain? the similarity between a RL photo and the CGI? organic vs artificial?

and what if it is done deliberately eg. a voice actor is phoning it in in a scene where something absurd happens? Is it a "good" performance because the VA's acting complements the scene, or is it a terrible performance because our collective metrics said so?

Graphics are judged by whether they are aesthetically appealing and whether they succeed in creating the effect they are going for, whether that effect is good or bad.

It's not just the effects breaking down before our eyes that affects our immersion, it's also whether we are feeling the mechanics used to move the experience along. In games this often takes the form of the feel that the game is railroading you to where it wants to be, with Call of Duty and Medal of Honor: Warfighter (ahahahaha) being prime examples. Narratively this can also be a problem if the plot devices are too obvious, so we are taken out of the story by our waiting for them to happen rather than being surprised when they do. I had that problem with The Last of Us, where I spent several cutscenes just yelling "Give her the gun already so we can get to the bit where she saves your life and the two of you become closer as a result!"

I dunno, saying that immersion is paramount to the quality of something seems to me a little douche-y, considering Yahtzee's aiding examples. I mean, what if you didn't notice the boom mic, and you were watching that show repeatedly, and later had someone show you the boom mic in the scene you have already watched several times. Does knowing that you can see a boom mic invalidate the emotional investment you have previously put in the series? Do you have contradictory feelings that are a pain to reconcile?

Video games and movies are artificial constructions, and we know it. We also know that mistakes can and are being made, and some of them are overlooked during editing. Having issues with a visible boom mic, or the ass of a film crew walking into a corner of the frame, or a minor glitch in a video game, is like being upset that someone told you Santa Claus doesn't exist, way after you've found it out and have gotten over it.

I don't feel like immersion is jeopardised by glitches in gameplay. I can still get absorbed into Halo's (monolithic) universe even if the Grunt I just shot zips off into space because it got stuck between two rocks. We go into games and expect there to be errors or stuff which doesn't seem right. To me an interface or HUD (unless it's explained like in Dead Space) is something the character wouldn't see, but it doesn't get in the way of my immersion.

That last sentence of the column, that is gold right there!

Immersion, huh?

Well, I can't say I was immersed in this article. Tight deadline? ;)
There is a lot to be said about this topic as there ARE very objective factors that contribute to the presence or absence of immersion[1]. And I'm quite annoyed at how the word is thrown around e.g. on places like here and loses all meaning. But therefore... boiling it down to a boom mic and concluding that "bad art is art in which you can see the workings" is lazy, careless and - applied to the entirety of Art: even wrong.

[1] How individual consumers react to it, subjectively, is a different story

Catasros:
"Suddenly it's not Lord Carstairs aggressively seducing Dolly the parlor maid and her faithful sheepdog, now it's just two actors in costume on a sound stage with a very perplexed border collie."

This made my day.

Hehehe, yeah that was a nice one :D

I also loved how after all that posh talk, it all came down to a lemon covered in wank.

Thanatos2k:

duwenbasden:
>YES, your opinion about the quality of something can be wrong.

To a point, but not always. Quality of a game is subjective, since you cannot measure with the scientific method and have enough data to support it.

Objective quality does exist. Yahtzee made some points about it in this very article. A TV show with a boom mic floating into the frame is not high quality, regardless of whether you like the show or not.

There are shared standards of quality that we as a collective consciousness have decided is "good" or "bad" in any given medium.

A collective consciousness is still a subjective one. The world has plenty of cultures, all with completely different 'collective consciousnesses' about the what artistic quality means. To take it to an extreme, if an alien came down from another planet, their culture could be based around boom mics in the shot being an incredibly powerful artistic statement - and they wouldn't be objectively wrong. I feel that's where a lot of people get mixed up on the difference between objectivity and subjectivity - the only objective statements you can make are facts, ie 'there is a boom mic in the shot'. To go any further in analysing it, you have to inject an element of subjectivity - even if everyone in the world agreed with the statement that 'boom mics in the shot detract from artistic quality', there can still be a theoretical person who has a valid opinion that is the complete opposite. However, it's important to note that this doesn't detract from the importance or validity of critical analysis - it just means that it'll be tailored to what the general opinion within a certain culture will be, not a facade of 'objectivity'.

I hate the word "immersion". Any time someone plays a game, particularly a horror game, doesn't like it and says they don't like it someone will inevitable say "you didn't play it right, you have to immerse yourself into the game". I got this when I criticized Lone Survivor, it's supposed to be a horror game but it's art style and some bad artistic choices killed any horror the game had. Because I went into the game (which is otherwise a good game) with the expectation of horror I was disappointed and because of that disappointment I do not like the game (even though I should like it). What makes it worse is their idea of "immersion" is setting up an environment that would make any game scary, if one played a My Little Pony game alone, in the dark, at night, with headphones on, with the expectation of horror (which is what they consider "immersing" oneself in a game), it would seem scary, not because the game is scary because of the environment the player set up and the expectations of the player.

The thing about immersion is it's not the player that immerses himself in the game, the game immerses the player, if the player did not find the game immersive, it's because the game failed to immerse them into it, not because the player didn't "immerse" himself in the game. This is a your mileage may vary thing, different people have different tolerances, just as some people are easily amused some people are easily immersed and some people have high standards.

Someone has probably already thought of this years ago but I'll go with it anyways. I'm going to call this the "immersion fallacy". It has two meanings: 1. The idea that a player did not like a game, or did not think the game was "X", "Y", or "Z" because the player did not immerse them selves in the game rather than the player not enjoying the game or the game failing to evoke "X", "Y", or "Z" in the player. 2. The idea that immersion is an important part of a game.
With 1, one person could like a game, a game can invoke an emotion in that person while others may dislike the game or the game fails to evoke that emotion in them. With 2. with many games, and for many people immersion is not important, it's how engrossing the game is that is important. Just as one could enjoy a movie without suspending their disbelief, and a movie can invoke emotions in people with out them suspending their disbelief so can a game. It's not always about suspension of disbelief or immersion, it's often about how compelling and interesting a game, movie, book, etc is. If you play a game that is enjoyable then it doesn't matter if you are immersed in it, the game is enjoyable, if you read a book that is compelling, suspension of disbelief doesn't matter because it's compelling.

I apologize for the long post.

I think fps games have a bit of an easier time since the "window into the world" through the protagonist's eyes is easier to sell than the player as creepy stalker following a character and watching his bum as he runs around the world.

The two most immersive experiences I had were Aliens vs Predator (2000, not that abortion that came out recently) and STALKER. In STALKER I was actually scared to go into X-19 because of how frightening the other underground areas were.

I think that if we could come up with an objective measure of quality art it would be staying power. People still "The Count of Monte Cristo" over 150 years after publication. How many people believe that Twilight will be read by our 2150s ancestors?

Obviously this is a pretty crap way of judging the quality of something new and it is still basically popularity (although a slightly more nuanced measure of it) but I think it could work.

The reason people mock "MY IMMERSION" is because "immersion" has become a vapid buzzword that's poorly defined and usually just used to say "things I don't like."

I mean, if we saw the same level of complaints in other media, we'd think it was whiny crap. "Oh man, this is only in 1080p? My immersion is ruined!"

mrdude2010:
Riffing off of what some commenters are saying, I think we need to redefine "opinion" as something you can have a subjective view of.

I think we already do:

a view or judgment formed about something, not necessarily based on fact or knowledge.

Opinions are inherently subjective, but for some reason the only time that comes up is when people disagree or say something unpopular. Suddenly, it's "That's just, like, your opinion, man!" or "it's my opinion and I'm entitled to it." You never see the former when someone agrees with you (except possibly in jest), and you never see the latter when it's something like "I think slavery is bad, but that's just my opinion." The latter being in a modern context, of course.

The problem isn't so much the definition, but the connotation that goes along with it.

I hear/see way too many news sources giving voices to people because "everyone has an opinion" while not realizing that things like homeopathy are not only provably wrong, but actively dangerous when promoted as somehow equal to the real world. You can't give legitimacy to things that are definitely bullshit.

No, the reason news sources give voices to people like homeopaths is that telling people what they want to hear is money. Homeopathy has become huge business with a lot of people who believe in it. You piss them off, they might not buy your paper/watch your broadcast/click your links anymore. And that's the problem with media in a corporate environment: when you're expected to follow the same profit rules as Haliburton and McDonalds, you have no incentive for truth and every incentive to come up with shocking headlines and bland, possibly indecisive conclusions.

This doesn't apply to all media, however: I mean, Fox has gotten where it is by accusing the rest of the media of being TEH BIAS (despite the reality that most news networks are bland, inoffensive paste spoon fed to us to avoid upsetting us), proclaiming themselves the answer, and then telling one specific and profitable group what they want to hear.

Where was I?

Oh yeah. It's really not about everyone having an opinion and more about us not wanting to offend the plebes or our corporate sponsorship (Which is why you were more likely to see anti-tobacco articles in publications not based on advertising than in magazines with Philip Morris as a sponsor).

It's all about the money, or often enough to count in music often defined by dissonant, "blue" notes and syncopated rhythms.

K12:
I think that if we could come up with an objective measure of quality art it would be staying power. People still "The Count of Monte Cristo" over 150 years after publication. How many people believe that Twilight will be read by our 2150s ancestors?

Obviously this is a pretty crap way of judging the quality of something new and it is still basically popularity (although a slightly more nuanced measure of it) but I think it could work.

It's also a pretty crap way of judging in general. Who's to say Twilight WON'T be read in 2150, as our great American novel of the period? If that happens, is Twilight now validated?

I would be wary of claiming that immersion is essential to good art.
After all, there is a world of difference between that accidental boom mic and a deliberate revelation of artifice. The Russian Formalists, for one, declared that a truly great work of art is one that advertises its own nature as an artificial construction, and they championed Tristram Shandy for that reason.

The only opinions that I consider bad are things that are factually wrong, like claiming that a game that's very buggy and crashes a lot works fine, your copy might work fine but if other people have issue there's a 95% chance that they actually have issues.

Zachary Amaranth:

K12:
I think that if we could come up with an objective measure of quality art it would be staying power. People still "The Count of Monte Cristo" over 150 years after publication. How many people believe that Twilight will be read by our 2150s ancestors?

Obviously this is a pretty crap way of judging the quality of something new and it is still basically popularity (although a slightly more nuanced measure of it) but I think it could work.

It's also a pretty crap way of judging in general. Who's to say Twilight WON'T be read in 2150, as our great American novel of the period? If that happens, is Twilight now validated?

Y'know, if Wuthering Heights managed it, anything can.

P.S. Thanks

Look, this "Objective vs Subjective" thing has run its course, and it's pretty easy to settle anyway.

If a review tells me whether the reviewer enjoyed the game, it's subjective.

If a review tells me whether *I* would enjoy the game, it's objective.

I know those aren't anything close to dictionary definitions of the words or anything, but dammit, can YOU come up with any more of a succinct definition of "objective" and "subjective" as relates to videogame reviews?

I think that an important factor to be analyzed is, why do people make the choice to like what they do?

Using ASM2 as an example even further: Why did some of my friends walk out of the theater satisfied, while other friends and I walked out almost disgusted?
What accounts for the difference between our reactions?

As Yahtzee points out, there is sometimes a cultural difference, as other countries might see ASM2 as simply just a distraction to break up the night.

But for some people who take pop culture a bit more seriously? The script issues, blatant disregard for morality and subtext, and awful character arcs, make the film less of a distraction, and more of an insult. Because those who are able to watch and dissect video games, movies, TV, and books tend to obtain their own personal standards for what is good and bad. And distinguishing between the "good" and the "garbage" is how such standards for us "nerds" are established.

The issue is that the mass audience really doesn't seem to care. And they also have a right to not care. So there were many moments in ASM2 didn't make sense and the movie ended with a pointless cliffhanger.... so WHAT? It's not as if their lives will be impacted by the poor decisions made by Peter "Sn"arker (hehehehe). They come for the cheap romance and the high-production value SFX. And they get just that.

I have recently chosen to abide by a "right to idiocy" policy. And this isn't the same thing as "everyone has a right to his/her own opinion." You can make a thought provoking argument in favor of anything.
But if an average Joe film-goer decides, without much to back it up, that the latest piece of Adam Sandler's shriveled up corpse of a career is the funniest flick in theaters... then that's fine by me.

I'll deride it, bash it, and leave it for dead, but you have the right to enjoy it regardless. It would be wrong for me to try to change your opinion forcefully.

Yeah, this was a long one. And kinda rant-y. Apologies.

Thanatos2k:

ConjurerOfChaos:
Then again, there are things where the 'context' is important; in fact, I'd say 'context' is most important, or rather 'cohesion', i.e. how all the different aspects of the game (or piece of art in general) work together to create an experience (whether it's the one the producers wanted or not).
To use your example of save systems: being able to save whenever I want, as often and in as many slots as I want, and then continuing exactly where I left off might seem, in theory, the best possible way.
But in a shooting game, disallowing that in favor of save points creates far more tension (which is why Half-Life 2 didn't quite 'click' with me; if I took to much damage, I could just reload and try again).
Or in a game with multiple branching paths, only allowing for one, continuous save instead of 50 save spots gives each decision much more weight.

Well, sure. Context is important, I thought that was assumed. Though context doesn't always save something. Something can still be bad despite the context being intentional - you can intend for your save system to not allow people to take back their choices and still mess it up objectively. The Souls games save system for example is a great example of them intentionally preventing you from save/loading your way to success and it working perfectly. Half Life allows you to quicksave/quickload everywhere and that works perfectly at its intended purpose as well (and lets the player choose how shamelessly they abuse it).

That's not really the case. Subconsciously, humans will cheat more the more ways to cheat you give them, and the easier it is to cheat. For example, lets see the Skyrim DLC Dawnguard and the Vampire Lord ability. It is blatantly unbeatable. If you arn't letting people kill you, it is simply not possible to die while a Vampire Lord. There is no penalties either to becoming one, and there is no cool-down on its use or activation.

I immediately noticed this and decided to only use it in dire circumstances. However, no matter how much I ignored it, I had trouble not becoming a vampire lord in situations I could usually deal with with a little pain and suffering before, which usually made the dungeons more enjoyable, now simply made them tedious and easy. (I have more difficulty with Skyrim then most who think every part can be beaten with fists, and think Alduin can die in 3 hits).

Suddenly the game just wasn't enjoyable to me. So to fix this I started the Companions quest line which overrides your ability to become a Vampire Lord. Harder dungeons were once again pain and suffery like I wanted and more enjoyable. I felt no need to become a Vampire Lord anymore SOLEY because I couldn't anymore.

This is a real psychological thing which happens with most people. You give them an easier path and they will have trouble intentionally choosing the harder path, unless you give them a greater reward for doing so. Remove the easier path, and the gamers wont care anymore, because they don't have to intentionally make the game harder. Half Life 2 has an awful autosave system, so you practically have to use the quick save and regular saves to avoid 20 minute gameplay loses. It becomes very hard to not quick save around every corner by the end of the game.

"Like the humble fedora (which as any man of class can tell you, should never be worn with short trousers)..."

That's why they invented the Tyrolean hat.

 Pages 1 2 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here