Death to Good Graphics!

 Pages PREV 1 2 3 4 5 6 NEXT
 

Graphics don't necessarily make a game fun, but they do make the game more engaging. I personally believe that if companies started to cut down on graphics, they run a great risk of creating games that would be unable to compete with the games that kept revamping their graphics. This is probably especially true for PC games, seeing that the majority of the PC gamers I know (including me) seem to believe that finding new tech that makes games look better is worth spending several hundred dollars on.

ender214:
Graphics don't necessarily make a game fun, but they do make the game more engaging. I personally believe that if companies started to cut down on graphics, they run a great risk of creating games that would be unable to compete with the games that kept revamping their graphics. This is probably especially true for PC games, seeing that the majority of the PC gamers I know (including me) seem to believe that finding new tech that makes games look better is worth spending several hundred dollars on.

I agree with this, but i think the point is to not make games that compete with each other on which one is better visually or which one uses the latest tech, but which one is more fun to play and which one is more entertaining. When the game itself *do* benefit from having good graphics then they should assign more resources on the graphics, but for games that do not really benefit from them i think it is better to focus more on the other aspects of the game more and use graphics for augmenting the gameplay not the other way around.

I believe that the best genre to do good use of graphics and related technologies are pure adventure games (not action/adventure, just adventure). Graphics tend to be used for realism and cinematic effects and no other genre has so much storytelling as the adventure games genre to benefit more. And unlike other genres (such as action games).

Combat Arms is a great example of old graphics, but amazing gameplay.

shamus, I love your articles, I read them whenever they update.. I agree with you on this one, but I really don't think the industry will have much of a choice soon but to follow the Valve and Blizzard model and make smart use of older technology. Moore's Law has been slowing down for a couple years now - technology doesn't grow exponentially anymore - we're getting close to the physical limitations in how small you can make chipsets. Once we reach that point, it'll take a REALLY HUGE technological advance to get past it (akin to discovering some Prothean Ruins on the surface of Mars "REALLY HUGE").

So yeah, the well will dry up eventually as far as the next technological graphical advance goes.

I'll probably wrong - I'm not a pundit of gaming technology - my computer is years old and looks it, but I have been watching the advances for some time and I can see the writing on the wall.

i like games that have a good, long story, good gameplay, are engaging, and are fun. graphics are a plus, but 3 years into the future what will be more important? the graphics or the game? Look at KOTOR, the graphics are mediocre compared to COD4, but it has a great story and so is remembered. in a few years CRYSIS will just be a footnote in game development. in a few years i expect that the first holograms will start to be available. or something.

do you get what i am saying?

I know that graphics are pretty damn good as they are already. But I think better graphics and models and movments and such, can add more appeal to a game. Making it more enjoy able, and moments wehre you can jsut feel damn right epic, or feel like your acctually there in the game.

i think people are to over the top about it both ways.

People who always want better: Yes graphics are nice and add appeal to a game expierence. But that doesnt mean you always have to talk about them and wait for the next up coming game jsut for the graphics.

People who think graphics are done: Yes we are at a very high point in graphics than we've ever been, and it might not even go much higher. But that doesnt mean you need to rant about it, and bash people.

Bravo! Nicely put ^_^

This is basicly what was going through my head when I got to the premium mods for Neverwinter Nights 2: the graphics looked awesome, but the effects like shadows caused sluggish gameplay durring battles. The money spent on upgrading their game engine could have been spent on better voice actors & maybe giving Storm of Zehir some actual gameplay & a deeper plot so I didn;t stop every 5 minutes to wish I was playing Neverwinter Nights 1 again.

I'm also awaiting Serious Sam 3. I still like the graphics from SS2 & don't think it really needs to be improved upon.

Ooh, another reason graphics don;t need to advance: players with ADD. Do we really need to stop & watch the wind blow through the grass? Do I have to let my enemy rail on me because I want to get a good look at what he's wearing? Do I have to get ambushed in a survival game because the house is full of shiny objects? No! But I do anyway, & that's why I suck at shooters.

Personally, I think developers should use whatever kind of graphics that they feel is appropriate for the kind of game they want to make and the kind of story that they want to tell.

the game that does not require me to buy a new pc will always have better chances to be bought by me then the game that demands a ludicrous machine.

fck you crysis, not gonna buy you.

Credossuck:
the game that does not require me to buy a new pc will always have better chances to be bought by me then the game that demands a ludicrous machine.

fck you crysis, not gonna buy you.

I think this brings up another interesting aspect to this discussion. Crysis and many of the other PC melting graphical feasts are often time optimized to scale quite well to older machines, but despite them being capable. They are simply written off as something you will never buy. It's believed it can only run on some theoretical supercomputer or there is a mystique that you are not playing the true experience if the resolution is not into the 5 digit range.

I actually played the first FEAR back on a system that was only capable of running it at 800x600 on low or medium graphic settings. I still had a lot of fun playing it and was drawn in regardless of the fact that it could be played at a much higher graphic level.

During Mike Capps keynote at TGC he mentioned having a small team working on the latest Unreal Tournament whose sole purpose was to get the game running on much previous generations of technology. So developers are at least beginning to understand it's important to provide their game on at least the widest range of systems as possible. One section of the gaming industry where you see this heavily is MMOs. MMOs make money by having lots of users, so having your game require for instance Pixel Shader 2.0 can drastically cut your user base. So they are designed graphically from the bottom up, what's the bare minimum we want this to be able to run on and then how can we improve what we now have as the user's system improves.

I do agree with the sentiments expressed in the article. A good solid art design can make up for a few years of addition pixel counts, and many games would be better off spending the time they invested into all those pixels and reinvesting it into core game design and gameplay.

Why can't we just accept that graphics are getting better in games and that technology is advancing?
I payed the price for a next-gen console, so I expect the next-gen graphics, nothing less, nothing degenerative.

Great read!

Slycne:

Credossuck:
the game that does not require me to buy a new pc will always have better chances to be bought by me then the game that demands a ludicrous machine.

fck you crysis, not gonna buy you.

I think this brings up another interesting aspect to this discussion. Crysis and many of the other PC melting graphical feasts are often time optimized to scale quite well to older machines, but despite them being capable. They are simply written off as something you will never buy. It's believed it can only run on some theoretical supercomputer or there is a mystique that you are not playing the true experience if the resolution is not into the 5 digit range.

I actually played the first FEAR back on a system that was only capable of running it at 800x600 on low or medium graphic settings. I still had a lot of fun playing it and was drawn in regardless of the fact that it could be played at a much higher graphic level.

During Mike Capps keynote at TGC he mentioned having a small team working on the latest Unreal Tournament whose sole purpose was to get the game running on much previous generations of technology. So developers are at least beginning to understand it's important to provide their game on at least the widest range of systems as possible. One section of the gaming industry where you see this heavily is MMOs. MMOs make money by having lots of users, so having your game require for instance Pixel Shader 2.0 can drastically cut your user base. So they are designed graphically from the bottom up, what's the bare minimum we want this to be able to run on and then how can we improve what we now have as the user's system improves.

I do agree with the sentiments expressed in the article. A good solid art design can make up for a few years of addition pixel counts, and many games would be better off spending the time they invested into all those pixels and reinvesting it into core game design and gameplay.

Crysis is about looking pretty. if the game won't look prett on your machne, there is no point buying it.

buy a wii those have crap graphics most of the time

Slycne:
I do agree with the sentiments expressed in the article. A good solid art design can make up for a few years of addition pixel counts, and many games would be better off spending the time they invested into all those pixels and reinvesting it into core game design and gameplay.

But that's assuming that the Graphics, Gameplay, and Story department are all intertwined. While some form of connection between cubicles is inevitable, lowering the costs of Graphics making doesn't automatically mean that the money cut from production will go over to the other ones (Gameplay, Story).

Credossuck:

Slycne:
snip

Crysis is about looking pretty. if the game won't look prett on your machne, there is no point buying it.

Crysis was merely my jumping off point. The game had a few interesting bits, being able to change your suit aspects and the open world gameplay was interesting, but it didn't bring much else to the table.

Jumplion:

Slycne:
I do agree with the sentiments expressed in the article. A good solid art design can make up for a few years of addition pixel counts, and many games would be better off spending the time they invested into all those pixels and reinvesting it into core game design and gameplay.

But that's assuming that the Graphics, Gameplay, and Story department are all intertwined. While some form of connection between cubicles is inevitable, lowering the costs of Graphics making doesn't automatically mean that the money cut from production will go over to the other ones (Gameplay, Story).

I certainly agree that game development is not a simple process, but I think it's safe to break it down to that if X were a typical development cycle and you removed Y( additional time spent working on your graphics suite). That you could fill that time with a number of other variables. That time could very well be spent on things other than gameplay though. It's just a likely to be sound, releasing a demo that you didn't previously plan on, simply releasing early though or some other aspect. The hope is that it would be used to polish the core design and gameplay though.

perfect article, perfect perfect perfect
i agreed with everything said, EXCEPT...

i personally don't find the old age pixel/block graphics offensive. not as attractive as new graphics, but not offensive. i think if people weren't so head-over-heals for good graphics, they'd realize that the best of these old 'crappy-looking' games are at least as much fun as today's best titles

'PC gaming is dying' only because every single new title A. is a female dog to install, partly due to DRM and mandatory account creation, and B. won't run unless you have a GeeForce9XJZROJGK Turbo II SLI setup.

There are very few gaming PCs, but there are a lot of COMPUTERS around. Make a game that will run on a mid-range netbook and the world is yours. World of Warcraft runs on anything sold in the past three years that doesn't have integrated video, and it rivals the best-selling console titles in popularity despite the monthly fee. Hell, 30K people online on *Diablo 2* beats most modern online games.

Why can't anyone just make a game that will run on a typical laptop (if at low settings), puts more effort into art direction than into anisodynamic megabloomshadows and allows anyone who doesn't know computers to install the game, create an account and download DLC without screwing up? Oh wait, the reviewers will call it a budget title. Never mind.

......

Ps. It is impossible to simulate reality properly because a lot of stimuli (like surround vision, sound, wind) are missing, so you just get a boring looking game instead of a realistic looking one. There's a reason why movies use colour filters and why Need for Speed Underground is more popular than Live for Speed.

Why not improve on reality through art design instead of attempting to copy reality? The medium is MADE for artistic creative worlds, not for perfectly recreating a drab looking battlefield.

I've believed graphics overrated since Tomb Raider 2. I had a hard time getting others to see my point in high school.

Shibito091192:
Why can't we just accept that graphics are getting better in games and that technology is advancing?
I payed the price for a next-gen console, so I expect the next-gen graphics, nothing less, nothing degenerative.

Well it's your money and you've got every right to expect to get your money's worth, but you missed the point. The graphics arms race is actually hurting game development and the industry at large, with ballooning budgets and more on the line with each game but less and less actual product being delivered--not to mention smaller profit margins for each sale. Combined with a poor global economy the current model is, frankly, a bit daft. Lower-graphics games don't have to sell more copies than their big-budget competition in order to be more profitable--or even near as many copies. On the PC especially.

While you're paying for better graphics, you're probably paying for less actual game while you're at it. Every dollar spent on graphics (and as an aside, celebrity voice actors) is a dollar not spent on producing more in-game content or ironing out bugs.

Valve's games run on a 5-year old graphics engine and they're still making a killing.

You might not like that people are looking at the economics of graphics bloat and shrinking back, but it doesn't change the fact that there's too much being spent for too little profit and every developer knows it. (Even Crytek, makers of Crysis, are stepping back from the bleeding edge.) The reason why they have to keep pushing graphics is because their publishers demand it, mostly to cater to gamers who keep buying the best graphics cards or leap onto the newest consoles only for graphics, all of whom are becomingly an increasingly less profitable source of income.

(There's more money to be had with casuals. Sad but true. Cheaper to make a casual game and more people are likely to buy them. The hardcore market has been effectively shooting itself in the foot for almost a decade now.)

The point most people seem to be making is that there has to be a happy medium between spending on graphics and spending on The Rest Of The Game. Good enough graphics to look competitive, at least when paired with competent art direction, but not so good that developers blow over half their budget on it.

Dorian Cornelius Jasper:

Shibito091192:
Why can't we just accept that graphics are getting better in games and that technology is advancing?
I payed the price for a next-gen console, so I expect the next-gen graphics, nothing less, nothing degenerative.

Well it's your money and you've got every right to expect to get your money's worth, but you missed the point. The graphics arms race is actually hurting game development and the industry at large, with ballooning budgets and more on the line with each game but less and less actual product being delivered--not to mention smaller profit margins for each sale. Combined with a poor global economy the current model is, frankly, a bit daft. Lower-graphics games don't have to sell more copies than their big-budget competition in order to be more profitable--or even near as many copies. On the PC especially.

While you're paying for better graphics, you're probably paying for less actual game while you're at it. Every dollar spent on graphics (and as an aside, celebrity voice actors) is a dollar not spent on producing more in-game content or ironing out bugs.

Valve's games run on a 5-year old graphics engine and they're still making a killing.

You might not like that people are looking at the economics of graphics bloat and shrinking back, but it doesn't change the fact that there's too much being spent for too little profit and every developer knows it. (Even Crytek, makers of Crysis, are stepping back from the bleeding edge.) The reason why they have to keep pushing graphics is because their publishers demand it, mostly to cater to gamers who keep buying the best graphics cards or leap onto the newest consoles only for graphics, all of whom are becomingly an increasingly less profitable source of income.

(There's more money to be had with casuals. Sad but true. Cheaper to make a casual game and more people are likely to buy them. The hardcore market has been effectively shooting itself in the foot for almost a decade now.)

The point most people seem to be making is that there has to be a happy medium between spending on graphics and spending on The Rest Of The Game. Good enough graphics to look competitive, at least when paired with competent art direction, but not so good that developers blow over half their budget on it.

I disagree to an extent with you saying that developers focus on the graphics more than the actual gameplay or other aspects of the game. I agree, however that games need to find a balance between graphics and gameplay and I also believe that games are starting to do so.
For example, Fallout 3 had good graphics and fantastic gameplay.
However, Fallout 3 developers, Behesda no doubt spent a fortune on graphics and do have at least one celebrity voice actor in the game.

So Fallout could be used as an example against your point that the more the developers blow on superficial aspects of the game, the lesser quality of the gameplay and the game itself you get.
Fallout 3 has all of the things you said make a game less good but it is still an award winning game...

Shibito091192:
Fallout 3 has all of the things you said make a game less good but it is still an award winning game...

Award-winning yes. But I'm something of a Fallout 1 and 2 grognard and F3 didn't impress me as much as it did others. However, compared to many other big-budget games, especially other first-person games, Fallout 3 did have much, much more content. It's less proof against my point and more proof for Shamus' point that developers need to find a sensible balance between spending on presentation and spending on content.

Note that even Bethesda admits that their visual production values are relatively low compared to other games in the latter half of this decade--ironic considering they earned a reputation for bleeding-edge graphics when Oblivion was released. Graphically speaking, Fallout 3 is actually a little behind the curve and still manages to look presentable, which lines up rather nicely with some points made in the thread.

Dorian Cornelius Jasper:

Shibito091192:
Fallout 3 has all of the things you said make a game less good but it is still an award winning game...

Award-winning yes. But I'm something of a Fallout 1 and 2 grognard and F3 didn't impress me as much as it did others. However, compared to many other big-budget games, especially other first-person games, Fallout 3 did have much, much more content. It's less proof against my point and more proof for Shamus' point that developers need to find a sensible balance between spending on presentation and spending on content.

Note that even Bethesda admits that their visual production values are relatively low compared to other games in the latter half of this decade--ironic considering they earned a reputation for bleeding-edge graphics when Oblivion was released. Graphically speaking, Fallout 3 is actually a little behind the curve and still manages to look presentable, which lines up rather nicely with some points made in the thread.

I agree that Fallout 3's graphics were not groundbreaking and I was actually quite disappointed at first... But then I realised and began to appreciate the effort that Bethesda had gone to to make the game detailed and full of content that keeps you interested and playing (graphics alone could not do this).
The slight degrading aspect of the graphics was a worthwhile sacrifice that made Fallout 3 a great game. I think that more game developer's should pay attention to Bethesda's infinate wisedom when it comes to making great games.

badsectoracula:
I fully agree with the article (and i'm a graphics programmer myself, i have worked in a games company and i've made a bunch of "indie" games).

However it isn't the technology that is the problem. Really. If you think about it, making an engine doesn't take much time. What it *does* take time is using the engine. This includes using the tools to create content. And with "tools" i mean both the engine-specific tools (such as the world editor) and the generic tools such as the 3D modelling program and the 2D content creation program (Max, Maya, Blender, Photoshop, etc).

In the company i worked at we had some great art. Technically our engine wasn't anything spectacular, edge-pushing, etc. In fact most of the art worked with your standard diffuse + normal + specular + lightmap rendering, plus some shadows, particles and other "common" effects. Yet the results we had (back then) were amazing.

Why? But because we had about five times more artists than programmers.

Content creation is the bottleneck in game development and this isn't always tied to technology. Sure, in Wolf3D era one could make a good level in a single afternoon. Doom expanded this to a week, Quake... well some people over at func_msgboard need at least a month to create a good looking level. And if you notice, every single one of these games removed some limitation. As we get less and less limitations, we reach more "life like" results for our virtual worlds and these virtual words become more real.

The problem is, the real world took millions of years to become what it is today.

I believe today we have enough technology to display a very convincing imitation of a real world. But to display something, we first have to create it. And our technology on this aspect is far from ideal. For the last 20 years (and more) we use the same principles developed by SGI to create the world content. Triangles. 3D mesh models. Our 3D modelling tools are just extensions to the same tools people used in old SGI workstations. The Quake 1 models were made the same way the Crysis models were made: pushing triangles around. Sure we now have ZBrush and similar tools, but these are just extensions to the basic idea of texture mapped triangles.

Currently the industry tries to solve the problem in a "brute-force" manner: spending more money for more detailed content. I believe this is a short-term solution which already has started showing its drawbacks. The solution lies more on creating new technology which is designed to allow faster content creation than pushing more detail. Currently the goal is to push more detail on the screen without much thought on the content creation speed. I believe this must be reversed and the goal shoul be to create content faster with the detail to be a secondary goal. If we solve the fast content creation issue, adding detail will be a natural evolution of this.

I enjoyed reading that, and I agree with you. I believe this solution needs to be made quickly, because games are beginning to take years at a time to be created, and if they keep on the same track it will only take longer. If FFXII took four years to make, and Killzone 2 took five, what stops the next generation of titles from taking longer with higher risk due to the higher cost, with games costing around 60-65 pounds/dollars the way they are going now?

Shamus is right, optimise what you have, don't venture too far forward. I think it's likely that Sony will never take such a risk of such high calibre again.

Shibito091192:
Why can't we just accept that graphics are getting better in games and that technology is advancing?
I payed the price for a next-gen console, so I expect the next-gen graphics, nothing less, nothing degenerative.

I'll play the snide bastard once again. You paid for a current-gen console, a games console. Hopefully you bought it for the games, in which case graphics come second in line to gameplay, and surely it should be forgiven if the graphics don't look great, if the gameplay is.

Thank you. Finally somebody agrees with me I've only been saying this for about 2-3 years. Years does nothing really to improve how much the game is enjoyed (unless you just enjoy the view, and in that case I want you to stay away from my house) the only thing it does is make most gamers now a day spoiled on good graphics, and this will cause them to hate the classics because "oh this game sucks because I can't see the sweat on his face" fuck you! Get over it just because it doesn't have top of the line graphics doesn't make it bad. Get over it you spoiled Paris Hilton reject cell phone using fuck bags.

/rant

ChromeAlchemist:
I enjoyed reading that, and I agree with you. I believe this solution needs to be made quickly, because games are beginning to take years at a time to be created, and if they keep on the same track it will only take longer. If FFXII took four years to make, and Killzone 2 took five, what stops the next generation of titles from taking longer with higher risk due to the higher cost, with games costing around 60-65 pounds/dollars the way they are going now?

Shamus is right, optimise what you have, don't venture too far forward. I think it's likely that Sony will never take such a risk of such high calibre again.

But the only real way to improve and evolve gaming is to take risks of high calibre. It doesn't matter what the risk is.

Again, I bring up the difference in "visuals" and "graphics".

I'm not sure if you'vev played Killzone 2, but KZ2 has amazing visuals as well as awesome graphics. Everything has been paid attention to and has been given detail to the most outstanding degree.

While, yes, it doesn't do too much to innovate in the game of story department, it's still as innovative as Mirror's Edge in it's own way. How many games have paid so much attention to the details, the atmosphere in games? Killzone 2 aesthetic feel to it is it's innovation and I hope more games continue to develop ways to make the game feel that much more immersive.

Of course, that is mostly my personal opinion of Killzone 2, but nobody can deny it's attention to detail and graphical/visual achievment whether they see it or not.

Jumplion:

ChromeAlchemist:
I enjoyed reading that, and I agree with you. I believe this solution needs to be made quickly, because games are beginning to take years at a time to be created, and if they keep on the same track it will only take longer. If FFXII took four years to make, and Killzone 2 took five, what stops the next generation of titles from taking longer with higher risk due to the higher cost, with games costing around 60-65 pounds/dollars the way they are going now?

Shamus is right, optimise what you have, don't venture too far forward. I think it's likely that Sony will never take such a risk of such high calibre again.

But the only real way to improve and evolve gaming is to take risks of high calibre. It doesn't matter what the risk is.

Again, I bring up the difference in "visuals" and "graphics".

I'm not sure if you'vev played Killzone 2, but KZ2 has amazing visuals as well as awesome graphics. Everything has been paid attention to and has been given detail to the most outstanding degree.

While, yes, it doesn't do too much to innovate in the game of story department, it's still as innovative as Mirror's Edge in it's own way. How many games have paid so much attention to the details, the atmosphere in games? Killzone 2 aesthetic feel to it is it's innovation and I hope more games continue to develop ways to make the game feel that much more immersive.

Of course, that is mostly my personal opinion of Killzone 2, but nobody can deny it's attention to detail and graphical/visual achievment whether they see it or not.

But that's it though, why are developers in general going to take risks to evolve gaming if they could go bankrupt on a single game a la Haze? (you wouldn't believe I had to google that, I had forgotten that title already) If games get more expensive, things like this would become the exception and not the rule.

Detail wise yes it is painstakingly detailed, but it's hardly the peak of this generation. Optimisation means you can get even more than that without venturing away from this generation's hardware. It's all well and good to improve and evolve gaming when you're a developer who has a bottomless budget, but that idea isn't so appealing when you're a business trying not to go bankrupt on a single title.

This in my opinion could affect the kind of games that these devs are putting out, hence the great deal of crap/similarity in content recently, because the grizzled space marine worked for the guy before them.

P.S. it's 4:47 here so forgive me if that didn't make sense, please let me know if it didn't.

This is something I've been saying for years.

Hell I still think Quake 2 looks good. (especially since I can finally run it on the highest settings.)

Kiutu:
If it aint fun, it aint fun. Graphics do not MAKE games fun. They can only augment it, but graphics that augment fun usually are considered bad graphics.

then there would be more people buying Wiis >_>

Shibito091192:
The slight degrading aspect of the graphics was a worthwhile sacrifice that made Fallout 3 a great game. I think that more game developer's should pay attention to Bethesda's infinate wisedom when it comes to making great games.

Then in the end we've come to an agreement.

Dorian Cornelius Jasper:

Shibito091192:
The slight degrading aspect of the graphics was a worthwhile sacrifice that made Fallout 3 a great game. I think that more game developer's should pay attention to Bethesda's infinate wisedom when it comes to making great games.

Then in the end we've come to an agreement.

I suppose in a way we have... That's a good thing.

Shamus Young:
Death to Good Graphics!

Shouldn't we all just get over the graphics thing, already?

Read Full Article

I'm sorry, but am I the only person who thinks we don't need to move graphics on any further at all? I mean, I think they're absolutely excellent now, why do people persist on raising the bar? I just want to keep things level, don't create new software, stick with the kit you have and make a good story...

(Congratulations Mass Effect >.>)

I don't understand why everyone seems to be obsessed with game length.

Having longer hallways, unnecessary cut-scenes, and more of the same enemy in the same rooms doesn't make a game "better", nor does it imply more "value".

Fredrick2003:
I don't understand why everyone seems to be obsessed with game length.

Having longer hallways, unnecessary cut-scenes, and more of the same enemy in the same rooms doesn't make a game "better", nor does it imply more "value".

But having a variety of levels and missions, more options and improved game mechanics does make a game better and gives it more value for money.

ChromeAlchemist:

Fredrick2003:
I don't understand why everyone seems to be obsessed with game length.

Having longer hallways, unnecessary cut-scenes, and more of the same enemy in the same rooms doesn't make a game "better", nor does it imply more "value".

But having a variety of levels and missions, more options and improved game mechanics does make a game better and gives it more value for money.

Agreed, but it seems that all too often "professional" reviewers will only mention the length of the game, not what it entails.

For example, a first person shooter could be around 3 hours long, and they will say something along the lines of "for 3 hours, this game is not worth your $60".

On the other hand, they could be reviewing an RPG, and say something along the lines of "the main story clocks in at around 80 hours, so you are definitely getting a lot of bang for your buck". But... They don't point out that 60 of those hours are pointless grinding and 10 more are annoying minigames.

I guess I just want more specifics in my reviews in general, honestly.

 Pages PREV 1 2 3 4 5 6 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here