Hardware Stagnation (Semi-Rant)

Last week I installed my first SSD to use instead of a regular hard drive, mainly because it was convenient since I was doing a format of my hard drive at the same time.

When I was doing research into SSDs, I noticed quite a lot of comments along the lines of 'x is dead/obsolete, x is becoming the new standard', for example, some were saying that 4K is becoming standard, CPUs such as AM3+ are 'dead' and that 8GB of RAM isn't enough. I was tempted to upgrade, which got me thinking-

There seems to be a general opinion that you need increasingly more powerful hardware to be able to game, when actually it seems like there has been little change in the last 5 or so years.
Certain parts are more power efficient now, which is a good thing, but that seems about it.

My current specs:
CPU: AMD A8 6600K
MOBO: Gigabyte f2a55m hd2
8GB RAM
GPU: Nvidia GTX 1050

I'm still using hardware from 2013 with no issues. Last year I upgraded my monitor- 1050p to 1080p, and my GPU, Nvidia GTX 750TI to GTX 1050. Was using the older parts in another PC so I decided to upgrade, it didn't seem necessary though.

It's as though people are anticipating a big development of game technology in the next few years.
This year I played through games such as RE7 and didn't have issues.
I feel like there was an obvious 'explosion' of technological advancements in 2007 when it became a 'requirement' to upgrade your hardware to be able to play newly released games at the time such as Crysis and similarly demanding games. I haven't really noticed another game like it recently which really pushes the boundaries of hardware and causes the playerbase to upgrade.

The only games which seem to struggle because of hardware seem to be mostly poorly optimised games.

TL;DR

The point i'm trying to make is that it doesn't seem necessary to write off hardware and upgrade components every 2 or so years just because it's "outdated".
8GB of RAM is still seemingly fine- I haven't played a game yet which has justified the use of any more than that. A fancy motherboard and so on seem useless if you're only using your PC for gaming.

Even if there were to be a sudden increase of games with more demanding requirements like in 2007 (using technology like this tech demo from 2007, which features technology which still isn't really used: https://www.youtube.com/watch?v=3bKphYfUk-M ), there's so many games that have been released recently that you'd probably have more than enough entertainment until newer hardware comes way down in price.
I currently have a backlog of over 50 games to get to at some point, just from mostly recent releases in the last <2 years.

Well, the key statement is "4k is the standard". It really isn't.

Most of the rest is just downflow from that initial flawed statement.

If you have a 4k setup, cool and all, but dropping a 1000 bucks on a giant TV is still not the go-to for most folks (whether by space concerns, budget concerns, or power use concerns. The monitors, while cheaper, also are smaller and won't really reflect the difference as much as a result.

Throw on the top of that that only the AAAest of AAA games even have 4k textures and so on to use anyways.

I'm using a lower resolution monitor now than a decade ago (in 2007, I had a 2560x1600 monitor. Now, it's 2560x1440)
CPUs have been long stagnant.
SSDs were a big leap forward... One that I made in 2009.
The move from dual-core to quad core on the desktop was a big one... Which was in 2007.

I'm a computer hardware enthusiast, and the last decade has been boring. 2007-2009 was probably the *end* of big gains on any front. Modern CPUs and GPUs are faster, sure, but when I compared my old 2500k to the then-shiny-new 7700k and saw little to distinguish the two, despite the 6 year gap between their release, it was hard to get excited.

And 1999-2009 was incredibly slow compared to 1989-1999, when the tech world was exploding (and kinda did at the end of 1999... in less good ways)

Up until less than a month ago I was still using my rig from 2008, upgraded to the gills. I could still play some new releases with mixed settings, but it finally died (don't ever clean your PC components with one of those electronic "data vacs", even with the PSU switch off). My precious motherboard konked out somewhere but I was due for a rebuild within the next year anyways. Yeah I really milk it haha.

If I could've stretched it out a few more months I might've gone for Coffee Lake which uses 6 cores (whenever programs started taking advantage of that, who knows), but I "settled" for a Z270 + Kaby Lake i7. I'm hoping to get as long or possibly longer than my last. I haven't even had time to install a game yet but am looking forward to it to see what I've been missing lately on PC. I must admit my board is more tricked out than I'll probably ever need but I bought it more for the durability rating than anything. I still have a backlog of games from the previous generation too.

In the meantime with games like Horizon Zero Dawn I've been pleasantly satisfied with current gaming visuals even on console, even though I'm sure the highest settings of some PC games technically outgun it. The law of diminishing returns is increasingly apparent, and I'm actually kind of relieved. I spent a lot of money last "gen" chasing frames in Crysis. Now I'll be happy to just play The Witcher 3 on high settings. Still using my old GPU for the time being though, waiting for some possible deals this month before digging into that one properly. It's the one I've been waiting for the most the last couple years.

It's kinda funny how console games are better graphically than PC games. Before I get a mob of PCMR after me, what I mean by that is that the PC exclusive games aren't big on graphics when compared to console games basically. Of course, a multiplatform game on PC can look better than the console version with a higher resolution and framerate. But an actual PC-only game like Divinity Original Sin 2 (which will mostly like end up on consoles like the 1st one) or League of Legends doesn't require much hardware to run. You can easily get a gaming desktop for around $100 because finding a refurb with a decent-to-good processor is so cheap, then just put in a somewhat decent video card and you can play pretty much any game worth playing that's only on PC. Even then so many PC style games that were only found on PC like aforementioned Divinity along with stuff like Pillars of Eternity, Shadow Tactics, Cities Skyline, PUBG are on or coming to consoles. So you can have a gaming PC + a console for $300 (as PS4s were just $199), that's cheaper than building that "console-killing" potato masher PC.

Most of the most graphically intensive console games are the first party games that are directly funded by Sony or Microsoft, as they serve as advertising for the platform. Because PC is a decentralised platform, there is no "platform holder" to pour huge sums of money into making tech showcase games, so the games tend to be less graphically intensive since PC games have to be made with tighter budget constraints.

Somebody's satisfied with a 1050! This can't be! Quick! Sound the PC Shill Alarms!

You have to remember that this is the internet. People with particular interests are going to find each other and talk about topics they're passionate about absent from the opinions of less zealous participants in that same hobby. To internet forums or communities that surround PC gaming you're going to find more people than average who are interested in staying on the cutting edge of technology and pushing their games at high resolutions and high framerates. To them, what they're saying about outdated hardware is 100% true. Your mileage however may vary based on your desired gaming experience and budget. I for one would personally not consider building a PC with your hardware but there is nothing wrong with you being happy with it.

Phoenixmgs:
It's kinda funny how console games are better graphically than PC games. Before I get a mob of PCMR after me, what I mean by that is that the PC exclusive games aren't big on graphics when compared to console games basically. Of course, a multiplatform game on PC can look better than the console version with a higher resolution and framerate. But an actual PC-only game like Divinity Original Sin 2 (which will mostly like end up on consoles like the 1st one) or League of Legends doesn't require much hardware to run. You can easily get a gaming desktop for around $100 because finding a refurb with a decent-to-good processor is so cheap, then just put in a somewhat decent video card and you can play pretty much any game worth playing that's only on PC. Even then so many PC style games that were only found on PC like aforementioned Divinity along with stuff like Pillars of Eternity, Shadow Tactics, Cities Skyline, PUBG are on or coming to consoles. So you can have a gaming PC + a console for $300 (as PS4s were just $199), that's cheaper than building that "console-killing" potato masher PC.

Not trying to stir up any PCMR nonsense here, but you also chose some very particular examples. Divinity is made by Larian which is by no means a big studio. As an indie it makes sense to prioritize systems and mechanics over graphics for budgetary reasons. Likewise with Cities, and Shadow Tactics. And LoL is meant to appeal to as large a population as possible to create a thriving competitive community. It makes no sense to push the graphics on a game like that because you want it to be as accessible as possible. Counter Strike is another great example. There's a reason it doesn't look like ARMA and it's because Valve wants basically anyone on Steam to be able to run it.

If you look at more niche PC exclusives like The Total War Series, Project Cars, Star Citizen, the upcoming Mech Warrior, 5, and even smaller titles like the Vanishing of Ethan Carter you see them taking advantage of PC hardware. And as you mentioned, multiplats benefit as well. I would argue that multiplats designed for PC first end up looking better. The Witcher 3, Prey, Wolfenstein, Doom, Shadow of Mordor, Deus Ex, these are all games that have a PC pedigree and I think everyone benefits from the bump in visuals that brings.

Phoenixmgs:
It's kinda funny how console games are better graphically than PC games. Before I get a mob of PCMR after me, what I mean by that is that the PC exclusive games aren't big on graphics when compared to console games basically. Of course, a multiplatform game on PC can look better than the console version with a higher resolution and framerate. But an actual PC-only game like Divinity Original Sin 2 (which will mostly like end up on consoles like the 1st one) or League of Legends doesn't require much hardware to run. You can easily get a gaming desktop for around $100 because finding a refurb with a decent-to-good processor is so cheap, then just put in a somewhat decent video card and you can play pretty much any game worth playing that's only on PC. Even then so many PC style games that were only found on PC like aforementioned Divinity along with stuff like Pillars of Eternity, Shadow Tactics, Cities Skyline, PUBG are on or coming to consoles. So you can have a gaming PC + a console for $300 (as PS4s were just $199), that's cheaper than building that "console-killing" potato masher PC.

You could theoretically, but even that's a stretch imo. I don't know who would ever want to game on a $100 PC, even with an upgraded GPU. What kind of GPU would even work in a $100 PC anyways? Curious to see where you're getting that number from.

The beauty of PC hardware is that, if it still works for your needs, it's not really outdated.

Sure, it's older and often surpassed by later generations of hardware, but I have a spare PC still using a GTX 670 that plays Overwatch just fine.
Consoles get outdated because eventually software stops being released for it, or at least dramatically slows down. On PC, as long as you have an OS from the last decade nearly, you can play any modern title on a wide range of hardware.

If were talking gaming strictly, there are more games getting close to using 8GB of RAM and above, but not nearly enough that you'd need to worry about it, and that's really only if you're planning on playing with the highest settings.

I've noticed that a lot of PC ports lately have settings that don't effect visuals nearly as much as they do performance. A lot of games look nearly just as good on Medium as they do on High, unless you're REALLY looking for it, the only option making the biggest difference often being texture resolution.

It was late when I posted so I didn't really construct my post as well as I could've and some of my thoughts came out a bit randomly. I'll try to organise them a little more to make more sense.

I think one of the points i'm trying to make is that it seems like the game industry as a whole has reached an 'impasse'.
It's expensive to make 'Crysis' type games, pushing the boundaries of what hardware is capable of. Thus, there's only a few such games, like Deus Ex: Mankind Divided.

With there being a shift towards 'less AAA companies, more indie companies', this means that the average game has a relatively lower budget compared to a few years ago. Companies that are capable of producing 'Crysis' titles are more concerned about the avertising budget than trying to make a hardware demanding title, which makes sense.

And there are advanced physics engines that exist such as PhysX and Euphoria, but these are mostly exclusive to certain companies.

So, I believe that the average system requirements have reached a peak, and won't really be any higher for some time.

Anyway,

I'm far from a graphics enthusiast. If it plays smoothly without stuttering then it works for me. I played through Fallout (1997) before writing that post, and thought it still looked OK.

Some people are saying "X is becoming standard" which makes it sound like games in general are becoming more demanding of hardware than they actually are.

There's absolutely nothing wrong if you have a 4k monitor, high end CPU, fancy motherboard and so on. If you actually have need for it, like you're using your PC as a workstation, then that's perfectly justified. I mean, my motherboard doesn't have USB 3.0/SATA3 support which would be nice, but I don't need to upgrade until I know i'll have use for it.

Although, i've heard some people are upgrading every 3 or so years, which seems wasteful to me. Hardware manufacturers obviously need to stay in business, but there's a point where hardware will be classed as 'obsolete' too quickly. With computer components being such a commodity, having a mindset where you're upgrading your hardware just because of what people are saying can be negative.

However, for the average consumer, I feel there's a fair amount of misinformation.
I've been told before that it was a bad decision to get a 1050 instead of a 1050 Ti or better. But I haven't really noticed any situations yet where having a more expensive GPU would've made my experience better.
For myself, out of the 50+ games in my backlog, only around 10 of them actually have 8GB RAM in the recommended specs.

(I actually used my old 750 Ti and a Phenom II x4 955 CPU to fix up an old PC from 2008. It runs perfectly fine and seemed to play pretty much most modern games that I wanted it to. The only limitation was the MOBO being AM2+, so could only use x4 955 and is limited to 8GB of RAM.)

StupidNincompoop:

I think one of the points i'm trying to make is that it seems like the game industry as a whole has reached an 'impasse'.
It's expensive to make 'Crysis' type games, pushing the boundaries of what hardware is capable of. Thus, there's only a few such games, like Deus Ex: Mankind Divided.

Yeah, I think pretty much the same. This gen has been the smallest leap graphics wise of any generation before. I think it's the reason console fans and companies have latched onto and won't stop talking about FPS and resolution, as at the end of the day games don't look massively different today compared to those for the 360/PS3 gen. The level of graphical fidelity we currently have seems to be the peak in terms of cost to make and profit returned.

JUMBO PALACE:
Not trying to stir up any PCMR nonsense here, but you also chose some very particular examples. Divinity is made by Larian which is by no means a big studio. As an indie it makes sense to prioritize systems and mechanics over graphics for budgetary reasons. Likewise with Cities, and Shadow Tactics. And LoL is meant to appeal to as large a population as possible to create a thriving competitive community. It makes no sense to push the graphics on a game like that because you want it to be as accessible as possible. Counter Strike is another great example. There's a reason it doesn't look like ARMA and it's because Valve wants basically anyone on Steam to be able to run it.

If you look at more niche PC exclusives like The Total War Series, Project Cars, Star Citizen, the upcoming Mech Warrior, 5, and even smaller titles like the Vanishing of Ethan Carter you see them taking advantage of PC hardware. And as you mentioned, multiplats benefit as well. I would argue that multiplats designed for PC first end up looking better. The Witcher 3, Prey, Wolfenstein, Doom, Shadow of Mordor, Deus Ex, these are all games that have a PC pedigree and I think everyone benefits from the bump in visuals that brings.

I was just pointing out some of the most popular PC exclusives don't really push the graphics at all, which doesn't make the games any worse or better really. I'm just saying it's not like there's these PC exclusives that completely shit all over the graphics you normally find on consoles (outside of the resolution and fps bumps). Chances are high that if there is a PC exclusive you do want to play, the PC required to run at it at least decently will actually cost less than a console. I'm playing the 1st Divinity OS on a desktop that doesn't even have a video card. Of course, I get that a game that releases on the 3 main platforms will sell more bringing in more revenue meaning the publisher can put more money towards those titles. But as the TC has said, you don't really need a "gaming PC" to game on PC really because you can get by with "out-dated" hardware much longer than you used to.

hanselthecaretaker:
You could theoretically, but even that?s a stretch imo. I don?t know who would ever want to game on a $100 PC, even with an upgraded GPU. What kind of GPU would even work in a $100 PC anyways? Curious to see where you?re getting that number from.

I worked for a company that recycled computers and sold the ones that worked on eBay as refurbs and you can easily get a fully working desktop refurb with a pretty decent processor for around $50. You might need to add an extra RAM stick, but you'd mainly need to add decent video card and you can play basically any PC exclusive game just fine. $100 was probably a bit low but you could easily put together a good enough PC for $150 with very little effort. It's been quite awhile since video cards used a different slot as the AGP slot has been pretty extinct for awhile so the mobo isn't going to have a different slot for a video card. The only thing to really look out for would be getting a smaller and compact business class case where you can have space issues then, but that's really it. That option is mainly for a console gamer that wants to play the occasional PC game vs a gamer who primarily games on PC.

Myeah, I mean thats the thing. Because most games are made for consoles, and consoles are basically a snapshot in time, you really don't need a powerhouse PC to get a great gaming experience.

Im looking at upgrading my girlfriends PC as a gift, and looking at some benchmarks, a PC with a Pentium G4560 and a GT 1030 has some surprisingly strong legs, for such a budget PC.

In the end, people see PCs as these super expensive money sinks, but in reality, it is just getting more and more approachable.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here