Mark Rein: Intel Held Back Innovation On the PC

Mark Rein: Intel Held Back Innovation On the PC

Epic Games president Mark Rein claims that he tried for years to convince Intel to fix its graphics.

Unless you are playing CryTek's latest tech-demo-disguised-as-a-game, chances are you aren't stressing your PC to the absolute maximum. There are very few PC games these days that force gamers to upgrade their graphics cards, whereas just a few years ago it was quite common for games to really push the boundaries of PC graphics technology. I rode out an Nvidia 8800 GT for something ridiculous like 5 years before seeing the need to upgrade. Epic Games president Mark Rein says that there is more at work here than just low-system requirement indie games, blaming Intel for directly holding back innovation on the PC.

"For years we tried to convince Intel to fix their graphics," claims Rein via Twitter, "but their data said they were good enough. PC innovation suffered for it." Rein didn't clarify the exact timeframe these attempts took place in, but later replied to a user that it was a time when "Intel still owned the lions' share of the graphics market with integrated. That's why their data said it was good enough."

This is a very interesting perspective to anyone who knows anything about PC building. Intel dominates installations, thanks to integration, and for desktop activities, browsers and video playback, Intel's hardware is more than adequate for a majority of users. Simply put, despite graphics giants Nvidia and AMD pushing each other with more and more powerful cards, it doesn't account for anything because the majority of users are sitting on Intel's integrated chip, which means PC developers have to scale games back to the "lowest common denominator."

These days, Intel has picked up the slack a bit, doubling and tripping the power of its integrated GPU's, but who knows what kind of super crazy realistic graphics we could be looking at if they got their act together back in the day?

Soruce: Twitter

Permalink

Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...

"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.

Personally, I'm glad for this. There was a time when it felt like games forced a requirement to upgrade your graphics card ridiculously often, but I've had the same graphics card for much longer and it's still very high end, letting me stick everything on max settings without worrying or paying out large sums.

I think it's better for developers to work under some level of constraint, asking for some engine optimization rather than just upping minimum requirements.

I also think that the biggest thing holding graphics back is actually that most PC games with high graphics are multi-platform with consoles.

But what is so great pushing so fast anyways? I had a 8800GT that I had when if firt came out and only replaced it about 6 months ago. And even then, I felt is was not because the games were really pushing the envelope, but because developers have gotten lazy optimizing their games. Every CoD game has ran at 60FPS on the same console hardware with practically the same engine. My rig ran CoD4 and World at War beautifully, and it was all downhill from there. The initial release of Black Ops was almost unplayable until it was patch a half dozen times to fix CPU optimization problems. The same engine on the same hardware, console is optimized to run flawlessly, and my PC just get a messy pile of unoptimized code. I stopped caring after Black Ops, haven't touched the series since...

Polygons are emotions, sorry that slipped out.

While it might be a shame that we could have a lot better graphics by now. It does mean people get more out of their hardware before having to upgrade.

There are enough successful games out there that only work on powerful dedicated cards. Sounds like a weak excuse.

If it's because he just wants to attract the widest audience possible, he can quit Epic and go work for Rovio and design the next Angry Birds. I'm sure he won't be missed.

I don't see why non-gamers should have to pay for an expensive GPU that they will never use.

Also: Graphics != Innovation

Why would epic games care about Intel integrated anyway? You can't play anything on those chips. No joke, I got an AMD laptop when I was sixteen, nearly five years ago. It could run Mass Effect 2, at about 20 fps. I lent my copy to my cousin who had an Intel laptop he'd just got that year, that is 2010, it was unplayable. I suppose it is a shame for people who may want to play games on their regular PCs (ie teenagers on their laptops) that the prebuilt PC space is dominated by Intel.

Coming from Epic games... the guys who haven't done anything of note on PC since 2004.

But anyway, anyone who wanted to play higher profile games knew damn well the cheap Intel shit would not do it so they really had no influence on this, and that is why they were the prime graphics producer because most PC's aren't used for games at all.

I can't wait to see Jims take on this

9thRequiem:
"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.

Personally, I'm glad for this. There was a time when it felt like games forced a requirement to upgrade your graphics card ridiculously often, but I've had the same graphics card for much longer and it's still very high end, letting me stick everything on max settings without worrying or paying out large sums.

I think it's better for developers to work under some level of constraint, asking for some engine optimization rather than just upping minimum requirements.

I also think that the biggest thing holding graphics back is actually that most PC games with high graphics are multi-platform with consoles.

What he said. I also don't mind games not advancing in graphics and requiring an upgrade every damn year. Whether it's because of Intel or consoles, it's a GOOD thing. Let's more people jump into PC gaming. Again, that's GOOD.

Graphics =/= innovation and I freaking wish more people got that through their heads.

We could have better graphics?! Srs?!

We've all seen the result of games pushing the graphics envelope (with the possible exception of Crysis)... by and large they end up being drastically over-budgeted, resulting in massive minimum-sale targets, shallower story/design/mechanics etc. and often a boatload of bugs because they ran out of money before they have time to polish.

I, for one, am grateful that some studio's are starting to realise that there's more to making awesome games than photorealistic graphics!

Pretty sure every single person who bought battlefield 3 on PC, for example, would have had non-integrated graphics. Intel didn't hold back innovation with underpowered integrated chips, it just reduced potential marketshare of the PC platform. Did that hold back innovation in gaming? Well, yes, it will have. Now PC is suddenly a big player, almost certainly the biggest overall (LoL alone dwarves CoD on all 3 platforms combined in concurrent playercount, for example), we're seeing tons of innovation from a thriving indie scene and even kickstarted AAA titles.

Interestingly, even without most PC users being capable of playing AAA games, it is still on par with the 360 in market share a lot of the time in financial reports. So, with Intel really gunning for good integrated graphics now, there's going to be a LOT of PCs out there which will be fully capable of playing games. I'll give it 3 years before the upper echelons of integrated graphics (like AMD A8 APUs and Intels highest end varient) are on par with a PS4.

That didn't happen last time around and it's only recently (2 years or so) that integrated graphics beat the consoles, so the PS4 and 360 could grow their market share nicely. So that shift will provide an interesting marketing problem for console manufacturers. How can they convince people to buy a PS4/infinity if their supermarket bought laptop - that they needed anyway for work ect. - is more powerful, with cheaper games, and can be played on the train?

SkarKrow:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...

All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.

I believe the point is though, think of it from a manufacturing standpoint, if Intels internal graphics were kept at an accelerated pace, then Intel HD3000 would be about equivalent to an Nvidia 8800GT and considerably cheaper to manufacture to boot. Imagine being able to mid end game on a 400 Laptop instead of a 1000 one, it would have revolutionised the market.

I believe is the point, from a purely speculative stand point it's interesting to consider how things could have gone.

Not to mention, if they could get those results cheaply enough it would have also been good for the console market, good for the goose good for the gander etc.

Of course all this is purely speculation on my part.

So why would the president of Epic Games care about this? Because it would mean that the installed user base of people able to PC Game without wanting to spend a fortune would have sky rocketed. Meaning a potential increase and more filthy casuals turning to hardcore experiences.

Hypothetically of course... and as someone who does long haul travel alot being able to get in DX:HR at a decent clip and settings on my lappy would have been welcomed.

Rellik San:
I believe the point is though, think of it from a manufacturing standpoint, if Intels internal graphics were kept at an accelerated pace, then Intel HD3000 would be about equivalent to an Nvidia 8800GT and considerably cheaper to manufacture to boot. Imagine being able to mid end game on a 400 Laptop instead of a 1000 one, it would have revolutionised the market.

I believe is the point, from a purely speculative standpoint it's interesting to consider how things could have gone.

Not to mention, if they could get those results cheaply enough it would have also been good for the console market, good for the goose good for the gander etc.

Of course all this is purely speculation on my part.

So why would the president of Epic Games care about this? Because it would mean that the installed user base of people able to PC Game without wanting to spend a fortune would have skyrocketed. Meaning a potential increase and more filthy casuals turning to hardcore experiences.

Hypothetically of course... and as someone who does long haul travel a lot being able to get in DX:HR at a decent clip and settings on my lappy would have been welcomed.

The bread and butter of PC sales is the business market. The cost of upping the integrated chipset would have been passed on to the consumer but the better graphics isn't a selling point to the majority of the market. If you are buying laptop or desktop for a business, a high fps on games isn't a consideration.

Awww man! You mean all this time PC gamers developers could have been even BIGGER graphics whores? All those poor companies who never got the chance to run themselves out of business by over developing their games :(

What? the by far biggest improvement (in terms of performance) of Intel chip is the iGPU on the chips for the last few generations.

if they had pushed for better intel graphics years ago Nvidia and AMD would be exactly where they are now. they have been pushing each other hard. intel having better integrated wouldn't change that.

what better integrated means practically is that every computer is a gaming computer. if you can play a game like TF2 at medium settings on integrated graphics it can really boost PC gaming

Steven Bogos:
Soruce: Twitter

Spelling error, Steven.

Kinitawowi:

SkarKrow:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...

All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.

An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under 300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as 40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you 40-80 saving on the CPU and around 70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.

I think we long passed the point that higher fidelity, higher texture resolution, and higher polygon count graphics really make any difference anymore to the gaming experience. Nowadays, it seems actual human art-direction, as opposed to raw calculating prowess, is the more important. Also, as someone above pointed out, sometimes the need for higher performing hardware is not because the game engine is really doing anything that spectacular; it's because the developers were so butt-fucking lazy/incompetent about optimizing their algorithms and code. Come to think of it, relying on higher-end hardware to make great looking imagery is also lazy, because you're just trying to calculate your way to pretty pictures rather than having any real artistic skill, aesthetics, or sensibilities (any trained monkey can make a highly detailed, pretty picture, but only an artist can make something that truly moves you).

SkarKrow:
An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under 300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as 40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you 40-80 saving on the CPU and around 70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.

Personally I still do a lot with emulation, so the raw clock on single threads is more of a concern (last I checked, MAME still hadn't found a decent way to split to multiple cores). But yeah, AMD have always been more interested in the graphics processing performance than Intel have so maybe it'll cope better with video processing and multicore in the long run, but... eh.

I'm planning on an LGA 1150 build sometime next month - my old Core 2 Duo build isn't quite cutting it any more (curse my new monitor!)...

Kinitawowi:

SkarKrow:
An A8 or A10 with some 1600+ memory will mop the floor with an i3 system and you can build it dirt cheap, you could have an A10 5800K system for under 300 easily. I really caan't recommend the i3 to anyone though over even AMD's higher options: a true quad-core such as a later Phenom picked up on the cheap will serve you better.

Honestly it's not so clean cut even in the mid-ranges to just go with intel though, the FX chips are very well priced for competition, you can have 8350's for as much as 40 less than a 3570K and it's neck and neck with it in most things, some games are intel and some are amd favourites, but the AMD tends to have the edge in multithread, so streaming or compressing high-def video is easier. Hence I'd recommend the 6350 or 8350 if you plan to do any streaming, it's cheaper and gives you 40-80 saving on the CPU and around 70 on a motherboard with the same feature set, and you could spend that on something much better for gaming: the graphics card.

Buuuuut those i5's are better for word processing, browsing,e tc, and single threads in general. I do think the current piledriver chips will age a bit better than ivy bridge though, since console ports will soon be using those extra threads.

If you got the cash to burn though nothing touches socket 2011.

Personally I still do a lot with emulation, so the raw clock on single threads is more of a concern (last I checked, MAME still hadn't found a decent way to split to multiple cores). But yeah, AMD have always been more interested in the graphics processing performance than Intel have so maybe it'll cope better with video processing and multicore in the long run, but... eh.

I'm planning on an LGA 1150 build sometime next month - my old Core 2 Duo build isn't quite cutting it any more (curse my new monitor!)...

Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...

SkarKrow:
Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...

My E6850 on a HD 6770 has served me great... right up until the moment I upgraded from a (nine years old and finally deceased) 17" to a 24" monitor, and in turn from 1280x1024 to 1920x1200. Then it started struggling a bit. ;-) Didn't help that it had to drop from 8Gb to 4 because of dead sticks (and there's no real point buying replacement DDR2 for it).

i5-4670K and 7870 is my likely route, although I'll check some specifics nearer the time.

Kinitawowi:

SkarKrow:
Pentium Dual-Core E5700 still here.... it works alright I guess, I can run Far Cry 3 between it an an HD 4770...

I'm planning a build based around thew FX 6350 and a 7870 tahiti LE card, maybe get me some 2400MHz Kingston Beast in there. Y'know, for overkill... I reeally need a job before I can do that though...

My E6850 on a HD 6770 has served me great... right up until the moment I upgraded from a (nine years old and finally deceased) 17" to a 24" monitor, and in turn from 1280x1024 to 1920x1200. Then it started struggling a bit. ;-) Didn't help that it had to drop from 8Gb to 4 because of dead sticks (and there's no real point buying replacement DDR2 for it).

i5-4670K and 7870 is my likely route, although I'll check some specifics nearer the time.

The 7870 tahiti cards are utter monsters for the price, default recommendation these days. I'll be considering haswell chips when they hit but I'm pretty wary of the platform cost, I can have a great AMD build for under 700 but the equiv ivy is a good 850 or so.

If haswell is just a 10-15% performance boost i'll be ignoring it and waiting to see what steamroller brings.

This is completely stupid. We aren't going to be seeing super realistic games by getting better integrated graphics, that's decided by the top end cards, and when a dev has a vision for a game that will not be scalable down to integrated PCs, they ignore that demographic. Here's a fact: integrated graphics will NEVER be a match for a dedicated graphics cards. If there are devs holding back their own games to accommodate integrated setups, those devs are the problem, not the state of the hardware, and there will always be devs who don't bother and are happy to push ahead with games that are for graphics cards only.

Kinitawowi:

SkarKrow:
Most people who care won't be playing games on intelgrated graphics though, they'll seek out a dedicated chip or use a card in their desktop.

If you must use integrated graphics systems why would you use intel and not AMD? The performance is day and night, the recent APU's give really solid performance for the budget...

All of the above. Seriously, nobody interested in gaming (or system building) is going to fork out the beans for an i7-3770K processor and then say "you know what, the integrated HD4000 graphics are fine".

AMD's APUs have the low end of the market locked up right now. The builtin graphics on even a relatively lowly A4 mean that they totally whomp Intel up to about the mid-i3. As soon as you get past that though, the raw CPU power of the Intels takes over; and Intel are fine with lesser integrated performance because they know that virtually nobody will use it.

All PC builders learn very early that a machine is only as fast as its weakest component. Historically that always meant the hard drive (Windows 7 Experience Index 5.9 GO GO GO), but SSDs have come down in price enough now for that not to be the issue. Now everything else is fair game, and that means the graphics are in the mix. Integrated is fine for home theater and other mini PCs where space for additional cards and cooling is at a premium. Want to do anything worthwhile? You need dedicated graphics. It's hardly Intel's fault that they've recognised this and aimed at the CPU Power end of the market rather than the "just enough graphics to play Angry Birds" end.

All that said, of course, if reining back the graphics for so long has enabled the indies to get on with doing their thing and reduce the number of games that emphasise graphics over, you know, game, then Intel can keep on reining.

haha completely agree with this, especially the windows index part, had to laugh at that XD

9thRequiem:
"PC innovation suffered for it"
Did it Mark? Did it? How, exactly? Innovation isn't graphics.

Graphics may not be innovation - but graphics certainly is emotions.

Also: Erm... Intel builds processors, right? Their onboard GPUs are basically only placeholders in case a system doesn't have a dedicated GPU, right? Now, show of hands: if you build or buy a PC for gaming purposes, how many of you would actually buy one without a dedicated GPU?

*chirp* *chirp*

Yes. I thought so.

Mark has been using this excuse for years and years. It's simple, we kill the Batman.. no, wait, what I meant to say is, if someone won't dish out money for even the lowest end dedicated video adapter (they're quite cheap), odds are they don't really care enough about games that use advanced graphics to actually be interested in them in the first place.

What has killed innovation in the PC medium is game designers constantly obsessed with the size of their penis instead of with making innovative video games.

Intel does not make graphics. (well ok they do have integrated chips, that is not a graphic card, that is ability to see a desktop on a work laptop). Even laptops use seperate graphic cards most of the time.
It is true that Intel does own Nvidia, but i dont think that this is what was meant.

I hope that guy realises he basically said "Graphics = innovation". Which is a remarkably stupid sentiment to have in regards to gaming.

And aside from that, PC gamers in my experience generally care enough to upgrade beyond the integrated graphics.

I'm confused

when did 'graphics = innovation' happen? did I miss a memo or are they letting the stupid people talk again?

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here