Oddworld Creator: Xbox One Already "Getting Comparable" to PS4

Oddworld Creator: Xbox One Already "Getting Comparable" to PS4

lorne lanning

According to the man behind Oddworld, this generation of consoles is approaching a technological convergence after just a few months in developers' hands.

The last console generation was heavy on exclusives, partially because it took serious effort to port a game from the PS3's unique architecture to the Xbox 360's more standardized format. The latest gen is pretty similar under the hood, however, and we may already be nearing the point where performance is the same across both Sony and Microsoft's platforms. Lorne Lanning, co-founder of Oddworld Inhabitants and creator of the Oddworld series, says that in its first few months, the Xbox One has quickly closed the gap with the PS4's slightly better specs.

"I would say, months ago, there was a wider gap," Lanning says. "Part of it was the development systems." Previously there were horror stories from developers claiming that it would take twice as long to get assets onto the Xbox One than the PlayStation 4. As Lanning tells it, Microsoft has been working to pick up the slack, and it's a visible improvement. "I think they've been improving the toolset really fast, improving the development environment and shaving that curve down ... I think they're getting comparable."

Oddworld Inhabitants is developing an Oddworld title for the PS4 at the moment, and Lanning is thinking about porting it to the Xbox One as well. "I have to say, the PlayStation 4 has been pretty amazing and that's where we've been spending a lot of our time. But I don't see a huge gap like there used to be."

If Microsoft really has caught up to Sony from the development side, the only major difference between consoles will be their built-in features - and even those are nearly identical. We're still early in this console generation, but the playing field looks even for now.

Source: Xbox Achievements

Permalink

I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

Hell look at how it was last gen:

2 quick spelling errors I think.

"partially because it took seriously effort to port a game from the PS4's unique architecture to the Xbox 360's"

I think seriously is meant to be serious and PS4 is meant to be PS3.

I agree with neronium, I would wait until enough time has passed that developers are able to utilise the ps4 and xbox ones power.

Yea, it's sort of easy for these consoles to play on the same level for now, but if there is a difference in hardware it'll be seen within a few years when companies try to tap it's potential. Games like Last of us Show just how much they can squeeze out of a PS3.

Perhaps so... but the XBox One was sold on at least a vague implication that it would be comparable to its competition coming out of the gate. The "doesn't reach 1080p" complaints and the "Oh, stop bitching" responses haven't really helped give Microsoft the best impression for anyone who's still on the fence with regard to buying a new console.

Quite frankly, I think both consoles are still waiting for their "killer app"- and shiny as Titanfall might be, its cross-platform nature doesn't convince me that it fills that gap for the XB1. Microsoft is still playing catch-up; Sony just needs not to drop the ball.

Cool, this can only be a good thing for all gamers, I own a PS4 but would hate for it to have a monopoly as Sony would then stop giving a crap. More competition can only be a good thing...

Now if only Microsoft would hire someone to follow it's employees around and stop them from saying stupid shit on twitter.

It could be the most powerful machine in the world and it'd be all for naught if they didn't have any good games. At the moment neither of the consoles have convincing system sellers. I have both the XBone and The PissPoor and I haven't actually used either of them since launch.

*Snerk*

Oh, I'm sorry. Abe comes out to speak on behalf of Microsoft? This is hilarious. HEY BUDDY!

How's that exclusive of yours working out?

That's a real shame, fella. Listen, let's not be Gamespeaking for Microsoft, man. They're not doing you any favors and people aren't exactly going to respect the opinion at this time. Looks pretty hokey to me.

Aw, isn't that adorable. Little Timmy is catching up.

captcha: no dice

Exactly.

To people who keep bringing up "Optimisation".

Yeah, not so much.

Last generation was using new and quite "Out there" hardware.

This generation is more "Slightly modified PC parts".

So, no, optimisation is not going to lead to the gains in graphical fidelity that previous generations saw.

Instead, we're going to see consoles using lower resolutions, less effects such as AA, and all that jazz, but, having better model quality and better textures.

Given that the platforms are essentially the same core architecture, really there isn't much optimization that can do to close the rather blatant hardware gap between them. The big thing is the memory size and speed. XBone's DDR3 vs the PS4's DDR5. pretty much any path of optimization you use to squeeze more out of the XBox One will also work on the PS4. So the gap will remain. The best MS can really hope for is to somehow maybe optimize enough towhere their console can actually run games at the full promised 1080p HD. Then at least the performance differences might be a bit less obvious to the casual consumer.

The main thing about optimisation is that most of the tricks that make the xbone faster will make the ps4 faster still, due to the similar hardware. So no, optimisation will not close the gap.

I think it stands to reason that performances will even out somewhat during the next few years, maybe with PS4 edging it slightly, but like the Xbox 360 has edge it slightly in the last generation.

Oddworld Inhabitants is developing an Oddworld title for the PS4 at the moment, and Lanning is thinking about porting it to the Xbox One as well.

W-w-w-wait hang on! Oddworld Inhabitants are working on a new Oddworld game? Do they mean the Abe New N' Tasty or a whole new one? Because if it's the former screw Xbox One/PS4 hardware comparisons and I wanna hear about more Oddworld damn it.

Just want to harp on it again: $70 dollar games and the $50 extra means in Canada the XBone is better than the PS4 now.

The Artificially Prolonged:

Oddworld Inhabitants is developing an Oddworld title for the PS4 at the moment, and Lanning is thinking about porting it to the Xbox One as well.

W-w-w-wait hang on! Oddworld Inhabitants are working on a new Oddworld game? Do they mean the Abe New N' Tasty or a whole new one? Because if it's the former screw Xbox One/PS4 hardware comparisons and I wanna hear about more Oddworld damn it.

He means New'n'Tasty, yeah. But Lanning said that if the game sells a certain amount of copies (I think 100, 000_) they would have more than enough money to go on and remake Abe's Exodus and if it sells roughly 5x that amount then they'd have enough to start working on Exodus AND the next true Oddworld title in the series. I'll try my best to find where I read that for you.

FalloutJack:

How's that exclusive of yours working out?

You know. The guy made a mistake by trusting a publisher that pretty much sabotaged the marketing campaign behind the game so it would sell like shit and they could try to buy out his IP. And in stead he decided to rebuild it himself rather than see his IP rot in IP hell like Crash Bandicoot.

I always thought the people that make the whole "how's the exclusive" joke are just butthurt that they didn't get to play Munch and Stranger when they were new. Which is fair enough, you guys missed out on two amazing games of the XBox/PS2 era. But rather than being all butthurt and making snide remarks you could always buy the HD re-release of those games off Steam and support new Oddworld titles. Or you could continue to make a 10 ear old joke. Yeah, just do that.

The Lunatic:
To people who keep bringing up "Optimisation".

Yeah, not so much.

Last generation was using new and quite "Out there" hardware.

This generation is more "Slightly modified PC parts".

So, no, optimisation is not going to lead to the gains in graphical fidelity that previous generations saw.

Instead, we're going to see consoles using lower resolutions, less effects such as AA, and all that jazz, but, having better model quality and better textures.

Even with modified PC parts, game engines on consoles and PC leave a large amount of room to make amazing leaps in fidelity given enough time.

VanQ:
Zoop

Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.

FalloutJack:

VanQ:
Zoop

Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.

I thought it was funny.

Neronium:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

Hell look at how it was last gen:

Why do those four particular games get to represent? To make this sort of point requires a large number of examples. Larger than 1. Not to mention, the Xbox games you've shown are of a radically different artistic style to those from the Playstation. The comparison lacks clarity.

144:

Neronium:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

Hell look at how it was last gen:

Why do those four particular games get to represent? To make this sort of point requires a large number of examples. Larger than 1. Not to mention, the Xbox games you've shown are of a radically different artistic style to those from the Playstation. The comparison lacks clarity.

I wanted to choose games that were from the same series on their console, that got sequels later on those consoles within a 2-3 year span. Both the first games on the system were launch titles, and the second image of both were the sequels later. You can notice in the second image that there is more detail than the one in the first image. I wasn't comparing the two systems together, the point of my post is that both systems have not been out for a year yet, and we'll see more detail in games later on then at the launch of them. Plus after looking at the launch list for 360 none of them were really cartoony, and if they were they didn't get a sequel or I didn't see them.

As for more examples, yes I could have posted more, but those two different series were what I was searching for at first.

Neronium:

144:

Neronium:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

Hell look at how it was last gen:

Why do those four particular games get to represent? To make this sort of point requires a large number of examples. Larger than 1. Not to mention, the Xbox games you've shown are of a radically different artistic style to those from the Playstation. The comparison lacks clarity.

I wanted to choose games that were from the same series on their console, that got sequels later on those consoles within a 2-3 year span. Both the first games on the system were launch titles, and the second image of both were the sequels later. You can notice in the second image that there is more detail than the one in the first image. I wasn't comparing the two systems together, the point of my post is that both systems have not been out for a year yet, and we'll see more detail in games later on then at the launch of them. Plus after looking at the launch list for 360 none of them were really cartoony, and if they were they didn't get a sequel or I didn't see them.

As for more examples, yes I could have posted more, but those two different series were what I was searching for at first.

Fair enough, and your explanation is valid. But the examples required the explanation - otherwise, we get four pictures with dubious implications, and online misinformation is easy to procure. A screenshot of Mario Galaxy could fool someone into thinking that the Wii has power, when in fact it was careful creation on the part of the developers. Now that I hear your reasoning, I can see it in the examples, but a range of screenshots varying in artistic styles and developer budgets would be far stronger. Perhaps it bothers me more than most, but especially in light of recent console warring, I see too many examples of too few examples.

144:

Fair enough, and your explanation is valid. But the examples required the explanation - otherwise, we get four pictures with dubious implications, and online misinformation is easy to procure. A screenshot of Mario Galaxy could fool someone into thinking that the Wii has power, when in fact it was careful creation on the part of the developers. Now that I hear your reasoning, I can see it in the examples, but a range of screenshots varying in artistic styles and developer budgets would be far stronger. Perhaps it bothers me more than most, but especially in light of recent console warring, I see too many examples of too few examples.

It was honestly my fault really, because as you said I should have explained it more honestly instead of just using the images instead. Really for the best results I should have recorded the footage myself and then spliced together a video like I usually do, but I was feeling particularly lazy...well that and my editor is currently busy with Kingdom Hearts Re: Chain of Memories. XD

But in the end, you can give first, second, and third party developers all the power they want, but if they don't know how to use it properly then there is no point. First party developers usually get it down, but there are times in which it falls flat (did you know that not a single Nintendo EAD game on the GameCube actually has 16:9 support), but generally they learn how to max out a console (Naughty Dog usually proves this as well). Second party is pretty hit or miss, but turns out good usually, and with repeated development they tend to learn how to utilize a console properly (Insomniac is actually 2nd Party, as is HAL Labs actually). Third party is the same as second party in whether they use the console fully or not, and it varies depending on the developer. SEGA, whether you like their games or not, has consistently proved to me that they know how to push a system's limits. On the Wii they did Sonic Colors, and Starlight Carnival really pushed the system's limits. For the PS2, Square Enix actually pushed the PS2 to it's limits with FF XII as the game has very little load times and all enemies are on screen, and it really taxes a PS2 when you look at it internally. Rockstar also pushes a system's limits with the GTA franchise, along with L.A Noire technically, so they know how to use power properly.

Yeah it's true that once you get a better understanding of the machine you can probably squeeze more out of the thing.
But isn't that also the case for PS4?

So doesn't it become a situation where you crank the XBone up to be more comparable, while you crank up the PS4 to be, well, just more?

FalloutJack:

VanQ:
Zoop

Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.

I see the joke made in every Oddworld thread ever here on the Escapist. I dare you to go back through any Oddworld thread that was in the News Room and find me a single one that doesn't have at least one person make that joke.

VanQ:

FalloutJack:

VanQ:
Zoop

Pretty sure I could do both of those, but the point being made is that this is kind of a weird spokesman to draw out for the subject in question. Oh, and a bit of comedic advice: A joke only gets old with a high frequency of it being repeated. Since I sure as hell don't use that one often and rarely see anyone else pulling it, the joke is effectively a spring chicken. The duration of a joke's overall existence has little bearing on a situation if it is relevent in the now, which in this case it is.

I see the joke made in every Oddworld thread ever here on the Escapist. I dare you to go back through any Oddworld thread that was in the News Room and find me a single one that doesn't have at least one person make that joke.

And I'm probably the culprit at least a couple times. Can't possibly be frequent enough. Oddworld doesn't get in the news that often. But I'm busy right now, so if you have a hankering, you go find 'em.

Neronium:
I still think it's a bit early to be making that assumption, as both consoles are still not even a year old and we don't start getting that stuff that truly looks the best until about a year or two in. For right now, I will say that the games pretty much look identical, but I still say give it at least a year to two years to when we can see the real differences.

you are making incorrect assumtion that this console cycle will work like the last. the reason previuos consoles had a warm up period is because they were unique, in PS3 case never seen before, hardwware that needed developers to re-learn how to program for it. Thus it got better with time as their skill improved.
This is not true with current gen consoles. They use standart x86 architecture. You know, the one that was used for PCs for decades now. so any developer that know how to program for PC already know how to program for consoles. Xbone even supports the regular directx so the game runs pretty much identically.
The reason new games look like shit and run bad is not because developers didnt have time to get used to hardware. its just that the hardware is crap (seriously, this power console should have been released in 2010, not 2013).

happy_turtle:
Cool, this can only be a good thing for all gamers, I own a PS4 but would hate for it to have a monopoly as Sony would then stop giving a crap. More competition can only be a good thing...

Now if only Microsoft would hire someone to follow it's employees around and stop them from saying stupid shit on twitter.

Sorry, when did gamers market shrunk to Xbox and PS4? Last i looked we also had Wii, Pcs, handhelds tablets and phones for gaming. The market is too diverse for monopoly to exist.

The Lunatic:

This generation is more "Slightly modified PC parts".

you got the gist of it but id like to add that the processing unit they are using isnt even "modified PC parts", they are releasing a cell phone with it. its a phone processor.

faefrost:
The big thing is the memory size and speed. XBone's DDR3 vs the PS4's DDR5. pretty much any path of optimization you use to squeeze more out of the XBox One will also work on the PS4. So the gap will remain.

DDR5 what? DDR5 does not exist. What they use is VDDR5, which is different. It has higher speed, but lower bandwidth. it is good for holding graphics, terrible for processing. There is a reason all previous systems had both regular RAM and Video RAM.

Parshooter:
Just want to harp on it again: $70 dollar games and the $50 extra means in Canada the XBone is better than the PS4 now.

so a console is faster because you pay less?

AzrealMaximillion:

Even with modified PC parts, game engines on consoles and PC leave a large amount of room to make amazing leaps in fidelity given enough time.

any leap done on new consoles will be applied to PCs as well. And the game engines will run on same architecture as PCs, so there isnt any "room" left.

Strazdas:

The Lunatic:

This generation is more "Slightly modified PC parts".

you got the gist of it but id like to add that the processing unit they are using isnt even "modified PC parts", they are releasing a cell phone with it. its a phone processor.

faefrost:
The big thing is the memory size and speed. XBone's DDR3 vs the PS4's DDR5. pretty much any path of optimization you use to squeeze more out of the XBox One will also work on the PS4. So the gap will remain.

DDR5 what? DDR5 does not exist. What they use is VDDR5, which is different. It has higher speed, but lower bandwidth. it is good for holding graphics, terrible for processing. There is a reason all previous systems had both regular RAM and Video RAM.

Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).

loc978:
Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).

Well, they are releasing cell phones with these APUs

All are based on the same Jaguar cores as the AMD-powered next-gen consoles as well as the new Graphics Core Next (GCN) GPU architecture.

They may be in netbooks, sure, i never denied that.

that may be two of the strapped together, but that hardly makes it twice the power. Processors dont work like that as im sure you know. besides, they are underpowered and microsoft already overclocked theirs.

Yes, GDDR5, not sure why my brain farted about that one. Thansk for exapanding on it.

If the console was built properly it owuld have both GDDR5 and DDR3 and it would use both making it run much faster.

You are wrong about being jelous. well, you cna be jelous, but you seem t be jealous for wrong reasons. Its not a high end video card. a 160 dollar 750ti beats the console in graphical power. You can build a 500 dollar PC that will be faster than Xbox one or PS4. Consoles have a reason for their popularity, but power is certainly not one of them.

Strazdas:

loc978:
Being x86, the Jaguar architecture is more a netbook APU than a phone APU, though phones with 'em are in the works... thing about the PS4 and Xbone versions, though, is that they're essentially doubled. Each has a huge die containing two quad core CPUs and a double-sized GPU to a normal Jaguar... so they're rather like two netbooks strapped together.

Also, it's GDDR5, which has higher bandwidth than DDR3 (up in the 5+Ghz range, as opposed to 2-3Ghz tops), but also higher (slower, ~CAS15 to DDR3's ~CAS10) latency... which means the PS4 can load new textures and such at north of twice the speed the Xbone can... and because real latency numbers in nanoseconds are dependent on memory bandwidth... they both work out to be in the 10-12ns range. The advantage the XBone's (2133Mhz, 68Gbps) DDR3 holds over the PS4's (effectively ~5500Mhz, 172Gbps) GDDR5 in terms of latency is completely negligible, while the GDDR5 in question has over twice as much bandwidth.

...memory bandwidth is the one thing I'm actually kind of jealous of in the PS4's hardware... that's the same as you get in very high-end video cards. The Xbone, on the other hand runs its video on the same memory bandwidth as my system RAM (which, to be fair, also runs my video overflow... but I'm running a budget, crossfired APU PC).

Well, they are releasing cell phones with these APUs

All are based on the same Jaguar cores as the AMD-powered next-gen consoles as well as the new Graphics Core Next (GCN) GPU architecture.

They may be in netbooks, sure, i never denied that.

that may be two of the strapped together, but that hardly makes it twice the power. Processors dont work like that as im sure you know. besides, they are underpowered and microsoft already overclocked theirs.

Yes, GDDR5, not sure why my brain farted about that one. Thansk for exapanding on it.

If the console was built properly it owuld have both GDDR5 and DDR3 and it would use both making it run much faster.

You are wrong about being jelous. well, you cna be jelous, but you seem t be jealous for wrong reasons. Its not a high end video card. a 160 dollar 750ti beats the console in graphical power. You can build a 500 dollar PC that will be faster than Xbox one or PS4. Consoles have a reason for their popularity, but power is certainly not one of them.

Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has (hard to find an affordable video card with half of that... the 750ti typically comes with 2GB of the same stuff. My video card has 1, but I got it for under $100). It could be running circles around the thing.

loc978:

Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has. It could be running circles around the thing.

Meanwhile, no software is except ones specifically designed to hod your whole processor for stuff like video rendering.
We have modern games coming out that struggle to use the second CPU, let alone 16. they will have to reprogram the whole thing to use that APU. and why would they when a single i7 core beats all 16 of those underpowered jaguar cores. Its just a very stupid choice for a processor. especially when you can run multitasking on seperate cores since you need to run a single demanding game.

and i seriuosly doubt you will see anything outperforming i7s soon. they may not look great when you look at the surface but 3 ghz in i7 is much more than 3 ghz in an athlon. Its the reason you need 8 AMD cores to do the same task 4 intel cores can handle. except, the problem here comes that the game developers does not use said 8 cores and in result you get 4 cores sitting idle while the game is being bottlenecked. that architectural gap was the reason AMD processors never really got on par with intels since the i generation and in result almost went bancrupt but the mining craze started and their GPUs saved the day.

Yes, i can see being jelous of 8 gb GDDR5, looks like i misunderstood your point there.

DDR3 has its uses, but DDR4 is around the corner and who knows what that will bring (well we know what it will theoretically, but how it will work in practice, especially for early adopters, is another thing).

Strazdas:

loc978:

Actually.. twice the cores at the same speed is twice the power... if the software running on it is designed to utilize all of the processing threads available to it. We've had game engines capable of utilizing up to 17 threads since back in 2008, yet most games are still developed only able to utilize 2, occasionally 4... some, brand new games, mind you, can only utilize 1. That's the primary reason my 4Ghz quad-core outperforms the PS4's 2Ghz Octo-core... and it's also the reason that those new consoles are going to see some optimization that PCs by and large won't be able to utilize. We may see the old FX Visheras outperforming everything short of a hyperthreaded i7 as a result.

...and the only appreciable advantages DDR3 has over GDDR5 are availability, socket compatibility and price. As I went over before, the latency pans out the same, and GDDR5 gives you roughly twice the bandwidth. I never said I was jealous of their on-chip GPU's processing power, just the bandwidth of the memory in the PS4. I'm aware that $400 worth of new parts I slapped into my old case a few months ago (an A10-7850k and R7-250 with accompanying RAM and mobo) outruns the PS4 by a fair margin... I just wish it was running the 8GB of GDDR5 the PS4 has. It could be running circles around the thing.

Meanwhile, no software is except ones specifically designed to hod your whole processor for stuff like video rendering.
We have modern games coming out that struggle to use the second CPU, let alone 16. they will have to reprogram the whole thing to use that APU. and why would they when a single i7 core beats all 16 of those underpowered jaguar cores. Its just a very stupid choice for a processor. especially when you can run multitasking on seperate cores since you need to run a single demanding game.

and i seriuosly doubt you will see anything outperforming i7s soon. they may not look great when you look at the surface but 3 ghz in i7 is much more than 3 ghz in an athlon. Its the reason you need 8 AMD cores to do the same task 4 intel cores can handle. except, the problem here comes that the game developers does not use said 8 cores and in result you get 4 cores sitting idle while the game is being bottlenecked. that architectural gap was the reason AMD processors never really got on par with intels since the i generation and in result almost went bancrupt but the mining craze started and their GPUs saved the day.

Yes, i can see being jelous of 8 gb GDDR5, looks like i misunderstood your point there.

DDR3 has its uses, but DDR4 is around the corner and who knows what that will bring (well we know what it will theoretically, but how it will work in practice, especially for early adopters, is another thing).

I feel like we're having two different conversations here... I thought it was pretty clear I knew all that, it's the why of it that I was trying to discuss... and the why of it is pretty silly. We've been at the point where advancements in hardware need to go the route of "more cores, more memory channels" for several years now, unless we find a replacement for silicon with better heat tolerance... yet very few software designers are building for that future.

I'm pretty sure these new octo-core consoles are going to represent a paradigm shift in the way games are optimized... which means more cores are going to start being more relevant than single-thread support across a multi-core CPU, which is why, as you said, Intel's 3ghz is faster than AMD's 3ghz. With an application that can utilize 8 processing threads, a 2ghz octo-core is going to outrun a 3ghz quad-core, to say nothing of the old 4.4ghz octo-core vishera. The i7 extreme will still be king, with those 12 threads and insane cache... but the plain quad-cores? Not so much.

loc978:

I feel like we're having two different conversations here... I thought it was pretty clear I knew all that, it's the why of it that I was trying to discuss... and the why of it is pretty silly. We've been at the point where advancements in hardware need to go the route of "more cores, more memory channels" for several years now, unless we find a replacement for silicon with better heat tolerance... yet very few software designers are building for that future.

I'm pretty sure these new octo-core consoles are going to represent a paradigm shift in the way games are optimized... which means more cores are going to start being more relevant than single-thread support across a multi-core CPU, which is why, as you said, Intel's 3ghz is faster than AMD's 3ghz. With an application that can utilize 8 processing threads, a 2ghz octo-core is going to outrun a 3ghz quad-core, to say nothing of the old 4.4ghz octo-core vishera. The i7 extreme will still be king, with those 12 threads and insane cache... but the plain quad-cores? Not so much.

Well, while silicon remians silicon, we already managed to invent processors that do twice the job of same frequency processors a decade ago. So its not like the frequency wall is so deadly. While going more cores/channels is definatelly a way things seem to be going, software will always be behind in that department. especially as long as few cores can still do the same job, provided those cores arent, ahem, underpowered mobile processors at low frequency.

octo-core processing will be relevant. Weak 1.6ghz octocore processing will not. the programmers that need to go for multicores due to single core not being enough will look at the powerful multicores, not the bottom of the barrel. Therefore consoles APUs will never be "The thing" for maximum performance. the movement towards multi-cores will be looknig at the high end cores, not the low end ones, and as long as 2 Intel cores can do the job there is no reason to spend extra money on programming time to make it run on 16 console cores.

Its a long streach, but i think we may even see CPUs slow down significanly as more and more procesing is done via GPUs. And they havent hit frequency wall yet. Of course they cant replace CPUs, but more like GPUs becoming far more important to the point that any old single core CPU is enough if your GPU is good. We already see some of that in games, and its the GPUs that are mining bitcoins and not the CPUs.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here