8-Core CPU Heads Xbox One's Hardware Specs

 Pages 1 2 NEXT
 

8-Core CPU Heads Xbox One's Hardware Specs

image

The Xbox One will have an 8-core CPU, 8 gig of RAM, a 500GB hard drive and a Blu-Ray disc drive.

Unlike Sony's PS4, who was a bit camera shy when it came to getting a shot of the actual console and its innards, Microsoft have been pretty forward with how the Xbox One looks both inside and outside. Heading its internal hardware will be an impressive 8-core CPU, supported by 8 gig of RAM, a 500GB hard drive and a Blu-Ray disc drive. It will also come with three 802.11N Wi-Fi radios, Ethernet, USB 3.0, and HDMI in and out.

The graphics card, arguably the most important piece of hardware, will actually be combined with the 8-core CPU. It will be a heavily customized AMD GPU tailored for DirectX 11.1 graphics, with 32MB of high bandwidth embedded eSRAM memory. Microsoft says that the CPU/GPU will consume around 100 watts of power, which is slightly higher than the current Xbox Slim and PS3, but it promises noise from the cooling fans will be "four times quieter."

Oh, and that new Kinect thing, sporting a 250,000-pixel infrared depth sensor as well as a regular 720p web cam, will also be packaged with every Xbox One

These stats seem pretty impressive, but without the clock rates and other specifics, its pretty hard to place it in terms of similarly priced PC components. For instance "8-core processors" run as low as $200 for AMD's FX-8150 to $2,000 for Intel's Xeon E5-2687W. If we are using modest comparisons to components already on the market, we are looking at around $500 worth of hardware.

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Microsoft also claims that three separate 802.11n radios are included to allow the console to communicate with its controller (over a form of WiFi Direct) as well as other devices, such as phones or tablets, without losing its connection to the internet.

Permalink

As a PC player who has never used Xbox or never will, the specs are actually a lot better than I feared. Finally game devs can start making games for the next gen, or the "current gen" of PCs. I'm glad about the jump from 512MB to 8Gb in RAM, though the difference between DDR3 and GDDR5 remains to be seen.

It is too bad all 8 cores are being wasted on a completely locked down system with some obscene DRM.

Well, it sounds like a lot of games today will be using more than four cores, at the very least. There's an actual reason to buy AMD CPUs then. I have a home theatre PC I built with an AM3+ motherboard. Probably now is a good time to buy a Vishera.

I've kind of wanted an excuse to buy a Vishera, but it hasn't really made much sense since almost nothing is using 8 cores and my Intel 2500k is superior for most things. The XBOX One is doing nothing to change that, though. Since nobody is actually going to develop games for the XBOX One.

Steven Bogos:

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Either I'm being stupid or this is the worst gobbledygoo I've heard in a while.

There are two kinds of RAM, system and graphical. While you can have system RAM on the GPU, you cannot have GDDR on the system ; Quite frankly my old 9800GT had GDDR3 and the standard for most DX11 GPUs is GDDR5 and the standard for most for system RAM is DDR3 .

It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

Welp, I have little to no idea what most of that means, but it sure as hell sounds impressive. I think. Maybe. Is it? I'm gonna remain mildly impressed until a few comments down when a few members of the Master Race talk about how pathetic it is compared with their own machines.

Well, the only thing that actually interests me, as a PC gamer, is the x86 architecture, which should mean cheaper, better PC ports.

A'tuin:
As a PC player who has never used Xbox or never will, the specs are actually a lot better than I feared. Finally game devs can start making games for the next gen, or the "current gen" of PCs. I'm glad about the jump from 512MB to 8Gb in RAM, though the difference between DDR3 and GDDR5 remains to be seen.

Except I don't want shinier, I want better. Sound design, gameplay testing, story writing, and creative game mechanics have fallen to the wayside for 'MOAR PIXELS', which is the primary reason the new console generation is underperforming in terms of hype compared to last genre; even the average person wants more than just shinier. We're on HD TVs, we can justify going from analog to digital but can't justify going to higher resolutions. Give us something creative.

The CPU is going to be based on the AMD jaguar, an 8 cored CPU with an ATI Radeon 7000 HD series on chip. This is the same as the PS4. As yet the this line of CPUs have not been released into the consumer marketplace.

mad825:

Steven Bogos:

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Either I'm being stupid or this is the worst gobbledygoo I've heard in a while.

There are two kinds of RAM, system and graphical. While you can have system RAM on the GPU, you cannot have GDDR on the system ; Quite frankly my old 9800GT had GDDR3 and the standard for most DX11 GPUs is GDDR5 and the standard for most for system RAM is DDR3 .

It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

For a start its the AMD jaguar which is 64 bit not 32 bit. As I said above the gpu is on the chip and is without native memory. So using 8gigs GDDR5 as the main memory will produce better graphics performance than using 8 gigs of DDR3

albino boo:

For a start its the AMD jaguar which is 64 bit not 32 bit. As I said above the gpu is on the chip and is without native memory. So using 8gigs GDDR5 as the main memory will produce better graphics performance than using 8 gigs of DDR3

Re-read my post.

I never stated it was 32-bit but questioned it. I understand the difference between DDR and GDDR, bonus points; GDDR5 does not process 64-bits but rather circa 32-bits.

edit: I'm pretty excited about the new Xbox, just need some concerns addressed before I intend to buy it.

albino boo:
For a start its the AMD jaguar which is 64 bit not 32 bit. As I said above the gpu is on the chip and is without native memory. So using 8gigs GDDR5 as the main memory will produce better graphics performance than using 8 gigs of DDR3

First part of this is correct, then you mixed your consoles up.

The PS4 with its dedicated graphics [If memory serves] chip is using 8Gb GDDR5. The Xbox One with its CPU integrated graphics is using DDR3. Really, if they each wanted the best performance, they should be swapped around. Or, better yet, 2-3Gb GDDR5 on the Xbox and 5-6Gb DDR3.

Snow Fire:

mad825:
[quote="Steven Bogos" post="7.408607.17073841"]
It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

x86 comes in many flavors x86 16bit, x86 32 bit, x86 64bit(also knows as x64 for short, this is where the Xbox One will be, used in most PCs), x86 128bit, x86 256bit, and x86 512bit. So, no worries, all that RAM will be used,

I don't know where you're from but x86 refers to 32-bit (0r less). Hell, even my OS is telling me this.

mad825:

albino boo:

For a start its the AMD jaguar which is 64 bit not 32 bit. As I said above the gpu is on the chip and is without native memory. So using 8gigs GDDR5 as the main memory will produce better graphics performance than using 8 gigs of DDR3

Re-read my post.

I never stated it was 32-bit but questioned it. I understand the difference between DDR and GDDR, bonus points; GDDR5 does not process 64-bits but rather circa 32-bits.

longstory short it transfers faster, hence why its used in vram for modern gpu's (part of the reason GPu's are no longer a pc's bottle neck)

both seem to be using the colective pool concept (where ram and vram share the same memory pool) that the xbox 360 uses (512mb split) instead of the ps3's 256 each.

the higher transfer speed means that ram won't be this gens bottle neck for a while, processor use speed will be

the diffrence for the user is probally texture load times and level load times being longer on the xbone

edit: a 64 bit pc processor, is a x86 64 processor. the actually design of the core is called x86

mad825:

I don't know where you're from but x86 refers to 32-bit (0r less). Hell, even my OS is telling me this.

X86 goes from 16bit up to 512bit, the current Windows line, from XP to 8 all support x86 64 bit.

oliver.begg:

the actually design of the core is called x86

That depends on how old you are...

OT:Does anyone want to go back to my original question? Do any more people want to lecture me on things I already know?

Well it was already rumored that the XO would be a tad slower than the PS4. They have yet to unveil the GPU specs but since both system are more or less build around AMD technology and probably aim at the same price tag it is fair to say that they are probably very similar.
I guess the end result will be that in this generation multi-platform title will indeed look and perform the same on XO and PS4.
Anyway the hardware seems competent enough to me. Lets see what games they up their sleeve.

> 8 cores. Yay.

>> made by AMD, and most likely a Jaguar.

Oh so its a useless waste of resources, then? It looks like the consoles will be exactly the same, and this gen will be so boring it may kill consoles entirely.

Snow Fire:

mad825:

I don't know where you're from but x86 refers to 32-bit (0r less). Hell, even my OS is telling me this.

X86 goes from 16bit up to 512bit, the current Windows line, from XP to 8 all support x86 64 bit.

well done, but x86 isn't x64. All it says that it supports on x64 systems not necessary with the benefits which x64 comes with.

oliver.begg:
[quote="mad825" post="7.408607.17074073"]the actually design of the core is called x86

Somewhat annoyingly this guy is actually kinda right. Looks like the x86 instruction set has undergone multiple word-legth extensions over it's lifetime, with the AMD64 instr. set (also known as x64, and hilariously Intel 64) being x86-64.

Given the 8gigs of RAM, it's probably just sloppy writing on the part of either Mr. Bogos or the microsoft marketting wonk he got it from, although they could have done the same with some odd (and custom) processor design, while still running 32-bit worded code.

Actually, thinking about it, given the integrated graphics, my guess would be that the memory's just divided up between the GPU and CPU. Yeah that makes more sense.

Leaving all that to one side though, and bearing in mind that I'm a embittered, purebred PC gamer, I'm overjoyed by these specs, especially the fact that the graphics are a Radeon chipset. Finally! The AAA games market is no longer held in total obeisance to the rip-off merchants at Nvidia!

mad825:

Steven Bogos:

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Either I'm being stupid or this is the worst gobbledygoo I've heard in a while.

There are two kinds of RAM, system and graphical. While you can have system RAM on the GPU, you cannot have GDDR on the system ; Quite frankly my old 9800GT had GDDR3 and the standard for most DX11 GPUs is GDDR5 and the standard for most for system RAM is DDR3 .

It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

You can have GDDR being used as system RAM. No one does it and you can't do it in the normal way in a PC because DDR is better as system RAM while GDDR is optimised for highly parallel operations (e.g. graphics). The reason the PS4 does it is because Sony decided having RAM that is optimised for graphics would be better. Microsoft have gone for a more traditional setup.

x86 does not necessarily mean 32 bit. x86 is the name of the instruction set architecture that Intel designed. Of which there is a 64 bit version.
From Wikipedia:

x86-64 (also known as x64) is a 64-bit extension of IA-32, the 32-bit generation of the x86 instruction set.. ..The original specification was created by AMD, and has been implemented by AMD, Intel, VIA, and others. It is fully backwards compatible with 16-bit and 32-bit x86 code.. ..Prior to launch, "x86-64" and "x86_64" were used to refer to the instruction set. Upon release, AMD named it AMD64. Intel initially used the names IA-32e and EM64T before finally settling on Intel 64 for their implementation. Some in the industry, including Apple, use x86-64 and x86_64, while others, notably Sun Microsystems (now Oracle Corporation) and Microsoft, use x64 while the BSD family of OSs and the Debian Linux distribution use AMD64.

Thamian:

Given the 8gigs of RAM, it's probably just sloppy writing on the part of either Mr. Bogos or the microsoft marketting wonk he got it from,

I resent that. x86 refers to the architecture, which is all Microsoft have said.

Furthermore, it should be assumed that it is running 64 bit, because that is the standard these days, and like you said, the 8 gigs of ram.

mad825:

albino boo:

For a start its the AMD jaguar which is 64 bit not 32 bit. As I said above the gpu is on the chip and is without native memory. So using 8gigs GDDR5 as the main memory will produce better graphics performance than using 8 gigs of DDR3

Re-read my post.

I never stated it was 32-bit but questioned it. I understand the difference between DDR and GDDR, bonus points; GDDR5 does not process 64-bits but rather circa 32-bits.

DDR3 uses a 64-bit memory controller per channel ( so, 128-bit bus for dual channel, 256-bit for quad channel), whereas GDDR5 is paired with controllers of a nominal 32-bit (16 bit each for input and output), but whereas the CPU's memory contoller is 64-bit per channel, a GPU can utilise any number of 32-bit I/O's (at the cost of die size) depending upon application ( 2 for 64-bit bus, 4 for 128-bit, 6 for 192-bit, 8 for 256-bit, 12 for 384-bit etc...).

A'tuin:
I'm glad about the jump from 512MB to 8Gb in RAM, though the difference between DDR3 and GDDR5 remains to be seen.

Theoretically the difference is huge, on AMD's fusion PC chips, going from DDR3-1333 to DDR3-1600 equates to a 10%-ish jump in frames per second when gaming, you can bet MSoft won't be speccing higher then 1333. GDDR5 is made more or less specifically for graphics cards, so even if they use identical chips the PS3 will have a speed advantage.

The lack of hardware specifics is hugely disappointing, it probably means the specs will be too.

Two other things that bug me hugely, HDD and controller batteries are no longer replaceable, so Ebay is going to flood with dead on arrival controllers and as game install sizes balloon that 500GB disc is going to become a huge frustration. That's not even 20 installs of the PC version of BF3 (or it's 25hours of 1080p video), since I assume that's the level they'll be shooting for (and hopefully surpassing) prepare for frustration Xbox users.

mad825:

Steven Bogos:

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Either I'm being stupid or this is the worst gobbledygoo I've heard in a while.

There are two kinds of RAM, system and graphical. While you can have system RAM on the GPU, you cannot have GDDR on the system ; Quite frankly my old 9800GT had GDDR3 and the standard for most DX11 GPUs is GDDR5 and the standard for most for system RAM is DDR3 .

It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

GDDR5 is standard memory in PS4 because it's an APU, not a GPU+CPU combo like a PC. To simplify it, PS4 is a 8GB GeForce Titan. That does all the processing.
BTW, I wonder if you can actually get a 8GB Titan? Or it's 1 gb tops? i'm kinda behind the curve with PC market.

Well, fuck.

Here I thought my nice 6 core was going to carry me through the entire next generation.

Well, fingers crossed we don't run into many games requiring an 8 core CPU for a couple years at least.

Krantos:
Well, fuck.

Here I thought my nice 6 core was going to carry me through the entire next generation.

Well, fingers crossed we don't run into many games requiring an 8 core CPU for a couple years at least.

The 8 cores in the xbone and ps4 are low power, low performance netbook cores (jaguar). A dual core sandy bridge will be faster than either of them. Any recent CPU, from the Phenom II X4 onwards, wont break a sweat beating the PS4/Xbone handily.

TheComfyChair:

Krantos:
Well, fuck.

Here I thought my nice 6 core was going to carry me through the entire next generation.

Well, fingers crossed we don't run into many games requiring an 8 core CPU for a couple years at least.

The 8 cores in the xbone and ps4 are low power, low performance netbook cores (jaguar). A dual core sandy bridge will be faster than either of them. Any recent CPU, from the Phenom II X4 onwards, wont break a sweat beating the PS4/Xbone handily.

You're right, but you know how uninformed people are.
"8 Cores is faster than 6 cores."
"13 megapixels camera makes better images than 8 megapixels camera."
...

TheComfyChair:

Krantos:
Well, fuck.

Here I thought my nice 6 core was going to carry me through the entire next generation.

Well, fingers crossed we don't run into many games requiring an 8 core CPU for a couple years at least.

The 8 cores in the xbone and ps4 are low power, low performance netbook cores (jaguar). A dual core sandy bridge will be faster than either of them. Any recent CPU, from the Phenom II X4 onwards, wont break a sweat beating the PS4/Xbone handily.

Oh.

Well that will teach me for not looking more at the type of cpu they're using. I admit to being slightly confused when I read they were Jaguar, as I had never heard of it. Should have done more research.

On the plus side that makes me considerably happier. Suck it Sony and Microsoft! My free computer is still better than you're Consoles!

Lord_Gremlin:

mad825:

Steven Bogos:

It is also worth noting that the 8 gig of RAM in the Xbox One is standard DDR3 RAM instead of the fancy-pants GDDR5 that the PS4 is boasting. The console will also be switching to a much more PC-friendly x86 architecture.

Either I'm being stupid or this is the worst gobbledygoo I've heard in a while.

There are two kinds of RAM, system and graphical. While you can have system RAM on the GPU, you cannot have GDDR on the system ; Quite frankly my old 9800GT had GDDR3 and the standard for most DX11 GPUs is GDDR5 and the standard for most for system RAM is DDR3 .

It's switching to a 32-bit system? How can it use more than 4GBs of RAM?

GDDR5 is standard memory in PS4 because it's an APU, not a GPU+CPU combo like a PC. To simplify it, PS4 is a 8GB GeForce Titan. That does all the processing.
BTW, I wonder if you can actually get a 8GB Titan? Or it's 1 gb tops? i'm kinda behind the curve with PC market.

APUs are highly inefficient, and the jaguar is pretty weak as it is.

Its a tablet APU, and there is NO WAY it can ever match up to a titan. the DDR5 means nothing if it has no processing power. Its more like a moped that somehow has 1,000 gallons of gas and expecting it to beat a muscle car.

mad825:

well done, but x86 isn't x64. All it says that it supports on x64 systems not necessary with the benefits which x64 comes with.

x86 has had no successor, x64 is just short for x86 64 bit. I guess I'm weird in that, given today's techbology, I assume if x86 is mentioned, they are automatically referring to the 64bit version of x86.

So, basically, what they are giving us is a high end current PC (which is fair, they cant invent soem kind of future technology) and use old 86x architecture.

Well, it seems better than PS4, with as little information as we are given. Still, PCs will get ahead soon, but that is to be expected. Though i think my 4 year old one wont be able to run games made for new generation, well, i plan to upgrade this year anyway.

Capcha: lunchtime
Funny, i actually was eating with colegues while reading this.

A lot of misinformation on this by people who don't study technology.

8 cores does not mean better. A low to mid grade 8 core can be matched by a mid to high grade 4 core. Considering that they are using things meant for smartphones and tablets, it really isn't all that impressive. Really there is quite a bit of diminishing returns after 4.

Microsoft seemed to be a little smarter using DDR3 instead of GDDR5.

DDR3 might have lower bandwith, but it has very little latency. It is quick to respond and will be great for things like AI and physics.

GDDR5 has much higher bandwith but medium latency. That means it has the potential to look prettier, but it won't handle things that require a lot of fast communication very well. I'd take better physics and AI over graphical fidelity any day personally.

Overall a midrange custom PC will be able to run anything that is put out this generation. While doing a whole bunch more applications and uses.

oliver.begg:

mad825:
[quote="oliver.begg" post="7.408607.17074122"]
the actually design of the core is called x86

for the slow people out their that can't read

gddr5 = faster then ddr3

faster = better

Faster may seem better, but latency matters and so many complex mathmatical computations will be harder on the PS4. Expect a difference in gameplay with the Xbox handling greater physics and AI and the PS4 handling better graphics

Strazdas:
So, basically, what they are giving us is a high end current PC (which is fair, they cant invent soem kind of future technology) and use old 86x architecture.

Well, it seems better than PS4, with as little information as we are given. Still, PCs will get ahead soon, but that is to be expected. Though i think my 4 year old one wont be able to run games made for new generation, well, i plan to upgrade this year anyway.

Capcha: lunchtime
Funny, i actually was eating with colegues while reading this.

No. A midrange pc will beat it while doing a lot more.

In fact, PCs from 2010 already beat this. With a nice quad-core and a decent gpu you will be able to run anything that is put out this generation.

If you spend about the same amount on a PS4 on an upgrade to a PC you will outdo them

taciturnCandid:
A lot of misinformation on this by people who don't study technology.

8 cores does not mean better. A low to mid grade 8 core can be matched by a mid to high grade 4 core. Considering that they are using things meant for smartphones and tablets, it really isn't all that impressive. Really there is quite a bit of diminishing returns after 4.

Microsoft seemed to be a little smarter using DDR3 instead of GDDR5.

DDR3 might have lower bandwith, but it has very little latency. It is quick to respond and will be great for things like AI and physics.

GDDR5 has much higher bandwith but medium latency. That means it has the potential to look prettier, but it won't handle things that require a lot of fast communication very well. I'd take better physics and AI over graphical fidelity any day personally.

Overall a midrange custom PC will be able to run anything that is put out this generation. While doing a whole bunch more applications and uses.

Well, I'd take pretty graphics first and better physics second, matter of taste. What I will say is that we have no idea just how effective those custom APUs are. We can only guess based on APUs used in PCs. I'll have to see games to make an informed opinion. What I would say is that Sony showed stuff that mid-range PC can't run. Those were stage demos of course too, we'll need to see them in action.

 Pages 1 2 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here