Your Intel Haswell-E CPU, X99 Chipset Review Roundup

Your Intel Haswell-E CPU, X99 Chipset Review Roundup

Intel Haswell-E CPU 310x

The first round of Haswell-E reviews is live!

The latest crop of Intel CPUs is upon us, complete with octacore power, and a new X99 chipset.

The short version? Intel has three new Core i7 CPUs: Core i7-5960X ($999), i7-5930K ($583), and i7-5820K ($389). The i7-5960X is the Extreme Edition CPU (hence the high price), while the $389 i7-5820K is the new chip that most enthusiasts will target. You can expect to see the 5820K in plenty of $1,200-$1,500 build lists (and maybe even a few $1,000 builds). $750 PCs, though? Not so much.

Along with the new chips comes a new chipset (X99), and a new socket (LGA 2011-3), and the former is more of a refinement than a revolution. The next big leap is being saved for Intel's Broadwell chips, it seems.

The one new major edition? DDR4 memory, and the X99 chipset is the first major platform to support the new standard.

So what are the bigger names in hardware saying about the new Haswell-E parts?

Tom's Hardware (Chris Angelini):

Intel is already buzzing about Broadwell. But it's technically taking the wraps off of Haswell-E while Haswell is still relevant. The distinction may seem trivial, but I guarantee that enthusiasts care. And although X99 Express doesn't introduce any groundbreaking functionality, it at least integrates thorough USB 3.0 and SATA 6Gb/s support.

That may sound like a tepid assessment of Haswell-E, but the truth is I'm giddy to have my hands on real high-end hardware again. Imagine a mixing bowl. Sift the idea of Intel's first desktop-oriented eight-core CPU based on its most modern architecture. Add a new memory technology. An updated chipset. Solder-based thermal interface material improving your chances of a solid overclock. And sprinkle in LGA 2011-3, which we're told will support Intel's next-gen high-end desktop chip. Folded all together, those ingredients are actually quite tasty.

AnandTech (Ian Cutress):

X99 comes up to par with Z97 in terms of PCIe storage implemented into the RST along with a full array of SATA 6 Gbps ports and USB 3.0. In fact, the additional PCIe 3.0 lanes of the extreme CPUs (40 on all but the i7-5820K) make more sense for PCIe storage on X99, especially when it is most likely prosumers taking advantage of the newer standards.
...
The i7-5820K is on par with the i7-3960X at just over a third of the release cost. These two processors have the same core count and same frequency, but differ in their architecture, PCIe lane count and price. With the i7-5820K being two generations newer, it should afford a 10-15% performance improvement in CPU limited benchmarks. This is quite amazing if we consider the release price of the i7-3960X was $990 and the release price of the i7-5820K is $389.

HardOCP (Kyle Bennett):

With these big CPU launches, the cold hard fact of the matter is that if you primarily use your PC for gaming and overclocking, the Haswell-E is not likely for you. We live in a gaming world that is still very thread-limited and most gamers are primarily GPU-limited well before we are CPU limited. That said, gaming has made some fairly good leaps in the processor threading department, but it is hard to argue that you need more than a 4 core processor currently and most likely into the next few years. A one or two core processor will hold your gaming back, but still in the world of gaming, processor clocks are much more important than anything else on the CPU side of the argument. Pretty much any Intel four core processor running at 4GHz or above is going to serve you extremely well gaming.

Additional reviews can also be found on Hot Hardware, and PC Perspective.

Planning on scooping up Haswell-E/X99 hardware? Or are you holding out for Broadwell? Let us know in the comments.

Permalink

The first Intel consumer octocore?

Iwanit!

Too bad it's way out of my budget.

Considering I just upgraded my rig (4770k), I will be holding out until much later. Modern games rarely use above 40% of what 4th gen Intel processors have to offer. There is no reason to upgrade unless you deal with video rendering or game engine lighting compilation.

Oh, so now you're copying and pasting others' work in order to get your hands on some of that tasty AdSense? Don't let the Escapist become the next Metacritic please!

Shame the lot are too expensive for me, guess I'll have to wait 5 years and they became cheap enough and out of date once more.

Alexander Kirby:
Oh, so now you're copying and pasting others' work in order to get your hands on some of that tasty AdSense? Don't let the Escapist become the next Metacritic please!

Oh...oh that was a joke! Sorry, let me prepare myself, then read your joke again.

image

-Devin Connors
Tech Editor, Click Accumulator

Devin Connors:
Oh...oh that was a joke! Sorry, let me prepare myself, then read your joke again.

Er, ya, I wasn't joking. There's a fine line between so called 'Review Roundups' and plagiarism.

Alexander Kirby:

Devin Connors:
Oh...oh that was a joke! Sorry, let me prepare myself, then read your joke again.

Er, ya, I wasn't joking. There's a fine line between so called 'Review Roundups' and plagiarism.

So you're accusing me of plagiarism...how am I taking someone else's work, and passing it off as my own?

I cited three Haswell-E/X99 reviews. I took one two paragraphs of each review, put it in blockquotes, and cited both the publication, and the author before each blockquote. Each cited opinion is supposed to offer a different perspective on the new chips; one is optimistic, one is factual, one is pragmatically dissenting.

I then included two other review links below those citations. I preceded the three cited reviews with my own short summation of the new hardware Intel introduced.

Your misplaced accusation of plagiarism, if we are going with a generally accepted definition of plagiarism, claims that I took one or several of the above reviews, and said I was the author of the copy. I didn't even come close to doing such a thing.

-Devin Connors
Tech Editor, Master Plagiarizer

I'll be looking into it. If it really has Quad Channel Memory that works then I'm interested. I have a thing against initial releases, and these would be initial releases. I don't want to be burned the Alpha builds of the firmware as I have in the past.

I've been looking for something that has the memory bandwidth to keep multi-core processors properly fed so I'm hoping these pan out.

If Quad Channel DDR4 doesn't work then I'll have to see if anyone's considering the other more daring setups. 1GB eDRAM would work, but that's well out of my price range, and finding the chips are impossible ATM. Then there's the possibility of GDDR6 as System Memory, but no one's even considering that viable for the PC market ATM.

I really could use the improved memory bandwidth. If only Intel or AMD would go for insane cache sizes my problem would be solved.

Devin Connors:
-Devin Connors
Tech Editor, Master Plagiarizer

Your words, not mine!! *runs to cover*

I never thought you were plagiarising, that would be if you hadn't credited the original authors, but you're still essentially making money off other people's writing (let's face it, if it weren't for their articles this one wouldn't exist). Hence a fine line.

It's like if I took a book and abridged it for children or something, it doesn't matter if I gave the original author a credit or just outright stole it, if none of the money I'm making is going to them there's only a small difference; there's a chance my customers might google the original author, but that's about it. If all the money is still going to me then I'm still going to get sued out of the ass.

And hey, discussing it like this really makes it sound like I took a lot more issue with it than I actually did. Heck plagiarism is a fragile concept anyway; if I were to tell you that the fundamental particles of the universe are quarks, leptons and force carriers, would I be plagiarising my high school teacher?? Or the countless scientists that came up with and named them?

Alexander Kirby:

...

1) This isn't about making money as much as it's about keeping readers informed as best I can.

2) The reviews those snippets are from? They vary in length from eight to 15 pages. They are incredibly in-depth looks at a new piece of tech. The entire point of including them in such a post is so you, the potentially uninformed readers, go to those reviews and educate yourselves on new hardware before making a purchase.

3) Accusing, or implying, or walking "the fine line" of accusing-someone-of-plagiarism-then-walking-it-back is dangerous business, and it's not something any writer takes lightly. Please keep that in mind going forward.

3b) I'm not particularly interested in having a philosophical debate on plagiarism. "Taking someone else's work and calling it your own," is good enough for me and most.

That's it from my end, folks. Have a good holiday weekend!

-Devin Connors
Tech Editor, Labor Day Enthusiast

Alexander surely you're aware of the inflammatory nature of the accusation of plagiarism? Your initial comment didn't mention it, but your follow-up did. And then your (above) comment claims that you never thought Devin was plagiarizing. Your words certainly give that impression. While your initial comment may have legs (using snippets of other's work, along with references and citations, to create low-effort articles on hot topics, the motivation for which is to generate income from advertising), your inept delivery of that comment completely undermined your message.

Full disclosure: Devin Connors and I are real life friends (even though we've had our share of locked horns), so I'm predisposed to take his side in disputes.

Edit: And then Devin posted between us. Devin's post is better than mine (that's probably a good indication that he should be the professional writer!)

Russell Kent

As Expected really, this is great if you have to render stuff like video. Video LOVES threads, how more how better. But yes games just don't spread as easily and will usually stick to 3 cores about. And that is mostly due to offloading. The problem with gaming and multiple threads is that you have to keep then not only synchronized but also well fed.

If thread 1 requires data from thread 2. And thread 1 is faster then thread 2 then thread 1 has to wait, yes wait, until thread 2 has given thread 1 its data. And only then it can go further. So yeah you can't just break up a game in 10 different modules. They have to be synchronized and depend on each other.

Unlike encoding video each thread just gets a block of data and works on it. It later is merged into the final video output. Games have more trouble breaking this up.

Or maybe if you have a beefy server that runs gameservers. You know those gameserver threads themself won't use up all cores but more threads means smoother gameplay while running more servers.
And of course the big thing for business. Running virtual systems, virtual servers, how more threads you have how better those will run.

Couple this with a PCI-Express SSD or two and SPEEEEED. Probably as quickly as the memory can handle. Oh yes SSD's on PCI-Express are pretty quick indeed.

I don't think memory swaps or thread counts are my current bottleneck so this really isn't an improvement for my gaming. This really appears as a half-step. Pretty much still waiting for the GTX 800 non-mobile series to launch, then I still gotta wait another six months for the prices to settle some.

masticina:
As Expected really, this is great if you have to render stuff like video. Video LOVES threads, how more how better. But yes games just don't spread as easily and will usually stick to 3 cores about. And that is mostly due to offloading. The problem with gaming and multiple threads is that you have to keep then not only synchronized but also well fed.

the reason why games dont go multicore but rather single demanding cores (which is why intel CPUs are MUCH better for gaming even if AMD provides thoereitcally more for same price) is because it requires parallel coding to utilize all cores to the bet of thier potential. According to the devs, parallel coding is bloody hard. Which is why they turned to offloading a lot into GPU instead of CPU since those are still (mostly) single core things. this in result meant that CPU bottlenects stopped being a thing and now any decent i5 will do well for all your gaming and the only reason to buy a i7 is if your recording/streaming or work with video editing or some such.

Couple this with a PCI-Express SSD or two and SPEEEEED. Probably as quickly as the memory can handle. Oh yes SSD's on PCI-Express are pretty quick indeed.

Uhhh, PCI-Express? isnt that connection kinda obsolete nowadays? From what i udnerstand everyone uses SATA3 for SSDs now even if its limited to 5Gbit/s.

Strazdas:

Couple this with a PCI-Express SSD or two and SPEEEEED. Probably as quickly as the memory can handle. Oh yes SSD's on PCI-Express are pretty quick indeed.

Uhhh, PCI-Express? isnt that connection kinda obsolete nowadays? From what i udnerstand everyone uses SATA3 for SSDs now even if its limited to 5Gbit/s.

Is that sarcasm? SATA3 runs at 6Gbit/s, where as PCI-Express 2.0 is specced at 5GT/s per lane (approx. 4Gbit/s) and 3.0 is specced at 8GT/s per lane (just under 8Gbit/s). SATA express and M.2 are both PCI-Express 2.0 based connections (using 2x PCIe 2.0 lanes) for new SSDs to go up to 10Gbit/s, since the 6Gbit/s limit of SATA3 is already bottlenecking the SSD market.

Then there's "Ultra M.2" that Asrock supports on their Z97 and X99 motherboards that use 4x PCIe 3.0 lanes to theoretically allow 32Gbit/s. But that's a different story since only the Samsung XP941 benefits from it, for now.

Strazdas that is true the i5 is more then enough for gaming. The i7 is when you want do more.For instance grabbing game footage.

And see what originality has posted, pci-express even 1 or 2 lanes wide can offer more bandwidth then sata 600/sata 3 can. And with the SSD's reaching for the top speed now saturating the SATA connections. Lets just say you might want to switch to PCI-express for your system/boot SSD. But also they are used in servers as they are screaming fast.

Sata3? Ha.. oh no. You'll have to wait for Sata Express to get PCI-Express 1x speeds. For a gamer with $1000 to spend all this might sound to expensive. But if you are a gamer with $2500 to spare you might just go for it. After all you gotta be the best right.

grigjd3:
I don't think memory swaps or thread counts are my current bottleneck so this really isn't an improvement for my gaming. This really appears as a half-step. Pretty much still waiting for the GTX 800 non-mobile series to launch, then I still gotta wait another six months for the prices to settle some.

You do know that GTX skips 800 for exactly that reason straight to 900?

Originality:

Strazdas:

Couple this with a PCI-Express SSD or two and SPEEEEED. Probably as quickly as the memory can handle. Oh yes SSD's on PCI-Express are pretty quick indeed.

Uhhh, PCI-Express? isnt that connection kinda obsolete nowadays? From what i udnerstand everyone uses SATA3 for SSDs now even if its limited to 5Gbit/s.

Is that sarcasm? SATA3 runs at 6Gbit/s, where as PCI-Express 2.0 is specced at 5GT/s per lane (approx. 4Gbit/s) and 3.0 is specced at 8GT/s per lane (just under 8Gbit/s). SATA express and M.2 are both PCI-Express 2.0 based connections (using 2x PCIe 2.0 lanes) for new SSDs to go up to 10Gbit/s, since the 6Gbit/s limit of SATA3 is already bottlenecking the SSD market.

Then there's "Ultra M.2" that Asrock supports on their Z97 and X99 motherboards that use 4x PCIe 3.0 lanes to theoretically allow 32Gbit/s. But that's a different story since only the Samsung XP941 benefits from it, for now.

I was not arguing that Sata is faster. of course it isnt. Merely that PCIE is a rarity nowadays as the old reasons of the 90s are all but obsolete and the new SSDs that can use it are really not popular enough to be reckoned with.

Auberon:

grigjd3:
I don't think memory swaps or thread counts are my current bottleneck so this really isn't an improvement for my gaming. This really appears as a half-step. Pretty much still waiting for the GTX 800 non-mobile series to launch, then I still gotta wait another six months for the prices to settle some.

You do know that GTX skips 800 for exactly that reason straight to 900?

I'm not really concerned with whatever magical mystery number they label the card with. The point is, the 700 series is not enough of an improvement over the 600 series for me to consider upgrading. Sure a Titan is much better than my 660 ti but I'm not dropping a thousand dollars or more on a video card. Until I see an option that presents a > 50% improvement on throughput and a > 100% improvement on memory at a price < $300, I'm not interested. They could call it the 8-zillion-bajillion GTXXXX TTTIIII for all I care.

Thankfully this was all announced before I went and bought a motherboard. Now I can remake my list of parts I need to get and watch it jump in price by a few hundred dollars, hahahaha.

Dying_Jester:
Thankfully this was all announced before I went and bought a motherboard. Now I can remake my list of parts I need to get and watch it jump in price by a few hundred dollars, hahahaha.

I don't think the price jump will be that ridiculous, assuming you weren't planning on building a PC for under $1,000.

Pricing on a solid X99 trio...
Motherboard: ASRock Extreme 4 -- $224 (http://www.newegg.com/Product/Product.aspx?Item=N82E16813157544)
Memory: Crucial DDR4-2400 16 GB (4x 4 GB) -- $234 (http://www.newegg.com/Product/Product.aspx?Item=N82E16820148866)
CPU: Intel Core i7-5820K -- $390 (http://www.newegg.com/Product/Product.aspx?Item=N82E16819117402)

Sub-$1,000 builds aside, the big jump here is in RAM pricing. $225 for a mobo isn't bad, and if you're interested in a $390 Haswell-E CPU, chances are you were already looking at the $335 Core i7-4770K.

16 GB of DDR3-2400 RAM (2x 8 GB) starts around $150-$160, so the jump to $230 or so might be hard to swallow. Other than that, you're not jumping up in price all too much.

Devin Connors,
-Tech Editor, The Wire Enthusiast

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here