Developers Say Memory Is Faster on PS4 Than Xbox One

 Pages PREV 1 2 3 NEXT
 

Zeckt:
Microsoft knows that they can go on brand alone and sell cheap ass systems doomed to break after warranty expires to their legions of Halo and COD fans.

After the warranty expires? Well damn, now that would be a nice improvement over the last console.

tdylan:
The joke's on all of you because even though the XBONE's power will actually be quadrupled by the use of the cloud

http://www.escapistmagazine.com/news/view/124325-Microsoft-Claims-Cloud-Will-Quadruple-Power-of-Xbox-One

M$ is already well aware that being more powerful is not in and of itself an advantage:

http://www.escapistmagazine.com/news/view/127561-Microsoft-Disputes-PS4-Power-Advantage

Albert Penello, Microsoft's director of product planning, himself has said

the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe.

So there you have it! Even with the quadrupling power of the cloud, "the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe." Do with that information what you will.

Wow fanboy alert... you do know there is no real proof that the cloud will help as it is based on connection! and why argue with what the developers say when they are the ones who make the dam games.

pandorum:

tdylan:
The joke's on all of you because even though the XBONE's power will actually be quadrupled by the use of the cloud

http://www.escapistmagazine.com/news/view/124325-Microsoft-Claims-Cloud-Will-Quadruple-Power-of-Xbox-One

M$ is already well aware that being more powerful is not in and of itself an advantage:

http://www.escapistmagazine.com/news/view/127561-Microsoft-Disputes-PS4-Power-Advantage

Albert Penello, Microsoft's director of product planning, himself has said

the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe.

So there you have it! Even with the quadrupling power of the cloud, "the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe." Do with that information what you will.

Wow fanboy alert... you do know there is no real proof that the cloud will help as it is based on connection! and why argue with what the developers say when they are the ones who make the dam games.

.... Please tell me you realize he's being sarcastic and are replying with similar irony.

Even if the Xbox None was the bestest and most powerful console there ever was and makes the PS4 look like a Sinclair ZX Spectrum in comparison, it don't mean jack if I can't play my games whilst my 'ternet is down.

[Catcha: Describe "Netflix"? I put "American"]

Strazdas:

Lightknight:

"30FPS in 19201080 on PS4, but it'll run at "20-something" FPS in 1600900 on Xbox One" is a much bigger difference that I thought it'd be. I wonder why they brought the resolution down though. What if they were both the same resolution?

Maybe Xbone refused to run properly at 1080p? or the framerate was so unstable they couldnt properly test it? Yeah the resolution change here makes a huge difference and most people will miss that.

That's probably something they would mention. This is a lower resolution that the XBO is having trouble running at even the same frame rate that the ps4 is running the higher one at. This makes me suspect the nature/reliability of the test (and I'm firmly in the ps4 came on power disparity).

neppakyo:
Haha you make me laugh. As soon as you mentioned "the cloooooud" any valid point you had disappeared... not that you had any. Research a bit about hardware and software optimizations first. Ignorance may be bliss, but knowledge is power.

Also I laughed more as you link MS PR babble from Microsoft morons.

pandorum:
Wow fanboy alert... you do know there is no real proof that the cloud will help as it is based on connection! and why argue with what the developers say when they are the ones who make the dam games.

I know Tdylan already responded. But you should read his post again. To spell it out, he's saying that Microsoft is claiming that the power difference doesn't matter while simultaneously claiming that their machine is so awesome because of cloud computing that can make things more powerful.

Regarding the topic, the thing is that cloud computing isn't anything special. It's non-local machines (aka servers stored somewhere else) that are doing the computing for you. Microsoft isn't the only company that has that. Anyone with a server and an internet connection can have that. That doesn't make the XBO more powerful, it just means that machines can process things on always online games to augment the amount of processing being contributed towards the gaming. Which is funny, because servers are already doing that for any online console gaming. To say that other machines doing the processing instead will somehow make the XBO more powerful is a laugh. Like saying a lazy man works like five men when you hire five more men to do the work for him.

If the cloud computing is being implemented in single player games, welcome to the world of obligatory always online single player games that finally introduce shitty lag to the story-lines of what would otherwise be immersive titles. Because God only knows that all the main storylines of Bioshock, The last of us, super mario brothers, Fable, or any other popular series needs is the introduction of lag because even the most powerful server in the world can't do shit about an internet connection that has trouble processing and implementing all those bits of data instantly.

SeventhSigil:
.... Please tell me you realize he's being sarcastic and are replying with similar irony.

Oh, thank goodness, someone else figured it out.

Lightknight:

Strazdas:

Lightknight:

"30FPS in 19201080 on PS4, but it'll run at "20-something" FPS in 1600900 on Xbox One" is a much bigger difference that I thought it'd be. I wonder why they brought the resolution down though. What if they were both the same resolution?

Maybe Xbone refused to run properly at 1080p? or the framerate was so unstable they couldnt properly test it? Yeah the resolution change here makes a huge difference and most people will miss that.

That's probably something they would mention. This is a lower resolution that the XBO is having trouble running at even the same frame rate that the ps4 is running the higher one at. This makes me suspect the nature/reliability of the test (and I'm firmly in the ps4 came on power disparity).

True enough they should have mentioned that, but this is kind of "secret leaked" stuff, so it could be they were covering thier ass/unable to acess the machine to extent to test them on equal footing/ect. i mean sure develoeprs probably already got the machines to develop for, but just how much are they regulared is a question to be asked. for all we know they could ahve reported the problem of Xbox for MS and thus woudl be a dead giveaway who did it.

Regarding the topic, the thing is that cloud computing isn't anything special. It's non-local machines (aka servers stored somewhere else) that are doing the computing for you. Microsoft isn't the only company that has that. Anyone with a server and an internet connection can have that. That doesn't make the XBO more powerful, it just means that machines can process things on always online games to augment the amount of processing being contributed towards the gaming. Which is funny, because servers are already doing that for any online console gaming. To say that other machines doing the processing instead will somehow make the XBO more powerful is a laugh. Like saying a lazy man works like five men when you hire five more men to do the work for him.

Erm, no, not exactly. Cloud computing is kinda special, since it does not even require servers to exist. Basically in the cloud you have 100 (example number) xboxes conencted together sharing resources. so for example whne you are at work Xbox owner A uses your xbox power to help calcualtion, and when you are home you get to use Xbox owner A pwoer if hes not using it. and that would be shared between many xboxes. usually it needs a serer to find other xboxes, but you can program it to "find on his own" by tryin possible ranges and then asking xboxes found for a list. this is how P2P download work on prnciple, and it can survive without a master server (tracker) due to sharing of IP lists.
Thing is, no company has ever done a proper cloud yet. at least noone has announced that, who knows maybe we got some uy in his basement that made it. (the guy in this case being some military system that created cloud for local usage of military operations). What is quite popular now is cluster servers, basically if you take a bunch of computers, put them next to eachother and ad make them work together. but thats just a server anyway.
Clodu storage however exist, and it basically stores in all places where file is available and then use the least loaded server to send you the data, so it sort of uses cloud principle of load sharing. but cloud computing and cloud storage is a long road of difference.
That being said, i do not believe microsoft will even attempt to do a real cloud and are only throwing buzzwords. morel ikely they do server computing where they try to offload some of computation to the server (hence their 300 million investment into servers), and likely fail due to horrific americans internet connection.

If the cloud computing is being implemented in single player games, welcome to the world of obligatory always online single player games that finally introduce shitty lag to the story-lines of what would otherwise be immersive titles. Because God only knows that all the main storylines of Bioshock, The last of us, super mario brothers, Fable, or any other popular series needs is the introduction of lag because even the most powerful server in the world can't do shit about an internet connection that has trouble processing and implementing all those bits of data instantly.

very much true, and this is why cloud computing can only work when we all have fiber optics. So go bug your ISP to get you one.

P.S. the sarcasm in that post was quite obvious, not sure why people took it literary. then again, you never know some people would post something like that and be Sirius..../

inb4 hydrofire?

inb4 hydrofire.

Anyways: Heh. I wonder if they're going to have a manufacturing rearrangement done quietly, or if they'll just take their lumps (if they can find a non-lumpy part of their body at this point).

I really do love how the PS4 has a 50% FPS advantage in this situation... on a higher resolution.

SeventhSigil:

pandorum:

tdylan:
The joke's on all of you because even though the XBONE's power will actually be quadrupled by the use of the cloud

M$ is already well aware that being more powerful is not in and of itself an advantage:

Albert Penello, Microsoft's director of product planning, himself has said

So there you have it! Even with the quadrupling power of the cloud, "the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe." Do with that information what you will.

Wow fanboy alert... you do know there is no real proof that the cloud will help as it is based on connection! and why argue with what the developers say when they are the ones who make the dam games.

.... Please tell me you realize he's being sarcastic and are replying with similar irony.

I need to right my sarcasm better.... I just wanted a homage to the nonsense that was the last fanboy war between xbox and ps3.

So, can we all agree that "the clooooud!" will make no difference in processing power? As its been stated before it depends on your connection, even the fastest non fiber optic connection can't do anything but basic stuff(maybe it will in a decade or so). Mainly the cloud will be used for storage. Also, latency plays a huge role as well. You could have a 60mbps connection, but lousy latency (ie, average of 200ms+ ping).

And with the PS4 basically a PC now, it will be immensely easier to program for than the older Playstations. You'll notice a difference between the two graphically with PS4 exclusives as the devs are not limited to the weaker xbone platform. Hell, even cross platform games will probably run smoother on the PS4, unless some devs are incompetent programmers.

And yes, developers would know a bit more than us forum trolls, as all we can do in conjecture and moan.

pandorum:

SeventhSigil:

pandorum:

Wow fanboy alert... you do know there is no real proof that the cloud will help as it is based on connection! and why argue with what the developers say when they are the ones who make the dam games.

.... Please tell me you realize he's being sarcastic and are replying with similar irony.

I need to right my sarcasm better.... I just wanted a homage to the nonsense that was the last fanboy war between xbox and ps3.

Speaking as someone who was poking around the official Xbox forums during the week after E3, you both needed to use the word Future waaaaaay more. >> First letter capitalization is key!

EDIT: Anyway, my bad, but without tone of voice, there really wasn't that much in the message to suggest anything but an earnest reply. xD

Well either way the important thing is that sony fanboys have found another reason to feel superior to others

Kross:
If they cared about speed, they'd put an SSD and larger quantity of RAM in the console first.

RAM speed doesn't help much when you are bottlenecked by disk I/O from swapping and such.

Given the current failure rates of even the best SSDs right now, this is not a good plan for a system that is meant to last several consecutive years without failure. Price would also be an issue, and the unit would have to be user replaceable in the event of failure. SSD failure often happens quickly with no warning whatsoever where traditional disk media combined with SMART gives you plenty of advance notice.

SkarKrow:
GDDR5 is faster than DDR3?

I, for one, refuse to believe this.

Fiz_The_Toaster:
So, MS just basically went "nuh uh" and make excuses.

Sounds about right for them as of late.

To truly sound like recent Microsoft, they would have to re-hire Orth to tell people that if they want speed, they should buy the PS4. I mean, I don't think Larry Hryb is capable of bringing those A-game epic fail moments.

search_rip:

inb4 "fanboys killed the Xbone and it's revolutionary way of digital distribution and sharing games and blah blah blah"

that is logical. After all, removing the daily check-in slowed things down by at least 40%.

Adam Jensen:

They still need to sell the console. Putting an SSD would increase the price significantly. They'd have to get rid of the Kinect...

...Oh, you might be on to something here!

Slow down, son[1]! Just think of all the things we'd lose without Kinect!

...When I can think of one, I'll totally bring it up!

Captcha: Last Straw. Well, not quite, but I give you effort for trying, captcha.

[1] possibly due to a lowering of your clock speed

Ensayia:

Given the current failure rates of even the best SSDs right now, this is not a good plan for a system that is meant to last several consecutive years without failure.

Especially if Microsoft insists on going the "you can't swap out your HDD" route they're on right now.

Zachary Amaranth:

Fiz_The_Toaster:
So, MS just basically went "nuh uh" and make excuses.

Sounds about right for them as of late.

To truly sound like recent Microsoft, they would have to re-hire Orth to tell people that if they want speed, they should buy the PS4. I mean, I don't think Larry Hryb is capable of bringing those A-game epic fail moments.

If memory serves Mr. Hryb had one about a week or so ago over twitter(?) when a fan asked him why a loyal fan should bother with the new Xbox and his response was pretty fail worthy.

Still, Microsoft can do all those things by themselves without really needing a major person involved here.

Fiz_The_Toaster:

If memory serves Mr. Hryb had one about a week or so ago over twitter(?) when a fan asked him why a loyal fan should bother with the new Xbox and his response was pretty fail worthy.

Still, Microsoft can do all those things by themselves without really needing a major person involved here.

It's quite possible. I don't follow social media that much. Especially for celebrities and corporations. I find this to be better for my sanity.

Hryb was also doing the "TEH FEWCHUR!" thing alongside some of the other luminaries of Microsoft, but it wouldn't seem right unless such stupid was coming from Orth.

A lot of these guys just seem to be toeing the corporate line, is what I'm getting at. Orth was like that one guy we all know who takes the joke a little too far so reliably you can count on crickets AND set your watch to them.

I mean, we can all say dumb things, especially if we're working for a corporation. This is one reason I like being self-employed. I'm not mandated to say anything stupid. My stupidity is organic!

And yeah, I get that Orth's bit started out as a joke, but he did what I like to call[1] doubling down on stupid.

[1] I bet other people do, but I like to pretend this is something original because it makes me feel special

Zachary Amaranth:

Fiz_The_Toaster:

If memory serves Mr. Hryb had one about a week or so ago over twitter(?) when a fan asked him why a loyal fan should bother with the new Xbox and his response was pretty fail worthy.

Still, Microsoft can do all those things by themselves without really needing a major person involved here.

It's quite possible. I don't follow social media that much. Especially for celebrities and corporations. I find this to be better for my sanity.

Hryb was also doing the "TEH FEWCHUR!" thing alongside some of the other luminaries of Microsoft, but it wouldn't seem right unless such stupid was coming from Orth.

A lot of these guys just seem to be toeing the corporate line, is what I'm getting at. Orth was like that one guy we all know who takes the joke a little too far so reliably you can count on crickets AND set your watch to them.

I mean, we can all say dumb things, especially if we're working for a corporation. This is one reason I like being self-employed. I'm not mandated to say anything stupid. My stupidity is organic!

And yeah, I get that Orth's bit started out as a joke, but he did what I like to call[1] doubling down on stupid.

I view Hryb nothing more than their cheerleader, and I tend to not take him seriously because of it. I mean, obviously he has to say the things he says because of who he works for, but even in front of criticism saying the same things over and over agin really aren't going to help.

I found Orth's comment to be one part dumb and one part ballsy, but hey, he found out what happens when you think your own jokes go too far and people aren't amused.

See, for myself I just stay quite because I don't play politics, nor am I asked to answer such things. The joys of blending into the background. :D

[1] I bet other people do, but I like to pretend this is something original because it makes me feel special

Zachary Amaranth:

SkarKrow:
GDDR5 is faster than DDR3?

I, for one, refuse to believe this.

The correct answer it depends. DDR3 will be faster for general computation needs as that's what it's designed for after all. GDDR5 was tailored to the specific needs of a GPU (texture sampling/writing). There's actually a very high chance that a central game loop will have a pathological hotspot for GDDR5 somewhere inside it.

Statements like this are misleading. The developer should get a slap on the wrist.

Fiz_The_Toaster:

See, for myself I just stay quite because I don't play politics, nor am I asked to answer such things. The joys of blending into the background. :D

I lack the appropriate social filters to blend in to the background. On the other hand, the same lack of filters let me do work for people regardless of politics, so it's not an entirely bad thing. Except I run the risk of others not reciprocating, as I've found a lot of people seem to have issues working with On the other hand, I don't think I could do the cheerleader thing. I'm trying to find an agent for a novel I wrote, and I have trouble mustering up hype for that, so you can imagine how lame I'd be at hyping the XBone.

"Well...Uhhh...My mom thinks it's cool...."

On the other hand, I'm a nerd so nobody cares. In some elements, I guess I DO blend in. >.>

Zachary Amaranth:

Fiz_The_Toaster:

See, for myself I just stay quite because I don't play politics, nor am I asked to answer such things. The joys of blending into the background. :D

I lack the appropriate social filters to blend in to the background. On the other hand, the same lack of filters let me do work for people regardless of politics, so it's not an entirely bad thing. Except I run the risk of others not reciprocating, as I've found a lot of people seem to have issues working with On the other hand, I don't think I could do the cheerleader thing. I'm trying to find an agent for a novel I wrote, and I have trouble mustering up hype for that, so you can imagine how lame I'd be at hyping the XBone.

"Well...Uhhh...My mom thinks it's cool...."

On the other hand, I'm a nerd so nobody cares. In some elements, I guess I DO blend in. >.>

Well, so do I, but I think I've gained some secret stealthy powers or something, I dunno.

But perhaps a lack of social filters is the new Xbone's problem too. I mean, it's pretty bad when there are jokes for it already and the thing isn't out yet.

I think at this point any attempt to make the console seem appealing to more of the audience would be lame.[1] :/

[1] That's just my opinion anyways, for what it's worth. Of which I would imagine it's worth nothing. >.>

Fiz_The_Toaster:

Well, so do I, but I think I've gained some secret stealthy powers or something, I dunno.

To be fair, I don't register on people's minds most of the time. It's rather annoying. I got several people telling me I needed to walk more when I had a big hole in my foot.

Yay.

But perhaps a lack of social filters is the new Xbone's problem too. I mean, it's pretty bad when there are jokes for it already and the thing isn't out yet.

Doesn't hurt that they put zero thought into their branding.

I think at this point any attempt to make the console seem appealing to more of the audience would be lame.[1] :/

Lame though it might be, i would still beat this passive-aggressive emo "we totally had an awesome system, we just failed to explain it so you fucking idiots could get it through your tiny pea brains" crap they've been pulling.

[1] That's just my opinion anyways, for what it's worth. Of which I would imagine it's worth nothing. >.>

I'm going mostly off of memory here, but

Simalacrum:
3rd gen: NES less powerful than competitors, slaughtered competition.

I'm not sure what competition there was in the NES days, unless you mean the Atari. And I'm pretty certain that the Atari wasn't the more powerful of the two.

4th gen: SNES less powerful than Mega Drive, out-sold it. (just)

You mean the Genesis? I'm pretty certain the SNES was more powerful that it. Here's a link supporting my suspicions: http://web.ics.purdue.edu/~dherring/cgt141/project1/comparison.html. In pretty much everything except clock speed, the SNES had better or comparable specs.

5th gen: PS1 less powerful than N64, outsells it.

I think this had way more to do with CDs being a much, much better medium for games than cartridges than any relative power of the systems. Note that all of the current consoles use similar methods (there's no "quantum leap" like there was in the PS1 days).

6th gen: PS2 least powerful of the Big 3 (sorry Dreamcast), massacred competition.

Again, I suspect this had more to do with backwards-compatibility into an already massive library and market share, which neither the PS4 nor the XB1 will have.

7th gen: Wii almost last-gen quality graphics, outsells everyone.

But mostly thanks to their motion control gimmick. If the Kinect works out for Microsoft as well as the Wii-mote did for the Wii, then yes, the XB1 will have a huge advantage. Otherwise, not so much.

I guess what I'm saying is, except for cases in which one competitor does something significantly better than the others, the more powerful machine has actually held a pretty big advantage. Even the PS3 caught up eventually, once they realized that customers' pockets were not actually bottomless. If the PS4 is in fact significantly more powerful than the XB1, it does not bode well for Microsoft. (Then again, not much does, at this point.)

Ensayia:

Kross:
If they cared about speed, they'd put an SSD and larger quantity of RAM in the console first.

RAM speed doesn't help much when you are bottlenecked by disk I/O from swapping and such.

Given the current failure rates of even the best SSDs right now, this is not a good plan for a system that is meant to last several consecutive years without failure. Price would also be an issue, and the unit would have to be user replaceable in the event of failure. SSD failure often happens quickly with no warning whatsoever where traditional disk media combined with SMART gives you plenty of advance notice.

Current SSDs are both SMART capable [1], and tend to fail less then hard drives with multiple moving parts - consoles especially tend to have read heavy loads as you only install once/write out save games. You may be thinking of some combination of SSDs years ago when they were first releasing to consumers combined with various lower quality models littering the market (which are really easy to stumble into).

Over the last 2 years or so, almost all of the reliability/wear considerations have been addressed relative to traditional hard drives. Now more focus is on reducing price points (which would be the real concern for a console of course - though judging by the price premium consoles have loved to charge for replacement/external hard drives for years now...).

*One thing to keep in mind with online save game and license storage (which is already something they track with the 360, even if they don't leverage it as much as they plan to for next gen), that the console disk doesn't need to be recoverable in a failure scenario. You just RMA a new one and log in to your profile/re-install whatever you're playing. They aren't going to want to do more serious recovery then that on any disk. If you uploaded your music collection to your console, you'd be boned regardless unless you were willing to do recovery on your own.

[1] Though tend to not have the advance warning from picking up mechanical issues as you mentioned, and I can't find a recent discussion of this to see what's changed

Strazdas:
Erm, no, not exactly. Cloud computing is kinda special, since it does not even require servers to exist. Basically in the cloud you have 100 (example number) xboxes conencted together sharing resources. so for example whne you are at work Xbox owner A uses your xbox power to help calcualtion, and when you are home you get to use Xbox owner A pwoer if hes not using it. and that would be shared between many xboxes. usually it needs a serer to find other xboxes, but you can program it to "find on his own" by tryin possible ranges and then asking xboxes found for a list. this is how P2P download work on prnciple, and it can survive without a master server (tracker) due to sharing of IP lists.

Cloud computing is virtual machines.

Dynamically sharing computing problems with a client network is a form of Distributed Computing.

Cloud computing operates on a similar resource allocation principle as Airlines overbooking flights (maximum hardware utilization with an uneven demand). Using the theory that 100% of your RAM/CPU/disk IO capacity is rarely used by one operating system + processes. So you put another virtual machine on the same system that can use the idle resources.

Distributed computing solves large or ongoing problems, so it will always use 100% of whatever resources you allocate it, which is TERRIBLE in a "cloud" environment, as it means a single instance will choke out every other instance on the same machine for a particular resource (usually CPU in traditional distributed computing, more typically things like disk I/O from running a database in a VM)

From a user's perspective "cloud" anything is EXACTLY the same as any other sort of "online service". They are hosted services at a data center somewhere, that the administrators can spin up at greater speeds then having to hook up new hardware. There's no end user difference from something being in "the cloud" other then companies can offer online services for a cheaper outlay then they could previously.

I don't understand how the company that made Windows can be so bad at hardware. It's baffling. You know how this stuff works, Microsoft! Christ...

Strazdas:
Erm, no, not exactly. Cloud computing is kinda special, since it does not even require servers to exist. Basically in the cloud you have 100 (example number) xboxes conencted together sharing resources. so for example whne you are at work Xbox owner A uses your xbox power to help calcualtion, and when you are home you get to use Xbox owner A pwoer if hes not using it. and that would be shared between many xboxes. usually it needs a serer to find other xboxes, but you can program it to "find on his own" by tryin possible ranges and then asking xboxes found for a list. this is how P2P download work on prnciple, and it can survive without a master server (tracker) due to sharing of IP lists.
Thing is, no company has ever done a proper cloud yet. at least noone has announced that, who knows maybe we got some uy in his basement that made it. (the guy in this case being some military system that created cloud for local usage of military operations). What is quite popular now is cluster servers, basically if you take a bunch of computers, put them next to eachother and ad make them work together. but thats just a server anyway.
Clodu storage however exist, and it basically stores in all places where file is available and then use the least loaded server to send you the data, so it sort of uses cloud principle of load sharing. but cloud computing and cloud storage is a long road of difference.
That being said, i do not believe microsoft will even attempt to do a real cloud and are only throwing buzzwords. morel ikely they do server computing where they try to offload some of computation to the server (hence their 300 million investment into servers), and likely fail due to horrific americans internet connection.

The real cloud would be functionally identitical to the type of server processing you and I both assume they'll be doing.

Real cloud computing also introduces an entire host of problems including making some already existing issues (like internet connection) even more problematic.

So when companies say cloud computing, I assume they are lying to us until proven differently and that they just mean server-side processing. As of right now, server side processing is just so much more efficient. In fact, for the most part cloud computing would only benefit companies like console manufactures so that they dont' have to host servers.

very much true, and this is why cloud computing can only work when we all have fiber optics. So go bug your ISP to get you one.

Cloud computing would cause some other issues that really shouldn't be the customer's responsibility. I don't want other people to be utilizing any of my bandwidth or my console's power unless I'm in the game with them. I don't think cloud computing should be the way of the future. Not unless it's for some altruistic ventures or something like that where we are giving some of our machines face time willingly for a good cause or if we're being compensated accordingly.

P.S. the sarcasm in that post was quite obvious, not sure why people took it literary. then again, you never know some people would post something like that and be Sirius..../

That's the primary reason why Poe's law now applies to any extreme position to hold rather than just fundamentalism. Got to throw that ol' winking smiley on everything or expect it to be missed. But I felt like the context of this one was quite strong

Micalas:
I don't understand how the company that made Windows can be so bad at hardware. It's baffling. You know how this stuff works, Microsoft! Christ...

It's important to remember two things:

1. Microsoft did not make a bad piece of hardware. They only made a machine that is weak when compared to its competitor, Sony's ps4. The XBO is likely to be a powerful machine capable of doing quite a bit even if it falls short of the ps4. So don't think that they created crap. They just created something that fails to impress thanks to the competition doing a better job.

2. Sony is a hardware company. They actually should have an advantage here. You'd probably be surprised how bad an OS designed by Sony for a pc would perform.

As someone who knows very little about GDDR5 as opposed to DDR3, I was sufficiently confused by people arguing that they would be roughly equivalent to stop having an opinion. I am glad to now have some professional opinion to set the record straight.

Kross:
snip

First of all id like to say im honored the master has spoken to me. Ive been stalking soem of your comments and its really nice you took the time.
Ok, now onto arguing.

Current SSDs are both SMART capable

But SMART is a lie. its designed to show very good health as long as possible so warranty costs would be less. It hides broken clusters from you as lnog as it can and even keeps reserve clusters to mascarade as good ones instead of broken ones. If SMART is showing problems its time to start panicking because the hardware has already failed to disguise its own failure. doom is imminent at this point, and yet windows STILL tries to hide broken clusters and compensate failures. the whole HDD failure detection is designed in a way to hide it as good as possible from the end user.
Though i find that the most failures of my hard drives was simply because the motor wore out and started giving up after 7 years untill it eventually crashed. SSDs having no moving parts do go around this problem. Still, they are way too expensive for the benefit unless they are used for something like a webserver or something, which is great for sites liek Escapist but not really that great for a regular gamer.

Cloud computing is virtual machines.

Dynamically sharing computing problems with a client network is a form of Distributed Computing.

Cloud computing operates on a similar resource allocation principle as Airlines overbooking flights (maximum hardware utilization with an uneven demand). Using the theory that 100% of your RAM/CPU/disk IO capacity is rarely used by one operating system + processes. So you put another virtual machine on the same system that can use the idle resources.

Distributed computing solves large or ongoing problems, so it will always use 100% of whatever resources you allocate it, which is TERRIBLE in a "cloud" environment, as it means a single instance will choke out every other instance on the same machine for a particular resource (usually CPU in traditional distributed computing, more typically things like disk I/O from running a database in a VM)

From a user's perspective "cloud" anything is EXACTLY the same as any other sort of "online service". They are hosted services at a data center somewhere, that the administrators can spin up at greater speeds then having to hook up new hardware. There's no end user difference from something being in "the cloud" other then companies can offer online services for a cheaper outlay then they could previously.

From a end user perspecitve its online services regardless, true, but thats the problem to begin with. apperently most people dont want online services to run thier SP games. Not that it could even work with centralized servers that MS is trying to do, but could in theory with distributed computing if you have enough machines in the neighboarhood.
Will distribution always use 100% though? what if supply is greater than demand of resources? P2P does not always utilize all resources allocated to it even though it technically could, because there is not enough demand. WOuldnt that be the same with other calculations if we had supply larger than demand?
Currently distributed computing solves large and ongoing problems[1], but it can be used for other stuff too.

Lightknight:
The real cloud would be functionally identitical to the type of server processing you and I both assume they'll be doing.

Real cloud computing also introduces an entire host of problems including making some already existing issues (like internet connection) even more problematic.

So when companies say cloud computing, I assume they are lying to us until proven differently and that they just mean server-side processing. As of right now, server side processing is just so much more efficient. In fact, for the most part cloud computing would only benefit companies like console manufactures so that they dont' have to host servers.

Very much true, they indeed simply mean server processing usually. cloud is just a popular buzzword now that dont seem to mean anything anymore anyway.

Cloud computing would cause some other issues that really shouldn't be the customer's responsibility. I don't want other people to be utilizing any of my bandwidth or my console's power unless I'm in the game with them.

Yes, and people like you IS THE PROBLEM. you want to utilize resources from other people, but dont want to give yours in return, hence the chope point is noone is sharing.
if you got fiber optics your bandwitch is a nonissue though. proper fiber optics give you so much bnadwitch you can do anything with your internet and it does not choke, and there are no bandwitch limiters from ISP on fibers anyway (if yours make one - they are terrible people and should never sell a contract again). processing power however does raise your electricity bill and wear down your hardware, so its understnadable you not wnating to give any, but then you wotn recieve any either, and were back to local hardware only.

1. Microsoft did not make a bad piece of hardware.

I diasgree. While they try to cover it up a lot there were leaks that showed they had massive problems with their attempt to make different CPU design that was supposed to be cheaper, but failed, thus they wasted a lot of money and due to tests instability we should expect high failure rates.
Now unless multiple sources flat out lied about it, they did make bad peice of hardware. unless they hired a magician that fixed all those problems in last few months.

[1] Hence my comment about it not beign used for many things at all yet

Strazdas:

Lightknight:
Cloud computing would cause some other issues that really shouldn't be the customer's responsibility. I don't want other people to be utilizing any of my bandwidth or my console's power unless I'm in the game with them.

Yes, and people like you IS THE PROBLEM. you want to utilize resources from other people, but dont want to give yours in return, hence the chope point is noone is sharing.
if you got fiber optics your bandwitch is a nonissue though. proper fiber optics give you so much bnadwitch you can do anything with your internet and it does not choke, and there are no bandwitch limiters from ISP on fibers anyway (if yours make one - they are terrible people and should never sell a contract again). processing power however does raise your electricity bill and wear down your hardware, so its understnadable you not wnating to give any, but then you wotn recieve any either, and were back to local hardware only.

I don't want to use the resources from other people either. I don't think any product on the market is good enough with load balancing if one of any processors augmenting my game has low bandwidth or gets shut off in the middle of computation. I want companies to stop trying to shift all the costs on consumers and buy their own damn servers just like we've done for decades. If they're going to use consumers machines then you're talking about making our hardware do their work. If that's the case, we should be compensated.

If I got fiber optics bandwidth, it would still be an issue. Just because I have a fast internet connection does not mean these other servers do. In internet traffic, the weakest link is the speed you get to go.

But yes, living in google fiber areas or even areas that google fiber is threatening to enter would be nice. I don't even care about 1GB up/down speeds. 100MBs up/down would already be fantastic for everything but downloading the largest files and even then you'd be cutting download times by several hours for those.

1. Microsoft did not make a bad piece of hardware.

I diasgree. While they try to cover it up a lot there were leaks that showed they had massive problems with their attempt to make different CPU design that was supposed to be cheaper, but failed, thus they wasted a lot of money and due to tests instability we should expect high failure rates.
Now unless multiple sources flat out lied about it, they did make bad peice of hardware. unless they hired a magician that fixed all those problems in last few months.

Another round of high failure rates? If that happened then yes, I would absolutely agree at that point. But as of now with the numbers we have, I think they just made a weaker machine and forcing the kinect (which they admitted costs almost as much as the console) is what has inflated costs. So what we're really looking at is something like a $300 machine compared to Sony's $400 model which by all accounts is looking like quite the machine for that price tag. Except MS is charging us a mandatory $500 for it.

Kross:
Cloud computing is virtual machines.

Dynamically sharing computing problems with a client network is a form of Distributed Computing.

Cloud computing operates on a similar resource allocation principle as Airlines overbooking flights (maximum hardware utilization with an uneven demand). Using the theory that 100% of your RAM/CPU/disk IO capacity is rarely used by one operating system + processes. So you put another virtual machine on the same system that can use the idle resources.

Distributed computing solves large or ongoing problems, so it will always use 100% of whatever resources you allocate it, which is TERRIBLE in a "cloud" environment, as it means a single instance will choke out every other instance on the same machine for a particular resource (usually CPU in traditional distributed computing, more typically things like disk I/O from running a database in a VM)

From a user's perspective "cloud" anything is EXACTLY the same as any other sort of "online service". They are hosted services at a data center somewhere, that the administrators can spin up at greater speeds then having to hook up new hardware. There's no end user difference from something being in "the cloud" other then companies can offer online services for a cheaper outlay then they could previously.

That's what I was thinking. There was a period of time where everyone, even cloud computing leaders were using the term for any kind of online service at all, including the idea of load balancing across multiple machines (distributed computing). I work with VMs all day long. I'm the sole manager of them for an international corporation and it isn't part of my job description. Heh.

Of all the companies, Sony's PS3 volunteer computing for the folding project is the best example of successful distributed computing. I really hope they allow us to do things like that this generation too. I loved going to bed at night or work in the morning knowing that my ps3 was actually doing something useful. I'd like the ability to set it to work more autonomously.

I'm glad you agree that this would cause a host of other problems where gaming is concerned. As far as I can tell, the optimum scenario would be a large datastore with a very fast internet connection so that users are only bottlenecked by their own speeds. We haven't seen anything better than that aside from the client machine being powerful enough to handle it.

So I think it's a safe assumption that MS is doing what we traditionally think of as the datastore model.

Lightknight:
I don't want to use the resources from other people either.

Then you dont want clouds. fair enough, but we need to have a want for cloud for this theoretical situation. without such need, we should just go witout one, quite obviously.

I don't think any product on the market is good enough with load balancing if one of any processors augmenting my game has low bandwidth or gets shut off in the middle of computation.

hence "it has never been done in reality yet".

I want companies to stop trying to shift all the costs on consumers and buy their own damn servers just like we've done for decades. If they're going to use consumers machines then you're talking about making our hardware do their work. If that's the case, we should be compensated.

but your hardware was doing the work for all history of videogames. you always covered costs of local computer and your hardware did all the work, locally. if anything, given perfect condition, distributed computing is cheaper for end user because you need cheaper hardware. but conditions are not perfect, obviuosly. Granted, there is OnLive that does shift work to servers, but this isnt popular atm (mostly becuase publishers refuse to work with it, not due to lack of demand from users).

If I got fiber optics bandwidth, it would still be an issue. Just because I have a fast internet connection does not mean these other servers do. In internet traffic, the weakest link is the speed you get to go.

Yes. EVERYONE needs to have fiber optics in this case. sadly there are plenty of reasons this is intentionality being sabotaged.

But yes, living in google fiber areas or even areas that google fiber is threatening to enter would be nice. I don't even care about 1GB up/down speeds. 100MBs up/down would already be fantastic for everything but downloading the largest files and even then you'd be cutting download times by several hours for those.

Yes, 100mbps is good enough. personally i chose that over 300 since i dont really need 300. they do offer up to 1 gbps here. but your seem to be using GB and MB instead of gb and mb. not sure if intentional. if intentional then you dont really need 800mbps (100MB) speed really, and 8gbps (1GB) is only accessible to industries as of yet.

Another round of high failure rates? If that happened then yes, I would absolutely agree at that point. But as of now with the numbers we have, I think they just made a weaker machine and forcing the kinect (which they admitted costs almost as much as the console) is what has inflated costs.

Like i said, if the leaked info was correct we will see that, if somone was just trying to make the company look bad, then we dont know. i guess we will have to wait and see.

As for kinect, i had a thoery that kinect does some of the processing to compensate for weak hardware in the box itself, hence mandatory connection. since it does have to have its own processor for the visual recognition and quite a powerful one. but if the statement that it can be disconnected for machine to work is true then this thoery is bust.

Strazdas:
Then you dont want clouds. fair enough, but we need to have a want for cloud for this theoretical situation. without such need, we should just go witout one, quite obviously.

I do want... "clouds". I want these companies to have centrally located server farms that do the necessary processing for my online games.

I specifically do not want ANY cloud based anything regarding single player titles. Screw EA for trying it with Sim City (lying about it) and for publishers who would offload menial tasks that the XBO could otherwise handle just to shoehorn in always online gameplay under the guise of "But servers process necessary functions" without any statement as to whether or not the XBO could have done it.

hence "it has never been done in reality yet".

There is no proven advantage to distributed computing but there is a multitude of shortcomings that are known. Even if bandwidth became 100 times better then it still wouldn't be better to have distributed computing than dedicated servers because that bandwidth would only improve the dedicated servers too.

Do you think there's some kind of serious problem with how servers work now? As a COD gamer I can tell you that there's nothing distributed computing done perfectly would contribute to gaming as we know it. The demands of gaming would have to become astronomically more advanced than they are now and I don't see that happening out of sync with current tech. We are rapidly approaching a day where the graphics and physics engines are better than necessary. Just as word processing became trivial for a pc to process, so will video games eventually arrive at a degree of realism for which additional processing just isn't necessary. I imagine a future where every game that wants to look realistic, can. Because of common and cheap game engines made after years of having reached that era. Imagine a future where a game can only succeed on its story since every game has access to the most advanced engines necessary to make things as realistic as possible. Heh, we'll probably even see a renewed move towards artistic games too.

An entirely new kind of application would have to be created that demands far more than the modern processing seems capable of meeting without vast amounts of computing. Something like a brain interface that adds to the senses in some way that requires sensory input and output and a highly complex level. Even then, there is no reason why datastore servers wouldn't still do a better job than distributed computing.

So I reject the premise that distributed computing would benefit videogames in any way that a server or cloud computing as we know it doesn't currently meet.

but your hardware was doing the work for all history of videogames. you always covered costs of local computer and your hardware did all the work, locally. if anything, given perfect condition, distributed computing is cheaper for end user because you need cheaper hardware. but conditions are not perfect, obviuosly. Granted, there is OnLive that does shift work to servers, but this isnt popular atm (mostly becuase publishers refuse to work with it, not due to lack of demand from users).

My hardware has done the work for ME. This is no different from saying that my blender blends food for me. I paid for it to do so and that shouldn't mean that my neighbor can walk in and borrow it to use when I'm not using it. This shift would mean that my XBO would never not be working. When I'm at work or on vacation, if my XBO was left in a sleep state it would be processing work for other gamers, thereby incurring wear and tear on my console at no benefit to me. Even if I don't play the online multiplayer titles. I'm sorry but that simply isn't my responsibility. If I had consoles that did this, I would kill the power to them after each and every play session.

Yes. EVERYONE needs to have fiber optics in this case. sadly there are plenty of reasons this is intentionality being sabotaged.

I would love to spend my days unplugging my XBO from the internet every few hours. Just imagining someone somewhere seeing lag at just the wrong moment. Regardless, me having fiber optics is relatively simple. Getting the entire gaming community to move to it on any reasonable timeframe in the next ten years? Not likely. That $70 google fiber charges for the excellent bandwidth is just $10 less than what I pay for 10mbps and cable in two rooms of my house per month. I doubt the average gamer would see a need to upgrade to that unless they're doing a ton of downloading. I only download huge files once every few months. Those would be the games I purchase on steam and I have no qualms about waiting half a day for those to download.

Yes, 100mbps is good enough. personally i chose that over 300 since i dont really need 300. they do offer up to 1 gbps here. but your seem to be using GB and MB instead of gb and mb. not sure if intentional. if intentional then you dont really need 800mbps (100MB) speed really, and 8gbps (1GB) is only accessible to industries as of yet.

Nope, just stuck in RAM notation. Common mistake from what I've seen. What's funny is I had it right but "corrected" myself.

Like i said, if the leaked info was correct we will see that, if somone was just trying to make the company look bad, then we dont know. i guess we will have to wait and see.

If the leaked information was correct we MAY see that. The testing done previously wouldn't have just been regarding the CPU. It'd have been testing for all of the components and replacing the CPU wouldn't necessarily have undone that work. These are also relatively old CPUs. The benefit of living in a world where CPUs are now just glorified switchboard operators that offload all the work to the RAM and GPU is that the CPUs don't have to be cutting edge i7s or something. I still regret having sprung for the extra $100 i7 in my pc as my processor is never over 10% utilization. Could have grabbed another video card instead to bridge.

As for kinect, i had a thoery that kinect does some of the processing to compensate for weak hardware in the box itself, hence mandatory connection. since it does have to have its own processor for the visual recognition and quite a powerful one. but if the statement that it can be disconnected for machine to work is true then this thoery is bust.

Well, that statement was from an MS representative's own mouth so we can assume it is bust.

Additionally, having the kinect process some of the work would make the hardware proprietary again. Stealing the advantage of being x86. So I would have contended that postulation before as well. Not that MS hasn't been making huge mistakes anyways.

Lightknight:

Strazdas:
Then you dont want clouds. fair enough, but we need to have a want for cloud for this theoretical situation. without such need, we should just go witout one, quite obviously.

I do want... "clouds". I want these companies to have centrally located server farms that do the necessary processing for my online games.

What you want is streaming. not clouds.

I specifically do not want ANY cloud based anything regarding single player titles. Screw EA for trying it with Sim City (lying about it) and for publishers who would offload menial tasks that the XBO could otherwise handle just to shoehorn in always online gameplay under the guise of "But servers process necessary functions" without any statement as to whether or not the XBO could have done it.

very much in agreement here.

There is no proven advantage to distributed computing but there is a multitude of shortcomings that are known. Even if bandwidth became 100 times better then it still wouldn't be better to have distributed computing than dedicated servers because that bandwidth would only improve the dedicated servers too.

Costs. Thats what it all boids down to anyway right? Its cheaper to use distributed computing than server farms if distributed computing works as intended, which so far it doesnt.

Do you think there's some kind of serious problem with how servers work now? As a COD gamer I can tell you that there's nothing distributed computing done perfectly would contribute to gaming as we know it.

I dont think there is a problem with how servers work. i do not advocate cloud gaming at all, im just saying that such a thing is theoretically possible and efficient. kety word here though, - theoretically.

You play COD, so you probably are going to be buying a machine powerful enough to run the newest title, right? That is expenditure of hardware. If you could use your old hardware, helped by other 5 old ahrdwares in distributed computing, to play the title. that saves expenditure for hardware (and ecology becuase you dont make new hardware). id say thats a fair contribution.

The demands of gaming would have to become astronomically more advanced than they are now and I don't see that happening out of sync with current tech. We are rapidly approaching a day where the graphics and physics engines are better than necessary. Just as word processing became trivial for a pc to process, so will video games eventually arrive at a degree of realism for which additional processing just isn't necessary. I imagine a future where every game that wants to look realistic, can. Because of common and cheap game engines made after years of having reached that era. Imagine a future where a game can only succeed on its story since every game has access to the most advanced engines necessary to make things as realistic as possible. Heh, we'll probably even see a renewed move towards artistic games too.

But demands og gaming have become astronomically more advanced than they were, say, 20 years ago when they used to have these 3 man teams for top games.
I would dispute you on the word necessary. what is necessary? for some, 10 year old graphics is the necessary limit, personally i wont stop pushing till we get reality simuilation. im not a sucker for graphics, heck they dont evne come into consideration when im choosign a game, but i got a dream that one day we will have a virtual reality game where we would have a whole city or similar landscape simulated with realistic phyiscs destruction mechanics ect ect. and our graphical and phtysical engines arent even close to that. our server farms arent even close to that. we got nothing that could do this. but maybe, some day, we will. necessary is intangible. Crytek boasted how crysis 1 was photorealistic. turns out it wanst even clsoe to what, say, crysis 3 put out. realism wont exist till we get reality simulation.
so yes, its a nice dream to have a world where every game has acess to realistic looks, but its one thats not coming soon and one that our computers are far from being able to handle.

So I reject the premise that distributed computing would benefit videogames in any way that a server or cloud computing as we know it doesn't currently meet.

so you reject the fact that more accessible calculating power closer to your geography would be beneficial?

My hardware has done the work for ME. This is no different from saying that my blender blends food for me. I paid for it to do so and that shouldn't mean that my neighbor can walk in and borrow it to use when I'm not using it. This shift would mean that my XBO would never not be working. When I'm at work or on vacation, if my XBO was left in a sleep state it would be processing work for other gamers, thereby incurring wear and tear on my console at no benefit to me. Even if I don't play the online multiplayer titles. I'm sorry but that simply isn't my responsibility. If I had consoles that did this, I would kill the power to them after each and every play session.

thats the kind of thinking that makes clouds impossible. and thats basically using the old local ahrdware processing. which i perfectly understand, and i want to use it this way as well, but that kind of thinking will have to die before we even attempt clouds.

I would love to spend my days unplugging my XBO from the internet every few hours. Just imagining someone somewhere seeing lag at just the wrong moment.

the point of always online is that you dont unplug it. now your just being sinister.

Getting the entire gaming community to move to it on any reasonable timeframe in the next ten years? Not likely.

oh i completely agree, we wont see this any time soon.

hat $70 google fiber charges for the excellent bandwidth is just $10 less than what I pay for 10mbps and cable in two rooms of my house per month. I doubt the average gamer would see a need to upgrade to that unless they're doing a ton of downloading. I only download huge files once every few months. Those would be the games I purchase on steam and I have no qualms about waiting half a day for those to download.

Ech, google fucked up again it seems. 70 dollars, are they seriuos? any company asking that here would be laughed out of the market. I pay 15 dollars for 100/100.
Then still you get more bandwitch AND cheaper that your previuos one. why not upgrade?

Personally around 30 GB of traffic comes though my PC every day, but i guess i download more than the average person.

Nope, just stuck in RAM notation. Common mistake from what I've seen. What's funny is I had it right but "corrected" myself.

Fair enough, thats what i though. But just to be sure.

If the leaked information was correct we MAY see that. The testing done previously wouldn't have just been regarding the CPU. It'd have been testing for all of the components and replacing the CPU wouldn't necessarily have undone that work. These are also relatively old CPUs. The benefit of living in a world where CPUs are now just glorified switchboard operators that offload all the work to the RAM and GPU is that the CPUs don't have to be cutting edge i7s or something. I still regret having sprung for the extra $100 i7 in my pc as my processor is never over 10% utilization. Could have grabbed another video card instead to bridge.

But you see, the test they done have shown that theis new CPU design (altrough i dont think we can call it just CPU anymore) is actually failing and they had trouble meeting it bellow failure rates they expect. and if CPU fails your console still not going to work anyway.
ANd yes sadly CPUs are not utilized as much as they should, i often get even my old dualcore loaded to 30% while my GPU is choking. im going to go for I5 instead of i5 when buying new pc (soon). makes me wish somone would write a program that would force drivers to offload some processing into CPU instead, (i know, that would never work, its just me being stupid), i could do so much more then (my CPU is more powerful than my GPU sadly)

Strazdas:
What you want is streaming. not clouds.

I thought Kross already went over this with you. Cloud computing is just a more efficient method using the exact same structure of server farms in conjunction with virtualization to make the maximum use of available resources. So yes, I want clouds. Give me all of the clouds. It's not strictly streaming as streaming is (thanks to google's definition I don't have to make one up) a method of relaying data (esp. video and audio material) over a computer network as a steady continuous stream, allowing playback to proceed while subsequent data is being received.

This requires a lot of uploading (information you're sending to the server to trigger it running specific calculations) and downloading (sending back the response). That's what is already being done.

Costs. Thats what it all boids down to anyway right? Its cheaper to use distributed computing than server farms if distributed computing works as intended, which so far it doesnt.

It'll always be the same cost. You're just talking about offloading the cost onto console owners for a method that has no benefits for gaming and nothing but disadvantages compared to the current server farm method.

I dont think there is a problem with how servers work. i do not advocate cloud gaming at all, im just saying that such a thing is theoretically possible and efficient. kety word here though, - theoretically.

I must have missed, did you disagree with Kross' correction of terminology where distributed computing is what you're talking about? It is not theoretically efficient. It is definitionally less efficient than server farm computing or cloud computing. Cloud computing, as in a server farm filled with multiple VMs to get the most out of resources is not just possible but it's already here.

You play COD, so you probably are going to be buying a machine powerful enough to run the newest title, right? That is expenditure of hardware. If you could use your old hardware, helped by other 5 old ahrdwares in distributed computing, to play the title. that saves expenditure for hardware (and ecology becuase you dont make new hardware). id say thats a fair contribution.

No. Terrible idea. I do not want distributed computing for gaming. I also do not want my own machines to be used thus. My obligation to COD or any other title ends when I buy the game unless they require an online subscription in which case my ONLY obligation is to pay the fee to continue playing.

I do not owe COD the use of my machine for processing games I'm not playing in. I especially don't owe companies that I don't play games for at all to be able to use my machine.

For something like Folding at home? Sure, distributed processing is great. But not real time processing for MMOs or FPS titles. Certainly not single player games.

But demands og gaming have become astronomically more advanced than they were, say, 20 years ago when they used to have these 3 man teams for top games.

So? Computing power has also grown exponentially (doubling every few years). By complexity or demands I'm not talking about the complexity of development, I'm talking about the complexity required to process the game. Most games don't even make use of more than 4GB of RAM. We've only recently been freed by x64 environments and it's going to take a long time for games to really catch up as the average computer is still at 4GB of RAM.

In my house, I have a 16GB RAM machine that I can easily upgrade to 32GBs of RAM. I've got a relatively new i7 with a decent video card that I can likewise bridge if I need to any time soon. The available hardware has vastly outstripped gaming demands and that's only going to get more and more apparent as newer machines become the norm.

I would dispute you on the word necessary. what is necessary? for some, 10 year old graphics is the necessary limit, personally i wont stop pushing till we get reality simuilation. im not a sucker for graphics, heck they dont evne come into consideration when im choosign a game, but i got a dream that one day we will have a virtual reality game where we would have a whole city or similar landscape simulated with realistic phyiscs destruction mechanics ect ect. and our graphical and phtysical engines arent even close to that. our server farms arent even close to that. we got nothing that could do this. but maybe, some day, we will. necessary is intangible. Crytek boasted how crysis 1 was photorealistic. turns out it wanst even clsoe to what, say, crysis 3 put out. realism wont exist till we get reality simulation.
so yes, its a nice dream to have a world where every game has acess to realistic looks, but its one thats not coming soon and one that our computers are far from being able to handle.

Necessary: what is needed to accomplish a goal. What I mean is that computing power will eventually outstrip the feasible demands of even the most demanding video games. There may be a day when the most powerful video games utilize 32GBs of RAM but the average machine has 64GB or something higher.

I'm not saying "necessary" as in some subjective term. I'm saying that video games require X and the average computer can provide X+1. Pure and mathmatical.

Take word processing for example. I have a word document open right now. It's using around 20MBs of RAM. You've got to understand that there was a time when that wasn't even possible on computers and the word processing at the time had to be designed to require significantly fewer resources. Movies and other media have fallen under those demands now too. Video editing is getting close and that can be just as demanding as video games.

In that same way, game demands will eventually reach a threshold where processing power far outstrips it. You can only go so far towards realism before you're there. Then there is no additional step to take unless we discover some kind of ultra reality. Even virtual reality (if it's possible to input false senses into the brain) would eventually reach that threshold. Computer technology is already capable of creating 3D environments as we've seen with the Occulus Rift. Already existing games can already use the rift because games no longer process only what you're looking at and actually render the entire room/setting for faster transitions. So the rift, while providing VR isn't much more than a screen strapped to your head that functions as the camera controller via your head movement.

We already have beautiful games. Games that are getting very close to realistic. Yet the console generation has pushed forward something like 10x the power of the previous console. Understand that this is 10x the machine that is currently capable of games like Skyrim. As the consoles get more and more powerful, that leap will be less and less but not necessarily less impressive (2x 100 isn't less impressive than the 5x 20 that may have led to the 100, it's then 10x 20). I don't even think the next big step is much more than a firming up of graphics. I think the next big step is in NPC AI and physics.

so you reject the fact that more accessible calculating power closer to your geography would be beneficial?

Yes. I categorically reject that premise in the way it is implemented. Right now, there's only two connections that we are concerned with. Mine to the internet and their server to the internet. This connection is remarkably simple and is already good enough to let people from across the world play games like COD in real time. Where a second off is a big difference.

There being much weaker machines nearer doesn't matter. These large multiplayer games benefit greatly from sharing common servers that process the data in real time. Splitting them out would be terrible.

And again, to what benefit? What need isn't being met? We've got some very technically advanced titles that our systems are well within the realm of handling. There's no reason to involve outside processing in single player games and the big online titles are already being more than met by the simply cloud/server processing that's already in place.

thats the kind of thinking that makes clouds impossible. and thats basically using the old local ahrdware processing. which i perfectly understand, and i want to use it this way as well, but that kind of thinking will have to die before we even attempt clouds.

Don't care. Not my responsibility. There is currently no need that isn't being met by already existing servers. What you're not explaining is any reason at all why we'd benefit from this. We'd get a much less efficient setup with a myriad of issues all to save the developers money on our dime. That's bullshit.

Again, there is NO benefit to this. I don't know what you think distributed computing does but it isn't good for this sort of processing. A server farm with a good internet connection is perfect. For the kind of traffic and processing demands we have there is 0 need to even change things up. Let alone the fact that distributed computing would be one of the least efficient ways to do this kind of processing. You would be multiplying the response time by however many machines are involved and their relative bandwidths with the weakest link being the deciding factor. And for what? We wouldn't see any advantages of that. Any game that would need the likes of a supercomputer to participate in wouldn't be something that the masses could partake in and would undoubtedly only suffer from distributed computing.

Ech, google fucked up again it seems. 70 dollars, are they seriuos? any company asking that here would be laughed out of the market. I pay 15 dollars for 100/100.
Then still you get more bandwitch AND cheaper that your previuos one. why not upgrade?

You have 100/100 mbps? Hot damn.

To put this in context, I recently switched from Comcast to Century Link because Comcast tried to charge me $70 for 16 mbps but Century link was offering me 10 mbps along with a full cable selection with HD DVRs all for around the same price. What's more, the 16 mbps was something I almost never got close to and I functionally had less than 10 mbps whereas century link in my area seldom drops below 9 mbps and is even willing to increase my mpbs if my cable TV isn't also running at the time.

$15 for any kind of internet isn't common. I do know that Google fiber has a 5/5 offer for a one time payment of $300 (can be broken up into multiple payments of $25). $70 for the 1gbps is the standard charge right now.

But you see, the test they done have shown that theis new CPU design (altrough i dont think we can call it just CPU anymore) is actually failing and they had trouble meeting it bellow failure rates they expect. and if CPU fails your console still not going to work anyway.

I have not seen any evidence that this is the case.

ANd yes sadly CPUs are not utilized as much as they should, i often get even my old dualcore loaded to 30% while my GPU is choking. im going to go for I5 instead of i5 when buying new pc (soon). makes me wish somone would write a program that would force drivers to offload some processing into CPU instead, (i know, that would never work, its just me being stupid), i could do so much more then (my CPU is more powerful than my GPU sadly)

Why sadly? RAM/GPU is far cheaper than the CPU. Advances in CPU have been seriously bottlenecked whereas RAM and GPU continue exploding.

I hope you dont take this the wrong way, i just like discussing things and thus i discuss.
...

Lightknight:

Strazdas:
What you want is streaming. not clouds.

I thought Kross already went over this with you. Cloud computing is just a more efficient method using the exact same structure of server farms in conjunction with virtualization to make the maximum use of available resources. So yes, I want clouds. Give me all of the clouds. It's not strictly streaming as streaming is (thanks to google's definition I don't have to make one up) a method of relaying data (esp. video and audio material) over a computer network as a steady continuous stream, allowing playback to proceed while subsequent data is being received.

This requires a lot of uploading (information you're sending to the server to trigger it running specific calculations) and downloading (sending back the response). That's what is already being done.

Fair enough.

It'll always be the same cost. You're just talking about offloading the cost onto console owners for a method that has no benefits for gaming and nothing but disadvantages compared to the current server farm method.

No. It would be the same cost if you always used your local hardware 100% effectively (or close to that). sicne you dont, a lot of time your hardware stands unused. if it was used in this time, you would need much less overall hardware across the whole community, thus it would be cheaper.

No. Terrible idea. I do not want distributed computing for gaming. I also do not want my own machines to be used thus. My obligation to COD or any other title ends when I buy the game unless they require an online subscription in which case my ONLY obligation is to pay the fee to continue playing.

I do not owe COD the use of my machine for processing games I'm not playing in. I especially don't owe companies that I don't play games for at all to be able to use my machine.

For something like Folding at home? Sure, distributed processing is great. But not real time processing for MMOs or FPS titles. Certainly not single player games.

You dont owe anything because you only use local hardware now. if you used their hardware, it is fiar they use yours as well. if that situation helps both of you save money on ahrdware expenditures, thats a real advantage.
you dont owe anything only as long as you do not use it.

Necessary: what is needed to accomplish a goal.

what goal? to make a game? you can do that on NES graphics. to make the game look good? how good is good? you want the game look how you want? you need graphics as good as your imagination.
Intangible.

What I mean is that computing power will eventually outstrip the feasible demands of even the most demanding video games. There may be a day when the most powerful video games utilize 32GBs of RAM but the average machine has 64GB or something higher.

I daisgree. i think the games will always keep up with the machines that they are played. remmeber what happened when dvds based games came? games stopped optimizing. when they met hardware limitations (current consoles) they leanrt the need to optimize again. games will always find a way to demand the hardware of average computer. you have a high end one so it feels like games arent catching up, when reality is game makers limit themselves in order to allow the average person to run the thing.

I'm not saying "necessary" as in some subjective term. I'm saying that video games require X and the average computer can provide X+1. Pure and mathmatical.

and you know that game require X when average ocmptuer can provide X+1 how?

Take word processing for example. I have a word document open right now. It's using around 20MBs of RAM. You've got to understand that there was a time when that wasn't even possible on computers and the word processing at the time had to be designed to require significantly fewer resources. Movies and other media have fallen under those demands now too. Video editing is getting close and that can be just as demanding as video games.

video editing is more demanding than video games. word processing didnt evolve though. you still write words like you used to. heck, most people still use same exact formats from 10-15 years ago. if you would have 10 new offices every year, you would see much more 365 monstrosities that can hog your 16 gb of ram just because it can.
other media, ah, i remember when MP3 lagged.... good days.
anyway, the thing is, movies and audio IS catching up to ahrdware. not as well but it tries. we got lossless audio formats that are much more demanding than previuos ones, we got video formats that needs a gaming PC to run them at all (for example 4k), sure, if you take a format from 15 years ago and run it on modern PC, it wont use a lot of resources. neither would a 15 year old game.
Its just that those are also limited by "Average hardware", in this case being televisions (which are statistically still on the SD side) and mp3 players (sure you can have much better lossless sound, but you still put MP3s on your ipod).
the devices are what limits us, they will never become more pwoerful than necessary, we will always find ways to utilize them. unless were talking about multivac from science fiction and even that one had a job it had to calculate till the heat death of universe to complete.

In that same way, game demands will eventually reach a threshold where processing power far outstrips it. You can only go so far towards realism before you're there.

You are aware that so far most powerful computers can simulate less than a cubic centimeter if we got for atom level of realism, right? were still very far away from realism.

Already existing games can already use the rift because games no longer process only what you're looking at and actually render the entire room/setting for faster transitions.

actually they always used to render entire room/setting. its just that when developers were forced to use 8 year old hardware that couldnt handle it they had to find shortcuts, such as rendering only what can be seen from the camera perspective and isntantly removing things you went past.
Now as we get new hardware we may start doing what we did before - render the whole damn room.

We already have beautiful games. Games that are getting very close to realistic.

beatiful - perhaps. thats subjective. realism - umm no. the only game where i saw jeans even look close to look like actual jeams was the last of us for example. we still very far from realism.

Understand that this is 10x the machine that is currently capable of games like Skyrim.

10x capable of console skyrim. less than 1x capable of PC Skyrim.

I think the next big step is in NPC AI and physics.

Id love to think that as well, but then reality comes back and we got cryteck CEO who says that graphics are the most important part of any game. The AI has been at quite a stand still and physics were only attempted by few, but graphics seems to be pushed by all....

Don't care. Not my responsibility. There is currently no need that isn't being met by already existing servers. What you're not explaining is any reason at all why we'd benefit from this. We'd get a much less efficient setup with a myriad of issues all to save consumers some money. That's bullshit.

Fixed the quote.

You have 100/100 mbps? Hot damn.

100/100 is a "slow" plan here. but since thats enough for me id rather take the cheaper one.

To put this in context, I recently switched from Comcast to Century Link because Comcast tried to charge me $70 for 16 mbps but Century link was offering me 10 mbps along with a full cable selection with HD DVRs all for around the same price. What's more, the 16 mbps was something I almost never got close to and I functionally had less than 10 mbps whereas century link in my area seldom drops below 9 mbps and is even willing to increase my mpbs if my cable TV isn't also running at the time.

so if we count the other services out, the internet in fact was much cheaper than comcasts. still its sad to hear they try to charge so much for such slow speeds. The services who offered 10mbps i didnt even look twice at.

I have not seen any evidence that this is the case.

and we wont till launch.

Why sadly? RAM/GPU is far cheaper than the CPU. Advances in CPU have been seriously bottlenecked whereas RAM and GPU continue exploding.

because i got more powerful CPU than GPU. i know, petty reasons.

 Pages PREV 1 2 3 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here