PS4 Architect: Cloud Computing Won't Make Graphics Better

 Pages PREV 1 2
 

I won't pretend I understand fully how this Cloud computing works. As best I can figure it's pretty much a method to enhance the processing power of the system. This process is reliant on the internet. On paper this sounds very advantageous, yet there are some points that puzzle me. The first question is the need to be online. As someone mentioned if the majority of the work is done by the Cloud wouldn't this bring problems for those with low internet services? People with limited bandwith? What about online games? Not really sure how much more strain Cloud computing would be. But it does then shed light on the always on, that was coming with the Xbox One. They needed you to be online to fully access the Cloud and it's features. So is the Cloud a primary system for MS console or is a secondary one? Is the Xbox so weak that it needs the Cloud to make up for it, or supplement it's abilities? In either case aren't the consumers paying $499 for something that won't work at 100% unless online? In essence doesn't that mean they paid half a thousand for half of console? Without definite answers it's hard to tell how well it will work compared to how it should work.

Because Everyone Knows that Mark Cerny is an expert on the capabilities and limitations provided by cloud computing technology, especially seeing as Sony has /so much experience/ in this field.

Rainforce:
When exactly did everyone stop calling them servers (i.e. what they actually are) and started referring to them as "the cloud" ?

when a PR team realized they could make more money with a name like "The Cloud" I think.

UnnDunn:
Whatever you think about cloud computing in games, the bottom line is that Microsoft is at least trying to make it work. There are studios making Xbox One games asking "how can we use cloud computing to make our game better." That's a good thing.

Welcome back UnnDunn. Now onto the real reply, Sony are trying to make it work too. The only difference is that Microsoft proclaim it as a mystical cure-all while Sony are giving us honest facts about what it can and cannot do.

PoolCleaningRobot:
Not sure if Mark Cerny is a legitimately cool guy...

... Or he's just been browsing threads to find out what we've been complaining about the most

Regardless, at least he's not telling us bullshit. Streaming and cloud services can give us cool things like cloud saves and streaming games instantly. It'll be useful for things like demoing games because who has the patience to download a game you're going to test for 15 minutes? Microsoft's calculation magic has already been proven physically impossible because of bandwidth

I think he's a legitimately cool guy.

It's true that the "Cloud" bullshit Microsoft is pushing is simply not a solution when compared to simply having a more powerful machine. IT makes everything more complicated and I won't call it viable until we have a quantum internet.

I love how people were clearly being paid to promote the XBOX One. Even before they retracted the always on DRM and no used games thing, people were still saying things like "Wait for E3! Don't count Microsoft out!". And all this clear and obvious paid damage control. The American market is so owned by Microsoft that nobody in the gaming press can say anything honestly bad about Microsoft products. I was amazed at how many people proved they were being payed in the ways they would rush to the defense of Microsoft and damage control the XBOX One.

Now their job is even easier to act like the playing field is even and "don't jump to conclusions" about the "console war". Now that Microsoft has ditched the two biggest things that would ruin them in this generation. But really, Sony is still leagues ahead.

The fact of the matter is that cloud gaming isn't ready yet, and throwing out hardware for it won't be worthwhile. Boosting graphics could, in theory, work in the cloud. And Mark isn't saying that it couldn't. But it isn't a current very good usage for such a thing. And honest, as powerful as the the PlayStation 4 and XBOX One are, and how weak internet connections are in most of the world, it wouldn't be worthwhile at all. If you want to play a game that won't run on current consoles, just get a gaming PC that can handle the workload. It isn't worth being always connected to the internet in order to play a game the XBOX One can't technically handle. Or get slightly better resolution or AA, for that matter.

"Cloud" was just a buzzword excuse. If cloud enhancements like that were already worthwhile, they would have already been used in PC gaming for a long time. The fact of the matter is, everything will "come to the cloud" when the cloud is good and ready. Meaning fast internet will be readily available. And when that happens, people won't want it in a console anyway. They'll want it in a smart TV, in a cheap TV box that costs less than $100, they'll want it in their Oculus Rift, they'll want it in their Google Glass, they'll want it in their tablet and in their ultrabook. Using cloud computing for graphical capabilities over the internet right now, is not worthwhile. People don't want to buy a big, expensive console, in order to simply rely on internet for boosted computing performance.

XBOX indeed has been able to negate some of the destruction to their product by backtracking on all their "the cloud! the cloud!" junk. Like this was Final Fantasy VII. But they're still in a horrible PR position. Sony is doing all of the right things, and has been telling consumers what they've actually wanted to hear from the beginning. Things like this, too, are why people are a lot more happy with Sony and the PlayStation 4 than the XBOX One. People want the PlayStation 4 and like Sony right now because they're being down to earth, and no-nonsense, and consumer-friendly.

I am getting a PlayStation 4. I have no current plans do get an XBOX One at all. And this, is most people. Though I at least appreciate the gesture of Microsoft being less ridiculous with the XBOX One.

so what i understand from that article is that you can use this to add features down the road that the consoles themselves maybe aren't capable of from launch

like how the ps3 doesn't have an equivalent to the xbox 360s party chat

vallorn:
Welcome back UnnDunn. Now onto the real reply, Sony are trying to make it work too. The only difference is that Microsoft proclaim it as a mystical cure-all while Sony are giving us honest facts about what it can and cannot do.

I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.

Sony's "attempt" at cloud computation is nowhere near as seamless or comprehensive as Microsoft's. With Microsoft's solution, developers simply do not have to worry about servers or capacity or even cost. Those things are handled by the platform. All developers have to worry about is how best to use this powerful extra CPU with tons of RAM they have access to.

Sony has nothing like that, so of course Mark Cerny is going to downplay it. That said, developers can come up with their own implementations of the technology for use on PS4, but they have to bear the costs of developing and maintaining it, and they'll have to run to a company like Microsoft to get a decent cloud infrastructure, because Sony certainly doesn't have one.

Bottom line: this is like when Microsoft decided to put an Ethernet port, real-time Dolby Digital and a hard drive in every original Xbox: back then, everyone criticized them for making the box too expensive, saying things like "no-one has broadband" and "give us memory cards". But laying that groundwork in the beginning gave Xbox the ability to do things no-one else could match. The same thing is true today with baked-in cloud processing and built-in NUI.

Frostbite3789:
Anytime I hear the phrase ~*the cloud*~ during a press conference or anything, my brain immediately replaces it with "GODDAMN WIZARDS" because it's essentially the same.

i just think of this every time I read a Microsoft article about their Xbox One.

Although I will say one thing in support of the "cloud": whatever advertiser that was given the task of rebranding remote server farms did a fantastic job. Seriously give that guy (or girl) a promotion.

I think they just shouldn't bother with cloud computing for games, like at all.

Either the game will be so reliant on it that it requires a constant internet connection, with all the downsides that brings (lag, server issues, the servers eventually being taken down etc.), or it will add next to nothing to the game, making it pointless.

Scars Unseen:

shameduser:
The latency from console to server is way to long be of any use to the local hardware. In the time it takes the console to send the data it needs processed to the server, have the server process it and send it back the console could have done it much, much faster and way more reliably. Also you would need enormous speed and bandwidth. The internal connections between the CPU, GPU and RAM all measure in gigabytes per second where as internet speed it measured in megabits per second. The two units are several orders of magnitude a part. Cloud computing to offload stuff like graphics, AI or physics is so impractical it makes no sense to even consider it.

I would say that depends on the type of game. FPS? Not a good idea. Turn based strategy? Much more plausible, though one wonders why you would need to. I think my Oblivion example is a good happy middle for this sort of thing. Don't use the cloud to handle the stuff happening on screen, but rather to calculate AI and simulation stuff that you can't see, yet can see the effects of.

Imagine a Dwarf Fortress game that had modern graphics(blasphemy, yes I know) without having to sacrifice its exhaustive detail in simulation. Some very few might have systems beefy enough to handle that, but most PCs(and all consoles if you were to somehow port the game) would need some of the work handled off client. Again, that wouldn't have to mean the cloud; a networked Raspberry Pi would likely be enough to augment the processing load. But off client processing could have a place in the gaming world.

Just not the place Microsoft wants you to think it does.

The dwarf fortress game is a bad example. Let's say you attack a giant boar (I haven't play must Dwarf Fortress). The boar would either react slowly (because while the game is waiting on the AI to be processed by the server) or react like any other video game enemy and then act very intelligent until the server processed data was no longer relevant. The speed of internal interconnect like PCI 2.0 is 8 GIGABYTES/second. The speed between the RAM is CPU (on 1600Mhz GDDR 3) is 16 GB/s. Then internal data transferred around inside the CPU is even faster. Their is simply to much data to send to a server (it would at the very least be several megabytes) to be practical in anything other then the slowest grand strategy game (which no one would pay for a server for anyway).

This doesn't even account for things like the server going down, the server being shut down by a company like EA, a bazillion people trying to play on launch day, ect.

nathan-dts:

Abomination:
My best examples of seeing how it would work is when I play on a 32 person server then load up my own 31 bot server of some game.

The 32 person server ran perfectly when I played on it but the 31 bot server running on my own machine ran at about 60% FPS. Clearly AI requires processing power and if that can be off-loaded to another location it can cause the immediate machine to run faster.

I, however, do not like the idea of my computer relying on another computer for a single player game. That's all Cloud computing is, making your single player game a multiplayer game without multiple players. We had a really shitty version of that attempted recently, it was called Sim City.

I wonder how much a company will be able to lie and say "Cloud Powered" when really next to nothing is cloud powered and the game is just using another form of invasive DRM... on the XBone with your always on Kinect.

Tinfoil hat, sure... but I've learned you give these corporations an inch they'll piss all over it then try and feed it to you.

Too slow. You have AI partially thrown into the cloud then you're going to be uploading all of the variables and downloading all of the process information. Wouldn't work cohesively; internet is too slow.

No... You'd need roughly the same amount of data that a multiplayer client needs to be able to render a player's point of view for most AI - (in some cases, AI and human player code is interchangeable). That's a few hundred kilobits a second at most if the data stream is well designed.

On top of that, AI, in games at least, needs response times about on par with human reaction speed. - Which in some cases can be as bad as 2 seconds or more.
- AI can have pretty high latency without seeming obviously slow.
In fact, in many games you can get away with only running AI updates quite a bit less often than every frame.

AI can run on the cloud pretty easily relative to some tasks. - For the same reasons that it's possible to run multiplayer games online. - The technical challenges and delay issues are pretty similar.

I think Microsoft is over-selling the idea of the cloud, but I think a lot of people here are under-selling it. Has anyone here tried something like On-Live or Gaikai before?

Sometime last year I played a bit of Saints Row: The Third in a browser though Facebook using Gaikai. Now, the graphics were a muddy mess, but it actually worked and was playable. I tried out one of the FEAR games on OnLive. Again, the graphics were a bit muddy, but the latency was amazingly low. Those games were completely running off the cloud servers. On top of that, I was in the Cincinnati area and using a server in like Atlanta for OnLive. I pay for a 10 Mb/s connection that isn't always that fast.

Now, if OnLive actually had a sever in Cincinnati, things would have been even better. So, if MS actually works with local ISPs (and MS has tons of money to do this sort of thing), you could actually see them owning a cloud network that could actually deliver some real improvements to graphics, AI, etc. in their games without introducing problematic latency. Sure, you could not use a system like that at all for an extremely twitchy game like COD or something, but for a huge swatch of games it wouldn't be a problem.

However, there are a ton of problems. The main problem is that not everyone has a mediochre enough connection to get this. The idea that your game looks worse because the internet is slow is pretty awful. Although MS is talking about this now, I don't think they intend for games to use it now. I guess they're hoping for a major upswing in solid broadband availability.

Note: I have no dog in this race. I'm a PC gamer. I do own a PS3 and Wii though. I don't see anything so far that would incline me to purchase a next-gen console.

subtlefuge:
I don't see any reason why 2-3 years down the line you wouldn't be able to offload some basic graphical tasks like shadows or reflections to cloud computing.

If you're offloading anything that the player can see to 'the cloud' then those things will always be several tenths of a second behind what the player is actually doing. I can see no way that could be annoying at all. Even on my 20mb/s connection an average ping to a server is 30-50ms, double that, add in processing time and you're north of a tenth of a second even on a faster than average connection. It's a very poor application for cloud computing.

Cloud computing is great for crunching numbers in things like Fold@home, it's great for mass access file servers too, whether that be saves, replays, stats etc. The things Microsoft are claiming are ridiculous, you should view them in the same light as EA claiming that Simcity and Battlefield use cloud processing.

Frostbite3789:
Anytime I hear the phrase ~*the cloud*~ during a press conference or anything, my brain immediately replaces it with "GODDAMN WIZARDS" because it's essentially the same.

I prefer https://github.com/panicsteve/cloud-to-butt

"Cloud" computing is just a bunch of virtual machines. It's cheap servers for them to do cheap server things with (unless you churn disk a lot like a database, then good luck with a shared environment). This means file hosting or background calculations. They don't offer anything beyond what anyone who hosts a server in a data center can already do, except cheap computers.

Of course you can do many more things when your resources are flexible enough to dedicate to an individual player on demand, but that's really just saving them effort and money. Which is nice when you want to offer more features.

Kross:
Of course you can do many more things when your resources are flexible enough to dedicate to an individual player on demand, but that's really just saving them effort and money. Which is nice when you want to offer more features.

And to have an extra layer of DRM.

It might just be me but extra DRM that offers some features to the player sounds right down MS alley, and they might even get away with it if presented right, they could for example launch an open world MMO with much more complex AI and environments.

nathan-dts:
-snip-

Ah, okay. I just saw his title and thought he was a senior person. It's good to have him so in touch with the real world.

UnnDunn:

vallorn:
Welcome back UnnDunn. Now onto the real reply, Sony are trying to make it work too. The only difference is that Microsoft proclaim it as a mystical cure-all while Sony are giving us honest facts about what it can and cannot do.

I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.

I was talking in hyperbole for entertainment purposes.

Sony's "attempt" at cloud computation is nowhere near as seamless or comprehensive as Microsoft's. With Microsoft's solution, developers simply do not have to worry about servers or capacity or even cost. Those things are handled by the platform. All developers have to worry about is how best to use this powerful extra CPU with tons of RAM they have access to.

Do you have any proof about this? And that's not quite how cloud computing works UnnDunn. The data to be computed has to be sent through a high speed internet line or you get weird lag in your singleplayer game. And besides it wont make that much of a difference if it can only do "Latency insensitive computation"

Sony has nothing like that, so of course Mark Cerny is going to downplay it. That said, developers can come up with their own implementations of the technology for use on PS4, but they have to bear the costs of developing and maintaining it, and they'll have to run to a company like Microsoft to get a decent cloud infrastructure, because Sony certainly doesn't have one.

Again. Do you have proof that Sony hasn't got a cloud infrastructure for devs or are you just pulling things out of your arse? Mark Cerny gave us a pretty comprehensive interview where he highlighted the good and bad points of cloud technology so unlike MS who just talk about the positives constantly (even when they're untrue) Sony is actually treating us like adults.

Bottom line: this is like when Microsoft decided to put an Ethernet port, real-time Dolby Digital and a hard drive in every original Xbox: back then, everyone criticized them for making the box too expensive, saying things like "no-one has broadband" and "give us memory cards". But laying that groundwork in the beginning gave Xbox the ability to do things no-one else could match. The same thing is true today with baked-in cloud processing and built-in NUI.

Your forgetting that the Xbox was the last released console of it's generation and was absolutely crushed by it's competition.

UnnDunn:
]I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.

http://www.develop-online.net/news/44318/Microsoft-Cloud-makes-Xbox-One-four-times-more-powerful

Four times as powerful. FOUR FUCKING TIMES MORE POWER WHEN IT'S ONLINE. Microsoft has been claiming that for quite a while now, and last time I checked, saying you multiply your console's power with 'da internets' counts as claiming cloud computing is a 'magic bullet' to gaming.

Would you like to explain how quadrupling a console's power, by offloading triple of the calculations to a server, is supposed to be 'latency-insensitive', or indeed not some 'mystical cure-all' claim? I mean, we ARE talking about a connection of, say, 1MB/s, when the Xbone's DDR3 is rated at around 68,000 MB/s? (I think they may have upgraded it, actually, but that only serves to further prove my point)

OT: Good on Sony for giving us hard facts, though I guess this is another one of those instances where they get undue credit for just doing what everyone really should be doing. I'd much rather know for certain that a feature on a console is pretty cool, than be bombarded with PR words and empty promises such as 'awesome' and 'the experience' all fucking day. By all means, advertise your shit, but eventually we just want to know what this supposedly awesome thing of yours can and cannot do.

Infernal Lawyer:

UnnDunn:
]I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.

http://www.develop-online.net/news/44318/Microsoft-Cloud-makes-Xbox-One-four-times-more-powerful

Four times as powerful. FOUR FUCKING TIMES MORE POWER WHEN IT'S ONLINE. Microsoft has been claiming that for quite a while now, and last time I checked, saying you multiply your console's power with 'da internets' counts as claiming cloud computing is a 'magic bullet' to gaming.

Do you read more than the headline, ever?

The actual MS statement is this:

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."

Read it and comprehend it. Then read it again. And one more time for good measure.

Nowhere does it say "the cloud makes Xbox One 'four times more powerful'". They said "here's the amount of CPU power we're allocating for cloud processing." That is all.

Vivi22:

Cognimancer:
Some have advertised that the Cloud will be able to offload processing power in handling game elements like lighting, physics, and even AI.

I assume by some you mean Microsoft because I haven't seen anyone else stupid enough to make that claim and actually expect people to believe it.

Theoretically, it can be done, but it would require a large amount of bandwidth and throughput*, and even if it were done the resulting improvement would be MINISCULE since most rendering effects provide diminishing returns.

The problems with cloud-processing is that the data payload for graphical rendering is large, and worse, timely since most relevant rendering MUST be done in real-time (on a per frame-basis).

Calculate the amount of time each frame requires from frames per second, and that is the amount of time you have to send information to the cloud, process it, and send it back. Any longer and it will noticeably degrade performance in some way. (and if your net latency is greater than that amount of time, then anything that's calculated on a per-frame basis in the cloud is useless at best).

This is why I didn't believe for one moment what Microsoft was braying about the Xbone being "future proof" since their claims were not only largely inaccurate but also rendered meaningless by the trappings of the present. (namely, the US lagging 8 years behind a good chunk of the developed world in high speed internet deployment. Thanks ISP oligopoly!)

(*much greater than what the average home in the US has at its disposal consistently)

Cloud computing dose not do much for normal people and considering bandwidth issues it will practically worthless.

PoolCleaningRobot:
Not sure if Mark Cerny is a legitimately cool guy...

... Or he's just been browsing threads to find out what we've been complaining about the most

Regardless, at least he's not telling us bullshit. Streaming and cloud services can give us cool things like cloud saves and streaming games instantly. It'll be useful for things like demoing games because who has the patience to download a game you're going to test for 15 minutes? Microsoft's calculation magic has already been proven physically impossible because of bandwidth

Or it could just be a pot shot at Microsoft.

Atmos Duality:
This is why I didn't believe for one moment what Microsoft was braying about the Xbone being "future proof"

anyone wants to make a "future proof" console all they need to do is come up with a new "consumer electronics" level form factor that basically has several "carts" as part of the case that lock in and hold certain potentially upgradable system components (memory, cpu, gpu, hard disc etc) while the rest of the connectivity/functionality serves long term conventions (usb, hdmi, "HD", "blue ray"/DVD, and the various networking and wifi standards)

you then aim games at the most common "system rating" and let market forces and consumer desire drive the thing forward.

yes that is basically a new sub type of PC but it would need to be a new PC form factor with "consumer electronics" design considerations in relation to robustness, exception of size, noise, heat, ease of use etc.

the "ultimate console" is not an unreachable dream: the "ultimate console" is basically a PS2 with modern graphics/media standards and "the net" on it.

that's all people really want.

but they keep trying to sell us something else.

it's a moot point tho as tbth i think Smart TVs (and also potentially "game streaming"*) are probably gonna kill stand alone consoles in a generation or two...

i think game "game streaming" holds great potential for the future. lifting the burden of game processing and rendering from the target device has huge potential to move things forward just as the industry is very swiftly moving towards a brick wall in therms of what it can squeeze out of mobile devices in particular (largely due to the reluctance of fundamental chemistry that forms the bais of battery tech to move at anything approaching Moore's law).

i don't think Microsofts plans will amount to much.

i do think that the awareness of "cloud computing" will drive developments both technological and political that in future will actually be able to support "game streaming" as potentially widespread and viable technology and at that point...well no one will need to buy a separate little box from Microsoft...although it's quite clear MS sees itself as being on the server side of things...thing is tho the publishers don't really need MS if that is the future...

Sleekit:

it's a moot point tho as tbth i think Smart TVs (and also potentially "game streaming") are probably gonna kill stand alone consoles in a generation or two...

"PCs"* are more likely to kill consoles long before Smart TVs will at the rate things are currently going.
Gamers go where the games go, so barring some massive surge in game production for Smart TVs, I seriously doubt Smart TVs will take over any significant share of gaming.

(*including towers, laptops and maybe some of the stronger tablets)

As for "game streaming". I've seen it and am not impressed. At all.

well i think you underestimate them good sir!

:P

Smart TVs will soon be ubiquitous.

every TV sold from pretty much now on is going to be "Smart".

and they are all gonna run android on pumped up smart phone tech and added low cost PC bits (like a hard disc and/or memory).

you can play OnLive (which carries for example Darksiders 1&2, Witcher 1&2, the Assassin's Creed games and Deus Ex: Human Revolution) on Android atm via the Ouya.

yes the technology is immature but even vague public awareness of the potential possibilities of "cloud computing" will place added weight on the need to make it mature.

for example i fully expect that the UK will move towards a full fiberoptic network within a few decades (we already quietly give BT public money on the side to make that happen) and once that happens (and there are similar advancements in other countries) it wont take long for some bright spark in a suit to realise there are 10s on millions of televisions out there and with just a few meetings and a software update or two a lot of people could get very rich selling games to people who actually don't have an inbuilt desire to spend 5-6 hundred bucks on a separate box of electronics they might not actually need.

"gamers" may go where the games go but the games go where the customers are and if there's one thing the publishers want its a huge near ubiquitous platform/"install base" they can just settle down and take aim at.

Sleekit:

Atmos Duality:
This is why I didn't believe for one moment what Microsoft was braying about the Xbone being "future proof"

anyone wants to make a "future proof" console all they need to do is come up with a new "consumer electronics" level form factor that basically has several "carts" as part of the case that lock in and hold certain potentially upgradable system components (memory, cpu, gpu, hard disc etc) while the rest of the connectivity/functionality serves long term conventions (usb, hdmi, "HD", "blue ray"/DVD, and the various networking and wifi standards)

That really doesn't accomplish anything other than losing the performance-to-cost advantage that consoles have over PCs.

You can't "future proof" old hardware when performance is the reason it will become obsolete. When the time comes, your hardware is going to be old, and the consumer will have to spend money on a new one. Having to spend $600 over seven years to keep updating your Xbox instead of $500 to buy a new one after six years isn't helpful to the consumer or the industry.

masticina:
If you require Cloud Computing to get your game to run a certain way. And lets say the internet of a gamer goes down.. doesn't that strips out allot of functionality from the game?

How more you off load how more is broken!

And will there be games that are utterly broken if it can't run the cloud?

In short what I ask is simply yes using the cloud to do things the current next gen hardware can't optimally will extend its life. And make it more powerful. But where is the fall back point?

If a game requires the cloud to run 50% of its functionality isn't it technically broken?

Yes and no. Yes, because in theory, developers will (especially later in the generation's cycle), be trying to optimize and suck out every bit of system performance possible, and your going to design features based on the resources available to you, meaning those features are not practical if those resources do not exist.

And no, because in practive, everything you said is true. The cloud will not be consistent or reliable enough for games to be designed to the extent that they may rely on it, so they won't. This drops cloud usage to a support position, where it's main benefit will be speeding up certain processing tasks when available. But, since gameplay won't be able to rely on it and will have to perform just as well without the cloud, this makes it largely meaningless. Maybe loading screens will be better, whatever.

I'm sure some clever devs will find great ways to capitlize on the cloud being available, but the simple fact that worst the game can be while offline is 'completely functional' means the advantages are really limited to a support role.

Angelous Wang:

masticina:
If you require Cloud Computing to get your game to run a certain way. And lets say the internet of a gamer goes down.. doesn't that strips out allot of functionality from the game?

How more you off load how more is broken!

And will there be games that are utterly broken if it can't run the cloud?

In short what I ask is simply yes using the cloud to do things the current next gen hardware can't optimally will extend its life. And make it more powerful. But where is the fall back point?

If a game requires the cloud to run 50% of its functionality isn't it technically broken?

No, the game just runs worse/slower.

Cloud allows you to offload certain processes/tasks, so that your console can focus it's own processors on certain processes/tasks so those processes/tasks can run faster/better.

Because the console processors have limited resources to spend amongst processes/tasks. And if they don't have to deal with certain processes/tasks themselves then they can spend more resources on the remaining ones.

If there is no cloud the console processors do all processes/tasks themselves, which is worse/slower.

Of course if you have no internet connection you do loose any extra cloud based function (like cloud saves). But these are not essential to playing the game.

That is the idea anyway, of course you have to take things like latency into account which theoretically could actually make cloud processing worse/slower.

My concern would be the developers willingness to embrace this technology. We saw how they dragged their feet the last go around with multi-processing, so developing a game with the mindset of 'what can we offload to the cloud and how to handle various-none latency issues' sounds like a whole lot of extra development and design time. The much better solution would be to have your local machine able to handle all the processing and memory requirements for your game, outsourcing to the cloud is something for multiplayer or a work around if the hardware can't handle the software.

The dependence on cloud computing means if your connection is down you can't play and when the provider discontinues or if provider removes the server the providershuts down the server your game will be useless.

UnnDunn:

Infernal Lawyer:

UnnDunn:
]I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.

http://www.develop-online.net/news/44318/Microsoft-Cloud-makes-Xbox-One-four-times-more-powerful

Four times as powerful. FOUR FUCKING TIMES MORE POWER WHEN IT'S ONLINE. Microsoft has been claiming that for quite a while now, and last time I checked, saying you multiply your console's power with 'da internets' counts as claiming cloud computing is a 'magic bullet' to gaming.

Do you read more than the headline, ever?

The actual MS statement is this:

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."

Read it and comprehend it. Then read it again. And one more time for good measure.

Nowhere does it say "the cloud makes Xbox One 'four times more powerful'". They said "here's the amount of CPU power we're allocating for cloud processing." That is all.

Actually reading further down, it says in the article:

A spokesperson for Xbox Australia went on to reiterate the claims to Stevivor, and stated that the Xbox One would effectively be "40 times greater than the Xbox 360 in terms of processing capabilities" using the cloud.

Da Orky Man:
I've always thought that cloud computing in games would only really be useful in certain processes. I can see an RTS using the cloud for a particularly complicated overall-strategy AI that doesn't need a quick response time, while running the individual unit/local squad AI on the machine itself. As far as I know, graphics need a fairly quick response time, so don't respond well to latency.

Yeah, niche applications like that it could work for (and things like determining random encounters in games, even things like the Director in L4D) but not for real time rendering or whatever.

Though if they were to offload battle RTS AI then we could take down the entire XBL network by playing a single Empire Total War siege battle[1].
Gotta love pathfinding bugs that just consume ever more CPU resources until the game crashes!

[1] Granted, it might take a while

Awesome forum

I think it is nice to see a forum like this where we all think the situations through and well we all seem to agree that it sounds cool but it won't work.

And yes I would love to see ETW in a full size siege battle being ran on said cloud. Hey at least it isn't your own console that is groaning and breaking out in black clouds.

And yes what if in 5-6 years the specific cloud service running your game goes down. Again this would make your single player game less worth. Already single players games are being stuck with online requirements and online activations. Already singleplayer games are turning into MMO's where you are alone.

Cloud services going down would just mean that our single player games, yes even our single player games, would end up worse.

 Pages PREV 1 2

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here