Rumor: "Very Affordable" PS4 Based on AMD's A10 APU

 Pages PREV 1 2 3 4
 

Magichead:
Well that's depressing; PC gamers have been waiting for years for the console designers to get off their arses and bring an offering that will finally move game developers on from graphics that have to run on 5+ year old hardware, and now we find out that the PS4/Orbis/Whatever is going to be running on an AMD A10, a CPU with a 3.8GHz base clock based on the horrible Piledriver architecture?

My "budget" gaming rig from two years ago with an overclocked Phenom II quad-core CPU can happily keep up with Piledriver rigs built today, and they get chewed to pieces by even the midrange of Intel's designs. And that's assuming they use the A10-5800K as the basis for the Orbis; there's the 3.4GHz 5700K model to think about as well.

You can put as much RAM in the thing as you like, if it's running what amounts to a last-gen midrange CPU and GPU it's still going to be rubbish.

Your probably still disheartened by the performance of the bulldozer architecture. I know I am, I'm still using a Phenom II x4 Black that I bought 5 years ago in my gaming system. The improvements that were made for piledrive place it well in the lead of AMD's previous chips, though not by much; and fix most of what was wrong with the bulldozer's, especially the price.

Regardless the biggest problem with the bulldozer architecture in the first place was it's almost one sided focus on multithreading, in a time when most applications still don't take full advantage of 2 cores let alone 4-8. It also didn't have the through put to maximize all it's cores. With devs writing directly for the chip piledriver should perform great.

Treblaine:
Maybe it's a matter of coding, but after assembling so many gaming PCs and benchmarking them I've yet to find a game that gets a consistent advantage from 16GB of System RAM, even the highest speed. Even though it makes us more money we argue against customers requesting 16GB of RAM for a gaming rig as we know we'd be taking advantage of them.

16GB only really comes into it's own for processes that don't need to be fast but deal with a lot, like photoshop, video editing and making 3D models and animations. So basically game development.

8GB is just about ideal for even the most demanding games.

I think a likely scenario is the console plan to launch with 8GB, but the dev-kit models have double the RAM (16GB) just to make it easier to tweak, create and combine elements with the goal on the dev-kit to get the system memory usage down to only 8GB which will be the launch version.

Though it may be even half that, 4GB of RAM would be very affordable yet very capable on a console with refined specs and no Operating-System overhead. Xbox 360 has done so well on only 512MB of RAM shared between CPU and GPU. 4GB would be 8 times that, that is three Moore's Law doubling, what you'd expect over 6 years. 2013 is 6 years since PS3 launched in Europe.

Absolutely a matter of coding. If next gen console games could cache huge amounts of game assets like textures and models then they could potentially eliminate faults like texture pop-in and long load times. PC's don't do this yet because not enough PCs have the hardware to enable this. Developers have to use methods that will work for a majority. With consoles being uniform it would be easy to implement. If it is then you can expect future PC games to have a similar feature.

Moore's Law relates to the number of transistors on an integrated circuit. Not memory use of software.

Lord_Gremlin:
So, 8 and 16 mean that they try to decide between 4 and 8 GB in a console. And since they will make in DDR5, current top RAM for videocards, they will want to organize 512 mb chip production to get 8 GB model out.

GDDR5 not DDR5.
DDR5 doesn't exist. GDDR5, like GDDR4, is based on DDR3 but is more optimised for high bandwidth, parallel tasks like graphics processing and GPGPU tasks. GDDR5 might not be appropriate for use as system RAM as far as I know but we will have to wait and see.

Magichead:

You can talk about Moore's Law and how amazeballs it is compared to the old gen consoles all you like, the fact remains that a budget gaming PC from two years ago will equal or beat an A10-based system in any benchmark you care to name, more often the latter if you give the budget rig a decent overclock. Even accounting for the moderate performance boost that comes from developing games on a closed system like a console, you're still not going to be seeing any DX11 features running on this thing.

When you consider that PC users have access to the i5, the i7, and soon the new Haswell architecture; plus graphics wise the 7970/GTX670 now, and soon the new AMD HD-8xxx-series(and with that the rumoured Nvidia cards that were held back from the current crop because they were so ludicrously powerful compared to AMD's 7xxx-series cards) - these new consoles will be obsolete at launch, and within a year or two we'll be back to where we are today, with PC's capable of truly outstanding performance being held back by the limits of another platform.

You're ignoring a few key points here.

Firstly, direct comparisons between consoles and PCs are always iffy at best. That's because PC architecture is a bloated mess of background software, operating systems, and drivers compared to consoles. PCs are designed to do a whole host of things, gaming being one of them. Consoles are designed purely around gaming. This means that, when running games that are properly optimised, they can do a whole lot more that a whole lot less than PCs with comparable or higher end components. The reason why your PC needs something like an i5 or an i7 is because it still needs to be able to back up all your documents, save states and settings should your PC suddenly die while playing Battlefield 3. Consoles that's not an issue.

Secondly, you're seemingly forgetting or ignoring that the global economy is currently in the toilet. The 360 and PS3 launched during the last years of the global boom, when people still had record amounts of disposable income. Times have changed. More and more people are having to get by on less and less money, and if Sony and Microsoft want to stay in business, they're going to need to keep their consoles at a reasonable price. That means using components that cost a reasonable amount. Cutting edge PC gaming components are marketed towards a niche market of PC gamers who have the disposable income to spend hundreds upon hundreds of pounds building their dream machines. That is not the same market as the console gaming market. Console gamers want an easy to use, reliable machine that lets them play all the most recent games without costing a small fortune. Whether the console has a Radeon 7000 or 8000 is irrelevant, unless it impacts the end price.

Thirdly, consoles will not be obsolete if they remain the platform of choice of developers. Visual standards are not set by hypothetical tech demos run on super-high-end machines, they're set by the games that are released to the public. The majority of developers are still releasing their games on consoles, meaning that is the standard to which the industry currently operates at. Some studios like to focus purely on PC gaming in order to get more spectacular visuals. Fair enough for them. They're still not the standard, they're above the standard.

If the PS4 and 720 release with tech specs significantly less than high-end gaming PCs, they won't be obsolete, because most developers will still design their games around them. Until Activision and EA start pulling in the majority of their billions in revenue from PC sales as opposed to console sales, that's the way it's going to be.

Treblaine:

RhombusHatesYou:

Owyn_Merrilin:
And welcome to America, where the minimum wage is only $7.25 an hour, and even "real" jobs don't pay as much as equivalent jobs in Australia.

Yet the average American wage is about 15-20% higher than the Aussie average wage, and that's calculated in 'international dollars' which are based on the purchasing power of the US dollar.

Average isn't necessarily median.

Yeah, I was being just a little bit intellectually dishonest there but it did make a few people stop and think about it, so I'll take what I can get.

[explanatory rant deleted]

SpAc3man:

Treblaine:
Maybe it's a matter of coding, but after assembling so many gaming PCs and benchmarking them I've yet to find a game that gets a consistent advantage from 16GB of System RAM, even the highest speed. Even though it makes us more money we argue against customers requesting 16GB of RAM for a gaming rig as we know we'd be taking advantage of them.

16GB only really comes into it's own for processes that don't need to be fast but deal with a lot, like photoshop, video editing and making 3D models and animations. So basically game development.

8GB is just about ideal for even the most demanding games.

I think a likely scenario is the console plan to launch with 8GB, but the dev-kit models have double the RAM (16GB) just to make it easier to tweak, create and combine elements with the goal on the dev-kit to get the system memory usage down to only 8GB which will be the launch version.

Though it may be even half that, 4GB of RAM would be very affordable yet very capable on a console with refined specs and no Operating-System overhead. Xbox 360 has done so well on only 512MB of RAM shared between CPU and GPU. 4GB would be 8 times that, that is three Moore's Law doubling, what you'd expect over 6 years. 2013 is 6 years since PS3 launched in Europe.

Absolutely a matter of coding. If next gen console games could cache huge amounts of game assets like textures and models then they could potentially eliminate faults like texture pop-in and long load times. PC's don't do this yet because not enough PCs have the hardware to enable this. Developers have to use methods that will work for a majority. With consoles being uniform it would be easy to implement. If it is then you can expect future PC games to have a similar feature.

Moore's Law relates to the number of transistors on an integrated circuit. Not memory use of software.

" could potentially eliminate faults like texture pop-in and long load times."

hmm, kinda done that already on 8GB. I told you, 8GB peaks performance even in the most demanding PC games. 16GB only useful for creative software 3D video rendering up to Pixar level. There is a reason to avoid 16GB in a release model, because it will HUGELY increase the cost of the machine yet have very marginal utility at the development projects in mind.

Doesn't system RAM actually use a "number of transistors on an integrated circuit"?

Reaper195:
I really hope it's not called 'Orbis'. PS4/Play Station 4 Suits the console much more, especially since it's been through 1 to 3.

Do YOU think you know better than the marketing team? No you don't. Go huff paint for a month straight and tell me this idea does not make sense.

On the other hand, I would *consider* getting this if it had full backwards compatibility.

Treblaine:
Doesn't system RAM actually use a "number of transistors on an integrated circuit"?

Well, sure, if you just want to take away all the mystery and magic from the world. :P

I prefer to think of RAM as using colonies of nanometer scale ants... or when I'm feeling really whimsical, cities of anthropomorphic 1s and 0s.

edit: And yes, I did once mod a case (for someone else, alas) to include an antfarm mounted to the windowed side-panel.

Well ok...as long as it's still a beast. That's pretty much all I expect Playstations to be - powerful, game-playing beasts. No gimmicks, no BS involved, just set it up, put in a disc, use a controller and play a game. And as long as I get that, and PSN is still free, I'm happy to get behind whatever the next Sony console is. I just want the same experience I got from the PS3 but better. Not asking for much.

Treblaine:

" could potentially eliminate faults like texture pop-in and long load times."

hmm, kinda done that already on 8GB. I told you, 8GB peaks performance even in the most demanding PC games. 16GB only useful for creative software 3D video rendering up to Pixar level. There is a reason to avoid 16GB in a release model, because it will HUGELY increase the cost of the machine yet have very marginal utility at the development projects in mind.

Doesn't system RAM actually use a "number of transistors on an integrated circuit"?

Current Xbox 360 games are about 8GB max when compressed onto a single dual layer DVD. The next generation of games will undoubtedly grow much bigger than that so caching large amounts of resources will quite easily make use at least 8GB of memory. A good example of where things are going is Rage. The HDD space needed in the system requirements is 25GB on a PC. Mostly due to supersized textures. 16GB of RAM would make a huge difference over 8GB if all that space was used for caching assets. This is about using RAM in new ways. Not the same way PCs have been doing for years. However I agree that 8GB is more likely than 16GB. They are using an SSD so load times from the installed game files rather than cached data will be "fast enough" in terms of optimising performance and price.

Yes it does but it is not the same for software/OS memory usage as you implied. Just because 4GB would follow Moore's Law doesn't mean it wouldn't be hugely beneficial to use at least double that.

j-e-f-f-e-r-s:

PCs are designed to do a whole host of things, gaming being one of them. Consoles are designed purely around gaming.

Things like that cut both ways, for example I want to do MORE with my integrated circuit technology than JUST play video games licensed by Sony/Microsoft/etc

A PC does more than a console, it's a robust web browser, photo editor and social interface not within one gaming network like Xbox Live but across many different networks. It's got the ultimate backwards compatibility for games and the ultimate variety not just in the highest fidelity graphics but scaling each element according to your preference, be it speed, resolution, fidelity or whatever.

If you get a home videogame console... you'll probably still need to get a home computer. But if you get a gaming capable computer, you don't really need a console. Especially with how games that are exclusive to home consoles are few and far between these days and more than ever they are going multiplatform with a PC release.

Console I consider much more of a "disposable luxury" than my gaming PC. My PC is EVERYTHING in home electronics, from facebook for my job, editing and posting videos, word processing, emails, file storage and backup, THIS very website which I wouldn't like to depend on console-browser to access. Of course taking photos off devices, scaling and sorting them and uploading them, uploading music and movies to media players like iPods. Netflix still needs a PC to sort your choices as far as i know. So many essential things are PC-browser based.

When budgets get tight, if I have to chose between marginalising my console or my PC you know which one all of us would chose.

A single PC may be expensive, but not as expensive as investing in a console (and $60 per game) and discovering you still need to fork out again for a low-spec PC for all your non-gaming needs. Then you'd just be wasting processing power, you COULD just have one processor and graphics card etc for both gaming and social computing switching between the two. And each processor is not "specifically made for gaming" they are general-purpose processors, it's the operating system and interface (gamepad vs mouse + keyboard) that's the deciding factor.

All I'm saying is when my budget gets tight, the console goes and the PC stays. You don't need a super-powerful killer PC. Remember, console games settle for much lower settings than the maximum settings on PC, to compensate for lack of a console just lower the settings to console level then it runs smoothly as on console (still not that smooth).

I applaud Sony going for a more budget A10 system as it recognises their place in the technology landscape, it is NOT the centre of my life or really anyone's life, it is not an indispensable component to being a modern connected and technologically capable person. The home PC is that. Consoles are a luxury we cannot so easily afford any more.

What I'd be most impressed with is if Sony's new A10 based console takes a leaf out of PC's book on affordable games.

That's another reason why I couldn't abandon PC, as PC has affordable games to an extent that console do not. So many PC games are Free-to-play, or fan-made mods. Steam sales are such good deals. I've been playing Brutal Doom on Zandronum recently, a free mod in a free source-port engine or Doom (that I got for pennies in a Steam sale) IT IS LITERALLY AWESOME!!!

PC is cheaper to run, more varied and dynamic and essential to remaining connected.

That's the way I see it. I'd be very interested, if you have a difference of opinion, to hear why I should - when faced with financial limitations - chose to invest primarily or exclusively in a console at the cost of marginalising my PC.

Rayken15:
Everything sounds good except the RAM. Isn't 16GB a bit of an overkill?

Not really. RAM is really cheap nowadays with 8GB being standard for a middle-of-the-road laptop or a regular desktop. 16GB sounds like a good place to be for a next-gen console.

scott91575:

Owyn_Merrilin:

RhombusHatesYou:

Yet the average American wage is about 15-20% higher than the Aussie average wage, and that's calculated in 'international dollars' which are based on the purchasing power of the US dollar.

Have you got a source on that? Because I've always understood Australians made more in general. They have to, because the cost of living over there is through the roof compared to what it is in the US. Video games are hardly the only thing that you guys get charged more for.

http://en.wikipedia.org/wiki/List_of_countries_by_average_wage

It is wiki, but links are provided. The page simply puts the OECD numbers in an easy to read chart.

Here is another one. I believe this too is based on the OECD numbers and expressed in PPP (money is expressed in 1 US dollar spent in the US). This a more simplistic number, and simply average monthly salary with cost of living included.

http://1-million-dollar-blog.com/average-monthly-salary-for-72-countries-in-the-world/

The OECD has a site, but it's not easy to understand or compare numbers.

edit: Note, these numbers are adjusted for living expenses (PPP number does that). I think that is what the other poster was probably using. If you go simply by actual dollars, Australia is considerably higher. Yet the cost of living more than eats up that difference. So it really depends on how you look at things. From a video game perspective, 33% higher costs in Australia should be expected and not really any different than US prices considering the wage differences.

Okay, your edit confirms what I was thinking, then. My point was that Australians weren't getting screwed over as much as they tend to think they are, because they have higher wages to match the higher cost of living. Although apparently they /are/ being screwed over more than I realized. If you listen to the Australians saying "$60? Ha! We pay $120, quit complaining!" You'd think they made half as much money as Americans do when adjusted for cost of living. The reality is more like 80-something percent.

Rayken15:
Everything sounds good except the RAM. Isn't 16GB a bit of an overkill?

I'm not sure why sleetblind says what they say. 16GB is EXTREMELY high for system memory. Even the most bloated, demanding and poorly optimised PC games don't gain any benefit from beyond 8GB. I've seen games run Team Fortress 2 on max settings Mann vs Machine (dozens of bots) SIMULTANEOUSLY with Eve Online, that is BALLS LOADS of system memory usage. I see no stutter with 8GB. For a time he was using 4GB and no single game had a problem. I'm no 8GB and regularly leave demanding programs running simultaneously like Chrome with 20 tabs open, photoshop and a Steam game.

I ask my boss who is in the business of assembling and repairing PCs, and you can ask an expert yourself and they'll tell you the same thing.

The only home computers that vaguely benefit from 16GB of RAM are those that are for MAKING games, not playing them. Things like animating and rendering Pixar quality movies or assembling and testing assets for a 3D game.

I think 16GB is just for the "developer kit" that publishers give to coders to design games on, the actual home console may have a half or quarter of that System memory capacity, 4GB is very workable. 4GB today you can play very high quality settings on PC, 8GB is overkill on PC.

Treblaine:
I think 16GB is just for the "developer kit" that publishers give to coders to design games on, the actual home console may have a half or quarter of that System memory capacity, 4GB is very workable. 4GB today you can play very high quality settings on PC, 8GB is overkill on PC.

Hrrmmm... you know 16Gb would be just about right if they were emulating a RISC based environment on CISC architecture system... but the CPU seems a bit underpowered for that. Then again current console OSes aren't exactly resource intensive so it might work well in that. Of course, trying to discern the specs of a console from the specs of its devkit is just idle speculation.

Oh yeah, 16Gb is not enough to render Pixar level animation... but then again, it doesn't matter what sort of kit you drop into a single system, it won't be enough for that. Renderfarms exist for a reason. ;)

Treblaine:

RicoADF:

if their smart they will offer a 'premium' edition that supports full backwards comparability..... for a price

It might defeat the purpose if the premium is too high a price.

For example, if the Premium version costs more than a "core" PS4 PLUS the price you'd get from selling your PS3... then it makes more sense jsut to keep your PS3 and get core PS4.

Backwards compatibility made sense with PS1 to PS2. I sold my PS1 then used the money I made from that to help pay for PS2 yet I could still play all my PS1 games. Gamecube didn't have backwards compatibility but it sold for $99 when PS2 sold for $299.

Personally for an all in 1 system that works, I'd pay any price :-)
1 console that plays all my games from psx, ps2, ps3 and ps4 would be worth it. Remember the ps2 and 3 will eventually be unrepairable/replacable one day.

RhombusHatesYou:

Treblaine:
I think 16GB is just for the "developer kit" that publishers give to coders to design games on, the actual home console may have a half or quarter of that System memory capacity, 4GB is very workable. 4GB today you can play very high quality settings on PC, 8GB is overkill on PC.

Hrrmmm... you know 16Gb would be just about right if they were emulating a RISC based environment on CISC architecture system... but the CPU seems a bit underpowered for that. Then again current console OSes aren't exactly resource intensive so it might work well in that. Of course, trying to discern the specs of a console from the specs of its devkit is just idle speculation.

Well, it's been pretty well established that Devkits have more memory than the release consoles. And memory seems to be scaled by a factor of 2, so 2x or 4x the home-console variant.

http://xna360console.blogspot.co.uk/

Xbox 360 Developer-version here has 1GB each for the GPU and CPU compared to the release console that has 512MB shared between CPU and GPU. That establishes that the developer version may have 4x the system memory of the release version.

I don't know what you think a home Video Game console would do something like try to emulate a RISC based environment on CISC architecture system, seems like such a waste and as you say, the CPU isn't up to it.

I think we can do more than "idle speculation" but some safe conclusions as well.

RicoADF:

Personally for an all in 1 system that works, I'd pay any price :-)
1 console that plays all my games from psx, ps2, ps3 and ps4 would be worth it. Remember the ps2 and 3 will eventually be unrepairable/replacable one day.

I wish I was as rich as you, to be able to honestly say "I'd pay any price" to get as trivial a convenience as to have an "all-in-one" device and willingly making your current technology redundant.

By the time - so far in the future - that most Playstation 3 consoles are irreparable, then computer hardware will have advanced to the point where PC emulation is practical if not preferable for PS3 games. As is the case with N64 games today.

And Realise, Sony only stopped making Playstation 2 consoles last year, there are PLENTY of consoles out there capable of playing PS1 and PS2 games, the PS3 looks like it will remain in production as long as there is any demand, they just released the PS3 mini. And PS1 games are increasingly available via PSP/PS3 emulation or source-ports. The games are NOT lost forever. I played through the entire Classic Tomb Raider series on my PSP and I wasn't limited by paying through the nose for an original disc for quite a high price (due to rarity) but for a digital download.

Treblaine:
I don't know what you think a home Video Game console would do something like try to emulate a RISC based environment on CISC architecture system, seems like such a waste and as you say, the CPU isn't up to it.

A consumer version wouldn't want to run emulated environments.

However, an early version dev-kit released before any custom hardware has been fabricated, just might want to do that if said custom hardware is RISC based.

I guess I just have a problem accepting that a console manufacturer is going CISC isntead of RISC as RISC based computing is more efficient when it comes to limited task usage rather than the more flexible CISC. Not to mention that going CISC makes a system far more open to being emulated on PCs as it cuts out one layer of emulation being needed, and the most problematic layer at that.

cerebus23:
first to the market, like the ps2

Sega Dreamcast was first to the market, not PS2.

CortexReaver:

cerebus23:
first to the market, like the ps2

Sega Dreamcast was first to the market, not PS2.

Sony today is not like Sega of 1999. Sony has had significant and continuing success with PS3 and PS2 has been selling very well up till recently and of course plenty of software sales.

Megadrive/genesis died painfully with many badly managed peripherals then saturn hardly made a ripple, most forget it even existed and can be completely overlooked.

When dreamcast came to the market early, big corps were able to do things to it that couldn't be done to Sony in 2013 just like Sony couldn't bully Microsoft after their early entry. The "bullying" was mainly giving ultimatums to publishers to port exclusives, and also to marginalise

Also Dreamcast made the huge blunder of on having a right thumbstick. WTF?!? It had far less buttons and analogue sticks than PS2, Xbox or even gamecube. Severe limitations. Can you imagine Halo without a right thumbstick?!?

Eclipse Dragon:
I have a hard time believing anything created by Sony is "very affordable".
It might be like the Vita, where the system price seems reasonable, but you need to pay extra for essentials.

Orbis basic system for $399.99.
Includes 256 GB hard drive and 1 month free Playstation Plus subscription.
Backwards compatibility available only in $499.99 models. Controller sold separately.
Price for controller: $99.99
Price for games at launch: $80.00

SHeeeiiiit. I hope not. That sounds like something that could kill a platform stone dead. There's not better way to anger your customers than hidden costs.

OT: I'd rage about the people who whine that this doesn't have the specs of a maximum P.C. in a desperate attempt to justify their latest ultra-expensive graphics card., but CBA. The platform developers don't listen to them, so why should I?

Kumagawa Misogi:
AMD's top A10 APU's GPU that costs $122 on it's own at 1280x720 resolution with all graphic settings at there lowest can get 48fps on the PC in Battlefield, 32fps in Crysis 2 that is not good now let alone in 5 years.

Consoles get far more performance out of less hardware because of the software. I don't know all the specifics but you can look it up online. If I recall correctly they do something called direct coding which allows the software to take full advantage of the hardware unlike a computer.

Wow. The first page alone has at least 5 people I wanna quote. Fuck it, you're not wasting three hours of my precious time.

It's not going to be more than 8 gigs of RAM. I bet my huge swollen face on that.

Sony's ultimate goal is to have the hardware run 1080p60 in 3D with "no problem."

Completely meaningless, vapid statement. The PS3 is powerful enough to run things at 60FPS, 1080p and 3D, the question is what it's going to be capable of running at those specs. Because if it's the exact same thing but with finally caught up to industry standard video specs and, say, shaders worth a crap, I will not be impressed.

There's one thing I'm actually curious about here, and that's how much they'll be able pull out of whatever A10 processor they ultimately decide to go for, because this APU business, it's weak as fucking shit to begin with, so it seems to me like what they'll be able to get out of it is "halfway passable" instead of what they could do with an actual dedicated GPU. Wait and see, I guess.

WaitWHAT:

Eclipse Dragon:
I have a hard time believing anything created by Sony is "very affordable".
It might be like the Vita, where the system price seems reasonable, but you need to pay extra for essentials.

Orbis basic system for $399.99.
Includes 256 GB hard drive and 1 month free Playstation Plus subscription.
Backwards compatibility available only in $499.99 models. Controller sold separately.
Price for controller: $99.99
Price for games at launch: $80.00

SHeeeiiiit. I hope not. That sounds like something that could kill a platform stone dead. There's not better way to anger your customers than hidden costs.

The cheapest Xbox 360 (arcade) came without a hard drive or wifi support.
If you wanted those things, you needed to buy them separately, and they weren't cheap.

If it ends up being in the $400 range I'll probably get it. But only IF it it's backwards compatible with PS3.

Treblaine:

YES, THIS is exactly the way I see PC gaming and gaming in general, in my country, legal console gaming is more than a luxury and not many can simply afford it, unless you just want a game or two.

I'm in the "non elitist" crowd that doesn't brag about better graphics (although sometimes it's fun to do so :P), I love PC gaming because I can afford it, between Steam sales, GOG, the many indie bundles and free to play games, I have more games than I could've ever imagined to have and although I can't run them all at max settings with 60 fps, blah, blah, blah, I'm more than happy with the graphics quality and performance of my current setup, heck, it could even be comparable to "console quality" graphics.

I built this PC in 2008 and it's still going strong, without signs of ever slowing down, I bought an Xbox from a friend back in 2009 and that thing only lasted me for a single year, I sold it, because I already had a PC and a shitton of games for it, I didn't wanted to spend $75 on each new game (that's the price we get in this wonderful country), plus an extra fee to play online and buying an overly expensive harddrive, not to mention a ludicrously expensive wireless adapter because I couldn't hook it up directly to my modem.

So yeah, like this guy said, the Xbox needed to go and I don't miss it one bit.

Adam Jensen:
If it ends up being in the $400 range I'll probably get it. But only IF it it's backwards compatible with PS3.

For $400... hell to the nope.

But why so much love for BC? You'd still have your PS3 right, don't tell me you own a load of PS3 games yet don't own a PS3. What do you add by having a second console redundantly play the same games?

I really don't get how people can be so attracted to redundant access to the prior generation that they will reject next generation of console developments.

Warped_Ghost:

Kumagawa Misogi:
AMD's top A10 APU's GPU that costs $122 on it's own at 1280x720 resolution with all graphic settings at there lowest can get 48fps on the PC in Battlefield, 32fps in Crysis 2 that is not good now let alone in 5 years.

Consoles get far more performance out of less hardware because of the software. I don't know all the specifics but you can look it up online. If I recall correctly they do something called direct coding which allows the software to take full advantage of the hardware unlike a computer.

I think John Carmack talked about it in his Keynote as "getting right down to the metal" but it is balls-hard to do as you have to be so much more careful, you can't just drop things in, you have to work in far more complex code. It's no magic wand, it gives performance but only after putting a LOT in.

What Kumagawa misses is that Crysis 2 runs 30fps at 720p on an Xbox 360 using tech from 2005, a graphics chipset far inferior to even what A10 offers. That's an example of where optimisation can give performance boosts but Crysis 2 then became much more expensive the develop.

But I prefer the PC way as even though it isn't as efficient, it's more flexible. It's easier to change design elements late in development cycle or with mods or other extra content/optimisation. It keeps development costs down in an industry where costs are cripplingly high that results in cutbacks in other parts of production and pushes out the indy developers.

It's no secret that indy development is king on PC. If it is on console then it can't hope to be graphically intensive and is platform exclusive. And then they get neglected, while Steam has been a good home to indy developers where they can actually be a reliable business opportunity.

Treblaine:

It's no secret that indy development is king on PC. If it is on console then it can't hope to be graphically intensive and is platform exclusive. And then they get neglected, while Steam has been a good home to indy developers where they can actually be a reliable business opportunity.

Erm... Trine/Trine 2 and Journey are all indie titles available on console, and they're all pretty damn stunning in the graphics department.

j-e-f-f-e-r-s:

Treblaine:

It's no secret that indy development is king on PC. If it is on console then it can't hope to be graphically intensive and is platform exclusive. And then they get neglected, while Steam has been a good home to indy developers where they can actually be a reliable business opportunity.

Erm... Trine/Trine 2 and Journey are all indie titles available on console, and they're all pretty damn stunning in the graphics department.

I didn't say exclusive to PC, I said king on PC. There may be a few counter examples but the trend is clear.

Trine was limited to a narrow fixed angle perspective, that's very easy to manage assets compared to an FPS game where the camera may suddenly point any direction and need to render something very different.

Journey also was not an indie title, it is a budget title but is was funded, supported by and published by Sony Computer Entertainment, it had the same support as Killzone or Uncharted. It's not indie. Small-low-key =/= indie. Indie means a small independent company with few financial backers (no strings attached). When an "indie" grows big then it's Capital-I "Independent". Mojang is independent. Valve is independent.

 Pages PREV 1 2 3 4

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here