How Massive is Wolfenstein: The New Order?

 Pages PREV 1 2
 

Dont forget that 7-8 gig patch they released not long after the game came out. I have no idea what it was for, as I had already beaten the game and was working on the second timeline when the mandatory update became known to me and after all that, only experienced 1 game freeze pre-update on the Xbone version. I had hoped that the patch would make getting out of water look better, but it's janky as all hell. But that's really my only problem with the game.

I was under the impression that most of the 40GB that make up The New Order was padding deliberately added by Bethesda to annoy pirates.

It is not about the overall size, but if that size is warranted. Rage for example was not that great for the large texture sizes, but since I could not find a tool to decompress them I could not determine if they were badly optimized or just badly created. This game at 40GB just sounds *horribly* optimized. More likely the devs were trying to wow everyone with their ridiculous numbers, hoping that no one would ever get under the hood and find out what is driving it.

Now I am really interested in borrowing a copy just to see the textures, if we have access to tools that is.

EDIT

If it is padding, then Beth is open to internet users billing them for having to go over their internet cap to download more than they needed - as in class-action lawsuits (one per country). It would also be one of the first things that pirates strip out (once again - if tools exist).

4Aces:
It is not about the overall size, but if that size is warranted.

Totally agree with this. I actually like where this thread is going - it's moved from an (amusing but trivial) discussion about at what point in the games development timeline the total sum of historic code was less than the largest available single game into something very relevant; how large can download/install sizes keep inflating given, a) the predominance of the digital purchase and, b) the fact that, as several commentators have pointed out, even in Europe/USA most of us are on sub-par connections compared to what is actually possible.

I think there is some confusion about 'optimisation' though. When I think of that term, I think of the most efficient piece of code to perform a given task in clock cycles of the processor. Even when you are doing something simple like sorting a set of data into alphabetical or numerical order there are many different ways this can be accomplished. Some do this more efficiently in terms of speed (my definition of optimisation these days) and some do it more efficiently in terms of how many bytes those instructions takes up in memory (my old definition of optimisation back in the '80s and '90s!).

I think it is a bit disingenous to compare 'install size' to optimisation as the actual code that runs the game is a very small percentage of the total data on any install these days. I suppose if you are talking about something like using a 2048x2048 texture when a 256x256 would do then, yes, I'd agree to a certain extent but I still maintain there is no point in compressing data anymore, apart from at the download stage.

Back in the '80s I wrote a lot of those text-only adventure games for the Spectrum. We used quite a clever little system that scanned through all the text the writer had created looking for repeating patterns or common words: the, ing, you, look, walk, etc., etc. We then replaced those multi-character words or partial words with a one-character replacement so that when the game was reading through the text data and found a non-alphabetical code, it looked it up in an array and replaced that code with the letters. Clever use of this system could get more text into 48k than should actually have been possible, but damn, it was slow come time to throw it up onto the screen:

Because we didn't know how long a given line of text was going to be at any one time due to these characters, and the fact that we had a fixed width screen, we had to generate the line in memory and then figure out if the last word would wrap onto the next line. If it did, we saved it, backtracked, deleted that word and then added it into the buffer for the next line and so on. This is no big deal on your 2.5Ghz PC. It is a massive deal on a 2.5Mhz Spectrum!

I know that was a bit rambling, but I thought the more technically-minded (and perhaps older!) readers would appreciate an explanation of why 'TEH OPTIMIZINGS!' means different things, at different times to different people, and not necessarily why installs seem bloated sometimes.

Why go through all that hassle these days to display some words? Well, we don't now. We leave it 'unoptimised' in terms of space allocation, but very optimised in terms of speed.

As game worlds get larger, as graphics data becomes more complex, as voice and a huge variety of music, effects and soundtracks become de-rigeur for even small indie projects, it is inevitable that there is only one way installation sizes are going to go and it most certainly not downwards as I see it.

I wonder if there is a Moore's Law equivalent for this or if there is a critical point at which it all falls apart due to lack of infrastructure for all but the lucky few? If I hadn't had this half bottle of Merlot, I feel quite sure I would already have worked out the equation :)

Kieve:

Shamus Young:
After I turned this in, I re-read it and felt that the size I came up with for DOS games was just way too small. On the other hand, the number was a really wild guess and I don't know how to come up with a more solid number. I didn't want to submit a re-write with one arbitrary number replacing another simply because the new number seemed "better" in some ill-defined gut-sense of the word. What I really needed was a better way to extrapolate an answer, and I didn't have one. I'm content to leave the DOS stuff as a weak spot in the article and see if readers have any better answers. Even if I was off by a factor of ten, the main thrust of the article stands: Wolfenstein: The New Order is BIG.

Looking forward to what other numbers people come up with, if anyone wants to take a crack at it.

From a technical standpoint, I find it interesting that you account for 5.25" floppies, then jump right to CD-ROM, forgetting completely about the 3.5" disks. Most of the DOS/PC games I ever knew came on the 1.44mb disks, up until CDs replaced them.

I think because he realised how futile it is to add up the space usage of all the old games after he made his point with consoles.
even if they do add up to a lot I don't think it's 24,000 floppys worth of different games for DOS out there.

IndieForever:
Back in the '80s I wrote a lot of those text-only adventure games for the Spectrum. We used quite a clever little system that scanned through all the text the writer had created looking for repeating patterns or common words: the, ing, you, look, walk, etc., etc. We then replaced those multi-character words or partial words with a one-character replacement so that when the game was reading through the text data and found a non-alphabetical code, it looked it up in an array and replaced that code with the letters. Clever use of this system could get more text into 48k than should actually have been possible, but damn, it was slow come time to throw it up onto the screen:

Because we didn't know how long a given line of text was going to be at any one time due to these characters, and the fact that we had a fixed width screen, we had to generate the line in memory and then figure out if the last word would wrap onto the next line. If it did, we saved it, backtracked, deleted that word and then added it into the buffer for the next line and so on. This is no big deal on your 2.5Ghz PC. It is a massive deal on a 2.5Mhz Spectrum!

I know that was a bit rambling, but I thought the more technically-minded (and perhaps older!) readers would appreciate an explanation of why 'TEH OPTIMIZINGS!' means different things, at different times to different people, and not necessarily why installs seem bloated sometimes.

It's always a speed vs memory tradeoff, and the Spectrum illustrated that like no other machine (even within the ROM itself - witness the infamous seven byte square root routine). Obviously certain more plodding games (strategy, text adventures, etc) relied less on speed and more on size, hence Speccy programmers being able to see a joke in this XKCD that I'm pretty sure wasn't intended (short version: "int PI" was tokenised and expressible in two bytes, while the number 3 would have needed five bytes of floating point to store). More action-oriented games couldn't have used those sorts of optimisations, because "int PI" is after all a calculation and every calculation adds more time.

Talking of time, I don't know if you've seen the disassembly of the Spectrum classic Knight Tyme; there's two sections (here and here) that illustrate your word encoding idea perfectly.

Kinitawowi:
Yes, tiny numbers add up, but the original point remains; compared to entire system libraries, Wolfenstein TNO is probably still bigger.

Sure, I'm not arguing that there is necessarily enough stuff that he missed to make it bigger than Wolfenstein, just that it's a pretty poor effort when he misses out several very significant factors in trying to make that point. BBC, Acorn, Spectrum, 3.5" floppies, CD, Windows - there are multiple, hugely significant platforms, storage media and OSes that he completely failed to mention at all, despite apparently thinking it necessary to take into account tiny niche systems that had less than 30 games in total. It may not be enough to catch up to Wolfenstein, but the things he missed out are almost certainly much more significant than the things he included.

Considering how smalle the difference between ps3 and ps4 (Same goes for pc and everything else) is, it seems ridiculous that we went from 6 GB games to 40.

I remember an Amstrad CPC collection of almost all games. I think the zipped archive was 100MB.
I think one way to estimate is to search for packages like all SNES ROMs, C64 game collection, etc.
Sometimes these collections, even before 1992 can be bigger, like Amiga/Atari stuff where you have the same roms 5 times from different pirated versions. And they could be few GBs for each of the 16bit machines.
Still, I think if we get all these collections it would hardly reach the 40GB size but maybe it would be 10-20.

p.s. Now I find a 250MB archive but it's also demos, applications and other stuff and maybe many different pirated versions of the same game.
https://archive.org/details/Amstrad_CPC_TOSEC_2012_04_23
p.p.s. Actually this archive would help to estimate size. Even divide by theortical half or more because it's also applications and other stuff https://archive.org/details/tosec
p.p.p.s. And yes, 40GB is too much. And the problem here is gameplay. While Wolfenstein seems fun for a while, I read it's quite short and it's not anymore like in the past with lot's of exploration and interesting gameplay. Too much size for too little gameplay.

 Pages PREV 1 2

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here