Perfection’s not quite the big deal it used to be, at least according to the foremost critics of gaming. Eyebrows have been raised across the industry at the fact that with their scoring of New Super Mario Bros. Wii, Japanese gaming magazine Famitsu has given four 40/40 “perfect” scores this year – more than in any other year to date.
Famitsu’s reviews have been an upward trend for years. This excellent analysis from Siliconera of the past decade of Famitsu reviews puts the trend in easily-digestible graph format – and it would be even more damning if the graph went back further, given that Famitsu failed to award a perfect score for its first twelve years of publication. I can only imagine the kind of “hockey stick” graph that might create.
Western gamers have taken a cynical stance, accusing Famitsu of buying and selling perfect scores. I can only suppose the geographical distance from Famitsu’s homeland gives them the ability to notice trends that they would otherwise ignore in their own countries – as there is very little comment about the equivalent trend of runaway reviews scores in the West.
Consider the UK’s Edge magazine – not the West’s equivalent of Famitsu by any means, but an influential publication that, like Famitsu, is read by both gamers and developers. Despite its reputation for strictness, out of the 11 titles to score a “perfect” 10/10, six of them have come in the last three years alone. Edge has been in publication for 16 years.
Added up, both Famitsu and Edge have given more perfect scores in the last three years than in the previous whole history of their publications – for Famitsu, 6 perfects in the 20 years up to 2006 followed by 7 in the last 3 years; for Edge, 5 in the 13 years to 2006, versus 6 in the following three years.
The question many of you are probably asking is why anybody should give a damn. And it’s a valid question. Games reviews are just numbers after all. The number of titles that haven’t bagged perfect scores but deserved them are enough to make you doubt the whole enterprise – notable gamer favorites like GoldenEye 007, Silent Hill 2, Resident Evil 4 and Shadow of the Colossus are conspicuous by their absence from both magazines’ lists (Zelda: Ocarina of Time and Bayonetta are the only games the two agree on). It’s just an argument for fanboys, right?
Not quite. For no matter how many times we are told that reviews do not matter anywhere near as much to sales as marketing or word of mouth does, the industry is obsessed with them to the point of addiction.
It’s been amazing to see how, in just a few short years, Metacritic has utterly pervaded this industry. There are publishers who select their development studios based on their average Metascore; Metascore can mean the line between bonuses received or canceled, between royalties or the pink slip. A game’s Metascore can even affect a game company’s stock price.
Personally I think part of the reason is that, in the absence of a single recognized awards ceremony that is the equivalent of the Oscars or the Golden Globes, we have let Metascore become the validation that we seek. It’s so convenient too – you put a hard to understand multi-million dollar product into one end, and out comes a convenient number out of 100 from the other. Anybody can understand that.
But are games actually getting better, or are our reviews just getting more corrupt? I think it’s a little of both. The really revealing figure in Siliconera’s Famitsu analysis was not the increase of 40s but “the sharp rise in games rated 36, which appears to have started in 2002.”
This, at least, I can get behind. While I am as cynical as hell about the number of perfect scores in the industry, the overall standard of games has seen a marked improvement over the past decade. So much so that the sight of seeing a game like Rogue Warrior – ostensibly a big-budget game from a major publisher – being savaged by IGN with a 1.5 out of 10 was almost nostalgic. It’s rare to see that kind of treatment dished out anymore.
What we have seen is a real increase in the solid-but-unspectacular game, the one that hits all the bases and does almost everything right, but which just doesn’t quite do anything brilliantly. You used to see a lot more games on either end of the bell curve – spectacular duds and audacious successes. But despite what Edge and Famitsu would say, I feel we’re getting a lot of congregation in the middle these days.
I recently saw the music critic Simon Reynolds make the same argument concerning music through the decades. “I reckon that if you were to draw up a top 2,000 albums of every pop decade and compare them, the noughties [2000-2009] would win,” Reynolds wrote last week. “But I also reckon that if you were to compare the top 200 albums, it’d be the other way around… the 70s would slightly less narrowly beat the 80s, the 80s would decisively beat the 90s, and the 90s would leave the noughties trailing in the dust.”
There’s something similar happening to gaming on a much faster scale, a change that has accelerated over the past decade. As the general quality of games has improved, the standout hits are becoming fewer and fewer. The same basic trend was mentioned by the CFO of Electronic Arts, who was reported by Kotaku as noting that “the very top games are garnering more sales than ever, making the top-20 far and away the largest hits, a shift from a few years ago when games in the top-40 could all boast grand sales figures”.
Perhaps this change is the inevitable result of an industry growing up and taking fewer risks. Others would say that gaming is better than ever and all this is just opinion. And fair enough – overlooking for a moment the problems of using an algorithm seemingly compiled by one man with zero oversight, the idea of using Metacritic as an aggregator for the industry is fine. The one glaring problem is with the mix of critics themselves.
Picking at random a blockbuster movie from Metacritic (I chose 2012, as it’s about as smart as the average videogame), look at the reviews that are being collated: they come from the San Francisco Chronicle, the Washington Post, Slate, Time, Salon, the Village Voice. Looking at an equally random videogame, (I chose The Saboteur on Xbox 360) we have Destructoid, IGN, GameDaily, Gamespot, 1Up. Spot the difference.
Metacritic’s game reviews are dominated by the specialist gaming press in the way other Metacritic categories are not. The specialist gaming press is massively dependent on advertising from publishers to stay in business. Meanwhile, publishers are dependent on the specialist gaming press to advertise and hype their product, both in the ad pages and in the previews. This leads to a relationship between the two groups that can be described as symbiotic at best and parasitic at worst.
Given the state of our own Western review culture, maybe we should look at the state of games journalism in our own house before we point too many fingers abroad. For Japanese publishers, the Famitsu average is the equivalent of our Metascore, so perhaps it’s not too surprising to see that in an ever-more competitive industry, both these ratings are rising.
This is not to say that good, independent games journalism doesn’t exist, there just isn’t nearly as much of it as there should be. But as hype, hyperbole and hits all chip away at the credibility of our media outlets, the likes of Famitsu and Edge may wake up one day to find that credibility is a currency easily spent but very difficult to earn.
Christian Ward works for a major publisher. Silent Hill 2 has a Metascore of only 89? Sacrilege.