Editor’s Choice

Failing sucks. It’s humiliating. It’s proof that you’re not good enough. But, as the opposite of success, it’s bound to, isn’t it? Success is like morphine to the soul. Success means you’re better than all the losers who failed, which is nice place to be, but you can’t get there without driving through Failure Town, gunning the motor, praying you don’t blow a tire.

But there’s another side to failure. Apart from the personal pride bounce, failing opens a certain kind of door. If you’ve ever failed big, uffed on a galactic scale, then you probably already know this. If Amelia Earhardt were still with us, she’d tell you about it. To stand up and pronounce, “I can do this,” whatever “this” may be, is to start that door opening. If you actually succeed, then the door closes. But if you fail – Heaven forefend – then it opens wide, and what pours out is the worst-tasting medicine of all: public scorn.

There’s a saying in sociological circles that most people never travel any farther than 30 miles from where they were born. I used to think this was a myth. Then I woke up one morning and realized roughly 19 in 20 of the people I know fit that sociological axiom to a T, and that I was the odd man out.

Inertia is a powerful force, especially on the human scale. You make friends, you learn shortcuts, you become familiar with the layout of the local supermarket. And so you don’t want to leave. It makes sense, and it’s right. Familiarity is comforting after all, and striking out is hard. It’s a risk, and like every risk, there’s as much a chance of failure as there is success. And so, say those who stay, why bother?

“Why bother” is the psuedo-nihilistic, cynical rationalization of mediocrity that provides a warm blanket of contentment for those who are not inclined to strive for greatness. It’s a Snuggie of flatline Prozac comfort, from within which they feel comfortable judging others for trying to be something better than what they are. And if those who try succeed? Well, they just got lucky. Anyone who grew up in a small town will recognize this phenomenon, yet never has it been more in the fore of social discourse than now – thanks to the internet.

Due to the miracle of Web 2.0 information sharing, your crazy ideas can be squashed by cynical mediocrities from around the globe at near light speed, and you don’t even have to know who they are. Internet pioneer Jared Lanier, who coined the term “virtual reality,” calls this “drive-by anonymity”:

On one level, the Internet has become anti-intellectual because Web 2.0 collectivism has killed the individual voice. It is increasingly disheartening to write about any topic in depth these days, because people will only read what the first link from a search engine directs them to, and that will typically be the collective expression of the Wikipedia. Or, if the issue is contentious, people will congregate into partisan online bubbles in which their views are reinforced.

In other words, people tend to try to squash what they don’t understand, and the internet, instead of becoming a beacon for intellectual free-thinking and collective artisanship (as visionaries like Lanier imagined in the ’90s) has become awash with lowest-common-denominator bull-headishness, an environment in which creativity and individualism find it difficult to take root.

I call this the tragedy of the forums. The internet has become a digital schoolyard, where the smart, sensitive or just plain “not normal” are punished for standing out from the crowd, and everyone who feels the least bit uncomfortable about their own standing joins in for fear they’ll be next if they don’t. From my own experience being a “not normal” kid in an actual schoolyard, I can tell you it’s a tough place to foster any kind of creativity. But it can be done.

In days of yore, people needed talent or connections in order to gain access to a platform from which they could make their voices heard to the multitude. These days one needs neither, only an email address. Even a fake one will do. Think about it: Every book, every magazine article, every television show, every movie, every comic book produced prior to the late, late 20th century, was vetted by someone whose job it was to evaluate the work of others based on its quality. These people were envied and revered. They held, quite literally, the keys to the kingdom.

Now, owing to the rise of “Web 2.0,” those people are on the verge of becoming obsolete. Anyone can publish anything, thanks to the internet, and anyone else can review it. Vetted media like television and books are in danger of becoming extinct. Even the collections of our most treasured wisdom, encyclopedia, are no longer relevant. Truth-seekers, instead of cracking open a World Book or Britannica, now turn to Wikipedia, where the facts they find may not be entirely accurate, but who cares? Media went rushing after the crowds, hurrying in a feverish fury to capture the gold at the frontier, and the crowds responded by destroying them. It’s like Orpheus torn apart by the Maenads.

Imagine if Major League Baseball worked this way. Currently players are vetted by a talent scout, usually while playing for a college varsity team, after having been vetted by the coaching staff of their college. If they’re deemed worthy, they’ll be farmed to a minor league team before maybe, eventually getting kicked up to the big leagues to play for millions of adoring fans. But imagine if you could join a Major League team as easily as picking up a bat. Imagine that everyone who wanted to play could play, whether they were any good or not. Would the game get better or worse, do you think? Would you still pay the same amount to see it?

Or what if, perhaps, the game industry worked like this? What if, instead of waiting for some company to make a game, you could just make your own and play games made by other people? Considering how vocal and critical gamers are as a subculture, one would think they’d have plenty of good ideas to offer such a creative nexus, and would be willing to support it, right? Wrong.

Three or so years ago, when industry veteran Raph Koster announced his latest venture, Metaplace (having coded the back-end in his bedroom just like the good old days), he was universally hailed as a genius. Metaplace was to be the YouTube of MMO gaming. It was, in a nutshell, a system whereby anyone could create an MMO based on roughly anything. It was expected to be a great success. And yet it failed. As of this year, Metaplace has closed its doors and let go all of its staff. Turns out people care enough to bitch, but not enough to do any more than that. Who knew?

Taking a closer look at YouTube, it becomes clear that model isn’t doing all that well, either. As the frantic rush settles down, it seems there isn’t actually any gold in them thar hills. It was just an illusion. And Metaplace, in spite of generous funding, didn’t have Google dollars to throw at the problem of feeding a horse everyone wants to ride, but nobody wants to pay for. If you’re not looking at this as the canary in the coal mine for Web 2.0 sustainability, you’re fooling yourself.

Why, then, is the Web 2.0 model so attractive? Because gold or no, it’s where the people are. And why shouldn’t they be? Free is free, after all, and any supermarket sample wrangler will tell you that people will take almost anything if it’s free.

Have a free sample of this new albino rhinoceros sausage? Sure, why not! Want to buy some? No thank you.

Radiohead figured this out to their dismay last year, with the release of their latest album, In Rainbows. The plan was to offer the album as a download to anyone who wanted it. You didn’t even have to pay for it if you didn’t want to. So most people, over two thirds, didn’t pay a dime. The rest paid, on average, about half what an album usually costs. Considering Radiohead distributed over a million copies of the album in its first month of release, it’s estimated they still made something of a profit, but how much? And how many bands can realistically move enough albums for that model to make sense?

The answer, in spite of the fact many are calling the move a success, is very few. And yet bands all over the world are flocking to follow in Radiohead’s footsteps, with YouTube and MySpace literally clogged with paeans from wannabe Web 2.0 supergroups. Tough luck if their landlords aren’t as liberal-minded about the virtues of paying for services as their adoring fans.

The television medium is suffering, too. Studios are scrambling to save their dying business model as viewers by the millions time-shift their viewing schedules, watch pirated versions of their favorite shows on the internet or skip commercial advertisements altogether. The real tragedy is that television was free to begin with, but apparently not “free enough.”

In order to survive, studios have turned to crowd-satisfying entertainments, broadcasting the narcissistic convulsions of the lowest common denominator to your living room via “reality” television. If you’ve tired of watching actors and actresses with talent tell stories written by writers with experience, you can watch people who have neither attempt to outdo one another in classlessness and poor taste. It’s a response befitting the situation, I suppose. If viewers won’t support quality product, then they can have the cheap stuff. Let them eat sewage.

Why bother to create entertainment at all, much less entertainment media of any significant quality? Because, if you’re like me, you believe that quality matters. That creating something good is a challenge worth the attempt, in spite of the chance of failure. In spite of the very real possibility that an internet full of critics will tear you apart, like Orpheus, for little more than the crime of bothering. Because to succeed, to float an idea, to sing a song, to write a funny web video series and connect with even one audience member who takes something of meaning from the content, is a precious and life-changing opportunity.

There’s a scene in a third-season episode of the television drama The West Wing in which the President, played by Martin Sheen, is speaking to his Director of Communications, played by Richard Shiff. Shiff’s character, in a rare breach of etiquette, lambasts the president for running for re-election based on what he thinks the country wants him to be, i.e. “nice” and “plain-spoken,” instead of what he actually is, a Nobel award-winning economist and statesman.

“Make this election about smart and not,” Shiff says. “Make it about engaged, and not. Qualified, and not. Make it about a heavyweight. You’re a heavyweight.” In other words, why pretend to be anything but what you are when what you are is what the world desperately needs? Why apologize for being smart? For being better?

Why indeed.

And so, in the spirit of not apologizing for being better, we bring you Issue 237 of The Escapist, the first videogame-related website to try being better in the first place and, in the humble opinion of this Editor-in-Chief (and the Webby Award committee), still the best. This week, it’s time again for “Editor’s Choice,” when we select a chosen few articles from among the many we receive each week to represent the magazine at its finest. Unbound from the constrictions of our editorial calendar, these articles are the very best we have to offer, chosen from among the best the entire internet has to offer.

We hope you enjoy it. But if you don’t, we’re sure you’ll tell us.

/Fingergun

Russ Pitts

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more
related content
Read Article Changing with Change
Read Article Goodbye is Still Goodbye
Read Article Connecting the Dots for Fun and Profit
Related Content
Read Article Changing with Change
Read Article Goodbye is Still Goodbye
Read Article Connecting the Dots for Fun and Profit