Don't stretch this political analogy too far, though. Compared to citizens in real-life constitutional democracies, players have very little political power and rarely get real input on design. It's still the developers who sort feedback to determine what is signal and what is noise, and they ultimately do the design.
Plus, there's another reason not to base your game design on player feedback: Players often change their opinions or stop playing entirely.
Let's return briefly to the empirical approach and quantitative data, with the mindset that social liberal player feedback isn't actually shared governance but rather just more data - qualitative data.
How do you know that a particular set of data or interpretation will hold true for the future? Many players could suddenly start playing as Spies for some reason. Maybe one day, suddenly your entire player-based economy uses Stone of Jordans as currency instead of gold or gems. Or tomorrow, gravity could suddenly cease to exist.
This is, more or less, the core problem of empiricism as posed by David Hume: How do we know that observable phenomena will continue to act that way, consistently, in the future?
People are much more unstable than the laws of nature, whether in their feedback and rants on forums or their erratic playstyles that could abruptly change upon reading a guide or watching a YouTube video of a strategy.
We can't collect more player statistics or solicit more player feedback in order to decide whether collecting statistics or feedback is good; that is, we can't use induction to prove the validity of induction because that's circular logic.
However, that very reasoning about using logic is a form of deduction from a set of premises - and to prove the validity of deduction, we can't use deduction because that's circular logic too - so we must use induction to prove the validity of deduction ... but we just used deduction to argue for the fallibility of induction!
It's okay if you're confused - so was Hume. In the end, he adopted a kind of common sense "wait and see" approach, a type of practical skepticism. "Don't worry about whether it will hold true forever, but just worry about whether it holds true for now."
Compare this attitude to the classic Aristotelian conception of player-centrism - players might've complained that "Mega Man is too hard because of Cold Fusion Man" - and Capcom's response probably would've been, "How did you get this number?"
Suddenly Aristotle doesn't look so pluralistic and liberal anymore - instead, it seems immovable, static and unresponsive.
Perhaps we must accept that a "good" game design is only good for a while, until the player data indicates it isn't good anymore - and then you redesign and rebalance it until it's good again. This is distinctly a player-centric notion, the idea that a developer must "do right" by the community of players.
So what makes a good game?
Perhaps it's the willingness to change it.
Robert Yang is currently an MFA student studying "Design and Technology" at Parsons, The New School for Design. Before, he studied English and taught game design at UC Berkeley. If he's famous for anything, it's probably for his artsy-fartsy Half-Life 2 mod series "Radiator" that's still (slowly) being worked on.