In his 2004 book, The Paradox of Choice – Why More is Less, American psychologist Barry Schwartz argued that, in general, the more choice we as consumers are given to, the less happy we become. Turns out, the psychological burden that comes with an array of choices actually outweighs the hypothetical gain of the optimal decision, which leads to anxiety and distress. Making choices makes us by and large miserable, it seems.
The things that bother us in choices are the trade-offs and, consequently, feelings of loss that we have to deal with when evaluating the options in hand. Yes, for some reason we tend to think, as Schwartz writes, that we actually lose all the alternative choices as a whole when we have to pick just one, which is obviously an erroneous line of thinking. Still, it hurts to make a decision whenever there’s plenty to choose from, and one has to only observe a kid at McDonald’s choosing his/her Happy Meal toy to see how painful it sometimes can be when there’re mutually exclusive but equally enchanting options on the table.
The McDonald’s kid leads us conveniently to the agony most PC gamers face every time they launch a freshly installed game, which is the host of sliders and drop-down menus found in the graphics options. Of course, the graphics options are a non-issue for the players who happen to possess a top of the line PC on which everything can be maxed out, quite joyfully so, I’d assume. But for the rest of us, the graphics sliders can be a serious source of misery and distress.
The agonizing trade-off here is the classic performance vs. image-quality dichotomy, which is something that has defined the real-time image throughout its existence. When operating on a certain piece of hardware, we simply can’t have both maximum performance and maximum image-quality happening at the same time, both of the notions being, of course, theoretical ideals in and of themselves. Once there’s a single pixel drawn on the screen, we are trading performance for image-quality.
The key question, then, rises: To what extent we are willing to sacrifice performance over image-quality? The one thing I love about consoles is that, due to their fixed hardware specs, it’s developers who solve the performance / image-quality equation for the player, which a) makes console games more auteur, and b) means the console gaming experience is identical throughout the platform so no one feels left out.
However, as said, the PC player with a lower-end hardware isn’t that fortunate. Adjusting the graphics settings to “Low” or “Medium” isn’t the source of anxiety per se, but rather the pure awareness of the fact that there’re also “High” and “Very High” or, let alone, “Ultra” settings available, but at the same time, unattainable performance-wise. It’s indeed amusing, and sad, how the high-end settings can make the current settings seem worse by mere existence. And what’s worse, the “Low” and “High” settings bear no absolute value whatsoever. I believe people would’ve been much happier with Crysis (2007) if the “Medium” setting had been simply renamed “High”, just like in Starbucks “Small” is called “Tall”, I hear. It’s all relative.
In the end, the true agony of graphics options comes down to the heartbreaking optimization process when the player struggles to find some magical combination out of tens of sliders that affect the performance the least while still keeping the image-quality acceptable. Personally, I’m all about performance so every frame below 60 per second is a compromise in my eyes, which keeps me jumping between the gameplay and the settings for a quite some time.
There’re indeed usually tens of choices and trade-offs to be made before the game can really start for many playing on a dated PC hardware. Don’t get me wrong, though: I’m all for options, tweaking, fine-tuning, and all that. Yet, the freedom of choice tends to come with a considerable psychological price that console gamers are generally free of.
A price that keeps us mulling over what we can’t have instead of enjoying what we do have.