Thursday 21 August 2014

The Resolution Wars - We all lose.

It seems we can't go longer then 24 hours without a new article asking if a new game is 1080p or not, and what that means for the game or the system it's on. And when a game is 1080p, we talk about frame rate, up-scaling vs. native, and whatever other numbers we can think of.  But why?  Is it really important?  Do more pixels make a better game?  I tried to examine this trend in the gaming industry as objectively as possible, without making it about the consoles themselves.  This is what I found.
Smoke and Mirrors
What’s the name of the regulating body that tests a game when it claims to be “1080p, 60FPS”?  What’s the agreed upon standard for how much of the game needs to be at the resolution and frame rate?  What’s to prevent someone from saying a game is 1080p/60 because a pre-rendered cut scene is in the game at that resolution, or because at one point there are two 1920x1080 frames drawn 1/60th of a second apart even if it never happens again?  Nothing at all, obviously.  As with any unregulated marketing buzzword, the more we tell publishers that saying a game is 1080p is more important than any aspect of a game itself, the more we are going to encourage some shady marketing.  Sony is already facing a law suit (which recently got the green-light to proceed as a class action) over the use of ” temporal reprojection”, a tactic to make a game look like it’s producing a solid 1080p image when it’s really not.  A lawsuit that they will almost certainly win by simply arguing what I said above; 1080p is a meaningless claim they are under no obligation to deliver on.   Diablo’s console port takes this creative resolution labeling to the next level as the game was built from the ground up to be able to claim 1080p/60 without ever having to draw 1920x1080 pixels 60 times a second .  Instead, the game renders the camera control image at 1080p/60 independent of the actual game play.  This allows the game to effectively drop frames without slowing down.  It also prioritizes drawing the pixels closer to the players focus (the character) and will sometimes fail to refresh the sides of the screen without dropping frames completely.  I’m not calling Blizzard out as being deceptive for this and these tricks are used fairly infrequently in the final build of the game, but it’s clear they devoted a significant amount of time, money, and tech wizardry to ensuring the 1080p claim above any other priority.  And was it worth it?  The answer looks to be a definitive no.
We live in a world of day one patches, which sometimes gives us a window into the game that could have been.  For Diablo, both a 900p and a 1080p version of the game that were available to the media and review sites for almost a month before the day one patch upped the resolution (I’m deliberately ignoring the fact they were on different consoles because that’s not what this is about).  The 1080p version saw game play frame rates drop as low as 40PS and notable visual hiccups, while the 900p version ran a flawless 60FPS.  Upon release, a patch brought both versions to 1080p and significantly reduced the frame rate problems at that resolution ... but didn't eliminate them.  From an objective, numbers to numbers  standpoint The 900p game simply ran better. So why aren't we playing that?  Why didn't the day one patch LOWER the resolution of the flawed 1080p version to the flawless 900p restitution, instead of focusing on fixing as much as possible at the higher resolution?  Microsoft and Sony knew that would never cut it.  They need to listen to the players, and the players didn't want the game in a lower resolution simply because it was better. We want the game it in the resolution that we've made our holy grail.  And why wouldn't we?  The 1080p experience is visually superior to 900p , and that’s a good thing.  Unfortunately for that argument, almost every site that had access to both the 900p and 1080p game say there is no noticeable difference in visuals, with a few even saying the 900p game lookedbetter because of the way the tech wizardry I mentioned earlier interacted with dense particle effects and busy fights.  Some did their best to point out subtle differences that justified the focus on resolution, but no one talked of dramatic, earth shattering differences.  This shouldn't surprise us however; we known that for years.
Meaningless Numbers
1080p is shorthand for “One thousand and eighty lines of resolution, scanned progressively”, and based off that alone we already have a problem.  Progressive refers to the way a television creates an image by drawing every line when it refreshes, as opposed to “interlacing” which only draws half the lines every refresh.  It’s almost meaningless in discussing next gen console resolution as DX 11 and the Xbox One don’t support interlaced output (the PS4 has a 1080i option, but it simply exports the image at 720p). What we are really talking about is 1920x1080 resolution, which seems a lot less "next gen" and important given that's the resolution 90% of our PC monitors have been in the for the last 5 years.  Yet we'll continue to say 1080p even though the "p" is meaningless and we are all more familiar with just the numeric resolution, because the buzzword is far more important than the reality. It's an attempt to make this number seem like a magic sweet spot where things are either 1080p or horrible.  To the shock of absolutely no one, all the evidence we have says otherwise.
The most popular study on the benefits of higher resolution  is this handy chart compiled and sponsored by the manufactures of high resolution TVs, a group that would have no reason whatsoever to mislead you.
resolution_chart
Even they show restraint, noting that it would be difficult to see the difference between 720p and 1080p (let alone 900p and 1080p) if you are further than 7 feet away from your TV, or its smaller than 50 inches.  Blind tests of this theory are almost impossible to find, with PC Magazine and Consumer Reports being the only two major media publications I could find that put people in a room with unlabeled TVs at difference resolutions and asked them to pick which one looked better.  In both cases, the result was statically insignificant*, with only 56% and 54% of the group correctly identifying the 1080p TV as having a higher resolution.  Blind test of screen shots on blogs are more common (I’ve even run one myself) and time and time again they give us the same message; the results are no better than guessing.  More importantly, when a control is added (some people are given screenshots that are in fact identical) the number of people who report a difference vs. the number of people who don’t doesn't change**.  In a phrase I never thought I would hear myself saying,  it looks like web comments might give us all the insight when need into why this is happening.  On gaming sites, where the comments are generally defending why the console they bought is better than the one they didn't, the vast majority says they can easily see a difference. On TV websites where the comments are generally defending the low cost TV someone bought instead of a higher priced alternative, the vast majority say they can’t see any difference at all.  The bottom line is that for most people in most situations the difference between 720p and 1080p is at best negligible and most likely simply confirmation bias.  This isn't to say there is no difference.  Right now my face is a few inches from my monitor and I’m sure I would notice if someone set my resolution to 800x600 while I wasn't looking.  However the numbers tell us that in any given conversation the guy who is telling you the difference is obvious is much more likely to be doing so based on bias then fact.  As damning as it looks, this isn't even the finial nail in the resolution coffin.
1080p is a measure of graphical density, not graphical fidelity.  I could use 2,073,600 pixels (1920x1080) to draw a stick figure, but few people would say it was as visually pleasing or artistically sound as a 921,600 pixel (1280x720) image of the Mona Lisa.  In video games, the developer gets to make this choice based on what they think is important.  You could make a simply looking game run fantastic on any hardware, or a great looking game by lowering the resolution or frame rate. For high end, AAA games where we demand photo-realism a developers can still “dumb down” the fidelity in subtle ways to hit 1080p/60 rather than focusing on making games that actually look better and push the limits of the hardware.  This should be obvious to anyone who’s ever used a computer.  Games run at the same resolution as you desktop, and you pick the frame rate by playing around with graphical option.   The Wii U has a wide range of 1080p/60 games despite being vastly inferior to the Xbox One or PS4 from a raw power perspective, so even people who game exclusively on console should understand this point.  Yet gaming sites are constantly reporting the resolution of minimalist games like The Binding of Isaac of Spelunky without any context or explanation when not only would these games look identical at lower resolution (due to using low resolution textures) but are so visually simple they could achieve 60FPS on a 386SX-66***.   In doing so they are spending a clear message; we don’t care about HOW games look.  We just care about the number of pixels.  I don’t see how this can possibly end well for us.
The final word
In 2016, Nintendo will release a console that reproduces 8-bit Nintendo games in 4k resolution at 240 frames a second, and will declare itself the winner on the console wars.  Next year, Sony will redesign the PS4 game package to include “1080P” in 800 point, Rockwell Extra Bold font at the top, with the games title in superscript.  At E3 2015 Microsoft will show off “Halo EXTREME”, a re-skinned port of the original “Quake” running at 4k with the tag line “It’s got more pixels then Kill Zone, shut up and buy it”.  Soon trailers will just be voice over against a black background, while frame rate charts and pixel counts are shown in place of game play.  This is the world we seem to be creating, a world where being able to say the graphics from console you bought or the game you developed are objectively bigger than others is far more important than how anything actually looks.  We need to take a step back, and stop reporting on games to appeal to the vocal minority that cares about what system a game is on more than the game itself., because that's all this is about.  The Wii, the most successful console of all time, didn't even have component output because it played all games in 420p (a resolution you might remember from 1970 VCRs).  Not once in all the years it co-existed beside the 360 and PS3 do I remember a single article highlighting 720p resolution as a feature, a benefit, or a reason that a game was “better” on a different console.  We still had flame wars back then, but we compared the features of the Xbox 360 with the features of the PS3, talked about the value of 1st and 3rd party exclusives, and generally focused on how to improve game-play overall by trying to prove our system had the better games (and better ways to play them).  I never thought I would look back at this as the "good old days", but a spade's a spade.  Back then, the gamer community was unified in a message that we needed more innovation, more investment in studios and better online play.  In return, we got franchises like Uncharted, Gear of War, and Kill Zone, as MS and Sony both spend millions investing in the studios they thought would put them ahead.  Today we get 1080p ports of games you can get for $1.99 on Steam.
We are sending the wrong message, and we need to stop.  We need to go tell developers we don’t want games that are 1080p if that resolution comes at too high a price, and we the compromise in Diablo should be a rally cry.  We want games that are fantastic to play, have great frame rates, and look beautiful.  Is that means a lower resolution, that’s fine.  All games start to look better as the console generation gets older anyways, and winning the "resolution war" will be a shallow and shot lived victory regardless.  But for now, I guess I'll go back to playing a game made deliberately worse to make Fanboys happy.  This would make me sad but ... hey ... it’s Diablo  It’s still freaking awesome.
*The number of people who correctly identify 1080p as being higher resolution is no higher than what we should expect if we used a random method, like drawing from a hat.
**Only 20% of people will report they see no difference between the two images, even when there is actually no difference between the images.  In other words, 80% of people will report there is a difference when there isn’t one.  This is called conformation bias, the human tendency to report any “result” over reporting no result.
***Citation needed

No comments:

Post a Comment