Thursday, 21 August 2014

The Resolution Wars - We all lose.

It seems we can't go longer then 24 hours without a new article asking if a new game is 1080p or not, and what that means for the game or the system it's on. And when a game is 1080p, we talk about frame rate, up-scaling vs. native, and whatever other numbers we can think of.  But why?  Is it really important?  Do more pixels make a better game?  I tried to examine this trend in the gaming industry as objectively as possible, without making it about the consoles themselves.  This is what I found.
Smoke and Mirrors
What’s the name of the regulating body that tests a game when it claims to be “1080p, 60FPS”?  What’s the agreed upon standard for how much of the game needs to be at the resolution and frame rate?  What’s to prevent someone from saying a game is 1080p/60 because a pre-rendered cut scene is in the game at that resolution, or because at one point there are two 1920x1080 frames drawn 1/60th of a second apart even if it never happens again?  Nothing at all, obviously.  As with any unregulated marketing buzzword, the more we tell publishers that saying a game is 1080p is more important than any aspect of a game itself, the more we are going to encourage some shady marketing.  Sony is already facing a law suit (which recently got the green-light to proceed as a class action) over the use of ” temporal reprojection”, a tactic to make a game look like it’s producing a solid 1080p image when it’s really not.  A lawsuit that they will almost certainly win by simply arguing what I said above; 1080p is a meaningless claim they are under no obligation to deliver on.   Diablo’s console port takes this creative resolution labeling to the next level as the game was built from the ground up to be able to claim 1080p/60 without ever having to draw 1920x1080 pixels 60 times a second .  Instead, the game renders the camera control image at 1080p/60 independent of the actual game play.  This allows the game to effectively drop frames without slowing down.  It also prioritizes drawing the pixels closer to the players focus (the character) and will sometimes fail to refresh the sides of the screen without dropping frames completely.  I’m not calling Blizzard out as being deceptive for this and these tricks are used fairly infrequently in the final build of the game, but it’s clear they devoted a significant amount of time, money, and tech wizardry to ensuring the 1080p claim above any other priority.  And was it worth it?  The answer looks to be a definitive no.
We live in a world of day one patches, which sometimes gives us a window into the game that could have been.  For Diablo, both a 900p and a 1080p version of the game that were available to the media and review sites for almost a month before the day one patch upped the resolution (I’m deliberately ignoring the fact they were on different consoles because that’s not what this is about).  The 1080p version saw game play frame rates drop as low as 40PS and notable visual hiccups, while the 900p version ran a flawless 60FPS.  Upon release, a patch brought both versions to 1080p and significantly reduced the frame rate problems at that resolution ... but didn't eliminate them.  From an objective, numbers to numbers  standpoint The 900p game simply ran better. So why aren't we playing that?  Why didn't the day one patch LOWER the resolution of the flawed 1080p version to the flawless 900p restitution, instead of focusing on fixing as much as possible at the higher resolution?  Microsoft and Sony knew that would never cut it.  They need to listen to the players, and the players didn't want the game in a lower resolution simply because it was better. We want the game it in the resolution that we've made our holy grail.  And why wouldn't we?  The 1080p experience is visually superior to 900p , and that’s a good thing.  Unfortunately for that argument, almost every site that had access to both the 900p and 1080p game say there is no noticeable difference in visuals, with a few even saying the 900p game lookedbetter because of the way the tech wizardry I mentioned earlier interacted with dense particle effects and busy fights.  Some did their best to point out subtle differences that justified the focus on resolution, but no one talked of dramatic, earth shattering differences.  This shouldn't surprise us however; we known that for years.
Meaningless Numbers
1080p is shorthand for “One thousand and eighty lines of resolution, scanned progressively”, and based off that alone we already have a problem.  Progressive refers to the way a television creates an image by drawing every line when it refreshes, as opposed to “interlacing” which only draws half the lines every refresh.  It’s almost meaningless in discussing next gen console resolution as DX 11 and the Xbox One don’t support interlaced output (the PS4 has a 1080i option, but it simply exports the image at 720p). What we are really talking about is 1920x1080 resolution, which seems a lot less "next gen" and important given that's the resolution 90% of our PC monitors have been in the for the last 5 years.  Yet we'll continue to say 1080p even though the "p" is meaningless and we are all more familiar with just the numeric resolution, because the buzzword is far more important than the reality. It's an attempt to make this number seem like a magic sweet spot where things are either 1080p or horrible.  To the shock of absolutely no one, all the evidence we have says otherwise.
The most popular study on the benefits of higher resolution  is this handy chart compiled and sponsored by the manufactures of high resolution TVs, a group that would have no reason whatsoever to mislead you.
resolution_chart
Even they show restraint, noting that it would be difficult to see the difference between 720p and 1080p (let alone 900p and 1080p) if you are further than 7 feet away from your TV, or its smaller than 50 inches.  Blind tests of this theory are almost impossible to find, with PC Magazine and Consumer Reports being the only two major media publications I could find that put people in a room with unlabeled TVs at difference resolutions and asked them to pick which one looked better.  In both cases, the result was statically insignificant*, with only 56% and 54% of the group correctly identifying the 1080p TV as having a higher resolution.  Blind test of screen shots on blogs are more common (I’ve even run one myself) and time and time again they give us the same message; the results are no better than guessing.  More importantly, when a control is added (some people are given screenshots that are in fact identical) the number of people who report a difference vs. the number of people who don’t doesn't change**.  In a phrase I never thought I would hear myself saying,  it looks like web comments might give us all the insight when need into why this is happening.  On gaming sites, where the comments are generally defending why the console they bought is better than the one they didn't, the vast majority says they can easily see a difference. On TV websites where the comments are generally defending the low cost TV someone bought instead of a higher priced alternative, the vast majority say they can’t see any difference at all.  The bottom line is that for most people in most situations the difference between 720p and 1080p is at best negligible and most likely simply confirmation bias.  This isn't to say there is no difference.  Right now my face is a few inches from my monitor and I’m sure I would notice if someone set my resolution to 800x600 while I wasn't looking.  However the numbers tell us that in any given conversation the guy who is telling you the difference is obvious is much more likely to be doing so based on bias then fact.  As damning as it looks, this isn't even the finial nail in the resolution coffin.
1080p is a measure of graphical density, not graphical fidelity.  I could use 2,073,600 pixels (1920x1080) to draw a stick figure, but few people would say it was as visually pleasing or artistically sound as a 921,600 pixel (1280x720) image of the Mona Lisa.  In video games, the developer gets to make this choice based on what they think is important.  You could make a simply looking game run fantastic on any hardware, or a great looking game by lowering the resolution or frame rate. For high end, AAA games where we demand photo-realism a developers can still “dumb down” the fidelity in subtle ways to hit 1080p/60 rather than focusing on making games that actually look better and push the limits of the hardware.  This should be obvious to anyone who’s ever used a computer.  Games run at the same resolution as you desktop, and you pick the frame rate by playing around with graphical option.   The Wii U has a wide range of 1080p/60 games despite being vastly inferior to the Xbox One or PS4 from a raw power perspective, so even people who game exclusively on console should understand this point.  Yet gaming sites are constantly reporting the resolution of minimalist games like The Binding of Isaac of Spelunky without any context or explanation when not only would these games look identical at lower resolution (due to using low resolution textures) but are so visually simple they could achieve 60FPS on a 386SX-66***.   In doing so they are spending a clear message; we don’t care about HOW games look.  We just care about the number of pixels.  I don’t see how this can possibly end well for us.
The final word
In 2016, Nintendo will release a console that reproduces 8-bit Nintendo games in 4k resolution at 240 frames a second, and will declare itself the winner on the console wars.  Next year, Sony will redesign the PS4 game package to include “1080P” in 800 point, Rockwell Extra Bold font at the top, with the games title in superscript.  At E3 2015 Microsoft will show off “Halo EXTREME”, a re-skinned port of the original “Quake” running at 4k with the tag line “It’s got more pixels then Kill Zone, shut up and buy it”.  Soon trailers will just be voice over against a black background, while frame rate charts and pixel counts are shown in place of game play.  This is the world we seem to be creating, a world where being able to say the graphics from console you bought or the game you developed are objectively bigger than others is far more important than how anything actually looks.  We need to take a step back, and stop reporting on games to appeal to the vocal minority that cares about what system a game is on more than the game itself., because that's all this is about.  The Wii, the most successful console of all time, didn't even have component output because it played all games in 420p (a resolution you might remember from 1970 VCRs).  Not once in all the years it co-existed beside the 360 and PS3 do I remember a single article highlighting 720p resolution as a feature, a benefit, or a reason that a game was “better” on a different console.  We still had flame wars back then, but we compared the features of the Xbox 360 with the features of the PS3, talked about the value of 1st and 3rd party exclusives, and generally focused on how to improve game-play overall by trying to prove our system had the better games (and better ways to play them).  I never thought I would look back at this as the "good old days", but a spade's a spade.  Back then, the gamer community was unified in a message that we needed more innovation, more investment in studios and better online play.  In return, we got franchises like Uncharted, Gear of War, and Kill Zone, as MS and Sony both spend millions investing in the studios they thought would put them ahead.  Today we get 1080p ports of games you can get for $1.99 on Steam.
We are sending the wrong message, and we need to stop.  We need to go tell developers we don’t want games that are 1080p if that resolution comes at too high a price, and we the compromise in Diablo should be a rally cry.  We want games that are fantastic to play, have great frame rates, and look beautiful.  Is that means a lower resolution, that’s fine.  All games start to look better as the console generation gets older anyways, and winning the "resolution war" will be a shallow and shot lived victory regardless.  But for now, I guess I'll go back to playing a game made deliberately worse to make Fanboys happy.  This would make me sad but ... hey ... it’s Diablo  It’s still freaking awesome.
*The number of people who correctly identify 1080p as being higher resolution is no higher than what we should expect if we used a random method, like drawing from a hat.
**Only 20% of people will report they see no difference between the two images, even when there is actually no difference between the images.  In other words, 80% of people will report there is a difference when there isn’t one.  This is called conformation bias, the human tendency to report any “result” over reporting no result.
***Citation needed

Saturday, 2 August 2014

Against the Flow: July

At the end of the month I like to take a look at what the industry is talking about and see where I'm going against the flow.  These are opinion that seem to be prevalent in the industry with both fans and insiders, but I just can't seem to get my head around, important questions that are not being asked, or opinions that seem to run counter to what everyone else is saying.  This month, I'm going to skip how I disagree with pretty much everything anyone is saying about DOTA and leave that to two articles I plan on posting later this month.  As for the rest of the "big news" ...
I don’t understand why we care about “PlayStation Now”
I've already posted a break down of the pricing and will be recapping that and the marketing Sony is doing later this month, but on a much simpler level I don’t understand why any “gamer” cares in the slightest about the service.  It’s a streaming service, and as such, requires a constant internet connection to use.  If I remember anything from the months that followed E3 2013, it’s that no gamer alive would EVER use a service that required an constant online connection, regardless of the benefits it offered.  This became the defining ideology of the PlayStation nation and nothing short of a battle cry; anyone who would put up with an always online requirement to play games was an idiot, and any company that would ask gamers to do that was worse than Hitler.  They are hundreds of thousands of comments on 10s of thousands of websites archiving this universal belief … but now being online 24/7 is all honky-dory?  What happened to the “what if my internet is down” arguments?  With PSNow an outage not only means you can’t play but you lose the money you paid for the timed rental.  What happened to the arguments about internet quality?  Without fast internet, PSNow is a horrible service, and when we were talking about an online check it sounded like everyone's internet was so slow and unreliable it would be impossible to send a few megabytes every month or so without major headaches.  Yes it’s clear that what Microsoft was doing with aggressive DRM was different (and subjectively much more “evil”) then a service with a technical requirement to be online all the time … but the arguments, if valid to one, are valid to both.  Unless everyone, including top tier media sites, was just making up objections to make something look worse than it was and feed the console wars to drive up clicks …but that would never happen.  In fact, the gaming community should find so little value in this service that I’ll even take it a step further …
I don’t understand why “PlayStation Now” is being offered on PlayStation.
With the exception of Colin, who would respond to Sony forcing you kill you first born child to continue using PlayStation products with an article about how wonderful it was that Sony was doing its part to deal with overcrowding, there is not a lot of positive being said about this service.  And that’s sad … because not only is it great, it’s great in the “big f’ing” deal kind of way, and might just be the most significant thing to happen in the gaming industry in years.  The next time I tell my non-gamer friends about a game like “The Last Of Us”, that has universal appeal well beyond its strength as a game, they’ll have an option to try it out without having to invest $300 in a new system.  With PlayStation Now, they can just steam it to a tablet or Web TV.  PlayStation Now isn’t only going to make Sony a lot of money on its own, but it’s a window to show the non-gamer how much gaming has evolved, and how the media can be enjoyed by anyone.  It’s going to solidify in the minds of millions the idea that Sony PlayStation is where they should go to become “new gamers”, a strategy that worked out rather well when Nintendo tried it with the Wii.  Microsoft has no counter; they tried to do the same thing by making the Xbox One the “living room device” and offering TV functionality and original programing, but failed miserably.  PSNow is not only going to be a huge win for Sony, it’s going to beat Microsoft at their own game … or at least it would have.  Instead, by releasing on PS3 and PS4 before showing in to the non-gaming world, the perception of an overpriced service that’s WORSE than just buying games is going to be so prominent in media that by the time it gets in the hands of the generally population it will already be a bust.
For 80% of the population, trying the one game a year you’re going to be interested in on PSNow is a much better option than investing in hardware, but Sony asked the other 20% to test it out for them.  Not learning the lession they so brutliy taught Microsoft just last year (most non-gamers still think the Xbox One requires an online connection even to this day), Sony is going to be stuck with the label the internet gives PSNow in it's infancy, and it's not looking good.  Objectively, a low cost alternative to console ownership for the people who are only going to play a few games a year is a very nice thing, and the internet can’t have nice things.  Especially when they are subscription services.  On that note …
I think EA’s subscription service is fantastic (even without games)
When EA announced they would be offering a $5 a month or $30 a year service on the Xbox one that would come with $10 off all games and DLC from EA, up to 7 days “early access” to EA video games (without having to pay for them) and a small collection of free games, it was met with cautious optimism.  Given the internet’s relationship with EA (voted worth company in the world 2 years in a row) cautious optimism is pretty darn good.  The uncertainty everyone seems to come back to is what games will be included and how will they work (will they expire and only be playable for limited periods of time).  To that I counter … Who cares?  Just doing some simply math, a $30 a year service pays for itself as soon as you buy $300 worth of EA games given the discount. That might seem like a lot, it’s a perfectly reasonable amount to someone who is so into gaming that a few top tier games at $30 a year might not be worth it.  More importantly, up to 7 days to try out FULL RETAIL COPIES of new games before they are released with progress carrying over to the game if you buy it has so much value add, it could be the whole service and I would still pick it up day one.  Games, like movies, are all about being the first to experience them … and you already know that, internet.  When Sony said PS4 owners would get to play the BETA of Destiny a few days early it was the death of Xbox One, and any time a games offered a “limited edition” that came with a day or two of early access for $20 or so, it sells out.  This isn't a new thing.  We want early access, it’s important and news worth every time it’s offered, and we are willing to pay crazy amounts of money for it in the rare cases it’s offered.
EA needs to re brand this service “EA early access” and call the other two features a bonus.  Maybe then we'll be able to see how fantastic it really is (and maybe Sony will let us decide for ourselves if we want it)
Did I miss anything?  What did YOU notice this month that everyone else seemed to miss?  Let me know in the comments!