Wednesday 25 March 2015

The Order: 1886 - Ready at Dawn's Triumph, Sony's Failure

The Order: 1886 was supposed by be the first TRUE next gen experience.  Instead, we got a 2 hour movie that forced us to walk down hall ways shorting at people for 3 hours in order to see the next scene.  But are we being fair to the game in our criticism?  Dose the interactive experience not have a place in gaming?  This week The Head Pirate takes a closer look.
How we got here
Historically, entertainment and expressive media relied so heavily on tradition that both stagnation and solidification had a chance to set in before variation took place.  By the time we got board of writing the same old stories and decided it might be neat to do the whole thing in meter and rhyme it was already crystal clear what made a story a story or a novel a novel, so it was instantly apparent was made a poem a poem.  By the time plays moved to moving pictures, we already knew that a comedy was a completely different thing from a drama.  Rarely in history do we find misrepresentation of this; there are no “short books that just happen to rhyme” or people who write them; there are authors and poets, stories and poems.  Society always has the time, and the artists the motivation, to explain all of the new ways media would be used long before the next set of experimentation came alone.  The drumbeat of human progress generally skips over art; with few exceptions.  Stone tablets from 4000BCE are not drastically different from the books of today, and a painting from last year might utilize the same style, pigments (paints) and canvas as some of the oldest painting ever made. Even the games we play haven’t changed much; Today's Hand Egg and Stick Ball (I’m pretty sure that’s what they’re called.  Not a sports guy) wouldn't look that different to fans of the sports played by the Romans or Aztec thousands of year ago.
Video game, however, break this mold.  With only 80 years of pedigree, there is no clear cultural understanding of variation within Video games.  Where someone who has never seen a movie will still be able to identify a comedy and differentiate it from a western, try asking your Grandmother to pick which one of two video games is a isometric top down and which is a two-stick shooter.  Even more dramatic, a child who has played video games on mobile, console, and PC might look at a Pong box or even game from the Atari 2600 and not immediately recognize them as video games at all.* As such we tend to talk about the “gamer” when there really is no such thing.  You’re unlikely to find someone who equally enjoys JPGS, MOBA’s Fighters, CCGs, and Pokémon.  Instead people tend to stick to a few closely connected genes, branching out only a few times a year when the hype associated with an individual game makes it seem worth it.  This reason is important; the JRPG fans doesn't spontaneously decide to try a Brawler, he decide to try a great looking hyped game that happens to a brawler.  The same is true of traditional media.  The action movie fan doesn't wake up one day, decide he’s got it all wrong, and go looking for romantic comedies to watch, although he might see any given romantic comedy if the trailer caught his interest.  Obviously, this makes it very important that the trailer doesn't misrepresent the film.  The Dark Knight is the 2nd highest rated film on IMDb, yet would still disappoint someone who went to see it thinking it was a buddy cop movie because the trailer focused on a few choice interactions with the commissioner.
Haters gonna hate.  And sometimes not hate.
Telltale’s "The Walking Dead" (TWD) is basically The Order: 1886 with toned down visuals and less things to pick up and look at.  Given these were sited as the high points of The Order, it seems pretty logical that TWD, a similar game that lacks The Order's strong points, should have a meta-critic rating in the low 50s and received universal panning by the gaming media.  So how did it end up winning a slew of Game of the Year awards, spawn a sequel and 3 almost identical series, and score over 90? It seems obvious that this type of game, the interactive story, is enjoyed by a large number of people and respected by the gaming media for the experience that it provides.  In fact, it could easily be argued that the Telltale games offer LESS in the way of game play, as they never break away from quick time events and story decisions into, for example, a shooting or exploration sequence.  Other games like Gone Home, Dear Esther and the Stanley Parable also received almost universe praise as games despite having no game-play elements at all.  How is a story where the only interaction is hitting "up"om your keyboard end up being accepted as a better game then The Order, which at least tries to be an actual game?  It turns out we learned that lesson a long time ago when we requested the earliest attempts at the interactive movie.
Back in the 90s, CD technology gave way to the popular trend of putting every-freaking-thing under the sun on the CD even if it has no reason to be there.  Map, books, encyclopedia … someone, somewhere thought they belonged on a CD.  By the time DVDs came around, the buying up disks of just about anything trend was in full swing, and many tried to capitalize on the next big thing.  Most agreed this would be the interactive movie.  By using the remote, the viewer could make choices which would send the DVD to a given bookmark and somehow change the experience.  The first were nothing more than glorified “chose you own adventure” books, where some choices continued a fixed story, and some lead to the story ending in some form of horrible death for the main character.  They were marketed as family experiences; parents will think the are watching a movie, but the kids will think they are playing a video game.  Despite dozens of companies trying dozens of approaches they saw little success because they disappointed both parties involved.  To the parents, or movie lovers, it was just a bad movie, and to the kids, it was just a bad video game.  It couldn't deliver on both promises at once, and in trying to do so it ended up delivering nether. So the parents went back to watching real movies and the kids went back to playing real video games and everything was right in the universe.  That is until for some yet unknown reason someone thought it would be a good idea to give this interactive movie another go.  We were promised pretty much the same thing as before; innovative use of technology, a movie that changes based on the choices you made, and a custom experience that added a higher level of immersion.  What we got was Heavy Rain.
Heavy Rain, Beyond: Two Souls and now The Order all fail where Telltale games success because of misrepresentation and over ambition, and The Order fails the most because it was the most ambitious.  Like the long haired (we assume) game designers of the 90s, they seek to unify the movie watching and video game playing experience into something that appeals to both sides, while Telltale is simply trying to tell a story on the computer.  The activity, where it exists, is more like page turning then video game playing, and they do nothing to hide that.  They only true interactivity comes in choosing how a character will deliver a line, or if person A will die, leaving person B to deliver their lines for the rest of the story.  And it’s perfect!  It’s what movie lovers have always wanted; to be the main character.  To choose how they react and what to say.  To decide if the Hero is a brooding anti-hero or a happy-go-lucky trickster.  We've spent 50 years screaming “no, don’t kill him, kill her!” at the screen because we liked one character more than the other, and now we get to make the call.   They are not games.  They are not for gamers.  They are stories, for people who love stories, and they are marketed as such.  The Order, in it's focus on additional game play elements over branching story and control over the main character, offers now of these things to fans of stories, and to fans of video games it's simply not a very good one.  Worse, Sony told gamers that they would love it, and showed trailer after trailer of game-play, trying to fool us into believing we would get a different experience then the one we got.
The Final Word
There is nothing wrong with an interactive story.  It’s not a lesser experience to playing a game, and if the non-stop Telltale games are any indication, there is a huge market for it. Ready at Dawn clearly had a product to sell that would have done extremely well with this market.  They obviously have talented people who understand emotion and storytelling, and artists that are second to none.  But Sony didn’t want them to make that game.  They wanted a system seller.  A block buster exclusive with universal appeal, and they told the marketing people to go out and make EVERYONE want to play it.  So instead of The Walking Dead, which absolutely delighted everyone who bought it by delivering exactly what was expected (an electronic comic where sometimes you need to hit buttons), The Order delivered too much game play for people who came for the Story, and too little game play for people who came for that.  It will be remembered as a failure, and rightly so … but it’s not Ready at Dawns failure.  They delivered.  They delivered the right people, the right parts, and every component necessary to make something great, but the publisher put it together without looking at the instructions, and marketing didn’t bother to read the brochure.
*I think it’s safe to say inverse is also true; if you were to somehow grab a Pong fan from the 1960 and show him games on the Xbox One or PS4, I don’t think he would be able to identify them as future versions of Pong.

Tuesday 24 March 2015

Off Streaming – BloodBorne

It’s Tuesday, and the absolute best type of Tuesday; on that delivers a highly anticipated release.  But sometimes the mainstream isn’t for everyone.  Is there something off-steam that might offer a similar experience at a lower price or on more systems?
Blood Born is a fantastic game that doesn’t just continue the traditions of From Software but somehow improves on them in every possible way.  It’s an absolute masterpiece, and one of the best next gen games we’ve been given so far.  But what if you don’t have a PS4?  What if you’re still working on Dark Souls 2 and don’t want something too similar?  Or maybe $60 is simply too rich.  Where can you get a BloodBorne (or From Software) like experience from something other than a From Software game?
Enclave
Developer: Starbreeze Studios
Publishers: TopWare Interactive, Vivendi Games, Conspiracy Entertainment
Platforms: Xbox, Microsoft Windows, Wii
Released: July 19, 2002
Availability and price:  Less than $1 on steam until March 30th, $5.99 on GOG.com

Like From Software games, Enclave is an extremely unforgiving and dark experience.  Although combat is more direct and less nuanced, a reaction based defense system which is far from optional will keep you on your toes. Absolutely brutal encounter design and enemy placement makes every decision counter as soon as you get past the first few levels, while failure comes at a heavy price.  The long, open levels have only a few deviously placed checkpoints, but Enclave ups it to 11 by setting a gold cost for each check point used.  If you’re not able to survive long enough and collected enough gold between deaths you might find yourself restarting a level even with all the checkpoints unlocked.

Game play is complex and satisfying.  Although the absence of controller support seems a bit odd on the PC the keyboard and mouse control set up works well enough, and controller support can be added with 3rd party software like Pinnacle Game Profiler.  Rather than build up a single character, you start each level by choosing a class to play and equipment to start with based on what you have unlocked and the gold you have collected throughout pervious levels.  This allows you to customize your load out to the task at hand or experiment with different play styles on the fly. There is a good mix of magic using, melee, and ranged classes to keep things interesting, and each feels just similar enough that you can play a single level as them without being lost, but different enough that the experience is fresh.

Where Enclave really shins is in how it takes you through a dark and morally ambiguous story from both sides.  You play as evil classes like lich or assassin in the dark campaign or your standard do-gooders like druid or Knight in the light campaign.  Both stories offer something completely different with only a few shared levels between them. Even these are mixed up and modified from one side to the other, with you starting at a different location and fighting to a different goal.  The maps and environments have aged nicely, although the visuals overall are obviously dated (with the cut scenes being particularly painful).  And although the bosses don’t hold a candle to what you see in Dark Souls or Bloodborne, they are nonetheless enjoyable and entertaining, and the occasional puzzle sequence to break the action is handed well.

All and all, Enclave certainly isn’t Bloodborne, but it offers a lot of the same things for less than a cup of coffee.  If you’re sitting at home today with nothing new to play, green with envy of your PS4 owning friends … you really can’t go wrong with this forgotten gem.

Monday 23 March 2015

We Should Stop: Simply Calling Games "Difficult"

In We Should Stop, The Head Pirate looks the things we all say and do as gamers, and picks apart the things we all do but really shouldn't.
Defining difficult is … what’s the word for it …
When you put a simple game of chance under the microscope, you can learn quite a bit about the most common mistakes we make when speaking to a games difficulty.  Picture a game where you role a 20 sided dice and “win” on a roll of 1-10 while you lose on a roll of 11-20.  In mathematics we call this type of game “fair”; you win as often as you lose.  There are a number of ways could modify the game to be “unfair”, the most obvious is to change the range of winning and losing values. We could also use a “loaded” dice where the probability of landing on any given value if not uniform, changing the games mechanics, or could add additional rules to the game, like saying you can never win on the same number twice in a row.  But have we made the game any more or less difficult?
Although we've modified the chance of winning and losing we haven't changed what has to be done to complete the game. The TASK required to reach a result is the same, and this task serves as a meaningful definition of game-play .  The only thing you have any control over is if you successfully manage to roll a dice such that it lands on a face and can be read.  As anyone who has ever had friends over for board games on far too small a table can attest, it’s hardly a guarantee … but few would call this task difficult.  As such, we need language to separate a game that requires you to do something that requires a high level of skill, like tossing a dice between a series of rings for the toss to count as a way of making you win less frequently, from other games that ensures you win less frequently by saying you only win if you roll three 20s in a row.  In the first instance, game-play is far more difficult, while in the second who wins or loses is still determine entirely by chance*
All your base(line)
When we move over to Video games it’s often less obvious what is the task we have control over and what is left to chance, and frequently they overlap.  To understand this, we’ll take a look at an action sequence that requires reactive button inputs like a boss fight in Devil My Cry, Bayonetta or Dark Souls (or any other brawler type game).  In this scenario the “boss” will use a combination of abilities with various telegraphs which require you to use correct counters, each mapped to a different button on the controller.  If you are successful, the boss takes damage and if you fail you take damage instead.  The encounter plays out in a set order; that is to say the boss uses the same attacks in the same order every time it is attempted.  This task is entirely skill based and has no random component which allows us to assign it an arbitrary rating of difficulty: let’s say 10. We’ll very quickly notice an interesting anomaly.  Although the encounter never changes, people who have played though it a number of times will have a higher success rate as they are able to apply both reflex and memory to the task.  Therefor any meaningful definition of difficulty must allow for that success rate to increase with repetition.  Practice might not make perfect, but for a task to be defined as “difficult”, it has to have some measurable effect.
With a baseline defined, we can look at ways of changing the experience and what effect that has on the difficulty using the arbitrary scale given above.  We already understanding that difficulty refers to a task, and that the task here is reacting to a telegraphed attack with the correct input using both reflex and memorization.  By lowing the amount of time we are given to input the correct response we can made that task more challenging without changing the structure of the task itself.  If, for example, we have 1 second to react in the game we gave an difficulty rating of 10 to, with only half a second might have a difficulty of 15, while a game that gave you 2 seconds to react would have a difficulty of 5.  This limits a meaningful discussion of “difficulty” to platformers, brawlers, fighting games, and other games where the primary challenges come from some primarily reflex based task.  Within these categories, this difference is rather obvious.  It’s easy to argue that Super Meat boy is more difficult than Super Mario brothers, or Street Fighter is more difficult than Mortal Kombat or Skull Girls.  But what about changes that effect something other than difficulty but still make it less likely we’ll complete the encounter?  What can we call them?
Getting Subjective
Most commonly, we would see this type of encounter move away from scripted attacks towards random patterns that are different every time it’s attempted.  Although this gives a perception of making the encounter more challenging by forcing the players to use reflex alone, that’s not really what happens.  The player is still building and applying memory to the individual telegraphs, and that’s good news for our definition; the encounter is still difficult.  Only now instead of a set difficulty of 10, it would be a range of values based on what exactly the random sequence ends up being.  This can be a problem when trying to review or talk about a game that utilizes this technique.  One reviewer might end up getting a random difficulty (using our arbitrary scale) of 5 and say the encounter was too easy, while another might get a random difficulty of 15 and say it’s too hard.  Both are correct, as the experience is now subjective. CCG and puzzle games are the most common example of this type of subjective difficulty, although it can be quite prevalent in RPG games, particular in boss fight.  It’s clear that the type of gamer who enjoys playing Magic That Gathering is generally different from a gamer who enjoys competitive Street Fighter, so this distinction from games with objective or set difficulty is extremely important.
The Unforgiven
Adding a life bar, or same amount of variability to the number of times we need to succeed at entering the correct inputs or the number of times we can fail to do so is another common way a developer might modify difficulty.  By forcing the player to “pass” a given challenge (set of tasks) 3 times in a row, for example, the developer will significantly lower the number of times the attempted encountered is completed successfully.  In addition, the developer could force you to reply content before you get the opportunity to retry the encounter you failed, both increasing the amount of time that needs to be invested in each attempt, and limiting your minds ability to form muscle memory (as you are forced to shift to other tasks to fight your way back to the failed encounter).  Some games may even take it a further as a single mistake forces you to restart the game from the beginning.  Even in these extreme cases, however, the difficulty of the gameplay remains unchanged; we perceive the game as harder because the time investment needed to successfully complete it is increased.  It’s quite literally the oldest trick in the developer’s handbook.**
Going back to our definition of difficulty, we were careful to separate things the player has control over from random events.  The trick in this type of modification is that it misrepresents player’s short term (immediate) skill level as controllable when it isn't.  Although we can get better at things over time, how good we are at something right now is a set number which can be defined as a probability of succeeding at a given task.  For example, if you are able to complete an encounter with a difficulty of 10 (using our arbitrary scale from above) 50% of the time, how often you succeed at that take 3 times in a row is a simple function of portability outside your control.***  More importantly, we've learned nothing about this players ability or skill. He may simply have “lucked out” and preformed the task at the absolute limit of his abilities multiple times and be completely unable to complete a task with a higher difficulty.  On this plus side, this can lead to an extreme feeling of gratification when a player achieves a sequence of successes which, to them, seems far more difficult than it actually was.
Games that utilize this technique are so popular that it’s spawned an entire genre (roguelike) and games like BloodBorne  or Dark Souls have received almost universal praise for their use of this technique.  This is why it’s so important to be able to talk about them using the right language.  Given the vast variation in core difficulty (or difficulty in task) of different rougelike games which are enjoyed by the same type of gamer, it’s obvious the sought after property is something independent of it.  In the business and legal worlds we call being able to fail (or default) at something without penalty “forgiveness”, and so it makes sense that we label these games as unforgiving.
Is this getting too complex?
Last but not least, addition rules and button combinations could be added to the encounter.  There is an appreciable difference between pressing “x” to react to an attack and pressing x, y, x, x, x while holding left, a and b at the same time.  Alternately, the rules could change throughout the encounter; if you pressed “x” three times in a row, the next time you should press “x” you need to press “y” instead.  Unlike difficulty, which focuses primary on reflex but allows for memory as well, this type of game focuses almost entirely on memory and cognitive ability to define a new way and encounter can present a challenge.  This is even clearer when you look at turn based and strategy games which eliminate reflex skill although while still maintaining meaningful differences from each other in terms of challenge.  I don’t think anyone would say that Civilization is any easier than Crusader Kings, but Crusader Kings has a much larger number of systems which the player needs to keep track of.  These games are best evaluated by the number of systems and mechanics the player is required to track and master, best labeled as the level of complexity.
The Final Word
In a perfect world, we call Super Meat Boy a game with extremely high objective difficult that is very forgiving and not complex, while we say BloodBorne has moderate subjective difficulty and complexity, while being extremely unforgiving.  Civilization has low subjective difficulty and a high level of complexity.  Only then do we all mean the same thing when we talk about a game’s challenge.  As it is now, we boost about clearing a game on nightmare or making it though hard-core mode while that doesn't necessarily measure skill.  More importantly, reviews and media lack a powerful tool in matching games with the people who enjoy them.  I personally don’t enjoy unforgiving games or subjective difficultly, but I love complexity and games that are objectively difficult, while my wife will spend hours perfecting a subjectively difficult, insanely unforgiving game, but can’t stand complexity.  As far as any gaming review site is concerned, however, we both like “hard” games.
I think we can do better.
* We seem to understand this and use different language outside of games.  Although the chances of winning the lottery are far smaller than the chances of becoming a millionaire by going to school to be a doctor and developing a successful practice, no one would ever say winning the Lottery is more difficult.  We clearly see the “task” of becoming a doctor being more difficulty then the “task” or buying a lottery ticket, and don’t confuse chance of success the same way we do with games.
**When Video games first came into being, this was the first trick developers used to make games more challenging to increase their length.  A game which offered no continues was perceived as “harder” then one that didn’t regardless of the difficulty of the tasks the player was being asked to do.
***You would succeed .50 * .50 * .50 percent of the time, or 12.5%

Thursday 11 December 2014

My Completely Erroneous Opinion: Best Final Fantasy Game

Hey everyone, Head Pirate here.  I hope you enjoyed “We Should Stop”.  I've been working on smaller, more periodical type articles and it’s been going over well, so here is another to share with you.  Should the positive response continue into the New Year, I hope to put together a release schedule of 2 to 3 small themed pieces a week)
I spend a lot of my time debunking popular opinion in “against the flow” or fact checking it in “In Perspective” but in My Complete Erroneous Opinion it’s time to look at what I think; the opinions I hold that are not simply different or unpopular, but are flat out wrong.
The Best Final Fantasy game
Every time a new Final Fantasy game comes out, everyone talks about the glory that is Final Fantasy 7.  But I don’t think it’s the best Final Fantasy game ever made; it’s not even in my top 5!  So below are 5 games in the series that, in my completely erroneous opinion, are better than Final Fantasy 7
Number 5: Final Fantasy Adventure
Long before Pokémon, at a time when I still though Game & Watch was the cutting edge of portable entertainment, this GameBoy classic offered limitless hours of adventure, monster catching, and is the first GameBoy game I can remember that bothered with a story.  It somehow took what was lovable about the original NES classic and gave it infinite replay value, back in a time where that wasn’t really a thing.  After 23 years and countless games, this is still one of the most memorable and enjoyable Final Fantasy games I have ever played, and that’s good enough to land it in my number 5 spot.
Number 4: Final Fantasy: 13-2
I don’t understand why this game gets so much hate.  It was a huge improvement over FF13 and added the exploration, open world, side quests, and switched to a non-liner format; everything the fans were asking for.  The visuals were ahead of the times, the soundtrack was varied, memorable, and appropriate*, and game play was improved in every possible way.  It was also very true to what makes Final Fantasy what it is … but these points are better covered in my number 3 entry, a game which in itself defines the series and perfectly captures its uniqueness.
Number 3: Final Fantasy X2
No I’m serious.  Look, Final Fantasy is more than just a great RPG, and it does a lot to define itself as something different then Dragon Quest, Shin Megami Tensei, Tales, or any of the dozen or so similar series.  Final Fantasy games have a clearly identifiable theme; a coming of age or end of innocence story revolving around crystals and a chosen one.  Equally important is the way the story is told; focusing on the importance of connections between people and the different roles everyone plays in how the events unfold.  It’s a world where not everyone gets to be the hero, and some of the main characters might need to make the ultimate sacrifice before the end.  The games often shifts the characters you are playing to force you to see another side to the same conflict, or to experience an unforeseen consequence to your actions.  Final Fantasy can also be defined by a strange balance between the critical and the absurd; one moment you’re fighting the Lord of Chaos in an almost futile attempt to prolong the end of the days, the next your singing in an opera or betting on Chocobo races in the middle of a floating city.  If you could quantify what makes a game “Final Fantasy” then X2 is without question the most “”Final Fantasy” game every made.  It focuses exclusively on the things that make Final Fantasy what it is.  This game in itself is the unforeseen consequence and the change in perspective from X, which is why it’s the first true sequel in the franchise.  It was an experiment in stretching that core experience of storytelling over two games so even the mechanics could shift to aid the change in paradigm.  For all the questionable ways this changed the gameplay and mood of the first game, I think it worked perfectly in doing exactly that.
Also, dresses!
Number 2: Final Fantasy 14 online:  A Realm Reborn
The vocal monitory of gamers are known for hating at lot of thing, but perhaps none have a higher hate to lack of merit ratio then the online play being added to traditionally single player experience.  No matter how often the publishers try to explain how budget based on sale potential and human resources work, they continue to belief the only way to add multi-player is by taking something away from the single player.  The cardinal sin of online play is when a single player franchise decides it wants to build an MMO, an offence Final Fantasy has imposed on its fans not once, but 3 times with FF11, 14, and “A realm reborn”.
FF11 was underwhelming, but came at a time when MMOs were all the rage and we were more willing to forgive.  Besides, we had just come off the X and X2 high, so it’s not like we had gone a long time without a great Final Fantasy game to play.  14, on the other hand, was given to us not only after the poorly received 13, but at a point where MMO saturation had reached a critical mass; every MMO other than WoW was falling, we had gotten Star Wars: The Old Republic instead of Knights of the Old Republic 3, and we just found out Elder Scroll Online would get to us before the next single player game in the series.  Worse, the game itself suffered for a serious case of the suck; content without context and activity without any fun.  In a surprising move for a publisher (less surprising for a Japanese publisher to be honest) Square acknowledged the game was garbage and invited everyone to keep playing it for free until they made a new one.  2 years later we got “A realm reborn” and it delivered.
With a theme and characters that focused on a perfect mix of the serious, the cute, and the absurd, it feels just like a Final Fantasy game should.  The single player focused plot-line does a great job of giving you the heroic feeling missing from most MMOs as well as something to do when you were alone, while cut scenes and character interaction build real connections between the player and the world around him (although you have to suspend disbelieve a bit and not question how these characters had the same deeply personal relationship with everyone else in your party).  Beyond that it’s just a great MMO that constantly motivates you to play and rewards you for doing so.  For that, FF14RR deserves praise on every level; it’s a great single player Finial Fantasy game, it’s a great MMO, and its’ the first time a game has given us both without completely messing up one or the other.
Number 1: Bravely Default
Are you honestly going to argue that because a game doesn't have the words “Final Fantasy” in the title, we should ignore its crystal focused adventure where you use X-Potions and Ethers while casting Fira or Esuna and switching jobs?  I don’t think so.  Bravely Default is more a Final Fantasy game than half the numbered sequels, and flawlessly strikes the balance between what we expect from a modern video game and what our nostalgia demands of the series.  Add to that an incredibly strong end of innocence story, fantastic characters you can’t help but care about, and a half naked fairy that follows you around FOR NO GOOD REASON WHATSOEVER and you have, without question, the best Final Fantasy game ever made.
Although I like to encourage intelligent and thoughtful discussion with most of my blog posts, in this case feel free to flame me about how wrong I am in the comments below!  I already know that.  Or tell me about your opinions that are way out there.

* Gas ‘em up with the greens and let him go Stand back, stand clear as he puts on a show So cute yet fierce, is he from hell? I cannot tell, yet I don’t even want to know So you wanna be a trailblazer? Kickin’ dirt like a hell raiser? Take the reins, but don’t react slow It’s time to feel the force of the chocobo
So you think you can ride this chocobo? Got Chocobucks? You better put them on this chocobo! Saddle up, if you think you can ride in this rodeo Are we in hell? I don’t know… to the dirt, let’s roll! You’re loco if you think you’re gonna hide this chocobo Everybody’s gonna wanna ride your chocobo It’s choco-loco style in a choco-rodeo Gonna ride him straight through hell in this chocobo rodeo! Yeah, let’s ride!

Tuesday 2 December 2014

We Should Stop: Calling Steam DRM

(Hey everyone following me on IGN.  First, I just wanted to say thank you.  I also wanted to apologize for not being as active over here as I would have liked.  Some real life stuff got in the way, but I’m hoping that I’ll be back when a vengeance and frequent updates very soon.  This is the first in a new series I’m going to be running every week or two.  My monthly “against the flow” will also return within the next few days)
In We Should Stop, The Head Pirate looks the things we all say and do as gamers, and picks apart the things we all do but really shouldn't be doing.
Nothing New
The only thing new about DRM (which stands for digital rights management) is the “D”.  All video games, whether they come on a cartage, a CD, or you download them from the internet, are software and software ownership has always been an inherently tricky thing.  When I (read: My Parents) paid $40 for Super Mario Brothers for the NES it sure looked like I was “buying” something.  I had a physical copy that came in a nice box and I could trade it, resell it, or do anything I wanted with it.  However, unlike my 10-speed BMX bike or my sweet racing car bed, there was something different about my NES cartridge. I didn’t understand it at the time but it was very easy for me to make a copy of the software on the cartridge.  Because of this, when Nintendo sold it to me they needed to be clear they were not transferring ownership of the code itself and that I was simply licensing the software for my own personal use.  I was expressly forbidden from reproducing it for profit.  Simply asking nicely (and having me agree to a mostly non-binding EULA) wasn’t the only tool Nintendo had to keep people from re-selling games however; there was a small chip in the NES that would detect unlicensed or duplicated games and prevent them from running. This is how, way back in 1983, “Rights Management”  for video games was born; a physical system that managed the end users right to use the software on NES cartridges, as well as developer’s rights to publish games on the system.
While copying a NES cartage was “easy” the cost of bootleg cartridges and the hardware needed was prohibitive enough that it wasn't much of a problem.  That changed significantly when games on the PC started to get popular.  Anyone could easily make a copy of a computer game and distribute it on a floppy disk or CD.  While the cost of a single copy had gone down, the cost of mass production was still expensive enough to keep large scale pirating limited to organized groups.  Then the internet changed everything; now anyone with the time and bandwidth could mass distribute a game with minimal effort.  Something needed to be done, but what?
Thief prevention good, rights reductions bad
I’ve never met anyone who complains about how unreasonable it was for a store to expect you to make your way over to the checkout and pay for something.  Sure, they could just use the honor system and we could toss money on the floor, but we understand that there are some bad people in the world and this minor step is justified by the retailer trying to protect their investment.  This should be just as true for digital video games, and to me is the biggest area where we all get DRM wrong.  DRM is not theft prevention.  Thief prevention is the idea that you need to take some minor step to prove you've paid for a product at the store, and we all agree it’s pretty reasonable (or at the very least do it every day without much fuss).  In the 90s and early 2000s most games had some form of copy protection that required you to use a code wheel or a CD key to verify you had a legal copy.  This isn't DRM because it’s not managing your rights to the software but simply checking to see if you bought it.
DRM, as we know it today, started with Sony Music.  Beyond trying to stop people from selling illegal copies, they didn't like the idea of someone buying a CD, ripping it to a computer, and leaving the original in the car.  They saw a real potential for lost income.  In the past, if you had two children who both liked the same artists, you needed to get them each a CD.  Game developers loved this line of thinking and set about including software that would limit the number of times a game could be installed, would require an online check, or any number of annoyances to not just ensure the person playing the game had paid for it, but that the digital rights in the EULA were being followed to the letter*.  This is the DRM we all know and hate, and with good reason.  Instead of trying to prevent some people from stealing, companies had started to assume we were all stealing and they needed to limit our rights in order to mitigate the damage we could do with our ill-gotten products.  At its peak we saw games like "Spore" which only offered a single install for any reason.  Change your video card, format your PC, or even suffer a hardware melt down and you’re out of luck.  You need to go buy a new copy.
Failure to launch
Any time Steam makes its way into a conversation about DRM, the fault is always directed at the launcher.  Without it installed you can’t play your Steam games.  This looks a lot like DRM:  even though an offline mode is offered, you have to be online and install every game first before you can use them without being connected to the internet.  But Steam is a digital store front, and being online to buy the game is a requirement.  It’s like going up to the counter and paying for your product in a brick and mortar shop and is simply a method of thief prevention.  If you need to install the game again, you have to be online again in the same way you need to show a receipt in a physical store.  There is no attempt by Steam to limit the number of computers the game is installed on, no activation limit, or anything of the sort.  My steam account has active copies of games I only bought once on 7 PCs right now, and I can play them “offline” on all 7 at the same time.  Although I’m not doing anything illegal (they are all my PCs and I don’t use them at the same time) this is exactly the type of things DRM exists to prevent.  While there are games on Steam that require an online connection, or use 3rd party DRM, Steam itself is not the reason.
So why dose Steam have a launcher at all, and why is it required?  Turns out it isn't.  Valve offers a DLL to developers that helps them to a number of things, mostly related to updating games and adding supports for the Valve servers and Steam Workshop.  It’s an investment on their part; they know an old game is more likely to sell if it is updated to include modern resolutions and supports a game pad, they make more money the more copies that sell, but they know must developers are not going to spend the time and money into coming up with the code themselves.  Games that require this code require the launcher**.  Games that don’t utilize this code or other online features don’t require the launcher to run.
The final Word
DRM is a very horrible thing, and we all are right to oppose it.  We also need to be mindful of the message we send in opposition.  While it’s reasonable for us to reject any reduction in the rights we have when buying software, it’s also reasonable for the companies we are buying from to do what they can to prevent theft.  Steam is a digital distribution service that does what it can to keep people from stealing games, but does nothing to manage digital rights.  It’s not perfect, but it’s the best of two imperfect worlds.  By opposing it we are telling the music and gaming industry they are right; we are all thieves, and we want to be able to steal things.  We should stop doing that.
*And more so.  For the last 10 years EULA have been used increasingly to challenge or force users to forfeit long held rights like the “first sale doctrine” in the US with overwhelming success.  While most of the blame is with the music and motion picture industry, game companies are more than happy to take advantage of the latest legal wins.
**They require steam_api.dll to be loaded.  Although this is possible to do without the launcher present, for the sake of the average user this is a fair statement.

Thursday 21 August 2014

The Resolution Wars - We all lose.

It seems we can't go longer then 24 hours without a new article asking if a new game is 1080p or not, and what that means for the game or the system it's on. And when a game is 1080p, we talk about frame rate, up-scaling vs. native, and whatever other numbers we can think of.  But why?  Is it really important?  Do more pixels make a better game?  I tried to examine this trend in the gaming industry as objectively as possible, without making it about the consoles themselves.  This is what I found.
Smoke and Mirrors
What’s the name of the regulating body that tests a game when it claims to be “1080p, 60FPS”?  What’s the agreed upon standard for how much of the game needs to be at the resolution and frame rate?  What’s to prevent someone from saying a game is 1080p/60 because a pre-rendered cut scene is in the game at that resolution, or because at one point there are two 1920x1080 frames drawn 1/60th of a second apart even if it never happens again?  Nothing at all, obviously.  As with any unregulated marketing buzzword, the more we tell publishers that saying a game is 1080p is more important than any aspect of a game itself, the more we are going to encourage some shady marketing.  Sony is already facing a law suit (which recently got the green-light to proceed as a class action) over the use of ” temporal reprojection”, a tactic to make a game look like it’s producing a solid 1080p image when it’s really not.  A lawsuit that they will almost certainly win by simply arguing what I said above; 1080p is a meaningless claim they are under no obligation to deliver on.   Diablo’s console port takes this creative resolution labeling to the next level as the game was built from the ground up to be able to claim 1080p/60 without ever having to draw 1920x1080 pixels 60 times a second .  Instead, the game renders the camera control image at 1080p/60 independent of the actual game play.  This allows the game to effectively drop frames without slowing down.  It also prioritizes drawing the pixels closer to the players focus (the character) and will sometimes fail to refresh the sides of the screen without dropping frames completely.  I’m not calling Blizzard out as being deceptive for this and these tricks are used fairly infrequently in the final build of the game, but it’s clear they devoted a significant amount of time, money, and tech wizardry to ensuring the 1080p claim above any other priority.  And was it worth it?  The answer looks to be a definitive no.
We live in a world of day one patches, which sometimes gives us a window into the game that could have been.  For Diablo, both a 900p and a 1080p version of the game that were available to the media and review sites for almost a month before the day one patch upped the resolution (I’m deliberately ignoring the fact they were on different consoles because that’s not what this is about).  The 1080p version saw game play frame rates drop as low as 40PS and notable visual hiccups, while the 900p version ran a flawless 60FPS.  Upon release, a patch brought both versions to 1080p and significantly reduced the frame rate problems at that resolution ... but didn't eliminate them.  From an objective, numbers to numbers  standpoint The 900p game simply ran better. So why aren't we playing that?  Why didn't the day one patch LOWER the resolution of the flawed 1080p version to the flawless 900p restitution, instead of focusing on fixing as much as possible at the higher resolution?  Microsoft and Sony knew that would never cut it.  They need to listen to the players, and the players didn't want the game in a lower resolution simply because it was better. We want the game it in the resolution that we've made our holy grail.  And why wouldn't we?  The 1080p experience is visually superior to 900p , and that’s a good thing.  Unfortunately for that argument, almost every site that had access to both the 900p and 1080p game say there is no noticeable difference in visuals, with a few even saying the 900p game lookedbetter because of the way the tech wizardry I mentioned earlier interacted with dense particle effects and busy fights.  Some did their best to point out subtle differences that justified the focus on resolution, but no one talked of dramatic, earth shattering differences.  This shouldn't surprise us however; we known that for years.
Meaningless Numbers
1080p is shorthand for “One thousand and eighty lines of resolution, scanned progressively”, and based off that alone we already have a problem.  Progressive refers to the way a television creates an image by drawing every line when it refreshes, as opposed to “interlacing” which only draws half the lines every refresh.  It’s almost meaningless in discussing next gen console resolution as DX 11 and the Xbox One don’t support interlaced output (the PS4 has a 1080i option, but it simply exports the image at 720p). What we are really talking about is 1920x1080 resolution, which seems a lot less "next gen" and important given that's the resolution 90% of our PC monitors have been in the for the last 5 years.  Yet we'll continue to say 1080p even though the "p" is meaningless and we are all more familiar with just the numeric resolution, because the buzzword is far more important than the reality. It's an attempt to make this number seem like a magic sweet spot where things are either 1080p or horrible.  To the shock of absolutely no one, all the evidence we have says otherwise.
The most popular study on the benefits of higher resolution  is this handy chart compiled and sponsored by the manufactures of high resolution TVs, a group that would have no reason whatsoever to mislead you.
resolution_chart
Even they show restraint, noting that it would be difficult to see the difference between 720p and 1080p (let alone 900p and 1080p) if you are further than 7 feet away from your TV, or its smaller than 50 inches.  Blind tests of this theory are almost impossible to find, with PC Magazine and Consumer Reports being the only two major media publications I could find that put people in a room with unlabeled TVs at difference resolutions and asked them to pick which one looked better.  In both cases, the result was statically insignificant*, with only 56% and 54% of the group correctly identifying the 1080p TV as having a higher resolution.  Blind test of screen shots on blogs are more common (I’ve even run one myself) and time and time again they give us the same message; the results are no better than guessing.  More importantly, when a control is added (some people are given screenshots that are in fact identical) the number of people who report a difference vs. the number of people who don’t doesn't change**.  In a phrase I never thought I would hear myself saying,  it looks like web comments might give us all the insight when need into why this is happening.  On gaming sites, where the comments are generally defending why the console they bought is better than the one they didn't, the vast majority says they can easily see a difference. On TV websites where the comments are generally defending the low cost TV someone bought instead of a higher priced alternative, the vast majority say they can’t see any difference at all.  The bottom line is that for most people in most situations the difference between 720p and 1080p is at best negligible and most likely simply confirmation bias.  This isn't to say there is no difference.  Right now my face is a few inches from my monitor and I’m sure I would notice if someone set my resolution to 800x600 while I wasn't looking.  However the numbers tell us that in any given conversation the guy who is telling you the difference is obvious is much more likely to be doing so based on bias then fact.  As damning as it looks, this isn't even the finial nail in the resolution coffin.
1080p is a measure of graphical density, not graphical fidelity.  I could use 2,073,600 pixels (1920x1080) to draw a stick figure, but few people would say it was as visually pleasing or artistically sound as a 921,600 pixel (1280x720) image of the Mona Lisa.  In video games, the developer gets to make this choice based on what they think is important.  You could make a simply looking game run fantastic on any hardware, or a great looking game by lowering the resolution or frame rate. For high end, AAA games where we demand photo-realism a developers can still “dumb down” the fidelity in subtle ways to hit 1080p/60 rather than focusing on making games that actually look better and push the limits of the hardware.  This should be obvious to anyone who’s ever used a computer.  Games run at the same resolution as you desktop, and you pick the frame rate by playing around with graphical option.   The Wii U has a wide range of 1080p/60 games despite being vastly inferior to the Xbox One or PS4 from a raw power perspective, so even people who game exclusively on console should understand this point.  Yet gaming sites are constantly reporting the resolution of minimalist games like The Binding of Isaac of Spelunky without any context or explanation when not only would these games look identical at lower resolution (due to using low resolution textures) but are so visually simple they could achieve 60FPS on a 386SX-66***.   In doing so they are spending a clear message; we don’t care about HOW games look.  We just care about the number of pixels.  I don’t see how this can possibly end well for us.
The final word
In 2016, Nintendo will release a console that reproduces 8-bit Nintendo games in 4k resolution at 240 frames a second, and will declare itself the winner on the console wars.  Next year, Sony will redesign the PS4 game package to include “1080P” in 800 point, Rockwell Extra Bold font at the top, with the games title in superscript.  At E3 2015 Microsoft will show off “Halo EXTREME”, a re-skinned port of the original “Quake” running at 4k with the tag line “It’s got more pixels then Kill Zone, shut up and buy it”.  Soon trailers will just be voice over against a black background, while frame rate charts and pixel counts are shown in place of game play.  This is the world we seem to be creating, a world where being able to say the graphics from console you bought or the game you developed are objectively bigger than others is far more important than how anything actually looks.  We need to take a step back, and stop reporting on games to appeal to the vocal minority that cares about what system a game is on more than the game itself., because that's all this is about.  The Wii, the most successful console of all time, didn't even have component output because it played all games in 420p (a resolution you might remember from 1970 VCRs).  Not once in all the years it co-existed beside the 360 and PS3 do I remember a single article highlighting 720p resolution as a feature, a benefit, or a reason that a game was “better” on a different console.  We still had flame wars back then, but we compared the features of the Xbox 360 with the features of the PS3, talked about the value of 1st and 3rd party exclusives, and generally focused on how to improve game-play overall by trying to prove our system had the better games (and better ways to play them).  I never thought I would look back at this as the "good old days", but a spade's a spade.  Back then, the gamer community was unified in a message that we needed more innovation, more investment in studios and better online play.  In return, we got franchises like Uncharted, Gear of War, and Kill Zone, as MS and Sony both spend millions investing in the studios they thought would put them ahead.  Today we get 1080p ports of games you can get for $1.99 on Steam.
We are sending the wrong message, and we need to stop.  We need to go tell developers we don’t want games that are 1080p if that resolution comes at too high a price, and we the compromise in Diablo should be a rally cry.  We want games that are fantastic to play, have great frame rates, and look beautiful.  Is that means a lower resolution, that’s fine.  All games start to look better as the console generation gets older anyways, and winning the "resolution war" will be a shallow and shot lived victory regardless.  But for now, I guess I'll go back to playing a game made deliberately worse to make Fanboys happy.  This would make me sad but ... hey ... it’s Diablo  It’s still freaking awesome.
*The number of people who correctly identify 1080p as being higher resolution is no higher than what we should expect if we used a random method, like drawing from a hat.
**Only 20% of people will report they see no difference between the two images, even when there is actually no difference between the images.  In other words, 80% of people will report there is a difference when there isn’t one.  This is called conformation bias, the human tendency to report any “result” over reporting no result.
***Citation needed

Saturday 2 August 2014

Against the Flow: July

At the end of the month I like to take a look at what the industry is talking about and see where I'm going against the flow.  These are opinion that seem to be prevalent in the industry with both fans and insiders, but I just can't seem to get my head around, important questions that are not being asked, or opinions that seem to run counter to what everyone else is saying.  This month, I'm going to skip how I disagree with pretty much everything anyone is saying about DOTA and leave that to two articles I plan on posting later this month.  As for the rest of the "big news" ...
I don’t understand why we care about “PlayStation Now”
I've already posted a break down of the pricing and will be recapping that and the marketing Sony is doing later this month, but on a much simpler level I don’t understand why any “gamer” cares in the slightest about the service.  It’s a streaming service, and as such, requires a constant internet connection to use.  If I remember anything from the months that followed E3 2013, it’s that no gamer alive would EVER use a service that required an constant online connection, regardless of the benefits it offered.  This became the defining ideology of the PlayStation nation and nothing short of a battle cry; anyone who would put up with an always online requirement to play games was an idiot, and any company that would ask gamers to do that was worse than Hitler.  They are hundreds of thousands of comments on 10s of thousands of websites archiving this universal belief … but now being online 24/7 is all honky-dory?  What happened to the “what if my internet is down” arguments?  With PSNow an outage not only means you can’t play but you lose the money you paid for the timed rental.  What happened to the arguments about internet quality?  Without fast internet, PSNow is a horrible service, and when we were talking about an online check it sounded like everyone's internet was so slow and unreliable it would be impossible to send a few megabytes every month or so without major headaches.  Yes it’s clear that what Microsoft was doing with aggressive DRM was different (and subjectively much more “evil”) then a service with a technical requirement to be online all the time … but the arguments, if valid to one, are valid to both.  Unless everyone, including top tier media sites, was just making up objections to make something look worse than it was and feed the console wars to drive up clicks …but that would never happen.  In fact, the gaming community should find so little value in this service that I’ll even take it a step further …
I don’t understand why “PlayStation Now” is being offered on PlayStation.
With the exception of Colin, who would respond to Sony forcing you kill you first born child to continue using PlayStation products with an article about how wonderful it was that Sony was doing its part to deal with overcrowding, there is not a lot of positive being said about this service.  And that’s sad … because not only is it great, it’s great in the “big f’ing” deal kind of way, and might just be the most significant thing to happen in the gaming industry in years.  The next time I tell my non-gamer friends about a game like “The Last Of Us”, that has universal appeal well beyond its strength as a game, they’ll have an option to try it out without having to invest $300 in a new system.  With PlayStation Now, they can just steam it to a tablet or Web TV.  PlayStation Now isn’t only going to make Sony a lot of money on its own, but it’s a window to show the non-gamer how much gaming has evolved, and how the media can be enjoyed by anyone.  It’s going to solidify in the minds of millions the idea that Sony PlayStation is where they should go to become “new gamers”, a strategy that worked out rather well when Nintendo tried it with the Wii.  Microsoft has no counter; they tried to do the same thing by making the Xbox One the “living room device” and offering TV functionality and original programing, but failed miserably.  PSNow is not only going to be a huge win for Sony, it’s going to beat Microsoft at their own game … or at least it would have.  Instead, by releasing on PS3 and PS4 before showing in to the non-gaming world, the perception of an overpriced service that’s WORSE than just buying games is going to be so prominent in media that by the time it gets in the hands of the generally population it will already be a bust.
For 80% of the population, trying the one game a year you’re going to be interested in on PSNow is a much better option than investing in hardware, but Sony asked the other 20% to test it out for them.  Not learning the lession they so brutliy taught Microsoft just last year (most non-gamers still think the Xbox One requires an online connection even to this day), Sony is going to be stuck with the label the internet gives PSNow in it's infancy, and it's not looking good.  Objectively, a low cost alternative to console ownership for the people who are only going to play a few games a year is a very nice thing, and the internet can’t have nice things.  Especially when they are subscription services.  On that note …
I think EA’s subscription service is fantastic (even without games)
When EA announced they would be offering a $5 a month or $30 a year service on the Xbox one that would come with $10 off all games and DLC from EA, up to 7 days “early access” to EA video games (without having to pay for them) and a small collection of free games, it was met with cautious optimism.  Given the internet’s relationship with EA (voted worth company in the world 2 years in a row) cautious optimism is pretty darn good.  The uncertainty everyone seems to come back to is what games will be included and how will they work (will they expire and only be playable for limited periods of time).  To that I counter … Who cares?  Just doing some simply math, a $30 a year service pays for itself as soon as you buy $300 worth of EA games given the discount. That might seem like a lot, it’s a perfectly reasonable amount to someone who is so into gaming that a few top tier games at $30 a year might not be worth it.  More importantly, up to 7 days to try out FULL RETAIL COPIES of new games before they are released with progress carrying over to the game if you buy it has so much value add, it could be the whole service and I would still pick it up day one.  Games, like movies, are all about being the first to experience them … and you already know that, internet.  When Sony said PS4 owners would get to play the BETA of Destiny a few days early it was the death of Xbox One, and any time a games offered a “limited edition” that came with a day or two of early access for $20 or so, it sells out.  This isn't a new thing.  We want early access, it’s important and news worth every time it’s offered, and we are willing to pay crazy amounts of money for it in the rare cases it’s offered.
EA needs to re brand this service “EA early access” and call the other two features a bonus.  Maybe then we'll be able to see how fantastic it really is (and maybe Sony will let us decide for ourselves if we want it)
Did I miss anything?  What did YOU notice this month that everyone else seemed to miss?  Let me know in the comments!