August 21, 2015

Patching our way to lower quality?

![Patching our way to lower software quality? - title slide](/content/images/2015/08/patchingtitle.png)

Well, that was a blast! I've just finished touring (two venues counts as a tour, right?) my presentation "Patching our way to lower software quality?", and - in an interesting last-minute twist - the kind folks in Brighton videoed my talk, meaning it'll be winging its way onto the internet in the near future! Update: The video is now available on The Dojo!

You can view the slides online, but the nature of the presentation (glossy image prompts only) means that it won't make a lot of sense without context. For those people who (like me) can't stand the thought of watching 45 minutes of me on video, I've summarised the talk below.

Digging up the past...

In summer 2014, a group of archaeologists and games enthusiasts met in the New Mexico desert to tackle one of the biggest urban myths in the history of gaming - and found it to be true. During the home gaming crisis of the mid-eighties, Atari were left with so much unsellable stock that they literally buried it in the ground.

In the early eighties, Atari's stock was burgeoning. Fresh from the back of a string of successful arcade-to-console conversions (Space Invaders, Pac-Man, Defender et al), the company made a brash move which signaled their intent. They paid $25m for the rights to make a tie-in game for the film E.T.: The Extra-Terrestrial. Eager to recoup this investment quickly, they doubled-down by aiming to have five million cartridges onto store shelves in time for the Christmas rush. The problem was, this meant that they only had five weeks to make the game.

Unsurprisingly, the game was a car-crash. It frequently appears near the top of "Worst Game of All-Time" lists, yet this is perhaps more because of its notoriety. Most of the game's clunkiness could be boiled down to one key problem: dubious collision-detection which caused ET to fall into traps too easily, creating a frustrating gameplay experience. But it was too late: five million copies were made, five million copies were shipped, and only 1.5 million were ever sold. Worse, people were actually returning the game because they didn't like it - a bitter blow for such a high-profile game.

ET wasn't the sole reason for the collapse of Atari. The ease with which so-called "bedroom programmers" could produce games for home consoles was a double-edged sword, as it led to the market becoming flooded with low-quality imitations of popular games, eroding consumer confidence while the arcades continued to flourish.

Fast-forward to the mid-nineties, and very little had changed: as I explained in detail in my article about Frontier: First Encounters, the graphics were getting better, but the same business problems remained. There were commercial pressures leading to games being shipped before they were ready, and little ability to correct mistakes post-release. Patches were now fundamentally possible, though: this was an era of customer service hotlines delivering floppy-disk patches through the mail, and magazine covermounted CDs allowing patches to be delivered proactively to the front-line.

Some companies, afraid of making an uncorrectable mis-step, instead chose to delay their release until they were completely happy with their game. Some went too far. One of the most notorious is the aptly-named Duke Nukem Forever: originally previewed in 1997, it finally surfaced in 2011 (and only after the original team were bankrupted and their IP sold to the highest bidder). The main problem was the speed of graphical improvements in the late nineties; the team at 3D Realms went through several cycles of beginning development using the hottest graphics engine, showcasing a trailer, and then throwing away their work when an even hotter graphics engine came to market. After a few years, they stopped projecting release dates, and insisted the game would be "done when it's done": an admirable aspiration of quality, but it was eventually inscribed onto 3D Realms' tombstone.

What we needed was a high-speed, mainstream method to deliver updated content to the masses at negligible cost. It arrived, and it was named The Internet.

The patching superhighway

Things didn't change overnight with the arrival of the world wide web. Dial-up was slow and inconvenient; having to unplug your phone to use your modem (and hope that nobody tried to call while you were on the internet) meant that it wasn't until the days of broadband that patching became the norm. (Mobile phones continue to lag behind; you can get 4G in many cities if you're prepared to pay, but even 3G isn't as prevalent as many would like. And if you're roaming overseas, you'd better have deep pockets or access to a Starbucks.)

Once companies became comfortable releasing 100MB patches rather than 10MB or 1MB, many new opportunities opened up. Patches didn't have to be restricted to show-stoppers any more. New features could be patched-in post release, and sticky situations (such as the GTA Hot Coffee controversy) could be side-stepped with a targeted patch. Fast-forward to the present, when even a 1GB patch isn't a big deal, and games are much more like living entities: console shooter Destiny is currently overwriting all of Peter Dinklage's dialog with a different performance by Nolan North, after they were unable to get Dinklage to record any new content for their forthcoming update. Such sweeping changes would have been unthinkable even 10-15 years ago.

This can be a challenging time for those who are still on capped-bandwidth data plans, either on their home internet or their mobiles. Large updates are issued as if they're no big deal, and many new games receive what's known as a "Day One Patch" which allow for last-minute bugfixes and feature additions prior to release, after discs are shipped to manufacturing. The recent Borderlands collection had a 16GB day one patch; new Wii U owners are subjected to a patch which can take several hours to download, particularly frustrating if you're a child who's just unwrapped their shiny new console on Christmas Day.

Update volume can be a problem, but so too can frequency. Mobile apps which update on semi-regular intervals can be the sign of a healthy development team, but update too often and your customers (particularly your less-engaged customers) can be pushed away. ("Another update? I only use this once a month, it's not worth the constant bother.")

With so many games and apps to update, consumers tend to fall back on the magic "Update All" button, failing to notice (or care) about the new features unless they're negatively impacted by them. Worse, if you look at the "Recent changes" section of many apps, it often simply says "What's new: Bug fixes". It's surprising that there isn't more consumer demand for openness here, though I suspect it's only going to take one major breach (that app which asks for a few too many permissions, and then turns itself malicious on a time-bombed date) before there's a call for clarity in patches.

Customers can also become riled when you remove features with a patch. This is particularly challenging when you're dealing with consumer electronics: in the pre-connected age, the device you bought would continue to function in the same way until the day it died. These days, features can be easily removed or deprecated on a business whim. For example, people who bought a first or second-generation Apple TV were able to watch YouTube through their TV, but Apple decided to remove the YouTube channel from these devices rather than convert it to support the latest YouTube API version.

Another annoyance is when features are added which are perceived to have more business benefit than consumer benefit. In recent years, Diablo 3 and SimCity have both aggravated PC gamers by requiring an internet connection even for their single-player game modes, compounded by flaky game servers during launch week which prevented players from being able to access the game that they'd purchased.

And sometimes, customers are just too comfortable with their current lot. Microsoft discovered this in the worst possible way when they announced their Xbox One console at the E3 2013 games conference. Its plans for a focus on digital downloads and disc-less gameplay were similar to what PC users have been happy with for years, but the implied loss of other features (an inability to buy or sell second-hand games, or lend games to friends) was too much to bear for Xbox fans, who rebelled and forced a rethink. (And not just for Microsoft - it's rumoured that Sony were to announce very similar features for the Playstation 4 the following day, but instead flipped their position to take advantage of the backlash against Microsoft.)

The current state of play

Thirty years after ET, has there been a noticeable shift in quality? If we remove the obvious technological improvements (such as graphic), the answer seems to be: not really. The recent Lego Jurassic World tie-in game was criticised by some reviewers for being bug-ridden and rushed to release alongside the movie, exactly as Atari did all those years ago. The main difference now is that if companies want to do something to fix this, patching gives them the ability to do so.

But it can be something of a paradox. The ongoing saga of Batman: Arkham Knight (the PC version being pulled from shelves shortly after release due to massive performance problems) is the perfect example of this. On the one hand, it's great that Warner Brothers are going to be able to address the issues and release a (sizable) patch straight to the desktops of affected gamers. But on the other hand, if we were still in the 80s or 90s, would the company have been forced to adopt a "right-first-time" approach? Indeed, there are plenty of conspiracy theories (apparently reinforced by sources within WB) suggesting that the company were well aware with the PC version's problems, but chose to release-then-withdraw rather than delaying the release, presumably to technically fulfill some contractual obligation.

As testers, we may feel largely powerless about this. (This interview with one of Arkham Knight's testers is a very literal example.) These are largely business decisions, enacted by higher powers, where a lone voice would struggle to make their opposition heard. However, as individuals, we can still strive to keep our team honest and customer-focused, and I propose three simple methods to help maintain this focus.

Three tips to avoid being a slave to the patch culture

#1: Know your MVP (no, really)

If you're not familiar with the concept of a Minimum Viable Product, now's a really good time to learn more, particularly if you're working with an organisation which has agile or lean aspirations. Unlike traditional definitions of iterative development (which focus solely on delivering self-contained units of work within each iteration), the V of MVP says that we should focus on delivering something that's viable (or, as Dan Ashby posted today, valuable) with each iteration. This is a great way of eliminating waste and reducing time-to-market.

MVP can work well, but it's important that you really know what you're building, and so do your stakeholders. In my slides, I showed the classic Spotify MVP example (it's echoed in Dan's blog post above) of building a car in iterative vs MVP development:

![MVP at Spotify](/content/images/2015/08/mvp.png)

Although this simply explains one of the core concepts behind MVP, it aggravates me whenever I see it because it can lead to some dangerous assumptions. Firstly, it's important that your stakeholders understand the approach that you're taking: if they've asked you for a vehicle which can get four people across the country, and you deliver a skateboard and a scooter in your first two iterations, I don't think you'd get the smiley faces that the illustration suggests! Secondly, if you've misunderstood the original requirements - for instance, you actually need to get 40 people across the country - this would reveal itself very early on in the first example, whereas it might not be realised until too late in the second example ("what do you mean there's no iteration 6??")

The example I gave here was Grand Theft Auto V: a massively complex game which boldly chose not to launch all of its features on day one. On day one, only the single-player experience was made available, with the team focusing on a successful launch; two weeks later, a solid multi-player mode was launched; and later still, the mission-based multiplayer mode ("Heists") was launched to much acclaim.

#2: Know your mission

Well-oiled teams will usually have a lot of information at their fingertips. Sometimes, though, this can be the wrong information. It can be too vague (for instance, a list of features which are targeted for the next release) or too granular (such as the detail of a particular user story). There's often a vacuum in the middle, which it's good to fill with a "mission".

Your mission is the reason that your team is doing what it's doing. What does success look like? What tangible benefit will users feel? It's useful to agree upon this mission, and display it prominently, brief enough that it can be understood with a glance, allowing you to review from moment-to-moment whether you're focused on a valuable activity. There's a great example in the opening credits of Star Trek:

  • To explore strange new worlds
  • To seek out new life and new civilizations
  • To boldly go where no one has gone before

One big benefits of having a "mission" is that it makes it easy to recognise when you're "off-mission". Going off-mission is okay, if you know you're doing it, and you know when you're going to get back to it. This makes it relatively easy to incorporate learning opportunities and investigations into your everyday work: if it's going to deliver benefit down the line, and it's not jeapordising your main mission, it's valuable.

The example I gave for this was the mobile game Crossy Road. Unlike most of the latest mobile games, which are built around monetisation (forcing players to part with cash if they want to play more than X times a day, or if they want to speed-up the unlocking of features), Crossy Road is a fun game at heart, which never throws up a paywall. There are paid-for options available, but these primarily consist of additional characters which offer only aesthetic differences; players are able to unlock these characters through standard gameplay, and by using in-game currency which is gifted liberally to players throughout the game. As a result, despite not pushing its premium elements front-and-centre, the game was a $10m smash hit within its first three months.

#3: Fix what "matters"

In agile development we're constantly reviewing whether a newly-discovered problem should be addressed "now" or "later". Never underestimate the significance of classifying something as "later". Your "later" issues will be forever doomed to sit beneath future incoming high-priority fixes, and today's "later" is often tomorrow's "never".

There are some (and I'm one who's being swayed towards it) who'll say that if it's not important enough to fix now, it's probably never going to be important enough, and there's little benefit in recording the bug at all. Whether this would work for your team, is a decision that only your team can make. If you're working in tightly restricted or regulated environments, this approach could be wholly inappropriate.

My current team doesn't have a bug repository at all (although it's important to note that we're not working with released production code yet). When an issue occurs, we get team consensus on whether it's an important problem, and if so then we fix it - now (or as soon as possible).

As for deciding what "matters"? Again, this comes back to delivering value or user benefit. A fix or change which will deliver noticeable benefit to a large number of users will normally be more important than a cosmetic change. Depending on your product, you may be able to gain insight through user studies/interviews, or via in-application metrics. The example I gave here was the game Civilization V, which was actively supported for over four years after release, with small gameplay tweaks to address both perceived imbalances (e.g. users complaining that Unit X was too powerful) and observed imbalances (e.g. reviewing thousands of gameplay logs and noticing imbalances in the raw data).

  • LinkedIn
  • Tumblr
  • Reddit
  • Google+
  • Pinterest
  • Pocket
Comments powered by Disqus