The Death of the Day One Patch
On Monday, CD Projekt Red announced that The Witcher 3 has been delayed until May 2015. That’s the third release window for the game since it was announced. The original 2014 release window was specified down to Fall 2014 which was revised to February 2015 and most recently to May 19, 2015.
At one point in time, a game being delayed for a second time would have been met by the internet gaming community with torches and pitchforks, especially a game that is as hotly anticipated as The Witcher 3. However, reaction to the move has been mostly favourable.
After being burned too many times in a row with games that are broken or technically faulty upon launch, gamers are starting to say enough with the now standard practice of a day one patch and are looking for a return of the good old days when games were done when they were released.
Like many gamers, I come from a gaming background where patching didn’t really exist. Sure, I could get patches for some PC games but they weren’t necessary or even shoved down your throat. I came from an era of console gaming where you didn’t connect your console to the internet. The game that you picked up on cartridge or disk was the game you were going to play and there was nothing that could be done to change that.
With the last console generation, we saw a massive shift in consoles. While there was some online multiplayer and content in the PS2/Xbox generation of consoles, online was really pushed as a core feature of consoles until the PS3/360 generation. Until then, online was the domain of the PC. Not only did consoles pick up the popularity of online multiplayer from PC but the use of patch and update downloads for console games. Consoles went from pop a disc in and go to pop in a disc, install it to your hard drive, patch it and some ten minutes after taking it out of the case, playing.
Patches should be a good thing for gaming. After all, all the QA work in the world won’t find every bug or all the optimization tricks for a PC or console so it would be reasonable to think that patches would be used to make gaming better because the games would become better. While it’s not wrong that patches are a good addition to gaming across the whole spectrum of platforms, it hasn’t led to better games.
As we enter the holiday shopping season, publishers are getting their big guns out to take advantage of all the sales at this time of year. There’s an enormous amount of pressure on developers from their publishers to get games out in the holiday release window at all costs. The prevailing thought would be that a delayed release would cost more money than a poor release. Unfortunately for gamers, this has resulted in a shift from the classic way that games were launched to an abuse of the new way that games are released.
If I was to list Assassin’s Creed Unity, Halo: The Master Chief Collection and Sonic Boom and ask you what they all had in common, what would the answer be? Sure, they’re all big triple-A releases that their respective publishers are relying on to drive revenue from now through Christmas. More importantly, they’re buggier than the floor under the bed of a sleazy motel.
While the games vary in bugginess from aggravating to play to nearly broken, they all desperately need to be patched to get up to a standard of quality that is expected from a full price game. It’s not even about making changes to them to turn them into good games. It’s patching to get to the point where they’re consistently playable. Ten years ago, developers would have never had a second chance to fix games to the point they were playable. Gamers wouldn’t be hoping for a patch to make it better. It would be a terrible game on a disc that critics and gamers would pan.
Now, we’re living in an age where the publishers are forcing developers to abuse the use of patches to make a game functional after they hit store shelves, console manufacturers are raking in the money from patch certification from these publishers and the press are becoming complicit with the likes of IGN now electing to run updated reviews and Polygon seemingly changing their review scores on a whim with patches being factored into it. While no game is ever going to be released 100% bug free, there’s a difference between trying your best and clearly rushing to market.
The backlash from the launches of ACU, Halo:MCC and Sonic Boom show that while gamers want gaming to advance with the more powerful hardware available on next-gen consoles, gamers aren’t going to give up quality now for the possibility of quality later. As the old adage goes, you only get one chance at a first impression. Many publishers are forgetting the power of that first impression on gamers and how long memories they have. Ubisoft, Microsoft and Sega are all making money now off their big releases but have put future sales at risk as smart gamers will wait for reviews and sales before buying the sequels to those games in the future.
What we’re seeing now is the death of the launch day patch. It was once a nice to have patch but the use of it as a need to have patch by developers and publishers is an abuse of the system. In the last month, we’ve seen the backlash come swiftly and grow significantly.
What the likes of Ubisoft, Microsoft and Sega are doing is creating the environment for a return of the times when games shipped completed. No, patches aren’t going away but shipping buggy games on disc that need patches to be functional is a practice that will quickly be forced to come to an end. Gamers have to stop pre-ordering games, reviewers have to stop forgiving major bugs and publishers have to stop abusing the system.
For more from et geekera, follow us on Facebook, Twitter, Google+, Tumblr, Steam and RSS.
Posted on December 10, 2014, in Games, Long Read and tagged Business of Gaming, Patches, Ubisoft, WPLongForm. Bookmark the permalink. Leave a comment.
Leave a comment