PSLS  •  Features  •  News  •  PS4 News, Trophies, Reviews, and More  •  Slideshow

Shadow of War and Why Buying Games on Day One is Losing Its Charm

What’s the one thing Star Wars Battlefront II and Middle-earth: Shadow of War have in common? You know, aside from the fact that they’re both AAA sequels from two of the industry’s biggest publishers? Those publishers being Electronic Arts and Warner Bros. Interactive, respectively.

Microtransactions, of course! Though in saying that, unless you’ve had hands-on experience with either title (or both!), you’d be forgiven for losing track of Shadow of War and Battlefront II‘s handling of in-game purchases.

For one, DICE’s shooter sequel drew the ire of the video game community back in October, when beta testers soon discovered that loot boxes, which yielded randomized materials and Star Cards for classes and hero characters, were the lifeblood of Battlefront II‘s progression system – a progression system that appeared to actively encourage its players to purchase crates with real money in order to get a leg up on the competition.

A Tricky Balancing Act

For instance, at one stage Darth Vader cost a whopping 60,000 credits to unlock, which equals to around 40 hours of game time for those unable, or not willing, to shell out real-life money for an in-game boost. But that all changed soon after release, when EA slashed the price of its prime heroes by 75 percent, and it wasn’t until March 21 that microtranscations returned – albeit alongside a progression system that is markedly different to the one that shipped with Battlefront II late last year.

So much so, in fact, that real-world money can only be used to purchase purely cosmetic items – not unlike Overwatch before it – while Middle-earth: Shadow of War is poised to execute a similar reversal early next month, when Monolith and Warner Bros. plan to remove the ability to dump straight-up cash on precious gold on May 8th, before the somewhat contentious Market closes its doors in July, taking all gold and War Chests with it.

And that’s just it, at a fundamental level, both Battlefront II and Shadow of War will soon be totally different to the base games that launched back in 2017, which begs the question: is it better to wait until a big budget, AAA game has found its feet before taking the plunge?

Is There Such Thing As a Happy Medium?

There are exceptions to the rule, of course; Sony Santa Monica’s God of War is currently riding the crest of a wave ahead of its PS4 launch next week, and is unlikely to be affected by any major issues on day one beyond some performance niggles. That means no microtranscations, no loot boxes, and definitely no issues connecting to the Internet, as Sony’s franchise refresh has been built from the ground up to be a single-player experience. And an epic one at that.

So perhaps it’s best to judge each title on a case-by-case basis. For solo adventures like God of War, it’s fair to say that early adopters can feel quite confident when paying hand over fist for their shiny new purchase, whereas a title touting in-game purchases, just like Shadow of War, calls for caution.

For the sake of perspective, here’s how Monolith explained its rationale behind the high-profile U-turn:

It allows you to miss out on the awesome player stories you would have otherwise created, and it compromises those same stories even if you don’t buy anything. Simply being aware that they are available for purchase reduces the immersion in the world and takes away from the challenge of building your personal army and your fortresses.

middle earth shadow of war microtransactions

To their credit, Battlefront II and Shadow of War are by no means the only offenders; in recent years, we’ve seen everything from Final Fantasy XV to NBA 2K18 bend to a similar model, during which time the whole “games as a service” mantra has become the trendy thing to hate.

Extending the life expectancy of a game is one thing, but not when it risks exploiting those who queued up to play said title on day one – literally or figuratively – only to be left feeling like a glorified beta tester. Or worse, ripped off because a particularly aggressive microtranscation system (read: Battlefront II) spoils the core gameplay experience.

The Age Old Question

Prior to EA’s dramatic upheaval, early adopters of the DICE sequel were forced to persevere in a digital arena devoid of balance and fairness. Sure, loot boxes could’ve been earned through in-game credits, but when another player’s health regenerates faster than your own – or he/she was able to inflict more damage simply by purchasing credits – it became somewhat easier to justify your in-game spending.

But frankly, after dumping hours into both Battlefront II and Shadow of War soon after release, I’ve lost all desire to pay hand over fist for an experience that could be decidedly different, and likely more complete, five or ten months down the line. And Star Wars Battlefront II is perhaps the textbook example of a game that limped out of the gate, only to receive a second wind thanks to an impassioned response from the community.

This is completely unheard of when it comes to Nintendo, whose products typically adhere to an old school philosophy that deems a game ought to be of the highest standard before it’s released into the wild. If only other studios would adopt a similar approach.

Buying games on day one has and always will be a major draw, both for the developers toiling away behind the pixels and those brave consumers willing to slap down $60 at their local brick and mortar store. Or the PlayStation Store, providing you can get the damn thing to work.

Point is, early adopters help drive revenue for video game studios, particularly in the cutthroat business of AAA experiences where 1-2 million sales can be the difference between “Game X” receiving post-launch support/a potential sequel and being shelved entirely, as was the case with Visceral’s Dead Space series prior to the studio’s closure. Oh Dead Space, how I miss you so.

It’s for this reason that microstransactions and in-game purchases are here to stay, as developers look to mitigate development costs by any means necessary. The question, really, is how they’re implemented.

The views expressed in this article are those of the author and don’t necessarily represent PSLS as a whole.

Essential Reading: