Type to search

Uncategorized

Does the Right Level of Microtransaction Integration Exist?

Whether you’re a hardliner or more open to discussion, there is one overlying truth about microtransactions – they’re here to stay. As a part of the gaming market, the usefulness of these systems to developers and publishers is too immense to ignore, and this will always be the case. That said, not all forms of microtransaction inclusion are created equal. Some have been well-received, while others are near-universally derided, but why does this happen, and how did we get here?

Opening the Floodgates

When looking back at the history of microtransactions in AAA gaming, one game stands out as an all-too-appropriate Trojan horse. This title was The Elder Scroll IV: Oblivion, which hit shelves back in March of 2006. A huge step forward for the series, Oblivion introduced many controversial mechanics, the most hotly debated of which was the now legendary horse armor pack.

Still listed on the official Xbox marketplace website, this armor costs $2.50 for a new design to keep your trusty steed safe. Well, as safe as an animal can be in a game developed by notoriously glitch-prone developers Bethesda. At the time, the gaming audience hadn’t seen this kind of paid inclusion before, and seeing horse armor as a revolting inclusion that should have been included in the base game, they revolted. Bethesda, however, would not relent, as it set the stage for what was yet to come.

Inclusion and Exclusion

The developers who include microtransactions tend to claim that the systems are either ignorable or necessary, based on the changing realities of game development costs. Sometimes this argument holds water, but not always. If a game is otherwise free to play, for example, most players find it perfectly fine that a title would support itself with cosmetic-only microtransactions. Without this system, games like Fortnite and Dota 2 would have to fundamentally reshape their entire business strategy.

On the other hand, inclusion in titles where microtransactions were never necessary has been widely criticized. Microtransactions in Star Wars Battlefront 2 were the most famous example of this, especially considering they were in a game that was always going to be a significant financial success. An attempt to excuse this behavior on Reddit by an EA representative still sits as the most downvoted comment of all time, with a score of -630,000.

Honesty and Fairness

When it comes to unacceptable microtransactions in video games, the most extreme microtransactions are perhaps those labeled ‘pay-to-win’. This is where players who put down money have an advantage over those who don’t and it’s where Battlefront 2, on top of its full-priced release cost, flew too close to the sun.

The concept for the publishers and developers is simple; if people want to win, they pay more money. This is akin to being able to pay in chess to replace your pawns with rooks or queens, a move that breaks the balance of the game and ruins it for players without the cash to take part.

Another of the more poorly regarded microtransaction practices is on-disk DLC microtransactions. This is where content is already included in a game you buy, and you have to pay extra to unlock it. Excluding completed code on release is naturally regarded as dishonest, so this is another case where distaste is constant. Just as dishonest are the games that launch without microtransactions, knowing they hurt review scores, but which add them later after reviews are in.

From a justifiable standpoint, it’s only the cosmetic systems that players accept, but even this comes with an asterisk. In fully-priced games, the inclusion of paid cosmetics is still seen as greedy, and players still take issue with some cosmetic systems in F2P games. Poor inclusion has poisoned the well for so many players, that the audience as a whole is wary of any system that includes microtransactions today.

All in all, there are no forms of microtransactions that will please everyone. There are, however, degrees of inclusion that players will find acceptable. Going too far risks dragging a game down and hurting its legacy, but poor monetization can also cause a developer to leave money on the table. It’s a difficult balancing point and one where CEOs and those disconnected from gaming culture can miss the forest for the trees. At least, in the age of patches, issues can be addressed, but for many players, post-launch modifications can be a case of too little too late.