Ikhan Games Banner

Microtransactions and Subscriptions


For nearly two decades, video game publishers have experimented with new ways to squeeze revenue from players. Microtransactions, subscriptions, and battle passes have all been hailed as innovations at different points, but over time their flaws have become painfully obvious. While monetization itself isn’t inherently bad—studios need revenue to keep making games—the aggressive and often exploitative ways these models have been deployed have eroded player trust, injured long time brands, and even killed promising IPs. These tactics have always had a shelf life, and we may finally be reaching the end of it.








Nickel-and-diming was always going to wear thin



Charging players for content already built into a game but locked behind a paywall, or constantly upselling microtransactions through intrusive in-game prompts, was never sustainable. These practices turn what should be a game into a storefront. Subscriptions add another layer of frustration: many players are sold on the promise of “premium access” without clear communication of what they’re paying for, or worse, with content disappearing or being reclassified after purchase. What starts as “extra value” often feels like a bait-and-switch. A fair number of franchises have become synonymous with these practices burning their reputability in the process.








The value rarely matches the price



One of the main criticisms of loot boxes—and why regulators in some regions treat them like gambling—is that players are paying real money for the chance at a reward, not the reward itself. It actually so much worse than that since normal content for games have been shaved back to instead be placed in loot boxes.




Battle passes and subscription models operate on the same principle, though with slightly more structure: they provide variable value depending on how much time a player can invest. For someone with hours each night, the pass might be worthwhile; for someone who plays casually, it feels like wasted money. A fairer model might tie subscription costs directly to in-game currency or tangible goods, giving players clarity and control over the value they receive. This would mitigate any sense of loss when a subscription goes unused.








Subscriptions provide no real protection for players



Another issue is permanence or rather, the lack of it. Gamers who pay upfront for content often find themselves stranded when studios shut down servers, discontinue a franchise, or quietly remove promised features. Unlike buying a book or DVD, where the product remains with you, digital game subscriptions evaporate the moment the provider decides to pull the plug. This isn’t unique to gaming — magazines fold, streaming services cancel shows—but the difference is that games are interactive and games with online elements or online piracy countermeasures often rely entirely on servers to function. Players aren’t just losing access to future content; they’re losing the product they already paid for.








The worst practices became the norm



Part of the problem lies in how these monetization strategies were introduced. When microtransactions first gained traction, a few major studios rushed to set the tone, often relying on aggressive upselling, grind-gating, or cosmetic inflation. Because those early adopters were financially successful, their methods were copied across the industry. Even when these ill-suited featured reached industry saturation and it became clear that the majority of players weren't on board with being pumped for cash. Instead of refining or improving the system, publishers doubled down on short-term gains, normalizing practices that frustrated gamers and damaged long-term trust.








Free-to-play isn’t a blank check for exploitation



The rise of free-to-play seemed like a win for players: anyone could try a game without an upfront cost. But the catch was clear—revenue still had to come from somewhere, usually microtransactions. That structure gives publishers every incentive to build endless new items, skins, and boosters, while keeping development costs low by sun-setting games once engagement dips. What players are left with is a cycle of investment into games that may not survive...an investment that could expire at the whims of others. This effectively turning what was supposed to be “free” into one of the most costly ways to play.








Paying for FOMO is worse than pay-to-win



Pay-to-win models have long been criticized, and for good reason—they undermine competitive balance. But in many ways, the modern obsession with limited-time exclusives and artificially inflated scarcity is even worse. Instead of competing for skill, players are competing against the clock, pressured into buying short-term offers before they vanish forever. Companies manufacture FOMO (fear of missing out) as a business model, conditioning players to pay not for fun or fairness, but for the fear of exclusion. It is still the minority minority of cases; players being coerced by underhanded marketing practices, but that hardly dispels the exploitation. The result is a system that manipulates psychology more than it rewards gameplay.









What ties all these arguments together is a fundamental mismatch between how players experience games and how publishers monetize them. Players want consistent value, permanence, and fairness; publishers have leaned on models that offer none of those things. The cracks are already showing—lawsuits, government regulation, and declining player patience are all signs that the current era of monetization is running out of steam. The industry will need to adapt, not just by finding the next cash-cow model, but by re-earning trust. Because when the game feels less like play and more like a sales pitch, players eventually walk away.