The video game industry and video gamers are in a peculiar bind. Depending on who you ask, gaming is either more popular and profitable than ever, in spite of the Covid-19 pandemic, or on the brink of collapse. Any quick online search will give non-gamers the impression of a horribly polarized landscape in which entire communities are in an uproar. Not a week seems to pass without fresh controversies about major publishers: whether it’s the furore over the use of randomized loot box microtransactions or the myriad sexual harassment charges that plagued Activision Blizzard even as it was being acquired by Microsoft. Popular YouTube channel Upper Echelon Gamers has even declared 2021 the “Worst Year in Gaming History,” thanks to both subpar launches and corporate hijinks.
Hyperbolic as that sounds, they’re not alone in this opinion. There’s a recurring sentiment—espoused by Nathan “Optimus,” Richard “ReviewTechUSA” Masucci and other commentators: gaming is not what it used to be.
There has been a rise in a longing for the good old days before the ills associated with the modern industry and its fandom. This warped nostalgia goes beyond reminiscences about childhood memories of gaming or the supposed perks of older titles and consoles. It has also helped foster an atmosphere of outrage and toxicity that, ironically, threatens the very things the most vocally nostalgic claim to be passionate about.
Among the most persistent gripes are those relating to microtransactions and downloadable content (DLC). The practice of monetizing digitally distributed products can be traced back to Atari’s dialup GameLine service in the 1980s and physical expansion packs have been around for just as long. It wasn’t until the turn of the twenty-first century and the rise of high-speed Internet, however, that paid DLCs became an industry norm: both on game consoles like the Sega Dreamcast and on computers—for example, through Valve’s popular Steam platform. The phenomenon attracted further attention after gamers were invited to purchase horse armour for the game The Elder Scrolls IV: Oblivion (2006). Although ridiculed at the time, such practices have now become ubiquitous. You can purchase everything from cosmetic skins for character models to full-blown expansion packs.
Microtransactions also include more dubious anti-consumer practices. These include “on-disc DLC,” in which players have to pay a sometimes exorbitant fee to unlock content already present in the game’s code. One of the most notorious forms of microtransaction is the mobile gaming-derived loot box system (aka gacha), which incentivizes players to pay real-world currency to access a randomized selection of virtual items or gamebreaking advantages. The use of such features—which, projections suggest, will generate $50 billion worldwide by the end of this year—has often been compared to gambling, and there have been rising concerns over their presence in titles aimed at children.
However, things were as bad, if not worse, in the past. As Reuters reported in 2005, Microsoft’s then-upcoming Xbox 360 console encouraged gamers to “spend a few cents here and there to enhance their gameplay,” giving publishers “a way to create a continuing revenue stream.” There were effectively no bounds as to what could be monetized.
Meanwhile, public outcry in the 2010s led many governments, especially in Japan and the EU, to introduce anti-loot box regulation and the industry also adopted voluntary standards. People are still arguing over whether such measures are in the public interest or interfere unduly with the free market. There are also concerns about censorship prompted by a moral panic, whipped up, perhaps, by opportunistic politicians. But these measures, together with the public backlash that has impacted sales of games like Star Wars: Battlefront 2 (2017), have led some publishers and developers to reassess how they implement microtransactions.
Another recurring refrain is that modern games are often broken or unfinished and are too heavily reliant on DLCs and post-release patches. Given the hasty releases of titles like Grand Theft Auto: Definitive Edition (2021) and Battlefield 2042 (2021), bug-riddled or with few in-built features, it’s unsurprising that many look back fondly to a time when games came out feature-complete.
In the ’90s and early 2000s, before DLCs, games had to come out with all features already programmed in, many people erroneously think. Some are now claiming that the technical limitations of those halcyon days forced developers to be more creative and to make their initial products as polished as possible, since no add-ons were available. However, this supposed golden age never existed.
Unfinished and bug-ridden games have not only been around for as long as there have been games but have always been the norm. Technological restraints didn’t prevent the proliferation of shovelware, from the hastily released trash that contributed to the Great Video Game Crash of 1983 to cheaply-made schlock in the 2000s. In fact, the technical limitations of older consoles meant that any bugs or glitches were often permanent. Gamers often put up with these as part of the overall experience.
This is even true of games that many consider classics. The seminal role-playing game Fallout 2 (1998) was notoriously bug-riddled on its release and required an extensive patch—at a time when dial-up connections were commonplace. The otherwise excellent Command and Conquer: Tiberian Sun (1999), as former Westwood Studios producer Rade Stojsavljević has recounted, suffered from feature creep and launched with unfinished content. Pre-release advertising for the innovative shooter game Half Life (1998) touted gameplay elements and levels that never made it to the final product. Even The Elder Scrolls V: Skyrim (2011), a more recent roleplaying mainstay, is notorious for its myriad glitches.
This hasn’t stopped such titles from attaining long-term popularity. Given all their initial bugs, and gamers’ higher expectations as to launch standards, those classics would probably not have fared well had they been released today. Many recent titles have launched without many issues and developers are now able to update even a poorly launched game well beyond its initial state, as in the case of the indie hit No Man’s Sky (2016), which has received years of tech updates.
The discourse around gaming in the Anglophone world has become politically polarised. This also impacts the way gamers invoke the past.
Progressives generally denigrate the old days. Older games, like the original God of War trilogy from the 2000s, are often lambasted as immature and misogynistic. In 2018, Dean Takahashi commented that in those days the industry had been the domain of bigoted white men who repressed “herstory.” Josh Tucker has described gaming and gamers as embodying an “inherently conservative worldview” that must be contested.
The reality is far less clear cut. Although most gamers are male, women make up around 45% of players in the US alone. In addition to the so-called girl gamers, who have been a significant presence since at least the 1990s, video games have always transcended borders and ideology, often offering a haven for social misfits. Within the industry itself, as Loni Reeder has commented, even in the ’80s and ’90s, many people went out of their way to counter discrimination and provide equal opportunities. While people from racial and other minorities are more visible in gaming communities nowadays, the notion that they have only just lately barged into a white boys’ club is at best misleading Americo-centrism and, at worst, disregards the contributions of minorities: the very people activists claim to represent.
Attempts to polarize the gaming community aren’t confined to progressives. Some of the anti-woke believe that woke developers have tainted modern games. Such anti-woke commentators hark back to an age before social justice concerns were injected into gaming in order to feed audience outrage, encourage paranoia and contempt towards political opponents, and attempt to gatekeep gaming to an excessive degree.
It’s going to be difficult to please everyone, especially when nostalgia becomes weaponised by the combatants in the culture wars.
It’s not simply a matter of giving gamers what they want, since the demands of those who claim to speak for them can be muddled, if not contradictory. Modern-day outrage culture leaves little room for tolerance or nuance—seemingly vindicating those who say things are getting worse and worse—but attempts to place restrictions on what can be discussed online in order to combat toxicity can easily fuel accusations of censorship, as we saw in the controversy surrounding the temporary lockdown of the official Halo subreddit.
Gamers have found their own solutions to some of these problems. For example, as Matthew Gault has observed, there’s been a rise in so-called low sodium communities: alternative online groups in which players can “see memes, talk about the story, troubleshoot technical problems, and figure out the best graphics settings” without having to put up with outrage culture.
Such communities, which practice what Ollie Barder has described as “soft gatekeeping” (deterring ideologues and trolls), could provide opportunities for less heated discussions, such as the merits of preserving older titles, the future of physical versus online game releases or simply sharing fun experiences. These can help foster a more realistic appreciation for gaming as it was, and is.
And that would be salutary. Because, for those of us who are just trying to have fun, it’s time to put the past behind us.
I’ve been playing ‘video’ games since 1983 when I had my first computer class. I was seven years old and the first game I ever played was a math-quest where you had to answer math questions correctly or get eaten by monsters. It was text-based with 8-bit graphics, but I was immediately hooked. I was ecstatic when my parents got me a ZX-spectrum 48k for Christmas. I spent hours playing Manic Miner, Chuckie Egg, The Hobbit, and Booty. Later on I would go on to complete Jet Set Willy 1 and 2 (with infinite life cheats, I’m afraid). After discovering girls, I pretty much left the world of gaming for a bit and didn’t return until the early 2000s when I discovered BioWare’s Baldur’s Gate. I fell in love with that series and ended up playing Torment and Icewind Dale. After completing all those, I gave Guild Wars a whirl,… Read more »