“Forget history and it will repeat itself” is often something you hear in history lessons due to its poiniency in how accurate it is. Often, people will completely forget how certain events have occurred more than once. It’s scarier (and frankly scummier) when people desperately try to scrub history from our lives. Much like how some people don’t even believe historical events occurred, it’s troubling how quickly people dismiss the past of video games — which is why backwards compatibility is an incredibly useful tool for players and the industry.
It’s amazing how people in the industry like to downplay the benefits of past games or mechanics. Recently, multiple storefronts from Sony’s Playstation will be closed down, including the PS3 and PSVita. That means that entire generations of console games will completely disappear in an instant, with only a few survivors from each generation moving on through remasters and sequels. The rest will be forever lost, except for the few who have both the console and the game (unless you emulate).
Despite how CEOs and other think tanks have tried to force people to forget the positives of backwards compatibility, the argument remains strong for their inclusion. Certain franchises who have a continuous story can have games on previous consoles that are no longer compatible with current consoles. Meaning, people have to either play emulated versions of the game or buy a console for those older games. Either way, it is an additional barrier preventing people from playing games that they should be allowed to try out, if not to learn from.
Each game’s generation learns from the previous. What works and what doesn’t have been a decent filter for ideas that were less effective than the ones that have survived the transition to a different medium. Cameras in each generation have been improving forever, and people keep forgetting how to make a good camera in a video game. However, there should be a way to see what hasn’t improved, and there’s no better way than to play the games from the past to get a better idea of what they’ve done and why they did it.
Briefly going back to the previous example: nowadays, the camera typically just follows the player from behind or a frames-per-second (FPS) view which is controlled by the player. Even just a decade ago, it was the Wild West for cameras. There were cameras that were stuck to certain parts of a room that would instantly switch to another to give a different perspective in the room, much like how the older “Resident Evil” games used their camera. It was a terrible camera scheme for that genre, but ended up being effective in other games like “Republique.” Fighting game movement executions were really good in fighting games but they were also implemented in spectacle fighter games like “Devil May Cry” and “Smash Brothers.” While obviously toned down, the message is clear: mechanics of the past shape the future of our video games.
Not only mechanics, but entire genres of games have completely neglected the past. Play any older game and you’ll find yourself immersed in rich stories that have aged well. However, just a few years ago, people were saying that gamers no longer wanted single-player, story-rich games. Just look at what EA CFO Blake Jorgensen said in 2017 after the closure of Visceral Games, creator of “Dead Space” and developer of a now defunct single-player “Star Wars” game: “As we kept reviewing the game, it continued to look like a much more linear game [which] people don’t like as much today as they did five years ago or ten years ago.”
If he said that today, he would get questionable looks and inquiries to his sanity, as games like “The Witcher 3,” “Spiderman” and “God of War” have all come out since this statement — all selling incredibly well. “The Witcher 3” went on to be the company’s most profitable game, making CDProjekt Red, the developer for the game, the biggest game developer in all of Europe. They even beat out gaming veterans such as Ubisoft, who had also shifted more toward the multiplayer experience with many of their games having multiplayer modes and features.
Out of all of these reasons game developers try to push out older hardware and games, the easiest answer is the simplest. The grim reality of erasing the history of video games is the financial benefits the industry gets from people forgetting how the industry used to be.
One of the best ways to see this is with DLC. DLC used to be, “Here’s an entire map of content and other things that you can do.” Now, it can be as simple as, “Here’s one new character,” or “Here’s a skin for a character you already have.” Without the context of the past, it becomes harder and harder to see where we once were when games simply came out and developers were not given the luxury of giving subpar experiences, only to reap the benefits of a tolerable experience later.
Games like “Dota 2” feel like relics in terms of their monetization because money gets you nowhere in the game in terms of play. All you can get is a stupid hat or costume to wear as you get pummeled into the dirt.
People look at “Overwatch,” a game where you pay real money to even get in to only see loot boxes, as a progressive move for monetization when in comparison to older games, it’s pathetic. Remember when people used to riot over pay-to-win, and now it’s literally everywhere? Remember when paying ten dollars was getting an entire new map, but now it’s a gun skin? Remember when older games remastered used to have all the content in them, but now even they are getting loot boxes and other scummy practices. Remember when games just came out?
You do? Good! Then the games industry hates you and everyone like you. And not only that, but bafflingly, other gamers hate you, too, when you bring it up.
Edward Park is a third year student with a BsED writings track. EP909767@wcupa.edu