The glitch is typically an unintentional artifact of software development gone wrong. Recently, however, it has become a major component in gaming culture.
The “glitch” used to be a bad thing.
To most gamers, it is seen as an imperfection. The glitch is undesirable, a “mistake” that is overall detrimental to the state of the game. It is an accident, an unintentional result of perhaps carelessness by the developer or maybe lack of thoroughness by the QA team. Regardless of its origin, the glitch is unwanted, because it can result in unpredictable consequences that at best generates mild irritation to the user and at worst wipes hundreds of hours of progress from a game save. No one wants the glitch; it is never seen as a “feature” on the back of the box, nor is it glorified in promotional material. Warner Bros. infamously set up pre-release brand deals with popular YouTube stars for Middle-Earth: Shadow of Mordor only if they agreed to “not show bugs or glitches that may exist.” And in the 2012 animated film Wreck-It-Ralph, the glitch, personified as little Vanellope von Schweetz, is shown to be alienated by her friends due to her unpredictable nature.
Yet, as of late, things have changed. In a strange twist, what used to be undesirable is now the pinnacle of dozens of gaming communities around the world dedicated to testing the limits of video games both new and old. From wall clips to sequence breaks, whistle sprinting to MissingNo., the glitch has always been embedded in gaming culture, but only recently has it arisen to the forefront of the community with the increased exposure of the speedrunning and competitive gaming scenes. With the interconnectedness of the gaming community as it is today, the discovery of a new glitch can send gamers into a frenzy, as our fascination for the unexpected takes a hold of us, our eyes glued to the screen as our brain attempts to process just exactly what is occurring before us.
Before I continue, it is important to first define the glitch. Merriam-Webster defines it as “a usually minor malfunction,” but that is not very useful in our context. Interestingly, Wikipedia synonymizes it with the phrase “software bug” and describes it as “an error, flaw, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways.” While seemingly adequate, this definition is still up to some debate, especially when it comes to the use of key words “incorrect,” “unexpected,” and “unintended.” It is this debate that has led to a new term “exploit” which generally refers to events or actions that are, while unintended, nevertheless expected and therefore not technically incorrect (for example, the advantageous wavedash technique in Super Smash Bros. Melee takes produces an unintended sliding effect that still follows the physics of the game and is therefore not banned in most tournaments). In the speedrunning community, the existence of “glitchless” runs have resulted in heated debate as the community attempts to reconcile with the divisive definitions of the glitch.
For the sake of this discussion, I will refer to glitches and exploits as one, because regardless of how you define these terms, our fascination for them are the same.
One of my favorite speedruns of all time is the AGDQ 2016 Any% run of Mirror’s Edge by SasukeAnimator. As I mentioned earlier, the glitch is an important part of the speedrunning community, especially in Any% runs in which players try to complete the game as quickly as they can through any means possible. In this run, many movement-based exploits are used to increase the maximum speed of Faith, the player character, and other glitches allow Faith to go “out of bounds” and skip entire portions of levels. This particular run is interesting because, aside from how impressive some of the techniques look visually, the couch commentary provides some insight into how the game was developed and why many of the physics exploits work the way they do. All in all, the use of these glitches have brought the game completion time down from what would normally take about 6 hours to a little under 40 minutes.
The following is a short blurb from a blog post I wrote for a film class I took in university about new media. Unlike most other media forms, games require user input. No other medium is so ingrained in interactions such that it would no longer exist without someone directly manipulating or controlling it. There is a degree of freedom that games allow that other media simply do not have. In this way, games are intelligent, expecting action and reaction, adapting to the user’s style of play… until that style of play breaks it. In the freedom that games afford, noise continues to slip through the channels and glitches await the creative “hacker” to exploit. However, software is dependent on control; they are to be used in such a way to that should be predictable, and it is unpredictability that manifests itself into the glitch. The console application is designed to accept only input that follows the strictest of syntax, turning away those too ignorant to understand its language. The advent of graphical user interfaces was supposed to alleviate this issue, striking a balance between control and freedom; yet even with the limited number of buttons one can press, software can still break under unexpected circumstances.
While some developers continue to go at length to resolve and close glitches and exploits (especially among online games where they may give some players an advantage over others), other developers have embraced the glitch as an inherent part of any game, going so far as to “re-patch” previously-removed glitches, as developers D-Pad Studio recently did with Owlboy, or leave existing glitches alone. It is a tough balance between delivering a polished product and retaining the original experience of a game as an artifact, especially when it comes to re-released versions of older games. Even for games that are still in development, the knowledge that some glitches might be beneficial to leave in makes for an interesting debate to be had among the development community.
One of the more famous video game glitches that has garnered much attention in the social sciences is World of Warcraft’s Corrupted Blood incident, in which an in-game debuff spell spread across the game’s world via a bug caused by a software update. The cause of the epidemic was a new raid boss introduced in 2005 that could inflict Corrupted Blood on players and pets, causing them to slowly lose hit points over time. The developers intended the spell to be contagious—that is, the spell could spread to other players in proximity to those affected—but a programming oversight allowed the spell to be carried outside of its intended areas, causing the famous outbreak and widespread panic that was only resolved after several patches and world resets. The event became a case of interest for epidemiologists, who were interested in the player reactions to the plague (some would use their skills to help weaker players while others would deliberately contract the “disease” in order to harm them). Just three years later, a similar event occurred called the Great Zombie Plague of ’08… but this time, the developers had put it in intentionally as part of a world event.
Our fascination for the glitch goes far past modern-day games. Indeed, it is much easier to share your discoveries now, and developers are able to quickly patch out anything they deem game-breaking with ease. Obviously, that was not always the case. After the release of the original Super Mario Bros. on the NES, players found a glitch that would allow them to access what has become known as “the Minus World”—a seemingly normal world that is accessible by clipping through a wall in the game’s second level… with the difference being that it can never be finished. In Japan, some magazines reported that there were 255 other “glitched” worlds much like the Minus World, a claim has since been confirmed true and can be accomplished through RAM manipulation via cartridge swapping while the Famicom remains on. It seems like a complicated trick, one that would likely come out of the mouth of a schoolyard bullying trying to trick you into breaking your video game console at home, but it was real. And to many kids (and adults) who had no idea how computers worked, it was magic.
Today, the glitch has made its way into other creative forms. In music, what used to be deemed as irritating noise is now a central part of a genre known as glitch hop, which has gained popularity as of late due to its extensive use in electronic dance music. In many ways, glitch music has its parallels to the video game world, as in both cases the medium travels outside the confines of normality and embeds its way into part of the culture. In art, the glitch is seen as a representation of chaos or instability, which many modern artists have used to communicate ideas of disruption and manipulation in their pieces. One of my favorite examples of this is the beginning of Eminem’s music video for “Rap God” in which the rapper, imitating the artificial intelligence Max Headroom, speaks through an 80’s television screen in an uncomfortably glitchy manner (the whole video, in fact, is a good example of the glitch-a-like aesthetic in mainstream culture). Some video games use glitch art much in the same way, as a means of making the player feel out of their comfort zone like Undertale does with its final boss.
There is a lot more I can say about the glitch, how it plays a big yet controversial role in the fighting game community, how TAS-ing takes data manipulation to the extreme and performs ungodly glitches not replicable by any human being, and how specific sects of the modding community have embraced the glitch as a means of sharing some rather horrifying creations. Like many things, the glitch is just a small part of gamer culture that makes our industry so intriguing and unique, and in the grand scheme of things, it is also a small reminder that not everything always works, yet there remains some beauty in imperfection.