The Culture behind the Glitch

The glitch is typically an unintentional artifact of software development gone wrong. Recently, however, it has become a major component in gaming culture.

The “glitch” used to be a bad thing.

To most gamers, it is seen as an imperfection. The glitch is undesirable, a “mistake” that is overall detrimental to the state of the game. It is an accident, an unintentional result of perhaps carelessness by the developer or maybe lack of thoroughness by the QA team. Regardless of its origin, the glitch is unwanted, because it can result in unpredictable consequences that at best generates mild irritation to the user and at worst wipes hundreds of hours of progress from a game save. No one wants the glitch; it is never seen as a “feature” on the back of the box, nor is it glorified in promotional material. Warner Bros. infamously set up pre-release brand deals with popular YouTube stars for Middle-Earth: Shadow of Mordor only if they agreed to “not show bugs or glitches that may exist.” And in the 2012 animated film Wreck-It-Ralph, the glitch, personified as little Vanellope von Schweetz, is shown to be alienated by her friends due to her unpredictable nature.

the-glitch

Yet, as of late, things have changed. In a strange twist, what used to be undesirable is now the pinnacle of dozens of gaming communities around the world dedicated to testing the limits of video games both new and old. From wall clips to sequence breaks, whistle sprinting to MissingNo., the glitch has always been embedded in gaming culture, but only recently has it arisen to the forefront of the community with the increased exposure of the speedrunning and competitive gaming scenes. With the interconnectedness of the gaming community as it is today, the discovery of a new glitch can send gamers into a frenzy, as our fascination for the unexpected takes a hold of us, our eyes glued to the screen as our brain attempts to process just exactly what is occurring before us.

Before I continue, it is important to first define the glitch. Merriam-Webster defines it as “a usually minor malfunction,” but that is not very useful in our context. Interestingly, Wikipedia synonymizes it with the phrase “software bug” and describes it as “an error, flaw, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways.” While seemingly adequate, this definition is still up to some debate, especially when it comes to the use of key words “incorrect,” “unexpected,” and “unintended.” It is this debate that has led to a new term “exploit” which generally refers to events or actions that are, while unintended, nevertheless expected and therefore not technically incorrect (for example, the advantageous wavedash technique in Super Smash Bros. Melee takes produces an unintended sliding effect that still follows the physics of the game and is therefore not banned in most tournaments). In the speedrunning community, the existence of “glitchless” runs have resulted in heated debate as the community attempts to reconcile with the divisive definitions of the glitch.

For the sake of this discussion, I will refer to glitches and exploits as one, because regardless of how you define these terms, our fascination for them are the same.

mirrorsedge-pc-screenshot

One of my favorite speedruns of all time is the AGDQ 2016 Any% run of Mirror’s Edge by SasukeAnimator. As I mentioned earlier, the glitch is an important part of the speedrunning community, especially in Any% runs in which players try to complete the game as quickly as they can through any means possible. In this run, many movement-based exploits are used to increase the maximum speed of Faith, the player character, and other glitches allow Faith to go “out of bounds” and skip entire portions of levels. This particular run is interesting because, aside from how impressive some of the techniques look visually, the couch commentary provides some insight into how the game was developed and why many of the physics exploits work the way they do. All in all, the use of these glitches have brought the game completion time down from what would normally take about 6 hours to a little under 40 minutes.

The following is a short blurb from a blog post I wrote for a film class I took in university about new media. Unlike most other media forms, games require user input. No other medium is so ingrained in interactions such that it would no longer exist without someone directly manipulating or controlling it. There is a degree of freedom that games allow that other media simply do not have. In this way, games are intelligent, expecting action and reaction, adapting to the user’s style of play… until that style of play breaks it. In the freedom that games afford, noise continues to slip through the channels and glitches await the creative “hacker” to exploit. However, software is dependent on control; they are to be used in such a way to that should be predictable, and it is unpredictability that manifests itself into the glitch. The console application is designed to accept only input that follows the strictest of syntax, turning away those too ignorant to understand its language. The advent of graphical user interfaces was supposed to alleviate this issue, striking a balance between control and freedom; yet even with the limited number of buttons one can press, software can still break under unexpected circumstances.

While some developers continue to go at length to resolve and close glitches and exploits (especially among online games where they may give some players an advantage over others), other developers have embraced the glitch as an inherent part of any game, going so far as to “re-patch” previously-removed glitches, as developers D-Pad Studio recently did with Owlboy, or leave existing glitches alone. It is a tough balance between delivering a polished product and retaining the original experience of a game as an artifact, especially when it comes to re-released versions of older games. Even for games that are still in development, the knowledge that some glitches might be beneficial to leave in makes for an interesting debate to be had among the development community.

owlboy.png

One of the more famous video game glitches that has garnered much attention in the social sciences is World of Warcraft’s Corrupted Blood incident, in which an in-game debuff spell spread across the game’s world via a bug caused by a software update. The cause of the epidemic was a new raid boss introduced in 2005 that could inflict Corrupted Blood on players and pets, causing them to slowly lose hit points over time. The developers intended the spell to be contagious—that is, the spell could spread to other players in proximity to those affected—but a programming oversight allowed the spell to be carried outside of its intended areas, causing the famous outbreak and widespread panic that was only resolved after several patches and world resets. The event became a case of interest for epidemiologists, who were interested in the player reactions to the plague (some would use their skills to help weaker players while others would deliberately contract the “disease” in order to harm them). Just three years later, a similar event occurred called the Great Zombie Plague of ’08… but this time, the developers had put it in intentionally as part of a world event.

Our fascination for the glitch goes far past modern-day games. Indeed, it is much easier to share your discoveries now, and developers are able to quickly patch out anything they deem game-breaking with ease. Obviously, that was not always the case. After the release of the original Super Mario Bros. on the NES, players found a glitch that would allow them to access what has become known as “the Minus World”—a seemingly normal world that is accessible by clipping through a wall in the game’s second level… with the difference being that it can never be finished. In Japan, some magazines reported that there were 255 other “glitched” worlds much like the Minus World, a claim has since been confirmed true and can be accomplished through RAM manipulation via cartridge swapping while the Famicom remains on. It seems like a complicated trick, one that would likely come out of the mouth of a schoolyard bullying trying to trick you into breaking your video game console at home, but it was real. And to many kids (and adults) who had no idea how computers worked, it was magic.

melting-ice-cream.png

Today, the glitch has made its way into other creative forms. In music, what used to be deemed as irritating noise is now a central part of a genre known as glitch hop, which has gained popularity as of late due to its extensive use in electronic dance music. In many ways, glitch music has its parallels to the video game world, as in both cases the medium travels outside the confines of normality and embeds its way into part of the culture. In art, the glitch is seen as a representation of chaos or instability, which many modern artists have used to communicate ideas of disruption and manipulation in their pieces. One of my favorite examples of this is the beginning of Eminem’s music video for “Rap God” in which the rapper, imitating the artificial intelligence Max Headroom, speaks through an 80’s television screen in an uncomfortably glitchy manner (the whole video, in fact, is a good example of the glitch-a-like aesthetic in mainstream culture). Some video games use glitch art much in the same way, as a means of making the player feel out of their comfort zone like Undertale does with its final boss.

There is a lot more I can say about the glitch, how it plays a big yet controversial role in the fighting game community, how TAS-ing takes data manipulation to the extreme and performs ungodly glitches not replicable by any human being, and how specific sects of the modding community have embraced the glitch as a means of sharing some rather horrifying creations. Like many things, the glitch is just a small part of gamer culture that makes our industry so intriguing and unique, and in the grand scheme of things, it is also a small reminder that not everything always works, yet there remains some beauty in imperfection.

 

 

 

 

E3 and the Evolution of Gaming Industry Fandom

E3 has changed over the years but has failed to catch up to other conventions of the like. What role does E3 have today?

When the Electronic Entertainment Expo first opened its doors to attendees at the Los Angeles Convention Center in 1995, only those with a proven connection to the gaming industry were allowed in. The show would, for the first time ever, allow unprecedented access to hundreds of video game developers and publishers all in one place. The Interactive Digital Software Association wanted a convention that focused solely on video games, specifically to attract and connect with retailers who saw the late successes of fourth-generation video game consoles as a gateway to a untapped but growing market of gamers. The result was the largest industry gathering for video games ever organized, with over 40,000 attendees there to play and discuss the latest games from Sony, Nintendo, and Sega.

Today, E3 has evolved into a much different beast. The IDSA has since changed its name to the Entertainment Software Association, attendance has more than doubled its original size, and the advent of fan-organized conventions like PAX have challenged the relevance of an industry-only event like E3, prompting the ESA to open the convention to the public for the first time ever this year. Yet in walking the show floor myself last month, I could not help but wonder—what role does E3 play in an internationally and socially connected world such as today’s, especially in light of other conventions that many would argue offer a better experience that satisfies the modern gaming industry fandom?

34470962844_886937b50f_o

I arrived at E3 a tad bit later than most other attendees. In fact, I had missed the entire first day due to other plans, and I would end up missing the entire third day as well. It was not like I minded too much though; I knew the moment I purchased my $250 ticket that this whole E3 thing would make absolutely zero financial sense to me (and honestly, given that most other conventions are cheaper, I doubt it would make financial sense to anyone). I was not there to wait in line all day, nor did I care to play early demos of unfinished games. Of the two demos I ended up playing, one was too short to get a good grasp of what the game was about (Super Mario Odyssey) and the other felt exactly what I thought it would play as (Sonia Mania). In both these cases, I felt that the games were better off played on my TV screen at home and from the comfort of my own couch (and moreover, with online stores on virtually every current-generation console, many publishers choose to release their E3 demos to the public anyway).

Some would argue that E3 is not for playing game demos (unless you are a journalist, in which case I guess it is your job to do that), but rather to watch all the fancy new game announcements from all the big game publishers. Who can forget the legendary E3 2004 reveal of The Legend of Zelda for the GameCube, or when Sony pushed out a fever dream sequence of trailers for The Last Guardian, Final Fantasy VII Remake, and Shenmue III at E3 2015? It was all hype… and unfortunately, little substance. Thereby lies the crux of E3 announcements, the early hype of clearly nowhere-near-finished games that seem laughable in hindsight. The Zelda demo from 2004 eventually led to the controversial announcement of The Wind Waker (which despite it ended up becoming one of the most critically acclaimed games of all time) and of the 3 games mentioned by Sony above, only The Last Guardian has thus far seen the light of day. Even when companies try to curb early announcements to prevent overhype as they did this year, the public sees it as a major letdown (to further argue this point: people seem to be more excited for Metroid Prime 4—which literally only saw a logo reveal—than any other game at E3… and you definitely do not need a convention to reveal a logo).

34485827643_1f1fd64ca8_o

The biggest problem with E3 seems to be the fact that it is still branded as an industry-only event, despite the fact that the requirements to get in become laxer by the year. This special yet misleading “exclusivity” treatment only sets attendees up for disappointment, especially for an event that was clearly not designed for large crowds (Nintendo needed to completely reorganize their queuing system for Super Mario Odyssey for Day 2 to deal with the surplus of attendees). Even for people at home, E3 does not seem as special as it once did, with companies opting to announce new games through social media or, in the case of Nintendo, random video broadcasts that feel more personal than the typical big-stage presser. It certainly does not help that other conventions that have kept fans in mind since the beginning have grown to be major juggernauts in the gaming convention space (Gamescom, for example, started in 2009 as an open-to-all gaming convention and currently boasts 5 times the attendance of E3).

Yet despite all this, I do not regret going to E3 at all this year. It sounds trite, but your experience at E3 really comes down to understanding what E3 is for and leveling your expectations. I came into E3 with three goals: to have fun, to get lots of swag, and to play Super Mario Odyssey (this last one is only because I love the Mario franchise too much). I accomplished the latter two handily (it helped that I only wanted to play a single game), but the first goal is easy to forget, especially when you are surrounded by obnoxiously loud noises and drowning in a sea of sweaty gamers. All too often I saw grumpy attendees complaining about long lines and the limited supply of free stuff (as if they expected anything but). On the flip side, I did meet a bunch of people who were friendly enough to start a conversation with me while I was walking around the convention center. Among them were a photographer taking really cool long-exposure photos of the crowds, a Nintendo representative who just needed someone to talk to 4 hours into her morning shift, and Sadworld (oh god the cringe).

35147498592_4e946bac76_o.jpg

Here is a moment that I would like to share in which I feel perfectly encapsulates everything positive about my E3 experience this year. The last thing I did during my time at E3 was watching the ARMS tournament live at Nintendo’s booth. ARMS is Nintendo’s newest IP which features characters with extendable arms in a boxing-match-style arena setting. The bracket was set up such that 4 pro gamers would be pit in one-on-one battles against 4 fans, and while I do not know how it was presented on Nintendo’s official live stream, the excitement on the show floor was palpable. The tournament stage itself was a sight to behold, with huge screens and dozens of LED’s programmed to light up every action. Each KO was met with a collective gasp or “oh” from the crowd followed by an applause to appreciate of the showmanship displayed by the players. The winner of the tournament was a young fan by the name of “Zerk” but his victory was cut short by a final challenge and ultimate beat down by ARMS producer Mr. Kosuke Yabuki. It was all in good fun, and you would be hard-pressed to find anyone there who did not have a good time.

And that is what makes E3 so special. E3 is a celebration of games and gamers by gamers for gamers—that is, everyone is here for one thing, maybe not to play video games or see video games, but to experience the video game industry at one of its peak moments every year. Much of what E3 stands for today is symbolic, as the need for an industry-only show has faded long ago, but it is still not very often that thousands of people from a wide variety of different backgrounds can come together at an event like this, all with the sole purpose of celebrating what makes our industry so great. And despite there being dozens of other major gaming conventions that have popped up within the past two decades, E3 is still the premier place to meet the industry’s leaders and major players.

35274252416_2bbfcf154b_o.jpg

So perhaps E3 no longer has the unique role it once had in the gaming industry. Maybe other conventions are better catered towards fans, or less crowded, or more affordable. But while E3 only lasts a few days every year, the 7 hours I spent on the show floor—the sights, the sounds, and the games—will be remembered for a lifetime.