Advertisement
Guest User

Untitled

a guest
Aug 7th, 2017
218
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 14.12 KB | None | 0 0
  1. Barricading the Fourth Wall: A Brief Timeline of Immersion in Video Games
  2. by Noah Tomlin
  3.  
  4. In electronic entertainment, particularly video games, immersion is a form of escapism. The inherent goal being that the participant temporarily loses all sense of self and daily life around him, and becomes “immersed” in the virtual experience. Particularly in the past 15 years, as technology has improved to allow for a near-photorealistic presentation, video games (particularly those developed by Western nations) have been attempting to be less game-like in order to increase the immersion experience. Still others have experimented with immersing the player by including them as a separate character, whether within the story or acknowledging them as the person holding the controller. Because video games are such a young and growing medium, developers and writers are still experimenting in a medium where they are forced to hand over the pacing and outcome of their product to the player, rather than having full artistic control like in other, more established media. This timeline will attempt to chronicle the key events in the three main types of gaming immersion (explained below) from 1985 until the present day. Each of the meaningful events will be labeled in bold.
  5.  
  6. In gaming, immersion can be separated into three different types of experiences. Narrative immersion can be defined as a state where the participant becomes so completely invested in the story that they (temporarily) forget it is virtual and ficticious. Tactical immersion occurs when the player becomes completely engrossed in his or her player-character's actions. Thirdly, strategic immersion occurs when the player gives their full and undivided attention to solving or completing a particular puzzle or other cerebral challenge.
  7.  
  8. One of the first major steps taken in narrative and strategic immersion was in 1989, when Jordan Mechner created the first Prince of Persia game. It was released for the Apple II personal computer, and later ported to virtually every other platform available at the time. It gave birth to the genre of the cinematic platformer, influencing games such as 1991's Another World, 1992's Flashback and 1998's Heart of Darkness. It was an early experiment in narrative and strategic immersion and is remembered as a flawed gem which was well ahead of the curve.
  9.  
  10. Prince of Persia was unique in a number of ways. The game was presented in a crude approximation of widescreen (barely possible with the Apple II's meager hardware). The heads-up display (HUD) was minimalist, with only a small health bar in the bottom-left corner, along with a visual representation of how many health-restoring potions the player had in their inventory. The HUD also displayed the amount of time remaining on the clock, but only at the start of each level. Cleverly however, the HUD itself was placed on the bottom “black bar”, not only making practical use of the widescreen presentation, but also making sure it did not obstruct the player's view of each screen (i.e. room).
  11.  
  12. The game was designed to enhance the sense of personal connection the player had to the Prince; the Prince himself appeared to be a real human being, simply by making the player focus on the Prince's journey step by step. The Prince's animations were rotoscoped (i.e. traced over) from video of Mechner's brother David running around in white clothes, in order to make the animation as smooth and lifelike as possible. Rather than the typical platformer convention of surviving fifty-foot falls without a scratch, the Prince's jumping height was an average one for humans, he took damage from high falls and would perish if the fall was high enough. Additionally, instead of fighting scores of enemies at once, short combat sequences with one (occasionally two) opponents were the norm, and far more difficult than any game before.. Furthermore, the game is timed (the player, as the Prince, has sixty minutes to get to Ja'far bin Yahya Barmaki and save the Princess). While the game is split into traditional “levels”, the transition is seamless when playing the game for the first time, and individual levels could only be accessed by passwords unavailable during the main, core game.
  13.  
  14. Nowhere is this sense of personal connection to the Prince's struggle more prevalent than in the very first level. The Prince begins the game without a sword, and one screen to the right is an armed enemy. The player is forced to move left and explore the ruined dungeon in order to acquire a sword of his own, before facing the enemy without a clue as to how the combat works. Not only does it help connect the helplessness of the Prince to that of the player's confusion with the initial core game mechanics, it does so in a way which doesn't appear to actively waste the player's time. Because it is in the very first room, the player is free to reset the game and start again, now armed with the knowledge of where the sword is, how to acquire it and how to use it, saving precious minutes from the hour-long time limit.
  15.  
  16. Prince of Persia was a key and unique starter on my timeline because of its attempt to shy away from traditional game-like mechanics, its cinematic presentation and its focus on telling a story centred around making the player feel for the Prince during his quest.
  17.  
  18. Not every game attempted immersion by placing the player directly into the story, bonding them to the player-character. On occasion, more experimental games have toyed with the idea of the players themselves being characters in the story. Nowhere is this more prevalent than in Shigesato Itoi's 1994 Super Nintendo role-playing game EarthBound, which combines both classic and existentialist narrative immersion. On the one hand, the player is allowed to choose the name of the hero as well as his main hobby and favourite food, both of which are absurdist twists on the customization system typical in Japanese RPGs of the time. About halfway through the game,players are asked to sign a contract in their own name, rather than the name they chose for the main character. All of this culminates in the endgame, where, when all hope of defeating the antagonist Giygas seems lost, one of the main character’s party members prays to the player him/herself to save them from destruction.
  19.  
  20. EarthBound toys with the idea of narrative immersion by including the player him/herself as a character, separate from the actions of the player-named hero and his party. It was an early attempt at trying to explore the potential of ignoring the then-traditional integration of the player-character and the player. However, EarthBound did so for humour, in a style similar to the game's absurdist, American-pop-culture-obsessed presentation; the player's ultimate role in the story is to provide a deus ex machina for his own party. In effect, by ripping out any sense of immersion and consistently reminding the player that they are in, and playing, a video game and thus shattering that fourth wall, EarthBound became a game that generated an incredibly memorable and unforgettably immersive experience which most realism-obsessed games fail to accomplish.
  21.  
  22. That is not to suggest, however, that a push towards realism and total narrative immersion is a bad thing. When done well, it can succeed beyond the developer's wildest dreams. Most narrative-immersive techniques in modern games owe their existence to 1998's Half-Life, the highly-anticipated first-person shooter developed by Washington-based Valve Software. In a time where other first-person shooters like Doom or Quake focused on turning their players into gruff space marines sent as one-men armies to destroy the legions of hell or whatnot, Half-Life starred a theoretical physicist, Dr. Gordon Freeman. Despite his impressive array of weaponry, the game was designed in such a way that Dr. Freeman was always presented as being constantly overwhelmed, first by an assortment of typical aliens, and then by squads of USMC soldiers with some of the best artificial intelligence ever seen at that time.
  23.  
  24. In addition, Half-Life decided to completely eschew cutscenes (story sequences) or individual "levels" in favour of one continuous uninterrupted narrative filled with what Valve called “scripted sequences”, interactive set-pieces which could range from an enemy kicking down a door to a complicated, multi-layered discussion with several other characters. All of these elements combined made players feel as if they were on a non-stop roller coaster ride of fast, furious action. Even during the first 45 minutes of the game, where the player (as Dr. Freeman) goes to work without a weapon in sight, there are many visual hints that something is terribly wrong. The steps that Valve took to further immerse the player in this world would be copied by almost every developer for the next thirteen years. Therefore, Half-Life becomes a key contributor to my timeline because of the total narrative and tactical immersion it provided the player.
  25.  
  26. While not an example of a game per se, Ernest Adams' 2004 online article, Postmodernism and the Three Types of Immersion1, is my next important milestone in the gaming world because it introduced the idea and mechanics of “immersion” to a larger audience: specifically, gamers themselves. It gained even further popularity when is was republished by the website Gamasutra.com in 2007, but in 2004, the blogging scene was still relatively new and Adams' concise yet thoughtful writing style helped gamers to understand the mechanics behind what makes an immersive game shine – and actually increased their attraction to, and respect for, the world of game design.
  27.  
  28. The article itself was written as a response to reader backlash after Adams had criticized Hideo Kojima's Metal Gear Solid, a cinematic stealth-action game whose story heavily relies on its existence as a video game, for having characters directly refer to “out-of-game objects” such as buttons, controllers, or the player him/herself. While Adams' argument is a tad shaky at points and is clearly written from a very North American market viewpoint (part of the Metal Gear series' appeal in Japan and across Asia is its esoteric mixture of deadly serious melodrama and fourth-wall-breaking comedy), his article still sets forth the standard definitions for types of immersion which I use in this paper. In particular, his argument that total immersion requires a video game to distance itself from its own identity as a game is a very North American trend that began well before Half-Life (although said game did go a long way to help make that notion a reality).
  29.  
  30. From a narrative immersion perspective, gaming, per Adams, is a bizarre type of escapism; at once, Western gamers want to completely immerse themselves in the player-character's experience whilst still keeping it separate from their everyday lives. This creates an interesting paradox when some games are ballyhooed for being so realistic that they intrude on the escapist fun (for example, guns that jam or fall apart after overuse, requiring the player to scrounge for others). From a gameplay perspective, it seems, what Western audiences want is not realism, but plausibility; the distinction being that the developers can take certain breaks from reality in the name of maintaining the fun factor, as long as they do not stretch the player's willing suspension of disbelief too far.
  31.  
  32. It was partially for this reason that my final narrative immersive point on the timeline was selected . At Blizzard Entertainment's 2007 BlizzCon (a gaming convention just for Blizzard), the vice president of creative development, Chris Metzen, announced that starting with 2010's StarCraft 2: Wings of Liberty, Blizzard would no longer be referring to the players themselves as separate characters (usually in the role of a commander or general) in their future real-time strategy projects2. Instead, in Starcaft 2, the player-character is Jim Raynor, a human who had previously been a "hero" unit in the player-commander's army. This was a significant departure from their previous real-strategy games, especially for a company with a notorious reputation for sticking with old-school design traditions.
  33.  
  34. The player-character-as-commander archetype had been an important narrative immersive technique, crucial to tying the writing, plot development and presentation of real-time strategy games as a whole (not just those made by Blizzard) to the gameplay, wherein the player acts as overseer of a large army, who could be commanded via mouse clicks and keyboard strokes in real time. Metzen had wanted to make StarCraft 2 a more character-driven story, and according to his presentation at the BlizzCon panel, several drafts of the game's story were written with the player-as-commander in mind, yet felt awkward. When the game was finally released, the focus on Raynor as the protagonist was one of the major complaints about its story; the player felt separated from Raynor throughout the single player campaign because he was merely another controllable unit on the ground during gameplay, but the central character in the actual storyline. In short, Blizzard's decision to focus on Raynor as a protagonist rather than the players themselves only backfired because their storyline, a character arc about Raynor's return from the fringe of society, failed to carry through in the gameplay.
  35.  
  36. As can be demonstrated by this timeline, immersion in video games is a highly subjective topic. Large parts of how well immersed a gamer is in their experience depends on cultural taste, personal preference and the techniques used in the game's design itself. Hyper-immersive games like Half-Life can live alongside Metal Gear Solid, where an offscreen support team advises which buttons to press in perfect deadpan. Narrative, tactical and strategic immersion is simply a game design theory rather than an established fact, and much like any theory, it is about perception as much as reality.Thus, the debates will constantly rage on as to what, if any, is the ideal way to ensureplayers continueto be fully engaged in the game's world,until the final credits roll. For without this perpetually heightened engagement, the blood, sweat and tears of the developers, along with the profit margins of the investors, will not be realized.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement