Guest User

Videogames Play You And You're an NPC

a guest
Jul 18th, 2018
659
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 20.94 KB | None | 0 0
  1. THE GAME PLAYS YOU
  2.  
  3. BJ Fogg: The Ed Bernays of Silicon Valley
  4.  
  5. Psychology and AI are used to brainwash users into an addictive stupor. BJ Fogg is the father of “persuasive technology” which is a euphemism for computer-facilitated manipulation. Fogg’s functional triad refers to the three uses of technology. First is as a tool. This is the tool you desire (a social network) as well as the deeper aspects of the tool (marketing research). Second as media. Media here refers to how you interact with a tool. (TV as passive, gaming as active). Third is as a social actor. This refers to your relationship with the technology.
  6.  
  7. What sort of relationship do you have with technology? A very one-sided relationship where you are the dupe. Every method of social manipulation (Cialdini and Bernays) can be digitally simulated. You can reciprocate a website that gives you points in exchange for clicks and likes. You can commit to a brand and stay consistent by using it. There is social proof in other users, likes, and virtual characters. There is authority in the brand and its experts. You can like the brand itself, individual representatives, or the mascot. But what of scarcity?
  8.  
  9. Loss is a greater motivator than gain. Humans have learned throughout evolution that losses can be fatal whereas gains are only potential. Losses are more real to us. By providing scarcity (or a threat of scarcity) of anything you want the AIs can motivate you with loss aversion. That’s what keeps you logged on, playing the game, or trying to make new “friends”.
  10.  
  11. While the notion that businesses manipulate you into buying their products is old as time a few things here are different. First this is backed up not only by skilled psychologists and social scientists, but also real time machine learning. Second you can’t meaningfully resist a combined psychological assault that uses these tools, very often without your knowledge. Third these products are harming people. Video games provide the illusion of mastery which limits many young men (and women) from realizing a future in the real world. Social media perverts the impulse for social acceptance into a lonely hunt for virtual friends, likes, and other pixelated simulacrums.
  12.  
  13. Our greatest minds have brought out the worst in us for profit and control. Do you consent to this control?
  14.  
  15.  
  16. https://en.wikipedia.org/wiki/B._J._Fogg
  17. http://archive.fo/92zZ7
  18.  
  19. https://en.wikipedia.org/wiki/Persuasive_technology
  20. http://archive.fo/AYwBg
  21.  
  22. https://en.wikipedia.org/wiki/Captology
  23. http://archive.fo/KuXu8
  24.  
  25. https://www.youtube.com/watch?v=fUHVDdAf3MI
  26. https://hooktube.com/watch?v=fUHVDdAf3MI
  27.  
  28. Video gaming
  29. https://www.gamasutra.com/view/feature/131494/behavioral_game_design.php
  30. http://archive.fo/KxM5g
  31. https://www.gamasutra.com/view/feature/131494/behavioral_game_design.php?page=2
  32. http://archive.fo/B7fYp
  33.  
  34. >Some common terms in behavioral psychology as they apply to game design considerations:
  35.  
  36. >Reinforcer: An outcome or result, generally used to refer to a reward. Examples: an experience point, winning a level, a bigger gun.
  37.  
  38. >Contingency: A rule or set of rules governing when reinforcers are given. Also referred to as a schedule of reinforcement. Examples: a level every 1,000 experience points, a bonus level that is only available if you kill a certain opponent.
  39.  
  40. >Response: An action on the part of the player that can fulfill the contingency. This could be killing a monster, visiting an area of the game board, or using a special ability.
  41.  
  42. >There are essentially two fundamental sorts of contingencies, ratios and intervals. Ratio schedules provide rewards after a certain number of actions have been completed. For example, a player might gain an extra life after killing 20 opponents. This would be called a "fixed ratio" schedule, because the same number of kills is required every time. Other types of ratios will be discussed later.
  43.  
  44. >One of the most common contingencies found in games, fixed ratio schedules typically produce a very distinct pattern in the participant. First there is a long pause, then a steady burst of activity as fast as possible until a reward is given. This makes sense when one considers that the very first action never brings a reward, so there is little incentive to make that first kill. Once participants decide to go for the reward, they act as fast as they can to bring the reward quickly.
  45.  
  46. >On the other side of the coin there are interval schedules. Instead of providing a reward after a certain number of actions, interval schedules provide a reward after a certain amount of time has passed. In a "fixed interval" schedule, the first response after a set period of time produces a reward. For example, the game might introduce a power-up into the playing field 30 minutes after the player collected the last one.
  47.  
  48. >Participants usually respond to fixed interval contingencies by pausing for a while after a reward and then gradually responding faster and faster until another reward is given. In our power-up example, the player would concentrate on other parts of the game and return later to see if the new power-up had appeared. If it hadn't, the player would wander off again. Gradually the checks would become more frequent as the proper time approached, until at about the right time the player is sitting there waiting for it.
  49.  
  50. >There are also "variable interval" schedules, where the period of time involved changes after each reward. A counterpart to the variable ratio schedules, these also produce a steady, continuous level of activity, although at a slower pace. As in the variable ratio schedule, there is always a reason to be active. The power-up mentioned in the earlier example could reappear immediately after being collected or an hour later. The motivation is evenly spread out over time, so there are no low points where the players' attention might wander. The activity is lower than in a variable ratio schedule because the appearance is not dependent on activity. If the player looks for the power-up 1,000 times during the interval, it will appear no faster. Experiments have shown that we are very good at determining which consequences are the results of our own actions and which are not.
  51.  
  52. >There are a few special cases in the study of contingencies that deserve special mention. First, there are "chain schedules," situations where there are multiple stages to the contingency. For example, players may have to kill 10 orcs before they can enter the dragon's cave, but the dragon may appear there at random points in time. These schedules are most commonly found in multi-stage puzzles and RPG quests, and people usually respond to them in a very specific way: they treat access to the next stage of the schedule as a reward in itself. In the example just mentioned, most players would treat the first part as a fixed ratio schedule, the reward being access to the subsequent variable interval schedule.
  53.  
  54. >Secondly, there is the question of what happens when you stop providing a reward, which is referred to as "extinction." Say the player is happily slaying the dragon every time it appears, but after a certain number of kills it no longer appears. What will the player do? The answer is that behavior after the end of a contingency is shaped by what the contingency was. In a ratio schedule, the player will continue to work at a high rate for a long period of time before gradually trailing off. In a fixed interval schedule, their activity will continue to peak at about the time they expect to be rewarded for a few intervals before ceasing.
  55.  
  56. >The moral here is that reducing the level of reinforcement is a very punishing thing for your players and can act as an impetus for them to quit the game. It needs to be done carefully and gradually, or there may be an undesirable backlash. This applies even to temporary reductions, such as when killing orcs stops producing points but the player has not yet discovered that trolls can be killed instead. Sudden loss of reward is very aversive and should be avoided when possible.
  57.  
  58. >How to make players play hard. Translated into the language we've been using, how do we make players maintain a high, consistent rate of activity? Looking at our four basic schedules, the answer is a variable ratio schedule, one where each response has a chance of producing a reward. Activity level is a function of how soon the participant expects a reward to occur. The more certain they are that something good or interesting will happen soon, the harder they'll play. When the player knows the reward is a long way off, such as when the player has just leveled and needs thousands of points before they can do it again, motivation is low and so is player activity.
  59.  
  60. >How to make players play forever. The short answer is to make sure that there is always, always a reason for the player to be playing. The variable schedules I discussed produce a constant probability of reward, and thus the player always has a reason to do the next thing. What a game designer also wants from players is a lot of "behavioral momentum," a tendency to keep doing what they're doing even during the parts where there isn't an immediate reward. One schedule that produces a lot of momentum is the avoidance schedule, where the players work to prevent bad things from happening. Even when there's nothing going on, the player can achieve something positive by postponing a negative consequence.
  61.  
  62. >How to make players quit. In other words, under what circumstances do players stop playing, and how can you avoid them? I've discussed two main conditions under which players will stop playing. The first is pausing, where their motivation to do the next thing is low. Motivation is relative: the desire to play your game is always being measured against other activities. While they may have a high overall motivation to play your game, during play they're comparing their motivation to do the very next thing in the game to all the other next things they could be doing. If they've just gone up a level and know that they have an hour of play before anything interesting happens, their motivation will be low relative to all the other activities they could be doing.
  63.  
  64. >One way around this problem is to have multiple activities possible at any given time. This means that even if killing monsters becomes unrewarding, there are other activities within the game that can take up the slack. If monsters are unprofitable, exploration may be better. The player could take some time to improve their equipment or to practice a new tactic. Note that this is the same phenomenon that led to quitting before, a drop in motivation in the main activity raising the motivation of lesser activities. In this case, the lesser activities are also part of the game, redirecting their attention within the game and maintaining a high level of play.
  65.  
  66. >The other situation that can lead to quitting is the sharp drop in rate of reward which I discussed in the chimpanzee example. Just like motivation, reward is relative. The value of the current reward is compared to the value of the previous rewards. If the current reward is 10 times the last one, it will have a big impact on the participant. If the current reward is weaker than experience has led them to believe, the player will experience frustration and anger. Violation of expectations is perceived as an aggressive act, an unfair decision by the game's creators. While the game can get more difficult over time, it's best to avoid sharp changes in the rate of reward. This is particularly applicable to puzzle games, where the player may have to spend hours on the same problem before moving on to the next. If the current problem is sharply more difficult than previous puzzles, the player may simply walk away.
  67.  
  68.  
  69. Nudging and behavioral psychology
  70. https://www.1843magazine.com/features/the-scientists-who-make-apps-addictive
  71. http://archive.fo/ZFmqn
  72.  
  73. >Fogg called for a new field, sitting at the intersection of computer science and psychology, and proposed a name for it: “captology” (Computers as Persuasive Technologies). Captology later became behaviour design, which is now embedded into the invisible operating system of our everyday lives. The emails that induce you to buy right away, the apps and games that rivet your attention, the online forms that nudge you towards one decision over another: all are designed to hack the human brain and capitalise on its instincts, quirks and flaws. The techniques they use are often crude and blatantly manipulative, but they are getting steadily more refined, and, as they do so, less noticeable.
  74.  
  75. >When you get to the end of an episode of “House of Cards” on Netflix, the next episode plays automatically unless you tell it to stop. Your motivation is high, because the last episode has left you eager to know what will happen and you are mentally immersed in the world of the show. The level of difficulty is reduced to zero. Actually, less than zero: it is harder to stop than to carry on. Working on the same principle, the British government now “nudges” people into enrolling into workplace pension schemes, by making it the default option rather than presenting it as a choice.
  76.  
  77. >Such upfront deliveries of dopamine bond users to products. Consider the way Instagram lets you try 12 different filters on your picture, says Fogg. Sure, there’s a functional benefit: the user has control over their images. But the real transaction is emotional: before you even post anything, you get to feel like an artist. Hence another of Fogg’s principles: “Make people feel successful” or, to rephrase it, “Give them superpowers!”
  78.  
  79. >Eyal has added his own twists to Fogg’s model of behavioural change. “BJ thinks of triggers as external factors,” Eyal told me. “My argument is that the triggers are internal.” An app succeeds, he says, when it meets the user’s most basic emotional needs even before she has become consciously aware of them. “When you’re feeling uncertain, before you ask why you’re uncertain, you Google. When you’re lonely, before you’re even conscious of feeling it, you go to Facebook. Before you know you’re bored, you’re on YouTube. Nothing tells you to do these things. The users trigger themselves.”
  80.  
  81. >Eyal’s emphasis on unthinking choices raises a question about behaviour design. If our behaviours are being designed for us, to whom are the designers responsible? That’s what Tristan Harris, another former student of Fogg’s, wants everyone to think about. “BJ founded the field of behaviour design,” he told me. “But he doesn’t have an answer to the ethics of it. That’s what I’m looking for.”
  82.  
  83. >In “Addiction by Design”, her remarkable study of machine gambling in Las Vegas, Natasha Dow Schüll, an anthropologist, quotes an anonymous contributor to a website for recovering addicts. “Slot machines are just Skinner boxes for people! Why they keep you transfixed is not really a big mystery. The machine is designed to do just that.” The gambling industry is a pioneer of behaviour design. Slot machines, in particular, are built to exploit the compelling power of variable rewards. The gambler pulls the lever without knowing what she will get or whether she will win anything at all, and that makes her want to pull it again.
  84.  
  85. The costs of computer-facilitated manipulation.
  86. https://medium.com/@richardnfreed/the-tech-industrys-psychological-war-on-kids-c452870464ce
  87. http://archive.fo/TTLuT
  88.  
  89. Nestled in an unremarkable building on the Stanford University campus in Palo Alto, California, is the Stanford Persuasive Technology Lab, founded in 1998. The lab’s creator, Dr. B.J. Fogg, is a psychologist and the father of persuasive technology, a discipline in which digital machines and apps — including smartphones, social media, and video games — are configured to alter human thoughts and behaviors. As the lab’s website boldly proclaims: “Machines designed to change humans.”
  90.  
  91. If you haven’t heard of persuasive technology, that’s no accident — tech corporations would prefer it to remain in the shadows, as most of us don’t want to be controlled and have a special aversion to kids being manipulated for profit. Persuasive technology (also called persuasive design) works by deliberately creating digital environments that users feel fulfill their basic human drives — to be social or obtain goals — better than real-world alternatives. Kids spend countless hours in social media and video game environments in pursuit of likes, “friends,” game points, and levels — because it’s stimulating, they believe that this makes them happy and successful, and they find it easier than doing the difficult but developmentally important activities of childhood.
  92.  
  93. According to B.J. Fogg, the “Fogg Behavior Model” is a well-tested method to change behavior and, in its simplified form, involves three primary factors: motivation, ability, and triggers. Describing how his formula is effective at getting people to use a social network, the psychologist says in an academic paper that a key motivator is users’ desire for “social acceptance,” although he says an even more powerful motivator is the desire “to avoid being socially rejected.” Regarding ability, Fogg suggests that digital products should be made so that users don’t have to “think hard.” Hence, social networks are designed for ease of use. Finally, Fogg says that potential users need to be triggered to use a site. This is accomplished by a myriad of digital tricks, including the sending of incessant notifications urging users to view friends’ pictures, telling them they are missing out while not on the social network, or suggesting that they check — yet again — to see if anyone liked their post or photo.
  94.  
  95. While social media and video game companies have been surprisingly successful at hiding their use of persuasive design from the public, one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.” Why would the social network do this? The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”
  96.  
  97. This is a major effect of persuasive design today: building video games and social media products so compelling that they pull users away from the real world to spend their lives in for-profit domains. But to engage in a pursuit at the expense of important real-world activities is a core element of addiction. And there is increasing evidence that persuasive design has now become so potent that it is capable of contributing to video game and internet addictions — diagnoses that are officially recognized in China, South Korea, and Japan, and which are under consideration in the U.S.
  98.  
  99. While girls are pulled onto smartphones and social media, boys are more likely to be seduced into the world of video gaming, often at the expense of a focus on school. High amounts of gaming are linked to lower grades, so with boys gaming more than girls, it’s no surprise to see this generation of boys struggling to make it to college: a full 57% of college admissions are granted to young women compared with only 43% to young men. And, as boys transition to manhood, they can’t shake their gaming habits. Economists working with the National Bureau of Economic Research recently demonstrated how many young U.S. men are choosing to play video games rather than join the workforce.
  100.  
  101. In truth, the harmful potential of using persuasive design has long been recognized. Fogg, himself, says in a 1999 journal article, “Persuasive computers can also be used for destructive purposes; the dark side of changing attitudes and behaviors leads toward manipulation and coercion.” And in a 1998 academic paper, Fogg describes what should happen if things go wrong, saying, if persuasive technologies are “deemed harmful or questionable in some regard, a researcher should then either take social action or advocate that others do so.”
  102.  
  103. More recently, Fogg has actually acknowledged the ill effects of persuasive design. Interviewed by Ian Leslie in 2016 for The Economist’s 1843 Magazine, Fogg says, “I look at some of my former students and I wonder if they’re really trying to make the world better, or just make money.” And in 2017 when Fogg was interviewed by 032c Magazine, he acknowledged, “You look around the restaurants and pretty much everyone has their phone on the table and they’re just being constantly drawn away from the live face-to-face interaction — I do think that’s a bad thing.” Nonetheless, Fogg hasn’t taken meaningful action to help those hurt by the field he fathered. Nor have those in positions of power, with the recent exception of tech execs coming forward, done anything to limit the manipulative and coercive use of digital machines against children and teens.
Advertisement
Add Comment
Please, Sign In to add comment