On his last day at GDC 2011, Loyd Case attended some of the lectures and discussions hosted by respected names in the gaming industry. We have his take on five of those talks, covering biofeedback, gaming history, the future of strategy games, and art.
Early attempts at biofeedback in games have been met with mixed results. Perhaps the best known effort comes from Neurosky, though its headset sensor hasn’t gained traction in mainstream gaming.
At least one major game developer is diving deeper into the topic. Mike Ambinder works as a research psychologist for Valve, looking into the potential for biofeedback in games. We’re not talking about games where you relax and concentrate on moving a little ball around the screen, either.
Why would you want biofeedback in a game like Left 4 Dead 2? Games engage our emotions on a moment-by-moment basis, and using those emotions as an input gives games the potential for more dynamic and immersive environments. Done right, biofeedback can also help calibrate the game. Auto-difficulty adjustment in the past has been very crude, but biofeedback could actually improve the quality of automated difficulty adjustment.
The psychology of emotions has two vectors: magnitude (or arousal) and direction (also called valence). High positive valence is usually associated with positive emotions, like being happy. A person with high arousal and valence vectors would be energetic, jubilant, and engaged. Down and to the left means a bored or passive player.
Types of physiological signals useful for biofeedback include heart rate, skin conductance level (SCL), facial expressions, eye movements, and EEGs (Electroencephalography is the recording of electrical activity along the scalp, produced by the firing of neurons within the brain). Ambinder dived into the details of the pros and cons of using a number of different methods, but the key point is that collecting data isn’t easy, is generally expensive, and, depending on the technique, subject to bias. After all, if you know your emotional state is being tracked, that can alter the effect.
The coolest part of the whole talk, though, were the demos. Ambinder demonstrated eye tracking used in a Portal 2 level. This type of technique might be more useful to level designers than the players themselves, allowing designers to see which areas and objects on a level the player considered important, since they could track where the player looked as she moved through the level.
In another demo more directly applicable to gamers, Ambinder conducted an experiment where the AI Director in Left 4 Dead 2 was modified to respond to signals from a SCL (skin conductance level) sensor. The AI Director already tries to respond to player arousal, but indirectly through controller inputs and idle time. Now, actual SCL data would affect when mobs would show up and even the type of boss zombie the players might encounter.
The graph shows the progress through one level. You can see sharp rises in the SCL data as mobs or key bosses appear.
Whether biofeedback will ever go mainstream is open to debate, but certainly Valve’s research into the area demonstrates that implementing biofeedback into certain types of games may pay off through a better gaming experience.