HOME   ABOUT   BLOG   ESSAYS   LINKS   GAMES   SERVICES   TUTORIALS 
Third Helix is now Kickbomb Entertainment! Please visit kickbomb.us for all future updates.

GDC 2011: USER RESEARCH TALKS

04 Mar 2011

Yesterday was a pretty packed day, hence the late writeup. The first two sessions of the day were both related to user experience research. Things kicked off bright and early with Veronica Zammitto from EA.

She opened with a basic overview of the user research field. It's all about finding out what's going on in players' minds while they're playing the game, and there are two broad categories of techniques for doing this. Qualitative techniques include interviews, focus groups, "think-aloud" play sessions (where the player simply says everything he thinks while playing), and surveys. If you've ever conducted or participated in any kind of playtest, you've almost certainly encountered one or more of these. These are subjective techniques, and they answer the question, "Why did the player do that?"

Quantitative techniques, on the other hand, gather objective, continuous data which answers the question, "What did the player do?" These techniques operate at a much deeper level than simply observing recorded playtest footage: they include biometrics, eye tracking, telemetry, and so on. These quantitative techniques were Veronica's focus for the session, and she discussed them in the context of two EA case studies: user research on NBA Live 10 and on NHL 11.

Eye tracking data for NBA Live 10 indicated that a certain type of players were frequently looking at the teams' coaches standing on the sidelines. This was curious behavior, since the coaches have no function: they're there simply because you expect to see them. Veronica hypothesized that these players were looking to authority figures for feedback and/or suggestions about gameplay, the roles a coach would traditionally fulfill in the real-world sport. While this hypothesis was apparently not acted upon in NBA Live 10, her suggestion was to answer this data by giving the coaches a gameplay role in line with what players apparently expected.

Telemetry - the recording and visualizing of game events - indicated that most shots and passes were taking place in the bottom half of the court. Why would players do this? Veronica hypothesized that the angle of the camera was the driver for this behavior: players didn't want their view of their active player and/or the ball to be obstructed by other players or objects in the foreground. The telemetry data made clear the effect a camera angle can have on gameplay.

It also exposed another issue: the telemetry for shots taken by the AI showed two very concentrated spots on the court, where the AI was taking shots from precisely the same two positions a disproportionate amount of the time. This seems to indicate an AI bug. (It's unclear whether this was addressed for the final product.)

On NHL 11, Veronica used eye-tracking to determine which UI elements were receiving the player's attention and which were ignored. She found that a staggering 75% of UI displays were never even looked at. She also found that when players did focus on something, they tended to do so for less than one second, making clear just how little time game developers have to communicate vital information to players.

(FYI I took photos of a ton of slides, because this session was awesome non-stop data porn, but as I'm about to leave the conference I'm not in a great wifi situation, so I'll have to post those later.)

The second talk was by Mike Ambinder of Valve, who extended the user research theme into more experimental applications. His theory is that incorporating biofeedback directly into gameplay will create more immersive, dynamic, and calibrated user experiences. Some possible applications of these technologies include:

  • Analyzing your own biofeedback data after a game
  • Adaptive experiences and dynamic difficulty
  • Identifying optimal patterns of engagement and arousal
  • Using emotional state as gameplay input (e.g. a lie-detector test)
  • Matchmaking based on physical/emotional profile
  • In competitive games, revealing players' biometrics to spectators in real time
  • Multiplayer mechanics (saving a panicking teammate, etc.)
  • Playtesting (of course)

Mike has conducted several experiments at Valve, incorporating biofeedback into gameplay in these ways. The first one he showed us modified Left 4 Dead 2's AI director. Normally, the director adjusts intensity based on a single "estimated arousal" value: this number is raised by traumatic events (such as spotting an enemy or getting shot) and decays slowly over time. For the experiment, Mike used a technique called Galvanic Skin Response, which measures the electrical conductivity of the skin to indicate emotional arousal (conductivity increases when players become stressed, for example). He fed the GSR data into the AI director in place of the "estimated arousal".

He demoed a video of a play session with the GSR data overlaid. We watched as the player's stress level, as indicated by the GSR monitor, started low and spiked when he was ambushed. Toward the end of the video he was attacked by a tank and the GSR levels went off the charts, much to the audience's amusement. Mike found the experiment a success: players reported increased enjoyment and challenge from the GSR-driven AI director than from the original one.

In Alien Swarm, Mike tried a different experiment. He set up a game goal to kill 100 enemies in four minutes, but tied increasing GSR levels to speeding up the timer. Thus, the optimal strategy is to stay as calm as possible, which keeps the timer slow and gives you more time to achieve your goal.

Unfortunately this experiment created a frustrating feedback loop. When players became stressed, the timer sped up. Seeing the timer going faster made players more stressed, and the whole system pretty much blew up. Players did recognize that this was a qualitatively different experience than the "regular" game, and appreciated that, but there's still a long way to go before this experiment yields viable gameplay results.

For Portal 2, Mike hooked up eye tracking to the movement of the crosshair, allowing players to aim the portal gun simply by looking at a surface, while still controlling their movement with WASD. Watching the video of this was a little weird, as the crosshair was jumping all over the screen in unpredictable ways, but users reported that this felt very natural (and of course the crosshair movement would make perfect sense if your own gaze was controlling it). This was a successful experiment, but as Mike noted, consumer-grade eye tracking setups are still a long way off.

Finally, Mike tried exposing GSR levels to players' opponents in a multiplayer game. Either he didn't mention which game, or I just missed it... but in any case, players unanimously found it extremely satisfying to watch their opponents' GSR levels spike when subjected to stress. This was a very successful experiment, although it didn't change gameplay significantly.

My primary takeaway from these talks is that user research is an awesome field which can greatly aid game designers in crafting more compelling and broadly-appealing experiences. That said, user research does get very close to "metrics", which these days are often associated with the shadier aspects of social games development, and I think there's an interesting comparison to be made there, which I'll explore after GDC when I write up this year's social games panels and my thoughts thereon.

Posted In:

game-design gdc video-games