Jessica Hammer is an assistant professor at Carnegie Mellon University, with a joint appointment between the Human-Computer Interaction Institute and the Entertainment Technology Center. She is a graduate of Harvard, with a B.A. in computer science, and earned a master’s of professional studies degree in interactive telecommunications at New York University and a Ph.D. in cognitive studies in education at Columbia University.
A member of SIGCHI, the American Educational Research Association, the International Academy of Digital Arts and Sciences, the International Game Developers Association and Women in Technology International, Hammer joined the CMU faculty in 2014. She spoke to Link Editor Jason Togyer.
What were your earliest gaming experiences?
I started designing games when I was 8 years old, planning birthday parties for everyone in the class. So I went to the library and got a book called “101 Party Games for Kids.” This book was my secret sauce. I would take the games in the book, and then modify them for each party.
Later on, I started babysitting and ran a youth group, and I designed games for those, but I didn't know it was a career path. I just thought it was just something you did when you were a babysitter.
How did you get into computer science?
My dad was a professor of computer science at MIT, so the two things I was convinced I was never going to do were become a professor and study computer science. Instead, I was going to be a poet. My father, who was very tolerant, said, “Well, you should do whatever you want to do in life, but you should take one computer science class. Computer science is a powerful tool that will help you to do your work better.”
I almost failed the course, and that made me mad. I said, “No! Computer science is not going to beat me!” I went back and took the next course in the sequence, and at some point, I fell in love. I realized that solving problems and making things was really satisfying.
From there, did you go into game design?
No—I went to work for a non-profit that was teaching STEM skills to girls. In the office next door, there was a guy named Scot Osterweil who was developing the Zoombinis games. I kept going over to peek at what they were doing until finally Scot said, “Would you like to just come and work for me one day a week?” All of a sudden, I was doing meaningful things with computer science and code that would bring people so much pleasure, and in the case of Zoombinis, help them learn mathematical skills. It was bringing back all of these feelings I had from childhood, and I felt like I’d hit the jackpot.
After that, I started work as a game designer, but it wasn’t quite the same. Even though I was doing work I loved, I wasn’t making progress on the questions I wanted to answer, such as—how can you use games to change people’s lives?
What brought you to CMU?
The Human-Computer Interaction Institute was looking for someone who did game design, and when I saw what kind of work they were doing I said, “Yes, me!”
I actually have a joint appointment between the HCII and the Entertainment Technology Center. At the HCII, I run a research lab where I have Ph.D. students, while at the ETC, I work primarily with the master’s students who are doing deep project work on designing games and other interactive experiences. It’s a really rich opportunity to combine research into learning with research into the game design process, and it allows me to work on some projects that would be much harder to do in a conventional academic research lab.
What sort of research questions can you probe with a game?
You can make people do ridiculous things in games—jump up and down, croak like a frog, dance in front of a screen—because people participating in games have to take the context of the games seriously. The ability to do that is an amazing opportunity for research, because you can basically put people into whatever context you can imagine and then see what they do.
Right now, I’m working with an ETC team to develop a game that will change people’s beliefs about what happens after a natural disaster. In popular fiction, including news media, portrayals of what happens after a major disaster depict life as Hobbesian—nasty, brutal and short. But that’s basically a lie. Communities are very resilient after a disaster, and the majority of people are incredibly resourceful and capable. Natural disasters often bring out the best in people.
The problem is that these narratives are not just popular in fiction—they’re narratives that policy makers use to make decisions. We think that we can make a game that can be played by young researchers and aides who put together policy proposals that are read by people who are more powerful. We are hoping we can make a game that has some power in the popular consciousness to change people’s minds, and change disaster policy.
Do people really change their beliefs based on an experience they have in a game?
The experiences you have in a game don’t disappear when you stop playing the game. Those experiences are vicarious to you in the same way that reading a novel is vicarious to you, but the experiences still matter. You still had the emotional reaction that you had. You can take advantage of this to design really interesting, powerful game experiences, so yes—people do.
Some politicians and public figures have been caught playing ultra-violent video games. Should we worry that their gaming activities are a sign of something they want to do in the off-line world?
The life someone has inside a game doesn’t necessarily reflect what a person literally wants to do. Perhaps it’s something that they need and aren’t getting—but that doesn’t have to be a literal need. When you tell me about a politician playing a super-violent game, I would say maybe what they need isn’t violence. Maybe what they need is simplicity. If you have to compromise and negotiate all day in your daily life, maybe when you get home and play a game, you just want to solve some problems by blowing things up.
Why is playing computer games still held in low esteem by some people?
I think there are certain game activities that are more acceptable than others. The idea of a kid spending 20 to 30 hours a week playing sports is perfectly acceptable, even though it’s a game. Digital games, though? It requires a certain amount of persuasion for people to see gaming as not just a ridiculous waste of time.
But think about a 14-year-old kid who’s leading a guild of 200 players in “World of Warcraft,” and organizing events where they’re overcoming major challenges and everyone has to be coordinated. They’re basically managing a small business.
What do you tell parents who are worried about the amount of time their kids are spending playing games?
I give them three pieces of advice. First, encourage your kids to play games socially. Playing games with other people is a great way to build relationships, to create opportunities to collaborate, to try on different identities and different roles. If you’re going to limit their game time, limit their solo game time and encourage their social game time. Second, worry less about game content and more about their behavior. So they want to play “Call of Duty” and blow things up? That’s less of a big deal than if they’re bullying other people in the game.
Finally, encourage them to look for games that allow players to do modding or level-building—creative ways to contribute to the game. And that includes non-digital games. Computers are one technology used to play games, but so are dice and pens and paper. Get them playing games that are hackable, because at the end of the day, that will have them thinking analytically about why their design is better. If you’re writing rules for games, that’s not that much different than writing code—the only difference is that the processor you’re writing for is the human brain.
Jason Togyer | 412-268-8721 | email@example.com