Societal Computing Thesis Proposal

  • Remote Access - Zoom
  • Virtual Presentation - ET
  • Ph.D. Student
  • Ph.D. Program in Societal Computing
  • Institute for Software Research, Carnegie Mellon University
Thesis Proposals

Exploring the adoption of privacy protective behaviors

Adoption of protective tools and technologies for digital privacy and security continues to be limited, despite reports indicating widespread privacy concerns in the population. This apparent gap between concerns and behaviors, known as the “privacy paradox,” has drawn the attention of many scholars. At the same time, anecdotal evidence suggests that some portion of the population may display a different—in fact, inverse—type of behavior: expressing that privacy is not important to them, while engaging in privacy protective behaviors. I call this phenomenon the “reverse privacy paradox.” In this thesis, I first present two case studies on the adoption (or non-adoption) of privacy protective behaviors, and then I explore the reverse privacy paradox.

In the first part of this thesis I explore two case studies of protective behavior adoption, one captured through observed behavior (two-factor authentication adoption at Carnegie Mellon University) and one through self-reported intentions (adoption of a personalized privacy assistant for the Internet of Things). These studies allowed me to observe the influence of many of the variables from behavior engagement and technology adoption models, such as how easy a system is to use, the benefits to be derived from using it, and social influence. Previous work on technology adoption has gone back and forth on the importance of users' self-efficacy---that is, their ability to engage with the technology at hand. However, I found that efficacy does play a role, though not self-efficacy. I noticed that participants' perspective on whether or not engaging in a particular behavior would actually make a difference impacts their decision to adopt such behavior to begin with. Furthermore, the qualitative nature of the second study allowed me to notice that some people would say that privacy was not important, or that they did not care about it, and yet, they would still engage in privacy protective behaviors.

In the second part of this thesis, I focus on understanding the engagement in protective behaviors for this group of people, in order to investigate the possibility of a reverse privacy paradox. To do so, I will develop mechanisms to collect people's perspectives about privacy, as well as collect the behaviors that they engage with. To identify situations where we may observe the reverse privacy paradox, I will first ask participants about their privacy attitudes, preferences, and concerns. Then, I will ask them questions about which protective behaviors they engage with. For participants where I notice the reverse privacy paradox, I will further investigate why this discrepancy happened. I expect to present results that contribute to our understanding of the reverse privacy paradox phenomenon.

Thesis Committee:
Lorrie Faith Cranor (Co-Chair)
Alessandro Acquisti (Co-Chair)
Laura Dabbish
Heng Xu (American University)

Zoom Partcipation. See announcement.

For More Information, Please Contact: