Download as iCal file

User-centered privacy: Tools for involving users in privacy by design

By Ms. Oshrat Ayalon
Location Bloomfield 527
Academic Program: BS
 
Tuesday 11 December 2018, 11:30 - 12:30

System designs that do not meet the users’ privacy and ethical expectations can startle users and lead them to abandon the system altogether. Privacy-by-design (PbD) calls to design systems with privacy from the ground up, and is central to major privacy legislations, such as recent European General Data Protection Regulation (GDPR). However, PbD is also criticized of being too focused on compliance to privacy regulation, rather than on answering users’ privacy expectations. My dissertation motivation is to help information systems’ developers with developing systems that are more aligned with the users’ privacy expectations.

I present the results of two sets of studies to develop assistive tools to be used by developers. In the first set of studies (n = 952) we explored what is the effect of information framing on users’ perceptions of a system’s appropriateness. I compared the participants’ appropriateness judgements made about a system’s privacy design while presenting different levels of personas (hypothetical archetypes of actual users), ranging from data to advanced personas. We concluded that framing information using personas can help in designing systems that would be more appropriate to the users.

In the second set of studies (n = 959) we focused on testing whether a particular system answers users’ expectations. These expectations include both institutional and social privacy, covering data handling between the user and the system, and the mechanisms of managing the relationships between end-users. We developed and evaluated Users’ Perceived Systems’ Privacy (UPSP) scale that looks at both institutional and social privacy. To demonstrate how the scale can be used, we developed A/P(rivacy) Testing, a platform that allows designers comparing several privacy designs alternatives, eliciting end-users’ privacy perceptions of a tested system. Our findings show how can A/B testing methodology be applied for privacy purposes and that our scale is sensitive enough to differentiate between designs that perceived as legitimate and designs that may violate users’ expectations.