Why Should HCI Researchers Care About Privacy?
Human-computer interaction (HCI) plays a crucial role in assisting design teams in addressing the challenges of safeguarding privacy and personal information By leveraging HCI, teams can gain insights into the diverse perceptions of privacy among individuals Westin identifies four key states of privacy: solitude, intimacy, anonymity, and reserve Additionally, practical expressions of privacy, as noted by Murphy, include the right to be free from physical invasion of one’s home or personal space.
“the right to make certain personal and intimate decisions free from government interference,”
The right to control the commercial use of one's name and image, along with the management of personal information, highlights the complex and often conflicting views on privacy While many scholars advocate for privacy as a fundamental right, Moor challenges this notion by arguing that privacy is not a core value like life, security, or freedom, but rather a means to safeguard personal security.
The concept of tradeoff is central to discussions about privacy, as highlighted by Warren and Brandeis in 1890, who argued that privacy should be balanced with public interest, a stance supported by numerous court rulings In system design, tradeoffs often arise between security or business requirements and end-user privacy needs, necessitating careful resolution HCI practitioners, with their comprehensive understanding of user-technology interactions, are uniquely equipped to navigate and address these competing interests effectively.
Privacy is intricately linked to various social concerns, including control, authority, appropriateness, and appearance For instance, while parents may view location-tracking phones as a means to ensure safety, children may see this technology as an infringement on their independence and identity This dynamic is illustrated by Goffman’s insights into behavior within small social groups A practical example is the act of closing an office door, which not only safeguards privacy but also underscores an individual's authority compared to colleagues without private offices Therefore, the thoughtful application of Human-Computer Interaction (HCI) tools can significantly enhance the accuracy and quality of the assumptions and requirements that inform system design.
Privacy often presents a challenge to rationalize, as studies reveal a discrepancy between individuals' privacy preferences and their actual behaviors Many struggle to accurately assess low-probability yet high-impact risks, particularly concerning events that may occur far removed from the initial cause For instance, a poorly considered blog post or spontaneous photo shared online can lead to unforeseen embarrassment years later Additionally, privacy is complicated by numerous exceptions arising from specific situations and historical contexts This need for adaptability is evident in data protection laws and social science research, which characterize privacy as a dynamic process of defining interpersonal boundaries rather than a fixed state.
[23] The use of modern “behavioral” inquiry techniques in HCI can help explicate these behaviors and exceptions.
Evaluating the impact of technology on privacy poses significant challenges, as there are limited methods to identify essential privacy features for consumer adoption Furthermore, assessing the actual privacy level a system provides and its overall return on investment remains unclear Privacy, akin to "usability" and "security," is a comprehensive attribute of interactive systems, encompassing both the technology and its users A single inadequately designed component that compromises personal information can undermine the integrity of the entire system.
Human-computer interaction (HCI) is essential for design teams addressing privacy challenges, offering valuable tools to explore perceptions of privacy threats and the sharing of personal information This paper primarily focuses on previous research that illuminates these privacy concerns and assesses how effectively systems support or hinder preferred privacy practices.
Despite significant advancements in our understanding of privacy in Human-Computer Interaction (HCI) over the past three decades, substantial research challenges persist In conclusion, this article highlights five key "grand challenges" that need to be addressed in the realm of HCI and privacy.
– Developing standard privacy-enhancing interaction techniques.
– Developing analysis techniques and survey tools.
– Documenting the effectiveness of design tools, and creating a “privacy toolbox.”
– Furthering organizational support for managing personal data.
– Developing a theory of technological acceptance, specifically related to privacy.
The field faces several significant challenges, and by concentrating research on these issues, we can achieve timely and impactful results that will benefit all users of information technology.
Sources Used and Limitations of this Survey
This survey paper focuses on the research literature within Human-Computer Interaction (HCI), Computer-Supported Cooperative Work (CSCW), and various other fields of Computer Science It is important for readers to recognize that extensive literature on privacy also exists in the Management Information Systems (MIS), advertising and marketing, human factors, and legal sectors.
The Management Information Systems (MIS) community has concentrated mainly on corporate organizations, highlighting how privacy perceptions and preferences significantly influence technology adoption among customers and shape employee relationships Meanwhile, the advertising and marketing sectors have explored privacy concerns related to privacy policies and their implications for consumer behavior, as evidenced by research such as that conducted by Sheehan.
The legal community has examined how specific technologies impact established legal balances, including court rulings and constitutional norms While this article does not delve into legal literature due to its limited practical application in IT design, it acknowledges that such scholarly work can inform Human-Computer Interaction (HCI) Researchers might find valuable insights in analyses related to data protection, the interplay between legislation and technology, identity, data mining, and employee privacy.
Strahilevitz presents a methodology to assist courts in determining an individual's reasonable expectation of privacy, drawing from social networking literature Similarly, Murphy examines the implications of the default privacy rule regarding the disclosure or protection of personal information.
Privacy research is deeply connected to security research For a more comprehensive understanding, we recommend consulting the books "Security and Usability" and "Multilateral Security in Communications," as we will not cover HCI work in the security domain here.
We also only tangentially mention IT management Management is becoming increasingly important in connection to privacy, especially after the enactment of data protection legislation
Despite the significance of privacy management issues, academia often overlooks them, while industry experts regard this knowledge as a strategic and confidential asset, leading to a lack of published research Although governments sometimes release reports on privacy management, it is important to recognize that a considerable amount of unpublished information exists, particularly in the realms of Computer-Supported Cooperative Work (CSCW) and e-commerce.
This survey paper emphasizes the experiences of end-users utilizing personal applications in telecommunications and e-commerce, while only partially addressing workplace applications A key factor in acceptance models, like Venkatesh et al.’s extension of the Technology Acceptance Model, is the perceived control of information Additionally, Kraut et al highlight that in the context of Computer-Supported Cooperative Work (CSCW), factors such as usefulness, critical mass, and social influences significantly impact the adoption of new technologies.
This section explores foundational concepts in privacy discourse and presents two key perspectives that shape research and design efforts related to privacy The first perspective contrasts principled views of privacy with those based on common interests, while the second highlights the distinction between informational self-determination and personal privacy Additionally, we offer a historical overview of 30 years of privacy-focused Human-Computer Interaction (HCI) research, examining how privacy expectations have evolved alongside technological advancements.
Often-Cited Legal Foundations
This section outlines essential legal resources frequently referenced by privacy researchers We believe that HCI researchers focusing on privacy should familiarize themselves with these texts, as they provide insights into addressing privacy issues from both social and legal perspectives, while also highlighting gaps in existing legislation.
In their influential 1890 Harvard Law Review article "The Right to Privacy," Judges Warren and Brandeis established a foundational concept in US legal tradition by asserting that individuals possess a distinct right to "be let alone." They argued for the protection of personal life details from unwarranted public disclosure, which aligns with the contemporary idea of informational self-determination Notably, Warren and Brandeis did not reference the US Constitution in their discussion, highlighting the evolving nature of privacy rights.
The Fourth Amendment of the Constitution safeguards individuals' property and dwellings against unwarranted searches and seizures, extending its protection to electronic property and communications This amendment is frequently referenced by privacy advocates, particularly concerning surveillance technologies and regulations on cryptographic tools Additionally, the Fourth Amendment serves as a foundational element for various privacy laws in the United States, including those governing electronic communications.
Communications Privacy Act, or ECPA 3 Constitutional guarantees of privacy also exist in other legal texts, for example the EU Convention on Human Rights [67, §8].
In the United States, case law significantly informs HCI practitioners, particularly regarding the implications of new technologies on individual privacy Key cases include Olmstead v United States (1928), which upheld the constitutionality of telephone wiretapping, and Katz v United States (1967), which reversed Olmstead's decision, emphasizing the evolving legal landscape surrounding privacy rights Additionally, Kyllo v United States further addresses these critical issues, highlighting the ongoing legal discourse on technology and privacy.
Warren and Brandeis asserted that the right to privacy is distinct, as it pertains to personal writings and information that cannot be classified as intellectual property or as assets that yield future profits.
2 “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, […].”
The Electronic Communications Privacy Act (ECPA) establishes regulations for recording telecommunications and personal communications at the federal level in the United States, including government wiretapping It generally prohibits recordings unless at least one party is aware and mandates different types of warrants for law enforcement to conduct wiretapping or record telecommunications data Notable cases include vs United States (2001), which addressed the use of advanced sensing technologies by police, and Barnicki vs Vopper (2001), which focused on the interception of over-the-air cell phone transmissions.
HCI professionals in the privacy field should stay informed about rulings and reports from regulatory entities like the FTC, FCC, and European Data Protection Authorities Notably, the EU Article 29 Working Party has provided guidance on critical issues, including the implications of video surveillance, the application of biometric technologies, and the importance of clear and simplified privacy policies.
HCI researchers frequently reference legal frameworks such as the European Data Protection Directive of 1995 and HIPAA, the US Health Insurance Portability and Accountability Act of 1999, which are influenced by the Fair Information Practices These data protection laws establish intricate data management requirements and end-user rights It is essential for HCI practitioners to recognize that privacy protection varies across jurisdictions and encompasses more than just the constitutional rights and laws mentioned.
Philosophical Perspectives on Privacy
Discussions surrounding privacy are often shaped by individual viewpoints, as the values and priorities of designers play a crucial role in shaping solutions This section explores various perspectives on privacy without promoting a specific stance Readers are encouraged to consult ethical principles from professional organizations like ACM and IFIP Understanding diverse viewpoints on privacy is essential, as it equips designers with a framework to choose the most suitable approach for addressing particular challenges.
2.2.1 Principled Views and Common Interests
The principled view of privacy regards it as a fundamental human right, a perspective upheld by modern constitutions like the Fourth Amendment of the United States and documents such as the European Convention on Human Rights, contrasting sharply with the communitarian view.
The communitarian perspective prioritizes the collective good, advocating for a utilitarian approach to privacy that may limit individual rights for the benefit of society as a whole This contrast highlights the ongoing debate regarding privacy in the context of ubiquitous computing technologies, as explored in the research conducted by Terrel, Jacobs, and Abowd.
The ongoing tension between principled and utilitarian perspectives is evident in discussions surrounding various technologies, such as mandatory HIV testing and video surveillance Etzioni highlights the pros and cons of these approaches, particularly in the realm of information and communication technologies This conflict is prominently illustrated in the debates between civil liberties organizations, like the Electronic Frontier Foundation, and government entities regarding the use of strong encryption technologies versus surveillance systems.
Diverse perspectives within the privacy research community highlight varying approaches to privacy-enhancing technologies (PETs) Some PETs are developed primarily from a principled standpoint rather than for commercial success, with certain researchers emphasizing that their existence plays a crucial role in influencing policy discussions, regardless of their actual adoption or market viability This principle-driven approach is a key reason why organizations like the Electronic Frontier Foundation advocate for these initiatives.
2.2.2 Data Protection and Personal Privacy
The second perspective distinguishes data protection from personal privacy, emphasizing that data protection, or informational self-determination, involves the management of personally identifiable information by governments and commercial entities This concept focuses on safeguarding such data through regulations that dictate how, when, and for what purposes information can be collected, used, and disclosed, with its modern interpretation rooted in the work of Alan Westin and others.
307], and came about because of concerns over how databases could be used to collect and search personal information [288]
Westin's contributions led to the development of the Fair Information Practices (FIPS), essential guidelines for managing personal information Key principles of FIPS include purpose specification, participation, and accountability These guidelines have significantly shaped privacy research, influencing standards such as P3P, website privacy policies, and data management strategies Recently, FIPS have been adapted to address emerging technologies like RFID systems and ubiquitous computing.
Personal privacy involves how individuals control their privacy in relation to others, rather than large organizations According to Palen and Dourish, privacy is an ongoing process of defining boundaries where disclosure and identity are fluidly negotiated For instance, the use of window blinds and doors exemplifies this boundary-setting Similar insights have been noted by other researchers, such as Darrah et al., who found that individuals create strategies to limit their accessibility while also wanting to connect with others Westin emphasized that individuals constantly navigate a balance between their need for privacy and their desire for communication and disclosure.
Altman’s research draws inspiration from Goffman’s exploration of social interactions within small groups Goffman highlights that individuals present varied personas depending on their audience and context; for instance, a doctor may adopt a formal demeanor in a hospital setting but be more relaxed and candid with friends and family This variability in roles poses challenges for designing interactive systems, as these nuanced behaviors are difficult to accurately capture or model algorithmically.
Personal privacy serves as an effective framework for understanding individuals' use of information technology, especially in scenarios where the information needing protection is ambiguous, such as managing interruptions or subtle interpersonal communications The decision to share personal information is highly contextual, influenced by the social and historical backgrounds of those involved For instance, individuals face dilemmas about disclosing their location while using mobile devices or "friend finder" apps Current studies indicate that such nuanced situations are challenging to encapsulate within the rigid privacy policies commonly found in data protection regulations.
Data protection emphasizes the dynamic between individual citizens and large organizations, highlighting that the power of knowledge is rooted in its volume Conversely, personal privacy pertains to interpersonal relationships and close social circles, where the primary concern revolves around intimacy.
Understanding the distinction between privacy modeling and data protection guidelines is crucial for effective design, as it leads to improved access control and usage policies for personal information This approach is essential for various IT applications, including healthcare and e-commerce Key design tools that reflect the data protection perspective encompass website privacy policies, consent checkboxes, certification programs like TRUSTe, and regulations that enhance consumer trust in organizations.
Applications that manage access to personal spaces and communication, like chat and social networking sites, often face challenges in data protection design For instance, overly detailed policies on instant messaging can complicate user experience Instead, instant messaging clients offer real-time control over availability, allowing users to manage their presence easily In contexts that impact personal privacy, ongoing and dynamic negotiation is essential, enabling individuals to project their desired persona based on social context and expectations of behavior.
Reconciling the various perspectives on privacy may not be necessary, as each viewpoint has contributed valuable tools such as analytical instruments, design guidelines, legislation, and social expectations Many applications, particularly social networking sites, simultaneously embody these diverse approaches For instance, they must implement data protection measures to safeguard user information, adopt a personal privacy perspective to allow individuals to curate their online personas, and reinforce data protection again to prevent unauthorized data mining and crawling of their platforms.
An Historic Perspective on Privacy
Privacy is an evolving concept influenced by technological advancements, our understanding of their social applications, and shifting societal expectations Over the past thirty years, these changes have significantly impacted privacy research within Human-Computer Interaction (HCI) This section explores the evolving expectations of privacy and their implications for HCI practices.
2.3.1 Changes in Expectations of Privacy
The fundamental frameworks of social relations, such as power dynamics and self-presentation, have largely remained consistent despite technological advancements However, significant changes in the perceptions and expectations surrounding privacy have emerged This evolution is evident in the widespread acceptance of telecommunication technologies, electronic payment systems, and surveillance mechanisms, even in the face of initial concerns about privacy.
Privacy expectations have significantly transformed in two key ways Firstly, social practices and expectations evolve in tandem with technological advancements, complicating the identification of direct causal relationships Secondly, privacy expectations develop along multiple dimensions, meaning that the same technology can produce contrasting effects on various aspects of privacy.
Social practice and technology co-evolve For example, the introduction of digital cameras, or location technology in cell phones, happened alongside the gradual introduction of legislation [2,
The evolution of technology often leads to the establishment of social etiquette governing its use, with legislation frequently lagging behind technical advancements However, in certain instances, specific laws have been introduced prior to the full development of technologies, as seen with digital signature legislation in some European nations This premature legislation may have hindered the adoption of such technologies by adversely impacting their usability.
The relationship between social practices and technology is complex, often characterized by co-evolution rather than a straightforward cause-and-effect dynamic Observers suggest that social constructs and technological advancements influence each other, highlighting the intertwined nature of societal expectations and technological development.
“socio-technological hybrids,” undividable structures encompassing technology as well as culture
Latour emphasizes the importance of studying hybrids—norms, social practices, and perceptions—as cohesive entities This perspective is evident in Human-Computer Interaction (HCI) research, particularly among advocates of participatory design and social computing Iachello et al argue that in the realm of privacy, the adoption patterns of applications should be intentionally designed to enhance user acceptance and ensure successful integration.
Technologies impacting privacy, such as Geographic Information Systems (GIS), are often developed with minimal public discussion GIS utilizes census, credit, and consumer data to classify geographic units, significantly influencing perceptions of community and individuality However, as highlighted by Curry and Philips, the introduction of GIS occurred quietly over several decades through a mix of government initiatives, advancements in information technology, and private sector involvement, without generating substantial public debate.
Understanding the complexities of technological changes is challenging, as they can have contradictory impacts on social practices A single artifact, like a cell phone, can enhance privacy in some aspects while diminishing it in others For instance, cell phones foster social connections by allowing friends and acquaintances to communicate more frequently and spontaneously However, they also create barriers among people who are physically present together, leading to "bubbles" of private space in crowded public areas, such as train compartments.
Privacy-sensitive IT design involves balancing the conflicting impacts of new technologies For instance, interruption management systems utilizing sensing networks, like those developed by Nagel et al., enhance personal and environmental privacy by minimizing unwanted phone calls However, these systems may compromise information privacy by collecting extra data through activity sensors.
We highlight this issue of how expectations of privacy change over time as an ongoing research challenge in Section 4.5.
The discussion surrounding human-computer interaction and privacy in information technology has evolved significantly over the past four decades This dialogue gained momentum in the late 1960s, particularly with the introduction of the National Data Center proposal in the United States, and reached a pivotal point with the release of the 1973 report titled "Records, Computers and the Rights of Citizens."
In the early 1970s, the growing collection of personal data led several industrialized nations to implement laws governing the collection, use, and disclosure of personal information, following the introduction of Fair Information Practices.
The FIPS embody a top-down, systems-oriented approach characteristic of the IT landscape of their era, where systems were limited, meticulously designed for specific functions, centrally governed, and their utilization was mandatory Privacy terminology mirrored this framework, emphasizing that data subjects were safeguarded through centralized data protection mechanisms overseen by a data controller or owner Trust was primarily established in government entities and the accountability of data custodians Similarly, Human-Computer Interaction (HCI) in the 1970s was marked by structured process modeling for non-discretionary applications, utilizing methods like GOMS to enhance performance, usability, and overall effectiveness of computer-related tasks.
Advancements in personal computing shifted the landscape of human-computer interaction (HCI), emphasizing ease-of-use and enjoyable experiences, while allowing users greater discretion in choosing applications and services Concurrently, the expansion of personal data collection due to enhanced storage and processing capabilities highlighted the importance of trust in IT service delivery This evolution in user choice and data management is mirrored in the 1980s data protection legislation, which transitioned from the original use limitation principles to the broader concept of Informational Self-Determination.
The 1990s marked the rise of the Internet, leading to innovative applications and communication methods In response, regulators and the industry began crafting more adaptable and thorough legislation to address the significant surge in personal information sharing and usage This shift prompted a wave of privacy research that recognized the evolving landscape of data utilization.
The rise of information technology has transformed communication and increased the fluidity of personal data utilized by individuals, businesses, and governments This evolution has led to the advancement of privacy-enhancing technologies, such as machine-readable privacy policies, and the emergence of concepts like Multilateral.
Security [247], and of technology supporting anonymous transactions (e.g., mail encryption tools, mix networks, anonymizing web services) are manifestations of the complexity of the IT landscape
Understanding Users’ Privacy Preferences
Understanding individuals' privacy preferences is complex, as these preferences are influenced by social context and often difficult to express For instance, while the need for plausible deniability in social interactions is significant, survey participants may not fully recognize or admit to this aspect of their behavior As a result, privacy concerns are not easily generalized and should be examined within specific contexts It is important to avoid extrapolating privacy preferences from one area, such as loyalty programs or online shopping, to another, like personal relationships with family and colleagues.
Despite the challenges, various techniques have been developed to collect data on user preferences and attitudes, utilizing both quantitative methods like surveys for mass-market insights and qualitative approaches for understanding personal privacy dynamics An overview of these techniques, including their scope, advantages, and limitations, is presented in Table 1 We will explore how these methods have been applied across different domains and discuss their specific benefits and drawbacks, particularly concerning privacy issues Additionally, we emphasize the ongoing need for enhancements in these data-gathering techniques.
3.1.1 Data Protection and Privacy Preferences
During the 1970s and 1980s, the evolution of data collection practices prompted governments to implement data protection laws Concurrently, various studies were undertaken to assess public sentiment towards these practices, often commissioned by governments, major IT firms, or research institutions In the United States, the Pew Research Center, a nonprofit organization, notably conducted a series of surveys that explored the attitudes and trends influencing American public opinion.
A prominent series of surveys by Privacy & American Business, founded by Alan Westin, categorizes individuals based on their privacy preferences regarding commercial entities The three identified groups include Fundamentalists, who are highly concerned about privacy and believe that personal information is inadequately protected by organizations; Unconcerned individuals, who feel that their data is managed securely and that adequate safeguards exist; and Pragmatists, the largest group, who recognize the risks to their personal information but still trust that sufficient protections are in place.
Over the past decade, the distribution of individuals across three categories—fundamentalists, unconcerned, and pragmatists—has shown notable temporal trends, generally ranging from 15% to 25% for both fundamentalists and unconcerned individuals, while pragmatists account for 40% to 60% These findings align with data from the Eurobarometer survey in the EU and are further supported by scenario-based research conducted by Ackerman et al and a controlled experiment.
Table 1 Summary of techniques for understanding users’ privacy preferences, with example studies.
Westin Segmentation Data Protection Principled 1000-
Data Protection Neutral 10000 Historic sequence of studies Smith et al Data protection in organizations
Data Protection Neutral