Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2018, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
…
17 pages
1 file
This position paper observes how different technical and normative conceptions of privacy have evolved in parallel and describes the practical challenges that these divergent approaches pose. Notably, past technologies relied on intuitive, heuristic understandings of privacy that have since been shown not to satisfy expectations for privacy protection. With computations ubiquitously integrated in almost every aspect of our lives, it is increasingly important to ensure that privacy technologies provide protection that is in line with relevant social norms and normative expectations. Similarly, it is also important to examine social norms and normative expectations with respect to the evolving scientific study of privacy. To this end, we argue for a rigorous analysis of the mapping from normative to technical concepts of privacy and vice versa. We review the landscape of normative and technical definitions of privacy and discuss specific examples of gaps between definitions that are r...
Journal of Social Issues, 1977
Meanings of privacy in everyday speech, in behavioral and social science, and in American law are compared. A variety of independent meanings emerge within each domain, and these distinctions are repeated across domains. A common‐core definition is proposed that appears to be consistent with these meanings. One behavioral theory that attempts to bring conceptual order to the various meanings of privacy is reviewed, and the review is extended to a general commentary on the current status of behavioral theories of privacy. Future tasks and directions for establishing a more complete understanding of privacy are indicated, including the explication of theoretical systems and the creation of linkages across disciplines and concepts.
2021 Third IEEE International Conference on Trust, Privacy and Security in Intelligent Systems and Applications (TPS-ISA)
This paper deals with the hot, evergreen topic of the relationship between privacy and technology. We give extensive motivation for why the privacy debate is still alive for private citizens and institutions, and we investigate the privacy concept. This paper proposes our vision of the privacy ecosystem, introducing privacy dimensions, the related users' expectations, the privacy violations, and the changing factors. We provide a critical assessment of the Privacy by Design paradigm, strategies, tactics, patterns, and Privacy-Enhancing Technologies, highlighting the current open issues. We believe that promising approaches to tackle the privacy challenges move in two directions: (i) identification of effective privacy metrics; and (ii) adoption of formal tools to design privacy-compliant applications.
SSRN Electronic Journal, 2018
International Data Privacy Law, 2011
Ethics and Information Technology, 2019
This paper takes as a starting point a recent development in privacy-debates: the emphasis on social and institutional environments in the definition and the defence of privacy. Recognizing the merits of this approach I supplement it in two respects. First, an analysis of the relation between privacy and autonomy teaches that in the digital age more than ever individual autonomy is threatened. The striking contrast between on the one hand offline vocabulary, where autonomy and individual decision making prevail, and on the other online practices is a challenge that cannot be met in a social approach. Secondly, I elucidate the background of the social approach. Its importance is not exclusively related to the digital age. In public life we regularly face privacy-moments, when in a small distinguished social domain few people are commonly involved in common experiences. In the digital age the contextual integrity model of Helen Nissenbaum has become very influential. However this model has some problems. Nissenbaum refers to a variety of sources and uses several terms to explain the normativity in her model. The notion 'context' is not specific and faces the reproach of conservatism. We elaborate on the most promising suggestion: an elaboration on the notion 'goods' as it can be found in the works of Michael Walzer and Alisdair Macintyre. Developing criteria for defining a normative framework requires making explicit the substantive goods that are at stake in a context, and take them as the starting point for decisions about the flow of information. Doing so delivers stronger and more specific orientations that are indispensible in discussions about digital privacy.
Internet Policy Review, 2019
This contribution provides a short introduction into the conceptual and socio-technical development of privacy. It identifies central issues that inform and structure current debates as well as transformations of privacy spurred by digital technology. In particular, it highlights central ambivalences of privacy between protection and de-politicization and the relation of individual and social perspectives. A second section connects these issues to the influential texts and discussions on digital privacy. In particular, we will demonstrate privacy in digital societies is to be conceived in a novel way, since contemporary socio-technical conditions unsettle central assumptions of established theories: forms of perceptions, social structure or individual rights. Thus, a final third paragraph summarises theoretical innovations triggered by this situationespecially research from computer science to the social sciences and law and philosophy highlighting the requirement to take groups, social relations and broader socio-cultural contexts into account.
In my presentation, I assume three hypotheses, validate them and come to a conclusion that leads to the alternative definition of privacy. a) Apprehension is an unpleasant emotion that is specifically linked to our public appearances. b) Privacy is a system, in which apprehension can be undone. c) Confidence or trust (used here as synonyms), when it comes to personal information, undoes apprehension. If (a), (b) and (c) are true, hence: d) Privacy is made of a network of trustworthy relationships that concern the use of personal information, no more, no less. To proceed, I will need to define the concepts I have named above, especially by distinguishing confidence from what I call “reasoned confidence”. In order to do so, I will oppose two different historical conceptions of privacy: the “classical” and the “alternative”; and will show that the latter is the contemporary conception of privacy.
Policymakers around the world constantly search for new tools to address growing concerns as to informational privacy (data protection). One solution that has gained support in recent years among policy makers is Privacy by Design (PbD). The idea is simple: think of privacy ex ante, and embed privacy within the design of a new technological system, rather than try to fix it ex post, when it is often too late. However, PbD is yet to gain an active role in engineering practices. Thus far, there are only a few success stories. We argue that a major obstacle for PbD is the discursive and conceptual gap between law and technology. A better diagnosis of the gaps between the legal and technological perceptions of privacy is a crucial step in seeking viable solutions. We juxtapose the two fields by reading each field in terms of the other field. We reverse engineer the law, so as to expose its hidden assumptions about technology (the law’s technological mindset), and we read canonical technological texts, so as to expose their hidden assumptions about privacy (technology’ s privacy mindset). Our focus is on one set of informational privacy practices: the large corporation that collects data from individual data subjects. This dual reverse engineering exercise indicates substantial gaps between the legal perception of informational privacy, as reflected in the set of principles commonly known as Fair Information Privacy Principles (FIPPs) and the perceptions of the engineering community. While both information technology and privacy law attempt to regulate the flow of data, they do so in utterly different ways, holding different goals and applying different constraints. The gaps between law and technology point to potential avenues to save PbD.
2020
The paper entails a theoretically grounded study of the notion of privacy to shed light on the different conceptions of privacy in the contemporary era. In addition to mapping out the distinctive images of privacy, the paper also aims to analyse the meanings and significance of privacy in recent times, bearing the changes happening in technology and emerging concerns for security. With the wide range of external factors affecting the transformation of the idea of privacy into a lived reality, privacy has become not only an eluding reality but also an eluding concept today. The paper aims to streamline the debates on privacy for a better appreciation of the idea.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Metaphilosophy, 1997
Journal of the Association for Information Science and Technology, 2014
International Journal of Security and Privacy in Pervasive Computing
Anali Pravnog fakulteta u Beogradu, 2014
Innovation: The European Journal of Social Science Research, 2013
Journal of Business Ethics, 1999
Paper presented at the Stockholm Criminology Symposium, 14th of June 2022, Stockholm (Sweden), 2022
Ethics & Information Technology, 2018
In: Keresztes, Gábor (ed.): Tavaszi Szél 2016 Tanulmánykötet I., Budapest, Doktoranduszok Országos Szövetsége, 2016
SSRN Electronic Journal, 2022
Technology, work and globalization, 2024
Core Concepts and Contemporary Issues in Privacy, 2018
Loyola of Los Angeles Entertainment Law Review, 2021
International Journal of Law in Context, 2006