Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2015
…
12 pages
1 file
In this paper we explore the extent to which privacy enhancing technologies (PETs) could be effective in providing privacy to citizens. Rapid development of ubiquitous computing and 'the internet of things' are leading to Big Data and the application of Predictive Analytics, effectively merging the real world with cyberspace. The power of information technology is increasingly used to provide personalised services to citizens, leading to the availability of huge amounts of sensitive data about individuals, with potential and actual privacy-eroding effects. To protect the private sphere, deemed essential in a state of law, information and communication systems (ICTs) should meet the requirements laid down in numerous privacy regulations. Sensitive personal information may be captured by organizations, provided that the person providing the information consents to the information being gathered, and may only be used for the express purpose the information was gathered for. Any other use of information about persons without their consent is prohibited by law; notwithstanding legal exceptions. If regulations are properly translated into written code, they will be part of the outcomes of an ICT, and that ICT will therefore be privacy compliant. We conclude that privacy compliance in the 'technological' sense cannot meet citizens' concerns completely, and should therefore be augmented by a conceptual model to make privacy impact assessments at the level of citizens' lives possible.
In this paper we explore the extent to which privacy enhancing technologies (PETs) could be effective in providing privacy to citizens. Rapid development of ubiquitous computing and 'the internet of things' are leading to Big Data and the application of Predictive Analytics, effectively merging the real world with cyberspace. The power of information technology is increasingly used to provide personalised services to citizens, leading to the availability of huge amounts of sensitive data about individuals, with potential and actual privacy-eroding effects. To protect the private sphere, deemed essential in a state of law, information and communication systems (ICTs) should meet the requirements laid down in numerous privacy regulations. Sensitive personal information may be captured by organizations, provided that the person providing the information consents to the information being gathered, and may only be used for the express purpose the information was gathered for. Any other use of information about persons without their consent is prohibited by law; notwithstanding legal exceptions. If regulations are properly translated into written code, they will be part of the outcomes of an ICT, and that ICT will therefore be privacy compliant. We conclude that privacy compliance in the 'technological' sense cannot meet citizens' concerns completely, and should therefore be augmented by a conceptual model to make privacy impact assessments at the level of citizens' lives possible.
Information & Communications Technology Law, 2020
In recent years, research within and outside the European Union (EU) has focused on the expanding scope of personal data. The analysis provided has primarily supported the conclusions that in time, personal data will become so ubiquitous that the EU data protection law would become meaningless, unreasonable, or even discredited and ignored. Notwithstanding these criticisms, EU law is promoted as the 'gold standard' for data protection laws and the law, including its definition of personal data, is being rapidly adopted by many non-EU countries. The objective of this article is to analyse the concept of personal data under EU law and to explore its continued relevance within a data protection framework that is rapidly globalised and in which technology is continuously evolving. The article argues that far from reflecting a universal notion of data protection, the EU law and particularly its definition of personal data reflects a perception of privacy that is peculiarly European. It further argues that recent developments in technology call for a re-examination of the concept of personal data and a more critical approach by countries with nascent data protection regimes. The article proposes the 'objective risk of contextual harm' as a new approach for formulating an alternative definition of personal data. It concludes that this approach better articulates the construction of data protection as a social good and a mechanism for (consumer) protection.
International Journal of Advanced Computer Science and Applications, 2016
This paper aims to investigate the effectiveness of the provision of privacy of individuals through privacy enhancing technologies (PETs). The successful evolution and emergence of cyberspace with the real world through "Internet of Everything (IoE)" has led to the speedy progress in research and development of predictive analysis of big data. The individual's privacy has gained a considerable momentum in both industry and academia since privacy-enhancing technologies (PETs) constitute a technical means to protect information. Privacy regulations and state of law deemed this as an integral part in order to protect the individual's private sphere when the infrastructure of Information Communication Technologies (ICT) is laid out. Modern organisations use consent forms to gather individual's sensitive personal information for a specific purpose. The law prohibits using the person's information for purposes other than that of when the consent was initially established. The infrastructure of ICT should be developed in alliance with the privacy laws and made compliant as well intelligent which learn by itself from the environment. This extra layer embedded in the system would educate the ICT structure and help system to authenticate as well as communicate with the perspective users. The existing literature on protecting individuals' privacy through privacy-enhancing technologies (PETs) is still embryonic and does conclude that the individual's concerns about privacy are not fully considered in the technological sense. Among other contributions, this research paper will devise a conceptual model to improve individual's privacy.
SSRN Electronic Journal, 2022
Recent decades have produced a wide variety of technical tools for privacy protection. However, the hopes placed in these tools are often inflated. Overestimating the protective effect of privacy-enhancing technologies can be dangerous and may lead to a false sense of security. This chapter stresses that the following aspects need to be born in mind when assessing the real impact of technical privacy safeguards: (1) Data controllers’ reluctance to apply self-limiting privacy safeguards, (2) the loophole created by the notice-and-choice approach to privacy protection, (4) technical challenges and limitations, (4) the limited scope of protection that privacy-enhancing technologies can offer. Also, many problems associated with the processing of personal data are not primarily of technical nature but involve fundamental social, ethical, and political questions. There is no doubt that technical tools play a crucial part in protecting people’s privacy today. They are necessary but not sufficient. To meaningfully protect people against the dangers resulting from modern data processing, strong regulation is needed as well.
Innovation–The European Journal of Social Science Research, 2010
Privacy is an important fundamental human right. It underpins human dignity and other values such as freedom of association and freedom of speech. However, privacy is being challenged in the networked society. The use of new technologies undermines this right because it facilitates the collection, storage, processing and combination of personal data by security agencies and businesses. This research note presents the background and agenda of the recently-commenced research project PRESCIENT, which aims at reconceptualizing the concept of privacy and developing means for the assessment of privacy impacts.
2013
After the recent spread of the Internet, many technology terms appeared like Smartphone, social media and cloud computing. Flexibility in use this technology encourage people to communicate and share their information in all kinds: photos, videos, documents and sometimes sensitive information like bank account. Annually we note the increased number of users of this technology to be arrived over billions of users. Most of these users are public which they do not realize ambiguity of technology and therefore easier to access their information. The abundance of information and poor knowledge of users about privacy led to the emergence of numerous threats and fears under this term. Thus, the importance of educating people about privacy issues and related risk factors become essential. This paper views the definition of privacy and what level of awareness should be applied. With try to understand some security issues and related laws. And compare between two different privacy laws.
The paper introduces an approach to privacy enhancing technologies that sees privacy not merely as an individual right, but as a public good. This understanding of privacy has recently gained ground in the debate on appropriate legal protection for privacy in an online environment. The jurisprudential idea that privacy is a public good and prerequisite for a functioning democracy also entails that its protection should not be left exclusively to the individual whose privacy is infringed. This idea finds its correspondence in our approach to privacy protection through obfuscation, where everybody in a group takes a small privacy risk to protect the anonymity of fellow group members. We show how these ideas can be computationally realised in an Investigative Data Acquisition Platform (IDAP). IDAP is an efficient symmetric Private Information Retrieval (PIR) protocol optimised for the specific purpose of facilitating public authorities' enquiries for evidence.
Privacy Technologies and Policy, 2018
The EU's General Data Protection Regulation is poised to present major challenges in bridging the gap between law and technology. This paper reports on a workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US. Discussions aimed at advancing current knowledge on the use of abstract legal terms in the context of applied technologies together with best practices following state of the art technologies. Five themes were discussed: state of the art, consent, de-identification, transparency, and development and deployment practices. Four traversal conflicts were identified, and research recommendations were outlined to reconcile these conflicts.
Snowden's revelations of 2013 have shifted attention to societal implications of surveillance practices and in particular privacy. This editorial reflects on key concepts and research questions raised in the issue. How can privacy be defined? Can it be designed? Considering such developments, this editorial asks if the public's attitudes to the sharing of data have moved towards, 'nothing to hide, nothing to fear' arguments and if greater awareness and corporate transparency are possible. Even if corporate surveillance does not operate through overt coercion, it is argued that it yet results in self-regulation and subjugation to neoliberal rationality. Since telecoms and social media companies generally work hand in hand with the state and legal and practical standpoints boundaries overlap on a great scale, how can privacy be safeguarded for citizens? And where 'accountability' of data holders, as interviewee Mark Andrejevic suggests, is a growing imperative. Contributions to this issue suggest detailed attention to legal frameworks, encryption practices, definitions of the surveilled subject and the history of such scrutiny may hold some of the answers.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Computer Law & Security Review, 2013
2021 Third IEEE International Conference on Trust, Privacy and Security in Intelligent Systems and Applications (TPS-ISA)
Innovation: The European Journal of Social Science Research, 2013
SSRN Electronic Journal, 2000
Anali Pravnog fakulteta u Beogradu, 2014
Social Science Research Network, 2013
Studies in Ethics, Law, and Technology, 2008
Mobility, Data Mining and Privacy, 2008
Social Science Research Network, 2012
Criminal Justice Matters, 2007