Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2017, Big Data and security policies: Towards a framework for regulating the phases of analytics and use of Big Data
…
27 pages
1 file
Big Data analytics in national security, law enforcement and the fight against fraud have the potential to reap great benefits for states, citizens and society but requires extra safeguards to protect citizens' fundamental rights. This involves a crucial shift in emphasis from regulating Big Data collection to regulating the phases of analysis and use. In order to benefit from the use of Big Data analytics in the field of security, a framework has to be developed that adds new layers of protection for fundamental rights and safeguards against erroneous and malicious use. Additional regulation is needed at the levels of analysis and use, and the oversight regime is in need of strengthening. At the level of analysis-the algorithmic heart of Big Data processes-a duty of care should be introduced that is part of an internal audit and external review procedure. Big Data projects should also be subject to a sunset clause. At the level of use, profiles and (semi-)automated decision-making should be regulated more tightly. Moreover, the responsibility of the data processing party for accuracy of analysis-and decisions taken on its basis-should be anchored in legislation. The general and security-specific oversight functions should be strengthened in terms of technological expertise, access and resources. The possibilities for judicial review should be expanded to stimulate the development of case law.
Big Data analytics in national security, law enforcement and the fight against fraud have the potential to reap great benefits for states, citizens and society but require extra safeguards to protect citizens’ fundamental rights. This involves a crucial shift in emphasis from regulating Big Data collection to regulating the phases of analysis and use. In order to benefit from the use of Big Data analytics in the field of security, a framework has to be developed that adds new layers of protection for fundamental rights and safeguards against erroneous and malicious use. Additional regulation is needed at the levels of analysis and use, and the oversight regime is in need of strengthening. At the level of analysis – the algorithmic heart of Big Data processes – a duty of care should be introduced that is part of an internal audit and external review procedure. Big Data projects should also be subject to a sunset clause. At the level of use, profiles and (semi-) automated decision-making should be regulated more tightly. Moreover, the responsibility of the data processing party for accuracy of analysis – and decisions taken on its basis – should be anchored in legislation. The general and security-specific oversight functions should be strengthened in terms of technological expertise, access and resources. The possibilities for judicial review should be expanded to stimulate the development of case law
Big Data analytics in national security, law enforcement and the fight against fraud have the potential to reap great benefits for states, citizens and society but requires extra safeguards to protect citizens' fundamental rights. This involves a crucial shift in emphasis from regulating Big Data collection to regulating the phases of analysis and use. In order to benefit from the use of Big Data analytics in the field of security, a framework has to be developed that adds new layers of protection for fundamental rights and safeguards against erroneous and malicious use. Additional regulation is needed at the levels of analysis and use, and the oversight regime is in need of strengthening. At the level of analysisthe algorithmic heart of Big Data processesa duty of care should be introduced that is part of an internal audit and external review procedure. Big Data projects should also be subject to a sunset clause. At the level of use, profiles and (semi-)automated decision-making should be regulated more tightly. Moreover, the responsibility of the data processing party for accuracy of analysisand decisions taken on its basisshould be anchored in legislation. The general and security-specific oversight functions should be strengthened in terms of technological expertise, access and resources. The possibilities for judicial review should be expanded to stimulate the development of case law.
2017
Big Data analytics in national security, law enforcement and the fight against fraud can reap great benefits for states, citizens and society but require extra safeguards to protect citizens’ fundamental rights. This requires new frameworks: a crucial shift is necessary from regulating the phase of the collection of data to regulating the phases of data analysis and use.
Health and Technology, 2017
This article encapsulates selected themes from the Australian Data to Decisions Cooperative Research Centre's Law and Policy program. It is the result of a discussion on the regulation of Big Data, especially focusing on privacy and data protection strategies. It presents four complementary perspectives stemming from governance, law, ethics, and computer science. Big, Linked, and Open Data constitute complex phenomena whose economic and political dimensions require a plurality of instruments to enhance and protect citizens' rights. Some conclusions are offered in the end to foster a more general discussion.
Computer Law & Security Review Volume 33, Issue 5, October 2017, Pages, 2017
In January 2017 the Consultative Committee of Convention 108 adopted its Guidelines on the Protection of Individuals with Regard to the Processing of Personal Data in a World of Big Data. These are the first guidelines on data protection provided by an international body which specifically address the issues surrounding big data applications. This article examines the main provisions of these Guidelines and highlights the approach adopted by the Consultative Committee, which contextualises the traditional principles of data protection in the big data scenario and also takes into account the challenges of the big data paradigm. The analysis of the different provisions adopted focuses primarily on the core of the Guidelines namely the risk assessment procedure. Moreover, the article discusses the novel solutions provided by the Guidelines with regard to the data subject's informed consent, the by-design approach, anonymization, and the role of the human factor in big data-supported decisions. This critical analysis of the Guidelines introduces a broader reflection on the divergent approaches of the Council of Europe and the European Union to regulating data processing. Where the principle-based model of the Council of Europe differs from the approach adopted by the EU legislator in the detailed Regulation (EU) 2016/679. In the light of this, the provisions of the Guidelines and their attempt to address the major challenges of the new big data paradigm set the stage for concluding remarks about the most suitable regulatory model to deal with the different issues posed by the development of technology.
SSRN Electronic Journal
Personal profiling and predictive behavioural analysis done by Big Data applications pose immense challenges to society and democracy, especially when they violate individuals’ constitutionally guaranteed fundamental rights, such as the rights to privacy and data protection, and the right to non-discrimination based on personal attributes. At the same time, Big Data applications also threaten basic ethical principles needed in a democratic society, such as fairness and respect for human autonomy. An analysis of European data protection laws (GDPR, ePrivacy, Digital Content, Copyright and Trade Secrets) shows far-reaching gaps with regard to the protection of privacy and the non-discrimination of individuals. Governments and legislators have a clear requirement to close these gaps as quickly and comprehensively as possible. We propose a three-pillared model for the future regulation of Big Data applications, covering three areas of action. Firstly, a fundamental reorientation of the concept of digital identity towards self-sovereignty over private data. According to this, the individual would become the owner of their own personal data and thus be able to decide sovereignly with whom to share which data, for which purposes, and over which time period. Secondly, the empowerment of the individual as a sovereign of their own data must be accompanied by a comprehensive education and training program at all levels of society. Through knowledge and training, individuals must be able to use the opportunity to determine for themselves how their information is used and, at the same time, be able to bear the associated risks. Thirdly, regulators themselves need to use so called “legal-tech” solutions to carry out automated and software-based testing and monitoring of Big Data applications with regard to their compliance with privacy protection regulations. For this purpose, the legislator faces the challenge of implementing the legal principles formulated in written laws into software code.
According to Moore's Law "the number of transistors on an affordable CPU would double every two years" (Moore, 1998). In simplified terms this means that computers are constantly changing as they physically get smaller yet virtually get larger. The growth in size of computing devices allows for larger storage and faster processing of information and in terms of big data, it is getting faster to collect and analyse mass amounts of information. However, with the ability to sift through such large amounts of information there is a responsibility that must be undertaken by intelligence organisations to ensure that human and civil rights are not impeded upon without just cause and for them to take into account the risks associated with using big data (Kuner, Cate, Millard and Svantesson, 2012). This paper will begin by providing a definition for big data before looking at how intelligence agencies gather and protect privacy through examples of legal and social frameworks. From there it will examine how big data impacts human rights and which rights may be affected by its use by looking at PRISM and the Section 215 program and applying an ethical framework to NSA practices. It will conclude by looking at how there is a fine line between protecting human rights and protecting national security and how various countries are applying laws and Acts to find compromise.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Big Data Challenges, 2016
A Digital Janus: Looking Forward, Looking Back
Edward Elgar Publishing eBooks, 2023
Digital Investigation, 2015
SSRN Electronic Journal, 2018
CGHR Cambridge Working Paper , 2018
International Data Privacy Law
Lowy Institute Analysis, 2022
Vol. 19 No. 1 JANUARY 2021 International Journal of Computer Science and Information Security (IJCSIS), 2021
Business Ethics and Leadership, 2018
Kutafin University Law Review, 2020