Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2019, Social Studies of Science
…
28 pages
1 file
On May 25, 2018, the European Union’s General Data Protection Regulation (GDPR) came into force. EU citizens are granted more control over personal data while companies and organizations are charged with increased responsibility enshrined in broad principles like transparency and accountability. Given the scope of the regulation, which aims to harmonize data practices across 28 member states with different concerns about data collection, the GDPR has significant consequences for individuals in the EU and globally. While the GDPR is primarily intended to regulate tech companies, it also has important implications for data use in scientific research. Drawing on ethnographic fieldwork with researchers, lawyers and legal scholars in Sweden, I argue that the GDPR’s flexible accountability principle effectively encourages researchers to reflect on their ethical responsibility but can also become a source of anxiety and produce unexpected results. Many researchers I spoke with expressed profound uncertainty about ‘impossible’ legal requirements for research data use. Despite the availability of legal texts and interpretations, I suggest we should take researchers’ concerns about ‘unknowable’ data law seriously. Many researchers’ sense of legal ambiguity led them to rethink their data practices and themselves as ethical subjects through an orientation to what they imagined as the ‘real people behind the data’, variously formulated as a Swedish population desiring data use for social benefit or a transnational public eager for research results. The intentions attributed to people, populations and publics – whom researchers only encountered in the abstract form of data – lent ethical weight to various and sometimes conflicting decisions about data security and sharing. Ultimately, researchers’ anxieties about their inability to discern the desires of the ‘real people’ lent new appeal to solutions, however flawed, that promised to alleviate the ethical burden of personal data.
Big Data, 2018
Ready data availability, cheap storage capacity, and powerful tools for extracting information from data have the potential to significantly enhance the human condition. However, as with all advanced technologies, this comes with the potential for misuse. Ethical oversight and constraints are needed to ensure that an appropriate balance is reached. Ethical issues involving data may be more challenging than the ethical challenges of some other advanced technologies partly because data and data science are ubiquitous, having the potential to impact all aspects of life, and partly because of their intrinsic complexity. We explore the nature of data, personal data, data ownership, consent and purpose of use, trustworthiness of data as well as of algorithms and of those using the data, and matters of privacy and confidentiality. A checklist is given of topics that need to be considered.
Big Data & Society, 2016
There are growing discontinuities between the research practices of data science and established tools of research ethics regulation. Some of the core commitments of existing research ethics regulations, such as the distinction between research and practice, cannot be cleanly exported from biomedical research to data science research. Such discontinuities have led some data science practitioners and researchers to move toward rejecting ethics regulations outright. These shifts occur at the same time as a proposal for major revisions to the Common Rule—the primary regulation governing human-subjects research in the USA—is under consideration for the first time in decades. We contextualize these revisions in long-running complaints about regulation of social science research and argue data science should be understood as continuous with social sciences in this regard. The proposed regulations are more flexible and scalable to the methods of non-biomedical research, yet problematically...
The Companion to Peace and Conflict Fieldwork, 2020
In this chapter, I examine what and whom we imagine to be protecting when we use the language of ethics and transparency to describe research. How does the imagination of humans as data inform different approaches to transparency and ethics in research? I draw from my experiences in negotiating about data transparency with a funding agency to reflect on how different temporalities and spaces of research and violence alike inform interpretations of our responsibilities toward research interlocutors. The goal of the chapter is to synthesize recent research to advance an alternate orientation of transparency, away from a conceptualization informed by liability or “checkbox” compliance and toward a practice of reflexive openness.
This is an introduction to the special issue of “Ethics as Methods: Doing Ethics in the Era of Big Data Research.” Building on a variety of theoretical paradigms (i.e., critical theory, [new] materialism, feminist ethics, theory of cultural techniques) and frameworks (i.e., contextual integrity, deflationary perspective, ethics of care), the Special Issue contributes specific cases and fine-grained conceptual distinctions to ongoing discussions about the ethics in data-driven research. In the second decade of the 21st century, a grand narrative is emerging that posits knowledge derived from data analytics as true, because of the objective qualities of data, their means of collection and analysis, and the sheer size of the data set. The by-product of this grand narrative is that the qualitative aspects of behavior and experience that form the data are diminished, and the human is removed from the process of analysis. This situates data science as a process of analysis performed by the tool, which obscures human decisions in the process. The scholars involved in this Special Issue problematize the assumptions and trends in big data research and point out the crisis in accountability that emerges from using such data to make societal interventions. Our collaborators offer a range of answers to the question of how to configure ethics through a methodological framework in the context of the prevalence of big data, neural networks, and automated, algorithmic governance of much of human socia(bi)lity.
2015
As scientific knowledge advances, new data uses continuously emerge in a wide variety of contexts, from combating fraud in the payment card industry, to reducing the time commuters spend on the road, detecting harmful drug interactions, improving marketing mechanisms, personalizing the delivery of education in K–12 schools, encouraging exercise and weight loss, and much more.At corporations, not-for-profits, and academic institutions, researchers are analyzing data and testing theories that often rely on data about individuals. Many of these new uses of personal information are natural extensions of current practices, well within the expectations of individuals and the boundaries of traditional Fair Information Practice Principles. In other cases, data use may exceed expectations, but organizations can provide individuals with additional notice and choice. However, in some cases enhanced notice and choice is not feasible, despite the considerable benefit to consumers if personal inf...
International Journal of Computer Science and Information Security (IJCSIS), Vol. 23, No. 1, January-February , 2025
Abstract - In an age of swift progress in data collection and analysis, ethical considerations are crucial for maintaining the integrity and accountability of research practices. To prevent skewed results and misinformation among stakeholders, researchers must avoid selective reporting or data manipulation, upholding their ethical obligation to report data accurately and transparently. This research report explores data ethics from 2019 to 2024, highlighting critical challenges in balancing technological advancement with ethical principles across healthcare, research, and digital environments. The study reveals significant gaps in current practices by examining informed consent, privacy, algorithmic bias, and data sovereignty which calls for more comprehensive approaches to data governance. Keywords: Ethics, Digital Data, Eligibility criteria, informed consent
GigaScience, 2018
Being asked to write about the ethics of big data is a bit like being asked to write about the ethics of life. Big data is now integral to so many aspects of our daily lives-communication, social interaction, medicine, access to government services, shopping, and navigation. Given this diversity, there is no one-size-fits-all framework for how to ethically manage your data. With that in mind, I present seven ethical values for responsible data use.
Bulletin of the Association for Information Science and Technology, 2016
The era of big data introduces new considerations into the traditional context of research ethics. Ethical questions may be considered in terms of accuracy, humane treatment, informed participants, and the necessity and applicability of the work, but big data complicates these issues. Since social media participants reflect certain demographic features, data drawn from those sources should not be taken to represent the general population. Big data collection may be more invasive than necessary due to easy access, and consent may be nonexistent. Data that was once anonymous may become identifiable, last indefinitely and conflict with goals for publication. Ways to respect ethics in big data research include involving participants throughout the process, avoiding collecting information that should remain private, notifying participants of their inclusion and providing them options to correct or delete personal information, and using public channels to disseminate research
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Big Data & Society, 2019
Science and Engineering Ethics, 2015
Springer eBooks, 2023
Economics of Information Security and Privacy, 2010
American Behavioral Scientist, 2019
Computer Law & Security Review, 2013
SSM - Qualitative Research in Health, 2023
PLOS ONE, 2020
Ethics and Information Technology, 2018
Review of Social Economy, 2022
Social Science & Medicine, 2013