Papers by Jules Polonetsky
Social Science Research Network, May 27, 2015
As scientific knowledge advances, new data uses continuously emerge in a wide variety of contexts... more As scientific knowledge advances, new data uses continuously emerge in a wide variety of contexts, from combating fraud in the payment card industry, to reducing the time commuters spend on the road, detecting harmful drug interactions, improving marketing mechanisms, personalizing the delivery of education in K-12

The International Review of Information Ethics, 2014
This article analyzes the opportunities and risks of data driven education technologies (ed tech)... more This article analyzes the opportunities and risks of data driven education technologies (ed tech). It discusses the deployment of data technologies by education institutions to enhance student performance, evaluate teachers, improve education techniques, customize programs, devise financial assistance plans, and better leverage scarce resources to assess and optimize education results. Critics fear ed tech could introduce new risks of privacy infringements, narrowcasting and discrimination, fueling the stratification of society by channeling “winners” to a “Harvard track” and “losers” to a “bluer collar” track; and overly limit the right to fail, struggle and learn through experimentation. The article argues that together with teachers, parents and students, schools and vendors must establish a trust framework to facilitate the adoption of data driven ed tech. Enhanced transparency around institutions’ data use philosophy and ethical guidelines, and novel methods of data “featurizat...

The Cambridge Handbook of Consumer Privacy
In the course of a single day, hundreds of companies collect massive amounts of information from ... more In the course of a single day, hundreds of companies collect massive amounts of information from individuals. Sometimes they obtain meaningful consent. Often, they use less than transparent means. By surfing the web, using a cell phone and apps, entering a store that provides Wi-Fi, driving a car, passing cameras on public streets, wearing a fitness device, watching a show on a smart TV or ordering a product from a connected home device, people share a steady stream of information with layers upon layers of hardware devices, software applications, and service providers. Almost every human activity, whether it is attending school or a workplace, seeking healthcare or shopping in a mall, driving on a highway or watching TV in the living room, leaves behind data trails that build up incrementally to create a virtual record of our daily lives. How companies, governments, and experts should use this data is among the most pressing global public policy concerns. Privacy issues, which are at the heart of many of the debates over data collection, analysis, and distribution, range extensively in both theory and practice. In some cases, conversations about privacy policy focus on marketing issues and the minutiae of a website’s privacy notices or an app’s settings. In other cases, the battle cry for privacy extends to diverse endeavors, such as the following: calls to impose accountability on the NSA’s counterterrorism mission; proposals for designing safe smart toys; plans for enabling individuals to scrub or modify digital records of their pasts; pleas to require database holders to inject noise into researchers’ queries to protect against leaks that disclose an individuals’ identity; plans to use crypto currencies or to prevent criminals and terrorists from abusing encryption tools; proposals for advancing medical research and improving public health without sacrificing patients’ control over their data; and ideas for how scientists can make their data more publicly available to facilitate replication of studies without, at the same time, inadvertently subjecting entire populations to prejudicial treatment, including discrimination. At a time when fake news influences political elections, new and contentious forms of machine to-machine communications are emerging, algorithmic decision-making is calling more of the shots in civic, corporate, and private affairs, and ruinous data breaches and ransomware attacks endanger everything from financial stability to patient care in hospitals, “privacy” has become a potent shorthand. Privacy is a boundary, a limiting principle, and a litmus test for identifying and adjudicating the delicate balance between the tremendous benefits and dizzying assortment of risks that insight-filled data offers. Consequently, far from what a first glance at the title of this volume might lead readers to expect, the Cambridge Handbook of Consumer Privacy critically explores core issues that will determine how the future is shaped. To do justice to the magnitude and complexity of these topics, we have asked contributors to address as many parts and perspectives of the consumer privacy debate as possible. How we, all of us, collectively grapple with these issues will determine the fate of technology and course of humanity.
Privacy Technologies and Policy, 2018
The EU's General Data Protection Regulation is poised to present major challenges in bridging the... more The EU's General Data Protection Regulation is poised to present major challenges in bridging the gap between law and technology. This paper reports on a workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US. Discussions aimed at advancing current knowledge on the use of abstract legal terms in the context of applied technologies together with best practices following state of the art technologies. Five themes were discussed: state of the art, consent, de-identification, transparency, and development and deployment practices. Four traversal conflicts were identified, and research recommendations were outlined to reconcile these conflicts.

As scientific knowledge advances, new data uses continuously emerge in a wide variety of contexts... more As scientific knowledge advances, new data uses continuously emerge in a wide variety of contexts, from combating fraud in the payment card industry, to reducing the time commuters spend on the road, detecting harmful drug interactions, improving marketing mechanisms, personalizing the delivery of education in K–12 schools, encouraging exercise and weight loss, and much more.At corporations, not-for-profits, and academic institutions, researchers are analyzing data and testing theories that often rely on data about individuals. Many of these new uses of personal information are natural extensions of current practices, well within the expectations of individuals and the boundaries of traditional Fair Information Practice Principles. In other cases, data use may exceed expectations, but organizations can provide individuals with additional notice and choice. However, in some cases enhanced notice and choice is not feasible, despite the considerable benefit to consumers if personal inf...

Cyberspace Law eJournal, 2017
The prospect of digital manipulation on major online platforms has reached fever pitch in the las... more The prospect of digital manipulation on major online platforms has reached fever pitch in the last election cycle in the United States. Jonathan Zittrain’s concern about “digital gerrymandering” has found resonance in reports, which were resoundingly denied by Facebook, of the company’s alleged editing content to tone down conservative voices. At the start of the election cycle, critics blasted Facebook for allegedly injecting editorial bias into an apparently neutral content generator, its “Trending Topics” feature. Immediately after the election, when the extent of dissemination of “fake news” through social media became known, commentators chastised Facebook for not proactively policing user generated content to block and remove untrustworthy information. Which one is it then? Should Facebook have deployed policy directed technologies or should its content algorithm have remained policy neutral? This article examines the potential for bias and discrimination in automated algorith...

European Courts have recognized a “right to be forgotten” (RTBF) that would allow individuals to ... more European Courts have recognized a “right to be forgotten” (RTBF) that would allow individuals to stop data search engines or other third parties from providing links to information about them deemed irrelevant, no longer relevant, inadequate, or excessive. There is a lack of consensus between the EU and the U.S. on the legitimacy of this right, which illustrates the cultural transatlantic clash on the issue of the importance of privacy versus other rights, such as freedom of information and freedom of speech. This is problematic given that privacy regulators in Europe have also pressed for a broad view of this right, seeking to extend it globally, requesting that information not only be delisted from European extensions, but from all extensions. Some are concerned that such an extraterritorial effect not only allows someone from a different jurisdiction or country to erase information that they perceive as “irrelevant” or “illegitimate” based on their own set of values; it also argu...

Santa Clara law review, 2016
One of the most hotly debated issues in privacy and data security is the notion of identifiabilit... more One of the most hotly debated issues in privacy and data security is the notion of identifiability of personal data and its technological corollary, de-identification. De-identification is the process of removing personally identifiable information from data collected, stored and used by organizations. Once viewed as a silver bullet allowing organizations to reap the benefits of data while minimizing privacy and data security risks, de-identification has come under intense scrutiny with academic research papers and popular media reports highlighting its shortcomings. At the same time, organizations around the world necessarily continue to rely on a wide range of technical, administrative and legal measures to reduce the identifiability of personal data to enable critical uses and valuable research while providing protection to individuals’ identity and privacy. The debate around the contours of the term personally identifiable information, which triggers a set of legal and regulator...

Federal Communications Law Journal, 2017
TABLE OF CONTENTS I.INTRODUCTION 104 II.THE INTERNET OF THINGS AND PRIVACY 104 III.THE INTERNET O... more TABLE OF CONTENTS I.INTRODUCTION 104 II.THE INTERNET OF THINGS AND PRIVACY 104 III.THE INTERNET OF THINGS AND INCLUSION 106 A. For people who are visually impaired 106 B. For people with mobility-related limitation 107 C. For people who are hearing impaired 107 D. For older adults and the elderly 107 E. For those with health concerns 108 F. For the infirm 108 G. For the economically disadvantaged 109 H. For farmers in rural communities 109 I. Improving Interoperability and Access 109 IV.EMERGING INDUSTRY STANDARDS AND NORMS 110 A. Wearables 110 B. "Always Ready" Home Devices 114 V.CONCLUSION 117 I. INTRODUCTION In the next decade, a critical issue for policymakers and regulators will be the advancement and growing ubiquity of cyber-physical systems, or the Internet of Things (IoT). Consumer-facing IoT systems are already delivering benefits to consumers and society. (1) IoT can also be a powerful tool for inclusion and equality, enabling accessibility for many who have tra...

IEEE Security & Privacy, 2021
Cybersecurity is an established business sector with numerous global public companies employing h... more Cybersecurity is an established business sector with numerous global public companies employing hundreds of thousands of professionals. Although the range of services offered continues to change and expand, experienced security executives understand the types of technologies they will need to develop and retain. In contrast, the privacy technology sector is still maturing. While many privacy tools and services are well developed and numerous businesses have gained traction by solving privacy problems, companies’ needs for privacy technology to help manage and utilize data continue to evolve rapidly. Based on a report by the Future of Privacy Forum and the Privacy Tech Alliance, this article reviews developments in the privacy technology sector and charts the services provided as companies mature. The report draws on dozens of hours of interviews with buyers and sellers as well as conversations with leading privacy tech experts and a survey sent to privacy technology companies.
IEEE Security & Privacy, 2018

The ANNALS of the American Academy of Political and Social Science, 2017
Companies and government entities collect substantial amounts of administrative data through the ... more Companies and government entities collect substantial amounts of administrative data through the Internet; mobile communications; and a vast infrastructure of devices and sensors embedded in healthcare facilities, retail outlets, public transportation, social networks, workplaces, and homes. They use administrative data to test new products and services, improve existing offerings, conduct research, and foster innovation. However, the lack of a clear legal framework and ethical guidelines for use of administrative data jeopardizes the value of important research. Concerns over legal impediments and ethical restrictions threaten to diminish productive collaboration between researchers and private sector businesses. This article provides strategies for organizations to minimize risks of reidentification and privacy violations for individual data subjects. In addition, it suggests that privacy and ethical concerns would best be managed by supporting the development of administrative da...

Stanford Law Review Online, Sep 3, 2013
This Essay attempts to frame the conversation around de-identification. De-identification is a pr... more This Essay attempts to frame the conversation around de-identification. De-identification is a process used to prevent a person's identity from being connected with information. Organizations de-identify data for a range of reasons. Companies may have promised "anonymity" to individuals before collecting their personal information, data protection laws may restrict the sharing of personal data, and, perhaps most importantly, companies de-identify data to mitigate privacy threats from improper internal access or from an external data breach. Hackers and dishonest employees occasionally uncover and publicly disclose the confidential information of individuals. Such disclosures could prove disastrous, as public dissemination of stigmatizing or embarrassing information, such as a medical condition, could negatively affect an individual's employment, family life, and general reputation. Given these negative consequences, industries and regulators often rely on de-identification to reduce the occurrence and harm of data breaches. Regulators have justifiably concluded that strong de-identification techniques are needed to protect privacy before publicly releasing sensitive information. With publicly released datasets, experts agree that weak technical de-identification creates an unacceptably high risk to privacy. 1 For example, statisticians have re-identified some individuals in publicly released datasets.
SSRN Electronic Journal, 2014
Lecture Notes in Computer Science, 2014
The use of general descriptive names, registered names, trademarks, service marks, etc. in this p... more The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The arrival of new technologies in schools and classrooms around the nation has been met with a m... more The arrival of new technologies in schools and classrooms around the nation has been met with a mixture of enthusiasm and anxiety. Education technologies ("ed tech") present tremendous opportunities: they allow schools to tailor programs to individual students; make education more collaborative and engaging through social media, gamification, and interactive content; and facilitate access to education for anyone with an Internet connection in remote parts of the world. At the same time, the combination of enhanced data collection with highly sensitive information about children and teens presents grave privacy risks. Indeed, in a recent report, the White House identified privacy in education as a flashpoint for big data policy concerns. This Article is the most comprehensive study to date of the policy issues and privacy concerns arising from the surge of ed tech innovation. It surveys the burgeoning market of ed tech solutions, which range from free Android and iPhone apps to comprehensive learning management systems and digitized curricula delivered via the Internet. It discusses the deployment of big data analytics by education institutions to enhance student performance, evaluate teachers, improve education techniques, customize programs, and better leverage scarce resources to optimize education results. This Article seeks to untangle ed tech privacy concerns from the broader policy debates surrounding standardization, the Common Core, longitudinal data systems, and the role of business in education. It unpacks the meaning of commercial data uses in schools, distinguishing between behavioral advertising to children and providing comprehensive, optimized education solutions to students, * Jules Polonetsky is Co-chair and Executive Director of the Future of Privacy Forum.
3 We use "techno-social chaos" to refer to the tight and often strained interaction between techn... more 3 We use "techno-social chaos" to refer to the tight and often strained interaction between technological developments and social norms.
Uploads
Papers by Jules Polonetsky