Privacy Program Management
Tools for Managing Privacy Within Your Organization
Third Edition
Executive Editor and Contributor
Russell Densmore, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Contributors
Susan Bandi, CIPP/E, CIPP/US, CIPM, CIPT, FIP
João Torres Barreiro, CIPP/E, CIPP/US
John Brigagliano
Ron De Jesus, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Jonathan Fox, CIPP/US, CIPM
Jon Neiditz, CIPP/E, CIPP/US, CIPM
Chris Pahl, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP
Liisa Thomas
Amanda Witt, CIPP/E, CIPP/US
Edward Yakabovicz, CIPP/G, CIPM, CIPT
An IAPP Publication
©2022 by the International Association of Privacy Professionals (IAPP)
All rights reserved. No part of this publication may be reproduced, stored in a
retrieval system, or transmitted in any form or by any means, mechanical,
photocopying, recording, or otherwise, without the prior written permission of
the publisher, International Association of Privacy Professionals, Pease
International Tradeport, 75 Rochester Ave., Portsmouth, NH 03801, United
States of America.
CIPP, CIPP/US, CIPP/C, CIPP/E, CIPP/G, CIPM, and CIPT are registered
trademarks of the International Association of Privacy Professionals, Inc.
registered in the United States. CIPP, CIPP/E, CIPM, and CIPT are also
registered in the European Union as Community Trademarks (CTM).
Indexer: Hyde Park Publishing Services
ISBN: 978-1-948771-56-6
Contents
About the IAPP
Preface
Acknowledgments
Introduction
CHAPTER 1
Introduction to Privacy Program Management
1.1 Responsibilities of a Privacy Program Manager
1.2 Accountability
1.3 Beyond Law and Compliance
1.4 Why Does an Organization Need a Privacy Program?
1.5 Privacy Across the Organization
1.6 Championing Privacy
1.7 Summary
CHAPTER 2
Privacy Program Framework: Privacy Governance
2.1 Create an Organizational Privacy Vision and Mission Statement
2.2 Define Privacy Program Scope
2.3 Develop a Privacy Strategy
2.4 Develop and Implement a Framework
2.5 Frameworks
2.6 Privacy Technology and Governance, Risk, and Compliance Vendors and Tools
2.7 Structure the Privacy Team
2.8 Establishing the Organizational Model, Responsibilities, and Reporting Structure
2.9 Summary
CHAPTER 3
Privacy Program Framework:
Applicable Privacy Laws and Regulations
3.1 Global Privacy Laws
3.2 Self-Regulation: Industry Standards and Codes of Conduct
3.3 Cross-Border Data Transfers
3.4 Organizational Balance and Support
3.5 Understanding Penalties for Noncompliance with Laws and Regulations
3.6 Understanding the Scope and Authority of Oversight Agencies
3.7 Other Privacy-Related Matters to Consider
3.8 Monitoring Laws and Regulations
3.9 Third-Party External Privacy Resources
3.10 Summary
CHAPTER 4
Privacy Operational Life Cycle: Assess:
Data Assessments
4.1 Data Governance
4.2 Inventories and Records
4.3 Records of Processing Activities Under the EU General Data Protection Regulation
4.4 Assessments and Impact Assessments
4.5 Physical and Environmental Assessments
4.6 Assessing Vendors
4.7 Mergers, Acquisitions, and Divestitures: Privacy Checkpoints
4.8 Summary
CHAPTER 5
Privacy Operational Life Cycle: Protect:
Protecting Personal Information
5.1 Privacy by Design
5.2 Data Protection by Design and Default
5.3 Diagramming Privacy by Design
5.4 Information Security
5.5 Data Privacy and Information Security
5.6 Privacy Policy and Technical Controls
5.7 Summary
CHAPTER 6
Privacy Operational Life Cycle: Protect: Policies
6.1 What Is a Privacy Policy?
6.2 Privacy Policy Components
6.3 Interfacing and Communicating with an Organization
6.4 Communicating the Privacy Policy within the Organization
6.5 Policy Cost Considerations
6.6 Design Effective Employee Policies
6.7 Procurement: Engaging Vendors
6.8 Data Retention and Destruction Policies
6.9 Implementing and Closing the Loop
6.10 Summary
CHAPTER 7
Privacy Operational Life Cycle: Sustain:
Monitoring and Auditing Program Performance
7.1 Metrics
7.2 Monitor
7.3 Audits
7.4 Summary
7.5 Glossary
CHAPTER 8
Privacy Operational Life Cycle: Sustain:
Training and Awareness
8.1 Training and Awareness
8.2 Leveraging Privacy Incidents
8.3 Communication
8.4 Creating Awareness of the Organization’s Privacy Program
8.5 Awareness: Operational Actions
8.6 Identifying Audiences for Training
8.7 Training and Awareness Strategies
8.8 Training and Awareness Methods
8.9 Using Metrics
8.10 Summary
CHAPTER 9
Privacy Operational Life Cycle: Respond:
Data Subject Rights
9.1 Privacy Notices and Policies
9.2 Choice, Consent, and Opt-Outs
9.3 Obtaining Consents from Children
9.4 Data Subject Rights in the United States
9.5 Data Subject Rights in Europe and the United Kingdom
9.6 Responding to Data Subject Requests
9.7 Handling Complaints: Procedural Considerations
9.8 Data Subject Rights Outside the United States and Europe
9.9 Summary
CHAPTER 10
Privacy Operational Life Cycle: Respond:
Data Breach Incident Plans
10.1 Incident Planning
10.2 How Incidents Occur
10.3 Terminology: Security Incident versus Breach
10.4 Getting Prepared
10.5 Roles in Incident Response Planning by Function
10.6 Integrating Incident Response into the Business Continuity Plan
10.7 Incident Handling
10.8 Roles Different Individuals Play During an Incident
10.9 Investigating an Incident
10.10 Reporting Obligations and Execution Timeline
10.11 Recovering from a Breach
10.12 Benefiting from a Breach
10.13 Summary
About the Contributors
Index of Searchable Terms
About the IAPP
The International Association of Privacy Professionals (IAPP) is the largest and
most comprehensive global information privacy community and resource, helping
practitioners develop and advance their careers and organizations manage and
protect their data.
The IAPP is a not-for-profit association founded in 2000 with a mission to
define, support, and improve the privacy profession globally. We are committed to
providing a forum for privacy professionals to share best practices, track trends,
advance privacy management issues, standardize the designations for privacy
professionals, and provide education and guidance on opportunities in the field of
information privacy.
The IAPP is responsible for developing and launching the only globally
recognized credentialing programs in information privacy: the Certified
Information Privacy Professional (CIPP®), the Certified Information Privacy
Manager (CIPM®), and the Certified Information Privacy Technologist (CIPT®).
The CIPP, CIPM, and CIPT are the leading privacy certifications for thousands of
professionals around the world who serve the data protection, information
auditing, information security, legal compliance, and/or risk management needs
of their organizations.
In addition, the IAPP offers a full suite of educational and professional
development services and holds annual conferences that are recognized
internationally as the leading forums for the discussion and debate of issues
related to privacy policy and practice.
Preface
We now live in an interconnected world where data is as valuable as gold. These
interconnected products and services are oftentimes engineered to be to “your”
liking. It makes tasks easier, such as mobile banking, ordering online, or watching
television. It can help people monitor their own health and wellness with devices
like a Fitbit or Apple watch. It allows for streaming of music services or other
online content. These are all designed to make your life better and improve
services. However, there is a trade-off to all these items. It requires you to allow
others to use your data in ways that you choose.
This interaction and sharing of data between individuals and service providers is
growing at an exponential pace. If we as privacy professionals do not stand up for
the rights and freedoms of individuals to ensure proper protection of their
personal information, then who?
Over the last decade, we have seen privacy ingrained into everyday operations of
organizations. The proper handling of data by organizations is demanded by
society. Probably one of the greatest changes we have seen in privacy program
management is in training and awareness. Similar to how information security has
been “baked” into an organization’s training strategy, so has privacy. It is not
uncommon to see an organization emphasize that protecting data is the
responsibility of each employee. Now we see organizations adding that protecting
personal information is also the employee’s responsibility. This has been
incorporated into many organizations’ standard operating procedures. This is a
good thing.
The roles of the chief privacy officer, privacy program manager, privacy analyst,
and privacy engineer are to ensure organizations are adhering to the privacy
principles outlined in various privacy laws around the globe. The laws may have
specific requirements; however, most of the regulations are based on the same
principles. The principles may be named differently but in essence are quite
similar. These privacy principles must be adhered to if an organization wishes to
be compliant to the varying regulations. This is where the privacy program
manager comes into play. The privacy program manager leads the effort to ensure
privacy principles are being carried out through information security practices.
This activity will look different for every organization. The privacy program
manager works with other privacy professionals, if available, to establish the
proper policies, procedures, and processes that will protect a data subject’s
personal information.
The success of the privacy compliance program for different organizations relies
heavily on how the organization has established its data governance program.
Some organizations do not have a structured data governance program. The
importance of good data governance is being highlighted as organizations race to
comply with not only privacy regulations, but also sectoral regulations, such as
finance and medical. This is a new area for which the privacy professional may play
an increased role.
I would like to humbly thank the International Association of Privacy
Professionals (IAPP) for allowing me this opportunity for a third edition and
everyone who assisted with this textbook, especially the individual authors who
contributed in their areas of expertise. They are all dedicated and supportive
professionals, proving we can all work together as a holistic team to achieve
success. This work would not be possible without all of them. My deepest thanks
to the team.
Russell Densmore, CIPP/E, CIPP/US, CIPM, CIPT, FIP
October 2021
Acknowledgments
This third edition of Privacy Program Management: Tools for Managing Privacy
Within Your Organization would not have been possible without contributions and
support from the IAPP’s global community of privacy and data protection
professionals.
Thank you to our Training Advisory Board. We are ever grateful for your
guidance and generosity in sharing your expertise. Current members include:
Shay Babb, CIPP/C, CIPM
Robin Anise Benns, CIPP/US
Jonathan Cantor, CIPP/G, CIPP/US
Justin Castillo, CIPP/E, CIPP/US, CIPM
Alfredo Della Monica, CIPP/E
Katrina Destrée, CIPP/E
Marta Dunphy-Moriel, CIPP/E
Thays Castaldi Gentil, CIPP/E
Ian Goodwin, CIPP/E, CIPM, CIPT, FIP
Wei Gu, CIPM
Adam Higgins, CIPP/E, CIPM, CIPT, FIP
Kulwinder Johal, CIPP/E
Mazen Kassis, CIPM
Sakshi Katyal
Julie McEwen, CIPP/G, CIPP/US, CIPM, CIPT, FIP
Sarah Morrow, CIPP/US, CIPM, FIP
Theresa Niland
Viviane Nobrega Maldonado, CIPP/E
Cristina Onosé, CIPP/C, CIPM
Chris Pahl, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP
Julia Palmer, CIPP/E, CIPM
Leonard Rivera, CIPP/US
Jennifer Schack, CIPP/E, CIPP/US, CIPM, FIP
Timothy Smit, CIPP/E, CIPP/US, CIPM, FIP
James Snell
Garry Tyler Spence, CIPP/E, CIPP/US, CIPM, FIP
Becky Tarrant, CIPP/E, CIPM
Liisa Thomas
Michael Tibodeau, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Jessica Vaianisi, CIPP/C
Judith van de Vorle, CIPP/E, CIPM
Victoria van Roosmalen, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Rajesh Kumar Viswanathan, CIPP/A, CIPP/E, CIPP/US, CIPM, FIP
Victoria Watts, CIPP/E, CIPT
Zhaofeng Zhou, CIPP/E
It has been my true pleasure to work with Russell Densmore, CIPP/E,
CIPP/US, CIPM, CIPT, FIP, who serves as executive editor for this book. He led
our contributing team of privacy and data protection pros from around the globe
through all stages of development and has supported our CIPM program from its
inception. Thank you for your guidance, advice, and continued commitment to
this project.
To our stellar contributors—Susan Bandi, CIPP/US, CIPM, CIPT, FIP, João
Torres Barreiro, CIPP/E, CIPP/US, John Brigagliano, Ron De Jesus, CIPP/A,
CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP, Jonathan Fox, CIPP/US, CIPM,
Jon Neiditz, CIPP/E, CIPP/US, CIPM, Chris Pahl, CIPP/C, CIPP/E, CIPP/G,
CIPP/US, CIPM, CIPT, FIP, Liisa Thomas, Amanda Witt, CIPP/E, CIPP/US,
and Edward Yakabovicz, CIPP/G, CIPM, CIPT—we are so grateful you have
shared your expertise and diverse perspectives in the pages of this book.
Many thanks to Jyn Schultze-Melling for permission to include his chapter on
the rights of data subjects from European Data Protection: Law and Practice, Second
Edition as an excerpt in Chapter 9 of this book.
Wei Gu, CIPM, Adam Higgins, CIPP/E, CIPM, CIPT, FIP, Sarah Morrow,
CIPP/US, CIPM, FIP, Julia Palmer, CIPP/E, CIPM, Jennifer Schack, CIPP/E,
CIPP/US, CIPM, FIP, Timothy Smit, CIPP/E, CIPP/US, CIPM, FIP, Becky
Tarrant, CIPP/E, CIPM, and Michael Tibodeau, CIPP/E, CIPP/US, CIPM,
CIPT, FIP, thank you for providing thoughtful, constructive feedback on the draft
manuscript.
Thank you to Hyde Park Publishing Services for creating the book index.
We appreciate the hard work, expertise, and dedication of the many
professionals who contributed to the publication of this book. We hope you will
find it to be both a useful tool for preparing for your CIPM certification and a
practical resource for your professional career.
Marla Berry, CIPT
Training Director
International Association of Privacy Professionals
Introduction
In 2013, when we launched the Certified Information Privacy Manager program,
the idea of operating a privacy program was still novel. Our profession largely
evolved from law and compliance, and privacy was, in many ways, binary: the
privacy professional gave the product or service a thumbs-up or thumbs-down.
Quickly, however, organizations with business models increasingly dependent
on data came to realize that better management and customer trust were needed.
Unless the privacy professional was involved at every step of product
development, organizations faced too much risk.
Further, with the passage of the EU General Data Protection Regulation
(GDPR), the idea of operational privacy, or “privacy by design,” (PbD), became
law. In the years since our last edition, the GDPR’s effects have become further
cemented into business operations, while other laws around the world continually
borrow concepts from the GDPR.
Moreover, the privacy world has gone through a panoply of changes. Brazil and
China now have national data protection laws. India is pondering its own law, and
several other nations around the world have passed or will pass their own
legislation. In the United States, California passed not one, but two,
comprehensive privacy laws. Other states followed suit, including Colorado and
Virginia. And more may be on the horizon.
Keeping up with these developments complicates the efforts of the privacy
office. Finding areas of convergence and identifying gaps is a must for risk
management and compliance. Operationally, many of these laws now require
organizations to facilitate data subject access requests, as well as rights to deletion,
correction, and portability. An entire marketplace of privacy technology vendors
equipped with products and services designed to scale the internal privacy
function has grown in response.
To add on, in the wake of the Court of Justice of the European Union’s (CJEU)
decision in “Schrems II,” international data flows have become exponentially
complicated. Companies must conduct transfer impact assessments, deploy new
standard contractual clauses, and rely on alternative transfer mechanisms, such as
binding corporate rules and derogations. Data localization is taxing cloud vendors
and creating its own sources of risk.
Plus, artificial intelligence and machine learning systems, which often require
massive amounts of data collection, are proliferating across industry sectors.
As we’ve consistently observed in our annual IAPP-EY Privacy Governance
Report, organizations with mature privacy operations not only have full teams of
privacy professionals, but they also have them embedded in various business
operations and administrative departments, ranging from human resources to
information technology, marketing, and sales. They provide privacy with
multimillion-dollar budgets. They buy tech bespoke for privacy operations.
In short, privacy program management is a foundational component in modern
business, and the need for sophisticated leaders who understand the complexities
of the global digital marketplace will only increase.
Yet again, Executive Editor Russell Densmore, CIPP/E, CIPP/US, CIPM,
CIPT, FIP, has overseen a variety of valuable contributions in revamping Privacy
Program Management: Tools for Managing Privacy Within Your Organization. There
are more practical examples, more deep dives into the “how” of privacy
management, and more information on the tools privacy professionals are using
to create effective privacy programs.
For data protection officers, privacy program managers, global privacy leaders,
and any number of other titles emerging around the globe, the CIPM is the
perfect tool for privacy professionals working in both the public and private
sectors. This book helps unlock the benefits of CIPM and prepare those hoping to
take the exam and get certified.
I am extremely pleased with the way the CIPM continues to be accepted around
the globe as the standard for how privacy is done on the ground. I hope you—and
your organization—enjoy its benefits.
J. Trevor Hughes, CIPP
President and CEO
International Association of Privacy Professionals
CHAPTER 1
Introduction to Privacy Program Management
Russell Densmore, CIPP/E, CIPP/US, CIPM, CIPT, FIP
What is privacy program management? It is the structured approach of combining
several projects into a framework and life cycle to protect personal information
and the rights of individuals. An organization that implements and maintains a
properly structured privacy program will enable it to comply with its legal and
regulatory requirements, meet the expectations of clients or customers, while at
the same time prevent and mitigate privacy risks.
What is program management? It is the process of managing multiple projects
across an organization to improve performance. Program management is used
widely in the aerospace and defense industries. It allows for oversight and status of
projects to ensure goals of the program are met. It allows for a holistic view of
multiple projects and change management. It also allows for valued metrics to be
viewed across the program.
What is a framework? A framework is the skeletal structure needed to support
program management. Each organization’s privacy program framework will be
created by analyzing the applicable laws, regulations, and best practices that are
tailored specifically for the goals of each organization.
What is a life cycle? It is the series of stages that something passes through
during its existence. In privacy program management, we refer to the privacy
governance life cycle of assess, protect, sustain, and respond.
The privacy framework and life cycle follow well-known program management
principles and consider privacy laws and regulations from around the globe. They
incorporate common privacy principles and implement concepts such as privacy
by design (PbD) and privacy by default.1
The term “privacy” has varying definitions among multiple nations, states,
regions, and industries of the world. Most people agree privacy is not the same as
secrecy and thus should not be confused with data classification models used by
governments of the world, which may label information as sensitive, secret, top
secret, etc. Privacy is a dynamic object with a discrete set of attributes and actions
that is difficult to observe and measure. Therefore, the use of a privacy framework
and life cycle provides the guidance and structure necessary to ensure a successful
program implementation and ongoing adherence. The world is demanding that
organizations are accountable for the data they collect, how they manage the data,
and how they use personal information to protect and respect the rights of
individuals. A structured privacy program exhibits an organization’s thoughtful
and intentional plan to protect personal information and the rights of individuals.
Since privacy is a subject of global importance, organizations can no longer
ignore the requirements necessary to protect personal information imposed by
laws, regulations, and industry best practices. As governments continue to impose
tighter laws and regulations, consumers continue to demand more protection
from organizations they choose to entrust with their information. Consequently,
organizations must meet these demands through placement of greater controls,
processes, and procedures on information under their custodial control. With so
many spheres of influence and pressure, global privacy teams must now seek to
track, manage, and monitor the dynamic changes that appear to occur
continuously.
As shown with all business management tasks, a privacy governance life cycle
provides the methods to assess, protect, sustain, and respond to the positive
and negative effects of all influencing factors. This framework and life cycle
thereby provides reusable procedures and processes that outline the courses of
action. Like maps, frameworks provide inquiry topics and direction (e.g., problem
definition, purpose, literature review, methodology, data collection, and analysis)
to ensure quality through repeatable programmatic steps, thereby reducing errors
or gaps in knowledge or experience. For the purpose of this book, this framework
and life cycle is called the “privacy program framework.” Although a dedicated
privacy team or privacy professional (e.g., a data protection officer) owns this
framework, it shares ownership and management aspects with other stakeholders
throughout the organization, including employees, executive leadership,
managers, and external entities, such as partners, vendors, and customers.
“Privacy professional” is a general term used to describe any member of the
privacy team who may be responsible for privacy program framework
development, management, and reporting within an organization.
“Assess” is the first of four phases of the privacy operational life cycle that will
provide the steps, checklists, and processes necessary to assess any gaps in a
privacy program as compared to industry best practices, corporate privacy
policies, applicable privacy laws and regulations, and the privacy framework
developed for your organization. The privacy professional should note that
although the assessing of a privacy program is explained sequentially, in actual
practice, the elements may be performed simultaneously, in separate components,
or tailored to organizational requirements. For example, you may be assessing a
program through measurement and alignment of organization
standards/guidelines, privacy management to regulatory and legislative mandates,
through industry best practices, or a hybrid or combination of both approaches.
There are currently many models and frameworks that allow measurement and
alignment of these activities to include privacy maturity models, such as the
AICPA/CICA Privacy Maturity Model, Generally Accepted Privacy Principles
(GAPP) framework, and privacy by design (PbD).
“Protect” is the second of four phases of the privacy operational life cycle. It
provides the data life cycle, information security practices, and PbD principles to
protect personal information. Although technical, containing information
security, information assurance, or cybersecurity practices, this chapter provides a
generic, high-level overview for the privacy professional. The protect phase of the
privacy operational life cycle embeds privacy principles and information security
management practices within the organization to address, define, and establish
privacy practices.
For any organization, domestic and global privacy management is further
complemented through each of the operational life cycle phases related to
jurisdiction, compliance, and laws. Understanding and analyzing each of these
phases as they relate to an organization provides the privacy professional a greater
understanding of how to protect personal information.
Privacy spans across the entire organization, from HR, legal, and other
supporting functions to businesses and procurement. Therefore, do not forget to
take into account laws and regulations applying to other areas, such as labor or
telecommunications law, as these may well interact with privacy laws.
“Sustain” is the third of four phases of the privacy operational life cycle that
provides privacy management through the monitoring, auditing, and
communication aspects of the management framework. Monitoring throughout
several functions in the organization, to include audit, risk, and security practices,
ensures “business as usual” for identification, mitigation, and reporting of risk in
variation or gaps in operations to meet regulatory, industry, and business
objectives.2 Monitoring should be continuous and based on the organization’s risk
appetite through defined roles and responsibilities that may include privacy, audit,
risk, and security roles.
“Respond” is the fourth of four phases of the privacy operational life cycle. It
includes the respond principles of information requests, legal compliance,
incident-response planning, and incident handling. The “respond” phase of the
privacy operational life cycle aims to reduce organizational risk and bolster
compliance to regulations. Every corporation needs to be prepared to respond to
its customers, partners, vendors, employees, regulators, shareholders, or other
legal entities. The requests can take a broad form, from simple questions over
requests for data corrections to more in-depth legal disclosures about individuals.
No matter the type of request, you need to be prepared to properly receive, assess,
and respond to them.
Businesses are motivated today, more than ever, to ensure they are compliant
with privacy laws and regulations around the globe—in part, because they want to
protect their brand name, reputation, and consumer trust. Large data breaches
frequently make news headlines, and organizations have paid significant penalties,
particularly through class-action lawsuits to affected individuals, lost revenue, or
lost consumer trust. Millions of people have been affected by sloppy data
protection practices of the past. This must change, and organizations must take
seriously how they handle personal information entrusted to them.
It is time for the privacy profession to recognize the value of a holistic data
privacy program and ever-important privacy program manager. This textbook
delves into the requirements for becoming a privacy program manager. The
Certified Information Privacy Manager (CIPM) certification indicates that a
privacy program manager has the proper understanding of concepts, frameworks,
life cycles, and regulations to hold the role of privacy program manager for their
employer.3
1.1 Responsibilities of a Privacy Program
Manager
The role and responsibilities of a privacy program manager may vary widely
depending on the type, size, complexity of the organization, and its business
objectives and may be performed by one of more privacy professional(s) who
form part of the central privacy team. This role also may not always carry such job
title, e.g., a data protection officer and a data privacy analyst could undertake
specific responsibilities of a privacy program manager in some organizations. It is
important to remember to align the various parts of a privacy program to business
objectives so as not to be in contention. The privacy program and operations
should align and support the business as a valued partner, not be seen as a
“blocker.” The person who ultimately leads the endeavor is usually referred to as
the privacy program manager.
The goals of a privacy program manager are to:
Define privacy obligations for the organization
Identify and mitigate business, employee, vendor, and customer privacy
risks
Identify existing documentation, policies, and procedures around the
management of personal information
Create, revise, and implement policies and procedures that effect
positive practices and together comprise a privacy program
Raise the data IQ of the organization to drive and embed a privacy-
orientated culture
The goals of a privacy program (at a minimum) are to:
Demonstrate an effective and auditable framework to enable
compliance with applicable data protection laws and regulations
Promote trust and confidence in the data entrusted by individuals,
including consumers and employees
Highlight that an organization takes its data privacy obligations
seriously
Respond effectively to privacy breaches and data subject requests
Continually monitor, maintain, and improve the maturity of the privacy
program
The specific responsibilities of the privacy program manager include:
Policies, privacy notices, procedures, and governance
Privacy-related awareness and training
Incident response and privacy investigations
Regulator complaints
Data subject requests
Communications
Privacy controls
Privacy issues with existing products and services
Privacy-related monitoring
Privacy impact assessments
Development of privacy staff
Privacy-related data committees
PbD in product development
Privacy-related vendor management
Privacy audits
Privacy metrics
Cross-border data transfers
Preparation for legislative and regulatory change
Privacy-related subscriptions
Privacy-related travel
Redress and consumer outreach
Privacy-specific or -enhancing software
Privacy-related certification seals
Cross-functional collaboration with legal, information technology (IT),
information security (sometimes referred to as IS or infosec),
cybersecurity, and ethics teams, among others
Internal and external reporting
As you can see by the preceding list, which is not exhaustive, the roles and
responsibilities of the privacy program manager can be far and wide. This text is
not meant to clarify every obligation of the privacy program manager but instead
give a holistic view so you may tailor a specific privacy program for your
organization.
1.2 Accountability
What is accountability? Accountable organizations have the proper policies and
procedures to promote best practices in handling personal information and,
generally, can demonstrate they have the capacity to comply with applicable
privacy laws. They promote trust and transparency to provide individuals with
confidence in their abilities to protect their personal information and respect their
data rights.
The concept of accountability is one of the most important concepts introduced
by new data protection laws. It is about not only saying the organization is taking
action, but also being able to prove that it is. In other words, the organization is
accountable for the actions it takes or does not take to protect personal data. The
idea is that, when organizations collect and process information about people,
they must be responsible for it. They need to take ownership and take care of it
throughout the data life cycle.
If an organization has a data protection policy in place, the organization should
comply with that policy and document any deviations and actions taken for any
failures in complying with the policy.
Accountability, as defined by laws, can benefit organizations, although it may
impose obligations to take ownership and demonstrate how the organization is
compliant. In exchange, it can give organizations a degree of flexibility about
exactly how they will comply with their obligations. Privacy program managers, as
well as chief information security officers (CISOs) and data protection officers
(DPOs), may be accountable for the safekeeping and responsible use of personal
information—not just to investors and regulators, but also to everyday consumers
and their fellow employees.
1.3 Beyond Law and Compliance
Numerous laws and requirements affect businesses today, and the topic of privacy
is receiving extra attention from legislators and non-privacy regulators. However,
it is not just about laws and compliance. There are various motivators driving
businesses to be more responsible with an individual’s personal data.
One such motivator is consumer trust. Fines and fees from regulators are usually
clearly defined and have a finite value to them. However, consumer trust can be
broad, unbounded, and have much more severe repercussions. Loss of consumer
trust can be ruinous to organizations. It is hard to obtain and harder to get back
once lost. Therefore, many organizations are motivated to have a mature privacy
program to ensure they do not lose consumer trust.
Obviously, organizations that are business-to-consumer (B2C) will be more
interested in consumer trust than business-to-business (B2B) companies.
However, all organizations have an interest in keeping trust with their partners,
employees, contractors, and customers. Proper handling of personal data is in
every organization’s best interest.
1.4 Why Does an Organization Need a Privacy
Program?
There are many reasons why an organization should have a privacy program.
Foremost of all is simple accountability. Showing proper respect for individuals’
personal information shows that the organization is reputable.
The reasons for having a privacy program may include but are not limited to:
Enhancing an organization’s brand and public trust
Meeting regulatory obligations
Encouraging ethical data-processing practices
Enabling global operations, such as mergers and acquisitions (M&A)
Preventing and mitigating the effects of data breaches
Providing a competitive differentiator
Increasing the value and quality of data (business asset)
Reducing the risk of employee and consumer class-action lawsuits
Being a good corporate citizen
Meeting expectations of consumers and business clients
Integrating data ethics into organizations decision making
Good accountability through a robust privacy program may lead to trust with an
organization. Trust, especially when it is consumer trust, may have great benefit to
the organization. Being transparent, accountable, and good data stewards of
personal information shows an organization is trustworthy of the information
entrusted to them.
1.5 Privacy Across the Organization
Managing privacy within an organization requires the contribution and
participation of many members of that organization and particularly functions
that process high volumes of data, such as HR (employee data) and customer
services (consumer data) teams.
Privacy should continue to develop and mature over time within an organization
so it is important that functional groups understand how they contribute and
support the overall privacy program, as well as the privacy principles themselves.
Importantly, individual groups must have a fundamental understanding of data
privacy because, in addition to supporting the vision and plan of the privacy
officer and privacy team, these groups may need to support independent
initiatives and projects from other stakeholders.
In some larger organizations, members of the privacy team may sit within other
functional groups and have a dedicated privacy role—for example, marketing
privacy managers may advise and sign off on new marketing initiatives and email
campaigns from a privacy perspective. They may report to both the senior
marketing manager and head of privacy. Buy-in and a sense of ownership from key
functions also assist with better acceptance of privacy and sharing of the
responsibility across the organization rather than in one office. Based on the
individual culture, politics, and protocols of the organization, privacy
professionals will need to determine the best methods, style, and practices to
work within the organization or individual functions. Initially, this effort may be
onerous, but building and maintaining good relationships with other key
stakeholders ensure privacy is built into the DNA of business process and design
rather than just an afterthought.
Many functions directly support the various activities required by the privacy
program. Among these activities are the adoption of privacy policies and
procedures, development of privacy training and communications, deployment of
privacy- and security-enhancing controls, contract development with and
management of third parties that process the personal information of the
organization, and the assessment of compliance with regulations and established
control mechanisms.
Privacy policies and procedures should be created and enforced at a functional
level, i.e., by the central privacy team. Policies imposing general obligations on
employees may also reside with other functions, such as ethics, legal, and
compliance; therefore, it is important to align with other policy owners and
reference other policies as applicable. Information technology (IT) may be
responsible for policies and procedures related to employee use of technical
infrastructure. Policies that govern privacy requirements for providers of third-
party services that have implications for personal data typically sit with
procurement, while those concerning the use and disclosure of employee health
information typically reside with HR.
Since activities that contribute to the protection of employee, customer, and
another data subject’s personal information span the entire organization, most
groups within the organization should have some policies to address the
appropriate use and protection of personal information specific to their own
functional areas; all such policies will need to be produced in close consultation
with the privacy office. There needs to be an awareness of the difference between
having appropriate policies in place and using appropriate controls. Examples of
the different functions involved in creating procedures related to privacy include:
The learning and development team manages activities related to
employee training. (Training and awareness—with the intention of
changing bad behaviors and reinforcing good ones—are integral to the
success of the privacy program.) This function enables policies and
procedures to be translated into teachable content and can help
contextualize privacy principles into tangible operations and processes.
In smaller companies, these responsibilities may fall on the privacy
function. Whatever the size of the organization, the privacy team will
always need to approve the privacy training output that has been
produced and closely monitor completion rates.
The communications team assists with publishing periodic intranet
content, email communications, posters, and other collateral that
reinforce good privacy practices in line with the company’s branding,
objectives, and tone of voice. This function can also advise on the best
methods of communication to boost higher engagement. For example,
an animated video might work better for certain employees, rather than
a physical poster or intranet blog post.
The information security team aligns more closely to the privacy team
than any other function in the organization. Every security-enhancing
technology that information security deploys—from encryption to
perimeter security controls and data loss prevention (DLP) tools—
helps the privacy program meet its requirements for implementing
security controls to protect personal information. As an example, EU
data protection law incorporates security provisions into the law as one
of its key principles. The information security team ensures that
appropriate technological controls are employed (e.g., complex
passwords, encryption, role-based access) and determines whether the
various groups within an organization are aware of and comply with the
organizational and technical controls that govern their activities and
behaviors.
The IT team can enhance the effectiveness of the privacy program by
adding processes and controls that support privacy principles. For
example, creating processes to develop and test software and
applications in a manner that does not require the use of production
data may decrease the chances that the data will be compromised. This
may also keep individuals who have no business need to view personal
data from accessing it. Creating systems that support role-based access
also supports the larger purposes of the privacy program by specifically
identifying and limiting who can access the personal information in a
particular system. The IT team should carry the mantle of PbD by
implementing privacy principles into the realm of technology
development, for instance, by limiting the data fields built into a tool or
application to only those actually required to perform a process or
action, or by building in functions that enable the user to easily delete
data according to a retention schedule.
An internal audit team assesses whether controls are in place to
protect personal information and whether people and processes within
the organization are abiding by these controls. This group can be
considered an ally of the privacy program and, in a sense, a member of
the privacy program, although it traditionally functions independently.
It is good practice to align with the internal audit team, particularly as
the privacy program matures, for help and assistance in developing a
framework to monitor privacy policies, controls, and procedures
already implemented to ensure they are being adhered to and working
as they should. This can also make it a much smoother process when it
comes to the internal audit team themselves carrying out their own
review of the privacy program.
Procurement plays an important role in ensuring that contracts are in
place with third-party service providers that process personal
information on behalf of the organization and that the appropriate data
privacy contractual language is imposed on these service providers.
Most privacy laws require data controllers or other entities directly
subject to data protection laws to ensure their privacy requirements are
fulfilled. Procurement teams usually support the privacy and/or legal
teams in facilitating or, in some cases, performing due diligence, taking
action based on the results, and making sure contractual language
reduces the organization’s exposure. In smaller organizations, a legal
department may create contract requirements if there is no
procurement.
Human resources (HR) ensures employee information is handled in
accordance with privacy policies and procedures. This function is most
likely to handle sensitive employee information, such as health
information and, in some organizations’ information collected, for
vetting staff.
Ethics and compliance manages whistleblowing and complaints
relating to how an individual’s personal data may have been handled.
Marketing and advertising creates awareness on how to handle
customer personal data for marketing and media purposes.
Business development and strategy helps understand how “good data
protection” can drive more business.
Finance ensures Payment Card Industry (PCI), Sarbanes-Oxley
(SOX), and other financial regulations are collaborated on with the
privacy office.
Legal keeps current on privacy regulations and requirements that affect
your organization.
Risk ensures data protection risks are included in the organization’s
Enterprise Risk Management framework.
Data governance develops a data governance framework that supports
data privacy requirements.
Product research and development performs privacy impact
assessments (PIAs), as well as privacy by design and default (PbDD)
consulting in new product development.
1.6 Championing Privacy
Protecting personal data and building a program that drives privacy principles
into the organization cannot be the exclusive job of the privacy officer or privacy
team, any more than playing a symphony is the exclusive responsibility of the
conductor. As with an orchestra, many people, functions and talents will merge to
execute on a vision.
Many organizations create a privacy committee, council composed of the
stakeholders, or representatives of functions, often referred to as “privacy
champions,” that were identified at the start of the privacy program
implementation process. These individuals and functions will launch the privacy
program, and their expertise and involvement will continue to be tapped as
remediation needs—some of which may sit within their areas of responsibility—
are identified. They will be instrumental in making strategic decisions and driving
them through their own departments.
Organizations with a global footprint often create a governance structure
consisting of representatives from each geographic region and business function
to ensure that proposed privacy policies, processes, and solutions align with local
laws and to modify them where necessary.
Discuss ways these teams can work together to champion privacy, creating an
even greater awareness of your privacy program. Another benefit of this approach
to building an organization’s awareness program could be that, through the
process of looking at the various awareness programs in place throughout the
organization, you have an opportunity to assess existing programs. Collating
feedback through questionnaires can help to reveal both strengths and weaknesses
in individual programs, which itself is a positive result, contributing to an overall
strengthening of all internal awareness programs.
1.7 Summary
Privacy program managers are responsible for the safekeeping and responsible use
of personal information—not just to investors and regulators, but also to everyday
consumers and their fellow employees. Privacy program managers should be
ready to demonstrate compliance with applicable data privacy laws, reduce risk,
build trust and confidence in the brand, and enhance competitive and
reputational advantages for the organization.
Endnotes
1 Ann Cavoukian, Privacy by Design: The 7 Foundational Principles, accessed November 2018,
https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf.
2 “Business As Usual,” Mr. Simon McDougall, interview, November 15, 2012.
3 “CIPM Certification,” IAPP, accessed November 2018, https://iapp.org/certify/cipm/.
CHAPTER 2
Privacy Program Framework: Privacy
Governance
Ron De Jesus, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Building a strong privacy program starts with establishing the appropriate
governance of the program. The term “privacy governance” will be used here to
generally refer to the components that guide a privacy function toward
compliance with privacy laws and regulations, as well as enable it to support the
organization’s broader business objectives and goals. These components include:
Creating the organizational privacy vision and mission statement
Defining the scope of the privacy program
Selecting an appropriate privacy framework
Developing the organizational privacy strategy
Structuring the privacy team
2.1 Create an Organizational Privacy Vision and
Mission Statement
The privacy vision and/or mission statement of an organization is critically
important and the key factor that lays the groundwork for the rest of the privacy
program. The privacy vision should align with the organization’s broader purpose
and business objectives, which may be subject to change depending on the
direction of the business. Once refined with feedback from key stakeholders, it
should ultimately be reviewed and approved by executive leadership. It is typically
composed of a few short sentences that succinctly describe the privacy function’s
raison d’être. Alternatively, privacy can be covered in an organization’s overall
mission statement or code of conduct.
A “privacy mission statement” describes the purpose and ideas in just a few
sentences. It should take less than 30 seconds to read.
The following examples illustrate some variations in privacy mission and vision
statements.
2.1.1 Stanford University
Our Mission: Enable Stanford to navigate a dynamic future in privacy through
transparent, ethical and innovative uses of personal data.
The University Privacy Office is a division of the Office of the Chief Risk Officer
(OCRO). We strive to be a valued partner and advisor to the Stanford
community by providing guidance and training on privacy laws, policies and best
practices.
Long Term Goals
Promulgate privacy by design
Build trust in Stanford as a data steward
Manage privacy risk proactively and pragmatically
Advocate for the innovative and ethical use of data
Be a recognized leader in information privacy1
2.1.2 Microsoft
At Microsoft, our mission is to empower every person and every organization on
the planet to achieve more. We are doing this by building an intelligent cloud,
reinventing productivity and business processes and making computing more
personal. In all of this, we will maintain the timeless value of privacy and preserve
the ability for you to control your data.
This starts with making sure you get meaningful choices about how and why data
is collected and used, and ensuring that you have the information you need to
make the choices that are right for you across our products and services.
We are working to earn your trust every day by focusing on six key privacy
principles:
Control: We will put you in control of your privacy with easy-to-use tools
and clear choices.
Transparency: We will be transparent about data collection and use so
you can make informed decisions.
Security: We will protect the data you entrust to us through strong security
and encryption.
Strong legal protections: We will respect your local privacy laws and fight
for legal protection of your privacy as a fundamental human right.
No content-based targeting: We will not use your email, chat, files or
other personal content to target ads to you.
Benefits to you: When we do collect data, we will use it to benefit you and
to make your experiences better.2
2.1.3 Apple
Privacy is a fundamental human right. At Apple, it’s also one of our core values.
Your devices are important to so many parts of your life. What you share from
those experiences, and who you share it with, should be up to you. We design
Apple products to protect your privacy and give you control over your
information. It’s not always easy. But that’s the kind of innovation we believe in.3
2.1.4 Irish Data Protection Commission
Our mission: Protecting data privacy rights by driving compliance through
guidance, supervision and enforcement.
Our vision: The DPC will be a fully fit-for-purpose independent, internationally
respected and trusted supervisor and enforcer of EU data protection law.
We will at all times demonstrate professionalism, competence and expertise in
performing our enhanced role under the GDPR as a lead supervisory authority
for many of the world’s largest technology, internet and social media companies
as well as being the supervisory authority for Irish domestic companies and
organisations, in both the public and private sector.
Where necessary and proportionate, and in accordance with Irish and EU law,
we may apply sanctions, such as administrative fines, against any company or
organisation which has infringed the law. We will exercise our prosecution powers
under the Data Protection Act 2018 and under S.I. No. 336 of 2011 in a fair
and transparent manner.
As required by the new data protection legal framework, we will handle
complaints from individuals who believe their data protection rights have been
infringed and take appropriate action in handling complaints.
The skills profile of our staff, our structure and size, our procedures and our
budget will reflect our expanded domestic and international functions and
responsibilities.
The DPC will continue to progressively build stronger international collaborative
relationships with data protection authorities and consumer protection regulators
across the globe, participating as members of international networks such as the
International Conference of Data Protection and Privacy Commissioners, the
Conference of European Data Protection Authorities and the Global Privacy
Enforcement Network. We will work in cooperation with all EU data protection
supervisory authorities, as a member of the European Data Protection Board, as
required under the new legal framework, and fully participate in the continued
implementation of a harmonised data protection regime throughout the EU.
Our values: We are committed to demonstrating the following values in
performing our functions and fulfilling our responsibilities, and in all of our
interactions with individuals, organisations, statutory and regulatory bodies, and
other stakeholders:4
Fairness
Expertise
Collaboration
Professionalism
Transparency
Independence
Accountability
2.1.5 UK Information Commissioner’s Office (ICO)
Mission: To uphold information rights for the UK public in the digital age.
Vision: To increase the confidence that the UK public have in organisations that
process personal data and those which are responsible for making public
information available.
Strategic Goals
1. To increase the public’s trust and confidence in how data is used and
made available.
2. Improve standards of information rights practice through clear,
inspiring and targeted engagement and influence.
3. Maintain and develop influence within the global information rights
regulatory community.
4. Stay relevant, provide excellent public service and keep abreast of
evolving technology.
5. Enforce the laws we help shape and oversee.5
2.2 Define Privacy Program Scope
After establishing a privacy mission statement and vision, you’ll need to define the
scope of the privacy program. Every organization has its own unique legal and
regulatory compliance obligations, and you’ll need to identify the specific privacy
and data protection laws and regulations that apply to it. A typical approach to
identifying scope includes the following two steps:
1. Identify the personal information collected and processed
2. Identify applicable privacy and data protection laws and regulations
2.2.1 Identify the Personal Information Collected and
Processed
The first step in gaining assurance that you are complying with your regulatory
obligations is to know what personal information your organization collects, uses,
stores, and otherwise processes. There are several ways to ascertain this. Initially,
you can take a less structured approach to identifying where data lives throughout
the organization by setting up information-gathering interviews with the typical
functions that usually collect, use, store, and otherwise process personal
information—human resources (HR), marketing, finance, and IT/information
security. Engaging teams, such as internal audit, data governance, or information
security, to assist you with data discovery efforts is a common approach to bring in
specific expertise, skills, technology, and resources. Overall, taking a lighter touch
can at least help you to determine the general categories and locations of personal
information, which will be key pieces of data for the next step. This manual
approach is relatively time consuming, particularly in larger organizations with
complex IT infrastructures. Therefore, consideration should be taken in how this
information will practically be maintained and resourced.
An alternative and more automated approach, although often to be more costly,
is engaging third parties to assist with information gathering efforts, such as
outside consultancies or data discovery tools (see section 2.6). This approach
benefits from automating the discovery of where personal information is
collected, stored, used, and shared—such engagements are usually more robust
and likely needed for complex data environments. This more structured exercise
of identifying data throughout the data life cycle drives the development of more
accurate and detailed data inventories, maps, and other helpful documentation,
which is generally easier to maintain.
Additionally, since maintaining written documentation about personal
information, including information about how the organization processes the
data, the categories of individuals impacted, and the recipients of data, has
become formalized through Article 30 of the EU General Data Protection
Regulation (GDPR), organizations that are subject to the GDPR should consider
this more thorough, holistic approach to the initial personal information
identification efforts. Chapter 4 covers developing data inventories, conducting
data mapping, and establishing these “records of processing” in further detail.
Some key questions that should be asked to help define the scope of the privacy
program are:
Who collects, uses, and maintains personal information relating to
individuals, customers, and employees? In addition to your own legal
entity, this group includes your service providers so you need to
understand these roles and obligations, too.
What types of personal information are collected, and what is the
purpose of collection?
Where is the data stored (applications and systems, as well as
countries)?
To whom is the data transferred?
Who has access to the data, both internally and externally (i.e., third
parties)?
When (e.g., during a transaction or hiring process) and how (e.g.,
through an online form) is the data collected?
How long is data retained, and how is it deleted?
What security controls are in place to protect the data?
2.2.2 Identify In-Scope Privacy and Data Protection
Laws and Regulations
After you have identified key metadata about the personal information your
organization collects (e.g., the specific data elements collected, from whom, where
it is stored), the next step is to identify the organization’s privacy obligations
related to that data. Most global organizations are subject to many data protection
and privacy laws—and some personal information collected and processed may
be subject to more than one regulation. For example, a health care services
company may be subject to domestic regulations governing the handling of
personal health information. The company may also handle financial transactions
and, therefore, be subject to financial reporting regulations, as well. Even further,
organizations that offer services to individuals located in other countries or have
locations overseas will likely be subject to global privacy obligations. Since no two
entities are alike, you will need to determine the true scope for your situation.
If your organization plans to do business within a jurisdiction that has
inadequate or no data protection regulations, institute your organization’s
requirements, policies, and procedures to the highest level achievable instead of
reducing them to the level of the country in which you are doing business. Best
practice is to choose the most restrictive policies—not the least restrictive, as it
also reduces privacy related risks for your organization.
2.2.3 Scope Challenges
Determining the scope of your privacy program can be challenging, regardless of
whether the program is domestic or global. Purely domestic privacy programs
may need to monitor only state and/or regional laws, as well as industry-specific
laws, while global programs will need to be cognizant of cultural norms,
differences, and approaches to privacy protection. A key example is the U.S. versus
EU approach. The former takes a limited sectoral approach, with laws that apply to
specific industry sectors or categories of data, like the Health Insurance Portability
and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the
Children’s Online Privacy Protection Act (COPPA), or laws that apply to the
residents of specific states, such as the California Consumer Privacy Act (CCPA)
and Virginia’s Consumer Data Protection Act (CDPA). The latter takes a
comprehensive approach (i.e., the GDPR generally applies to all personal data,
regardless of sector, and applies to all individuals within an EU/European
Economic Area member state).
You’ll find many countries around the world have either enacted some form of
data protection legislation or are in the process of establishing their own laws,
often modeling their laws on those established by the United Kingdom and
European Union.
These laws may apply to your organization, whether it is located and operates in
a particular country itself or just transfers personal information from that country
to its home location. Determining the applicability of these laws to your
organization is a key responsibility of the privacy professional as is ensuring that
relevant laws, regulations, and other factors are considered from the start of the
privacy program and throughout its life cycle. Table 2-1 describes the different
privacy approaches by jurisdiction, and chapter 3 details, by jurisdiction, the
specific privacy and data protection laws, regulations, and frameworks you should
understand.
Table 2-1: Sample Approaches to Privacy and Data Protection around the Globe
Country Protection Approach to Privacy Protection
Models
United States Sectoral Enactment of laws that specifically address a particular
laws/state- industry sector, such as:
specific laws
Financial transactions
Credit records
Law enforcement
Medical records
Communications
Enactment of laws that apply to the residents of a specific
state, such as:
California (CCPA)
Virginia (CDPA)
EU member Comprehensive Govern the collection, use, and dissemination of personal
states, Canada, laws information in public and private sectors with an official
United oversight enforcement agency that:
Kingdom
Remedies past injustices
Provides guidance for individuals and businesses
Australia Co-regulatory Variant of the comprehensive model in which industry
model develops enforcement standards that are overseen by a
privacy agency
Country Protection Approach to Privacy Protection
Models
United States, Self-regulated Companies use a code of practice by a group of companies
Japan, model known as industry bodies. The Online Privacy Alliance (OPA),
Singapore TrustArc, and WebTrust are examples of this type of model
Organizations operating in the United States face domestic privacy challenges
that include determining whether your organization constitutes an entity that is
subject to a law or industry standard that regulates data or the collection of data
from certain individuals. “Financial institutions,” as defined by the Gramm-Leach-
Bliley Act, are subject to the GLBA.6 Certain types of organizations and entities
known as “covered entities,” such as health care providers (e.g., hospitals, clinics,
pharmacies) and health plans (e.g., medical plans, organization benefit plans) are
subject to HIPAA.7 Websites collecting information from children under the age
of 13 are required to comply with the Federal Trade Commission’s (FTC’s)
COPPA.8 A merchant of any size that handles cardholder information for debit,
credit, prepaid, e-purse, and ATM and point of sale (POS) cards must follow the
Payment Card Industry Data Security Standard (PCI DSS), which is a global
standard.9
As the name suggests, PCI DSS is an industry security standard, not a law, but it
still imposes certain data protection requirements on organizations, as well as
certain notification obligations in the event of breaches; some U.S. states have
adopted PCI DSS as part of legislated requirements.
Domestic U.S. privacy challenges also extend from federal laws and regulations
to the states; all 50 states now have data breach notification laws.10 Accordingly, if
you process the personal information of any resident of a state that has adopted a
breach notification law, understand that to the extent that nonencrypted data has
been compromised, your compliance obligations may include notifying the
residents of the state, as well as government bodies or state attorneys general
offices. It can also be a difficult and complex exercise to pull together and maintain
a list of breach notification requirements in each location you have a presence and
resulting obligations. Third-party tool providers offer assistance in this area to
complement your data breach and incident response program, as well as ensure
you are complying with applicable legal and regulatory notification obligations.
Outside of the United States, many countries put more stringent privacy
requirements on the government than the private sector and impose separate
requirements on certain industry sectors (e.g., telecommunications companies
face stricter record-keeping requirements) or on data collected from employees.
Appropriately scoping your organization’s privacy program is a challenging
exercise. A successful approach requires:
Understanding of the end-to-end personal information data life cycle
Consideration of the global perspective to meet legal, cultural, and
personal expectations
Customizing of privacy approaches from both global and local
perspectives
Awareness of privacy challenges, including translations of laws and
regulations and enforcement activities and processes
Monitoring of all legal compliance factors for both local and global
markets
2.3 Develop a Privacy Strategy
Now that the scope of the privacy program has been defined and applicable laws
have been identified, the next consideration is your overall privacy strategy.
Essentially, a privacy strategy is the organization’s approach to communicating
and supporting the privacy program and its vision. Personal information may be
collected and used across an organization, with many individuals responsible for
protecting this information. There is no one-size-fits-all solution or strategy that
can successfully mitigate all privacy risks. There are positive benefits to
implementing a privacy strategy in today’s environment, among them are
management’s growing awareness of the importance of protecting personal
information and the financial impact of mismanagement. With that said, getting
buy-in at the appropriate levels can still be challenging.
Building a privacy strategy may mean changing the mindset and perspective of
an entire organization. Everyone in an organization has a role to play in protecting
the personal information an organization collects, uses, and discloses (these
responsibilities should also be clearly presented in policies). Management needs
to approve funding to resource and equip the privacy team, fund important
privacy-enhancing resources and technologies, support privacy initiatives, such as
training and awareness, and hold employees accountable for failing to follow
privacy policies and procedures. Sales personnel must secure business contact
data and respect the choices of these individuals. Developers and engineers must
incorporate effective security controls, build safe websites, and create solutions
that require the collection or use of only the data necessary to accomplish the
purpose. All staff must understand and employ fundamental practices required to
protect personal data from secure methods of collecting, storing, and transmitting
personal data, both hard copy and electronic, through secure methods of
destruction. The adage “the chain is only as strong as its weakest link” truly reflects
the way an organization must approach its privacy program. There are no
shortcuts, and every individual within an organization contributes to the success
of the privacy program.
Before an organization can embark on this journey, the management team will
need to understand why their involvement and support is so critical. It is
important to know the ultimate destination before beginning and have a road map
for the journey. These factors and more must be contained in the privacy strategy
to ensure success, buy-in, and ownership from the widest possible pool of
stakeholders.
2.3.1 Identify Stakeholders and Internal Partnerships
One of the most challenging aspects of building a privacy program and the
necessary supporting strategy is gaining consensus from members of the
organization’s management, up to the top-level of executive leadership, on privacy
as a business imperative. Building and gaining this consensus in stages is a must.
The first major step in building a coalition of supporters is to conduct informal
one-on-one conversations with executives or senior leaders within the
organization who have accountability for information management and/or
security, risk, compliance, or legal decisions. It is best practice to start with the
most senior leader in your division and then work your way up the chain of
management to provide initial visibility and gain buy-in/advice, where necessary.
For example, a privacy team may sit ultimately under the group’s general counsel,
which may serve as a good first step in assigning as a “program sponsor.” Internal
partners, such as HR, legal, security, marketing, risk management, and IT, should
also be included in these conversations, as they too will have ownership of privacy
activities, and their buy-in will be essential. Depending on the organization’s
industry and corporate culture, the executives, managers, and internal partners
will each play a key role in the development and implementation of the privacy
strategy for the privacy program.
From these conversations, you should start to get a sense for which executive
will serve as the program sponsor or “champion” for the privacy program, as well
as whether an executive is even necessary. The program sponsor should be
someone who understands the importance of privacy and will act as an advocate
for you and the program. Effective program sponsors typically have experience
with the organization, the respect of their colleagues, and access to or ownership
of a budget. Final budgetary decision makers are the preferred program sponsors,
but if they are unavailable, it is best to obtain approval from executive
management closest to the organization’s top executive. Frequently, sponsors
function as risk or compliance executives within the organization. Sometimes,
chief information security officers (CISOs), chief compliance officers (CCOs), or
general counsels (GCs) serve as program sponsors.
A privacy champion at the executive level acts as an advocate and sponsor to
further foster privacy as a core organization concept.
Most organizations, regardless of their size, industry, and specific business, use
personal information for roughly the same bundle of activities—for example, staff
recruitment and ongoing employment administration, customer relationship
management and marketing, and order fulfillment. Further, the use of this
personal information is managed by a similar array of executives—regardless of
the organization or its activities. It is common to refer to the individual executives
who lead the relevant activities and own responsibility for them as stakeholders.
Typically, in a larger organization, an executive privacy team will include some or
all of the following individuals: senior security executive (e.g., CISO); senior risk
executive (e.g., chief risk officer [CRO]); senior compliance executive (e.g., chief
compliance officer [CCO]); senior HR executive; senior legal executive (e.g.,
GC); senior information executive (e.g., CIO); senior physical security/business
continuity executive; senior marketing executive (e.g., CMO); and a senior
representative of the business.
Several best practices when developing internal partnerships include:
Becoming aware of how others treat and view personal information
Understanding their use of the data in a business context
Assisting with building privacy requirements into their ongoing
projects to help reduce risk
Offering to help staff meet their objectives while offering solutions to
reduce risk of personal information exposure
Inviting staff to be a part of a privacy advocate group to further privacy
best practices
2.3.2 Conduct a Privacy Workshop for Stakeholders
With the support of the privacy program sponsor, you should plan to conduct
regular workshops for the stakeholders who will support efforts to develop and
launch a privacy program, including through dedicated privacy committees. Don’t
assume all stakeholders have the same level of understanding about the regulatory
environment or complexity of the undertaking—there will invariably be different
levels of privacy knowledge among the group. This is an opportunity to ensure
everyone has the same baseline understanding of the risks and challenges the
organization faces, data privacy obligations that are imposed on it, and increasing
expectations in the marketplace regarding the protection of personal information.
This can also help in embedding a privacy culture into the DNA of the
organization and assist with implementing a privacy-by-design (PbD) framework
to support workstreams, such as identifying projects where the processing of
personal data triggers the requirement to complete a data protection impact
assessment (DPIA).
Conduct a privacy workshop for stakeholders to level the privacy playing field by
defining privacy for the organization, explaining the market expectations,
answering questions, and reducing confusion.
2.3.3 Keep a Record of Ownership
Once the importance of the privacy program has been established, key internal
stakeholders may form a steering committee to ensure clear ownership of assets
and responsibilities. Keep a record of these discussions through structured
agendas, minutes, and actions as a tool for communication and to ensure
stakeholders can refer to what was decided. This can also be used to support
audits carried out against the privacy program to demonstrate compliance. Such
documentation also helps support the accountability requirements of the various
privacy laws to which you may be subject and serves as the privacy program’s due
diligence in terms of which functions and individuals should be held accountable
for privacy compliance.
2.4 Develop and Implement a Framework
Once you’ve determined which laws apply, you must design a manageable
approach to operationalizing the controls needed to handle and protect personal
information. Implementing and managing a program that addresses the various
rights and obligations of each privacy regulation on a one-off basis is a nearly
impossible task. Instead, using an appropriate privacy framework to build an
effective privacy program can:
Help achieve material compliance with the various privacy laws and
regulations in-scope for your organization
Serve as a competitive advantage by reflecting the value the
organization places on the protection of personal information, thereby
engendering trust
Support business commitment and objectives to stakeholders,
customers, partners, and vendors
Note: The timing of the framework development and implementation may vary
among organizations. Many privacy professionals may choose to define their privacy
strategy before selecting a framework, but practically these steps may be performed in
parallel.
2.5 Frameworks
The term “framework” is used broadly for the various processes, templates, tools,
laws, and standards that may guide the privacy professional in privacy program
management.
The different frameworks have varying objectives based on business needs,
commercial grouping, legal/regulatory aspects, and government affiliations. The
privacy questions most frameworks answer primarily include:
Are privacy and the organization’s privacy risks properly defined and
identified in the organization?
Has the privacy program been properly implemented into all key
workstreams, particularly for organizations with a global presence?
Has the organization assigned responsibility and accountability for
managing a privacy program?
Does the organization understand any gaps in privacy management?
Does the organization monitor privacy management?
Are employees properly trained, and does the organization have a
privacy awareness program?
Does the organization follow industry best practices for data
inventories, risk assessments, and privacy impact assessments (PIAs)?
Does the organization have an incident response plan?
Does the organization communicate privacy-related matters and update
that material as needed?
Does the organization use a common language to address and manage
cybersecurity risk based on business and organizational needs?
This section focuses on two categories of frameworks that can be used as a
foundation to build a privacy program: (1) Principles and Standards; and (2)
Laws, Regulations, and Programs.
2.5.1 Principles and Standards
Fair Information Practices provide basic privacy principles central to several
modern frameworks, laws, and regulations.11 Practices and definitions vary across
codifications: rights of individuals (notice, choice and consent, data subject
rights), controls on information (information security, information quality),
information life cycle (collection, use and retention, disclosure, destruction), and
management (management and administration, monitoring, enforcement).
The Organisation for Economic Co-operation and Development (OECD)
Guidelines on the Protection of Privacy and Transborder Flows of Personal
Data are the most widely accepted privacy principles; together with the Council
of Europe’s Convention 108, they are the basis for the EU Data Protection
Directive and GDPR.12
The American Institute of Certified Public Accountants (AICPA) and Canadian
Institute of Chartered Accountants (CICA), which have formed the
AICPA/CICA Privacy Task Force, developed the Generally Accepted Privacy
Principles (GAPP) to guide organizations in developing, implementing and
managing privacy programs in line with significant privacy laws and best
practices.13
The Canadian Standards Association (CSA) Privacy Code became a national
standard in 1996 and formed the basis for the Personal Information Protection
and Electronic Documents Act (PIPEDA).14
The Asia-Pacific Economic Cooperation (APEC) Privacy Framework
enables Asia-Pacific data transfers to benefit consumers, businesses, and
governments.15
The European Telecommunications Standards Institute (ETSI) is an
independent, not-for-profit, standardization organization in the
telecommunications industry and produces globally applicable standards for
information and communications technologies, including fixed, mobile, radio,
converged, broadcast, and internet technologies.16
The National Institute of Standards and Technology (NIST) Privacy
Framework is a voluntary tool intended to be widely usable by organizations of
all sizes and agnostic to any particular technology, sector, law, or jurisdiction. The
framework is adaptable to any organization’s role(s) in the data-processing
ecosystem and helps organizations manage privacy risks by:
Taking privacy into account as they design and deploy systems,
products, and services that affect individuals
Communicating about their privacy practices
Encouraging cross-organizational workforce collaboration17
The framework uses a risk-based, customizable approach to identifying and
managing privacy risk and splits its components into three key parts. The first
component is a “core” set of privacy protection activities that privacy programs
could consider and be built upon. The second component focuses on “profiles,”
which is a set of core activities that an organization is expected to customize based
on various factors, like risk appetite, desired future state, resources, etc. The final
component, “tiers,” asks organizations to then consider the level of operational
maturity that is achievable for a given profile, again based on key criteria.
To offer insight into the professional skill set needed to implement the NIST
Privacy Framework, the IAPP’s Westin Research Center mapped the Privacy
Framework’s Core to the CIPM body of knowledge. The mapping documents
how the framework—and, more generally, a risk management framework
designed to bring together security and privacy professionals—aligns with CIPM
certification. The mapping also serves the dual purpose of informing privacy
professionals seeking to understand the skill set needed to implement the NIST
Privacy Framework and the IAPP’s ongoing work to ensure its certifications are
continually refined to meet the needs of the privacy profession across sectors and
disciplines.18
As a privacy risk management framework, NIST’s Privacy Framework aligns
closely with the CIPM body of knowledge. However, it should be noted that as a
framework designed to bring together stakeholders across disciplines, additional
skills are needed to go deeper into certain aspects of the Privacy Framework.19
PbD or data protection by design is an approach to privacy program
development and systems engineering based on a set of seven “foundational
principles.” The approach was initially developed by Dr. Ann Cavoukian and
formalized in a report on privacy-enhancing technologies by a joint team of the
Information and Privacy Commissioner of Ontario (Canada), the Dutch Data
Protection Authority, and the Netherlands Organisation for Applied Scientific
Research in 1995. The PbD framework was published in 2009 and adopted by the
International Assembly of Privacy Commissioners and Data Protection
Authorities in 2010.20
The overall intent of the PbD approach is to ensure every stage of the
development life cycle (i.e., of systems, products, features, and processes) takes
into account privacy requirements and considerations. According to Cavoukian,
the following principles upon which the framework is built affirm the Fair
Information Practices “but go beyond them to seek the highest global standard
possible.”21
1. Proactive not reactive; preventative not remedial
2. Privacy as the default setting
3. Privacy embedded into design
4. Full functionality—positive-sum, not zero-sum
5. End-to-end security—full life cycle protection
6. Visibility and transparency—keep it open
7. Respect for user privacy—keep it user-centric
These foundational principles can serve as a guide to broader privacy program
development and operationalization by ensuring program policies, procedures,
and guidelines are implemented at key software development life cycle (SDLC)
stages or supporting the use of existing company tools and resources to
implement privacy controls and review gates.22
2.5.2 Laws, Regulations, and Programs
Canada’s PIPEDA provides well-developed and current examples of generic
privacy principles implemented through national laws.23
EU data protection legislation includes the GDPR, which offers a framework for
data protection with increased obligations for organizations and far-reaching
effects, plus national laws implemented by member states to supplement the
GDPR.24 The EU-U.S. Privacy Shield was officially adopted in 2016 by the
European Commission and establishes a cross-border data transfer mechanism
between the two regions that replaces the previous Safe Harbor Framework.25 It
should be noted, however, that on July 16, 2020, the Court of Justice of the
European Union (CJEU) issued a decision in the case Data Protection Commission
v. Facebook Ireland, Schrems, which invalidated the European Commission’s
adequacy decision for the EU-U.S. Privacy Shield Framework. The decision is
colloquially referred to as “Schrems II.”26
Binding corporate rules (BCRs) are legally binding internal corporate privacy
rules for transferring personal information within a corporate group. Article 47 of
the GDPR lists requirements of BCRs (e.g., application of GDPR principles).27
Under the GDPR, BCRs must be approved by the competent supervisory
authority. The process can often take organizations several years for supervisory
authorities to approve BCRs.
HIPAA is a U.S. law passed to create national standards for electronic health care
transactions, among other purposes.28 HIPAA required the U.S. Department of
Health and Human Services to promulgate regulations to protect the privacy and
security of personal health information. The basic rule is that patients must opt in
before their information can be shared with other organizations—although there
are important exceptions, such as for treatment, payment, and health care
operations.
Local data protection authorities (DPAs), such as France’s Commission
Nationale de l’Informatique et des Libertés (CNIL), provide guidance on legal
frameworks.29
2.5.3 Rationalizing Requirements
Once an organization decides on a framework or frameworks, it will be easier to
organize the approach for complying with the plethora of privacy requirements
mandated by the laws and regulations that are applicable to it. One option is to
rationalize requirements, which essentially means implementing a solution that
materially addresses them. This activity is made simpler by several factors. First, at
a high level, most data privacy legislation imposes many of the same types of
obligations on regulated entities, and much of this regulation requires entities to
offer similar types of rights to individuals. Among these shared obligations and
rights, data protection regulations typically include notice, choice, consent,
purpose limitations, limits on retaining data, individual rights to access, correction
and deletion of data, and the obligation to safeguard data—duties that are
generally covered by the privacy frameworks previously identified. Further, there
seems to be a growing consensus among data protection regulators and businesses
on the actions and activities that meet these regulatory obligations.
Note that a rationalized approach to creating a privacy strategy also necessitates
addressing requirements that fall outside of the common obligations (often
termed “outliers”) on a case-by-case basis. Outliers result when countries’ local
laws exceed the requirements of national law or when countries have industry- or
data-specific requirements.
For example, rationalizing the common legal obligation of providing individuals
with a right of access to their personal information means the organization must
also identify the time frames within which data must be provided to individuals,
as well as the specific exemptions that would prevent disclosure of certain
information, per applicable privacy laws. In the European Union, because of the
GDPR, prescribed time frames within which an organization must provide access
to individuals (e.g., employees, consumers) now exist. In countries where no legal
requirements exist and the granting of access may be merely an organization
policy or where there is a generous amount of time extended to provide data, the
organization can adopt a procedure that sets a common period within which data
must be provided. A rationalized approach that seeks to address both sets of
requirements would result in the organization establishing a standard access
process that generally meets the demands of many countries with a local process
that meets specific time frame requirements for individuals in countries where
more stringent requirements may exist.
Another approach organizations employ, when possible, is to look to the strictest
standard when seeking a solution, provided it does not violate any data privacy
laws, exceed budgetary restrictions, or contradict organization goals and
objectives. This approach is used more frequently than most organizations realize.
In the example above, rather than responding to access requests of only EU-based
individuals within a 30-day period, the organization would provide all individuals
globally with access to their data within a prescribed, GDPR-compliant time
frame. Other examples are shredding everything versus shredding only
documents that contain personal or confidential information or rolling out laptop
encryption for the entire employee population as opposed to targeting only
individuals who may perform functions that involve personal information.
2.6 Privacy Technology and Governance, Risk,
and Compliance Vendors and Tools
Some organizations choose to use privacy technology vendors to help them
achieve compliance within their selected privacy program framework. According
to a recent survey completed by the IAPP and TrustArc, the biggest driver for
privacy tech adoption is the need to demonstrate compliance.30 Privacy tech
vendors offer a range of solutions, including data-mapping tools (which,
according to the same survey, were at the top of a privacy department’s wish list,
with 24 percent of respondents indicating they were planning to purchase such
tools), assessment management modules, supplier due diligence, redaction
capabilities (e.g., as part of data subject access requests), and deidentification and
incident response tools. Note that a product or solution in and by itself is not
compliant. When deployed as part of a properly thought-out privacy program, the
solution is a tool that assists the organization with GDPR and multijurisdictional
and regulatory compliance requirements.
2.6.1 Categories of Privacy Technology Vendors
Privacy tech vendors generally support privacy program management and
typically work directly with the privacy office. Vendors may manage:
Due diligence and risk assessments
Consent management
Data subject access requests (DSARs)
Data mapping
Incident response
Website scanning/cookie compliance
Enterprise program management services require buy-in from the privacy office,
IT, and C-suite. Services include:
Activity monitoring
Data discovery
Deidentification/pseudonymization
Enterprise communications31
2.6.2 Governance, Risk, and Compliance Tools
Governance, risk, and compliance tools (GRC) is an umbrella term whose scope
touches the privacy office, as well as other departments, including HR, IT,
compliance, and the C-suite. GRC tools aim to synchronize various internal
functions toward “principled performance”—integrating the governance,
management, and assurance of performance, risk, and compliance activities.
While many IT vendors provide capabilities to meet a single compliance
requirement, true GRC vendors provide tools to oversee risk and compliance
across the entire organization, helping to automate GRC initiatives that are mostly
manual or beyond an organization’s current capabilities.
GRC tools are generally used to:
Create and distribute policies and controls and map them to regulations
and internal compliance requirements
Assess whether the controls are in place and working, and fix them if
they are not
Ease risk assessment and mitigation32
2.7 Structure the Privacy Team
Structuring the privacy team is generally the last objective to formalizing the
organization’s approach to privacy. This section will focus on the many factors that
should be considered to assist with the decisions involved in structuring the team,
including identifying and selecting a suitable governance model for the privacy
office based on the specific needs of the business and establishing an appropriate
organizational model that defines roles, responsibilities, and reporting
relationships.
Note: Again, the timing of this stage may vary among organizations. For example, it
may make sense to consider the appropriate governance model and organizational
model of your privacy office as you define your privacy strategy since company footprint
and resource limitations will play a factor in these decisions.
2.7.1 Governance Models
There are different approaches and strategies for creating privacy office
governance models. This text is not intended to educate thoroughly on the
idiosyncrasies of various governance models but provide examples of types of
governance models that should be examined when structuring your privacy
program. It is important to consider the models as they will inform decisions by
your privacy team and the policies they will need to establish.
You should consider whether to apply the model only within a given
geographical region(s) or globally depending on your operations. Many large
organizations find they need to consider global implications when structuring
privacy teams.
The positioning of the privacy team within an organization should rely on the
authority it will receive under the governance model it follows. Positioning the
privacy team under the corporate legal umbrella may be substantially different
from aligning the team under the IT umbrella. Executive leadership support for
the governance model will have a direct impact on the level of success when
implementing privacy strategies.
No matter which model is chosen, consider the following key tasks to ensure the
model selected is most “fit for purpose” for the business:
Involve senior leadership
Involve stakeholders
Develop internal partnerships
Provide flexibility
Leverage communications
Leverage collaboration
Privacy governance models include centralized, local, and hybrid versions but
are not limited to only these options. Governance models and the choice of the
correct model objectives should ensure information is controlled and distributed
to the right decision makers. Since decision-making must be based on accurate
and up-to-date management data, the allocation and design of the governance
model will foster intelligent and more accurate decisions.
2.7.1.1 Centralized
Centralized governance is a common model that fits well in organizations used to
utilizing single-channel functions (where the direction flows from a single source)
with planning and decision-making completed by one group. A centralized model
will leave one team or person responsible for privacy-related affairs. All other
persons or organizations will flow through this single point. Often this single
point is the chief privacy officer (CPO) or corporate privacy office.
2.7.1.2 Local or Decentralized
Decentralization is the policy of delegating decision-making authority down to
the lower levels in an organization at a distance from and below a central
authority. A decentralized (or “local” or “distributed”) organization has fewer tiers
in the organizational structure, a wider span of control, and a bottom-to-top flow
of decision making and ideas.
In a more decentralized organization, the top executives delegate much of their
decision-making authority to lower tiers of the organizational structure. As a
correlation, the organization is likely to run on less-rigid policies and wider spans
of control among each officer of the organization. The wider spans of control also
reduce the number of tiers within the organization, giving its structure a flat
appearance. One advantage of this structure, if the correct controls are in place,
will be the bottom-to-top flow of information, allowing decisions about lower-tier
operations to be well informed. For example, if an experienced technician at the
lowest tier of an organization knows how to increase the efficiency of production,
the bottom-to-top flow of information can allow this knowledge to pass up to the
executive officers.
2.7.1.3 Hybrid
A hybrid governance model allows for a combination of centralized and local
governance. This is most typically seen when a large organization assigns a main
individual or department responsibility for privacy-related affairs and for issuing
policies and directives to the rest of the organization. The local entities then fulfill
and support the policies and directives from the central governing body. Members
of the privacy team may also sit locally, for example, with regional compliance
hubs in large multinationals. Each region may have a privacy manager or data
protection officer (DPO) who reports to local management and/or the CPO at
the global level.
2.7.1.4 Advantages and Disadvantages
Centralized management offers many advantages, with streamlined processes and
procedures that allow the organization to create efficiency by using the same
resources throughout the organization. Since decisions are made at the top layer,
individual employees or groups cannot make their own decisions and must seek
approval from a higher level.
With fewer layers of management, decentralized managers create and manage
their own business practices. This may be inefficient because each process may be
reproduced many times instead of once. On the other hand, employees are also
tasked with solving problems with which they are closest and most familiar.
The hybrid approach uses a decentralized decision-making process that tends to
provide less outside influence for employees yet offers the advantage of the
organizational resources of a larger, centralized organization. Typically, the hybrid
model will dictate core values and let the employee decide which practice to use
to obtain its goals. Working groups, individual offices, and other groups are
encouraged to make business decisions that consider revenue, operating costs,
and operations to retain visibility and alignment. Such models allow an
organization to function in a global environment yet maintain common missions,
values, and goals.
Mixing centralized and decentralized management approaches into a hybrid
approach enables the organization to achieve desired results that may span the
globe or locations across town. Employees believe their contributions provide a
sense of ownership, which encourages them to perform more efficiently and
effectively, consistent with top management.
2.8 Establishing the Organizational Model,
Responsibilities, and Reporting Structure
In establishing the overall organizational privacy model, one must consider the
organizational structure as related to strategy, operations, and management for
responsibilities and reporting. The privacy professional should know how each
major unit functions and understand its privacy needs. The following is a short list
of those roles for both large and small organizational structures, including:
Chief privacy officer (CPO)—A corporate leader tasked with
developing a company’s privacy strategy and operationalizing its
privacy program. The role is common in large organizations where there
is a global presence (e.g., Fortune 500). Note that this role may not
always report directly into the executive committee; however, this is
dependent on the structure of the organization.
Privacy director/manager—A mid-level position that usually reports
to the CPO or equivalent and assists with the implementation of the
privacy strategy/program. In larger organizations, the role may be
responsible for a specific subset of the company’s business activities
(e.g., ads privacy manager).
Privacy analysts—An entry-level position that usually assists with
more tactical privacy program tasks, such as conducting research to
support key decision-making or assisting customer support with
privacy-specific inquiries.
Business line privacy leaders—A senior management-level position
that is common in large multinational corporations where multiple
business lines, brands, or siloed regions exist (e.g., vice president, EU
privacy).
Privacy/legal counsels—Legal expertise is often required to support a
number of workstreams that make up the privacy program (e.g.,
carrying out third-party due diligence, negotiating DPAs in contracts,
breaches and incidents, regular notification and complaints, etc.).
Depending on the organization, these roles may either sit directly in the
privacy team, within the legal function, relied on through the
instruction of external counsel, or a combination of the some/all of the
above.
First responders—Roles that support a specific privacy process in
certain scenarios (e.g., incident response team members).
Data protection officer (DPO)—A role often reserved for companies
where it is a requirement to name a DPO (e.g., those subject to the
Article 37 requirements of the GDPR); however, it is increasingly
common to see the DPO role assigned regardless of legal need, and
such roles are often not full time (i.e., existing privacy, legal, or
compliance staff may be assigned the title) and may be outsourced.
Privacy engineers—A relatively new role that is focused on the
technical implementation of privacy requirements into product design
and leading the implementation of PbD principles across the
organization.
Privacy technologists—A term used to reference the many
technology professionals that play a role in protecting privacy in or with
technology. Includes but is not limited to audit, risk, and compliance
managers; data professionals; data architects; data scientists; system
designers and developers; software engineers; and privacy engineers.33
Most of these roles may not necessarily sit within the central privacy
office.
Organizational structures function within a framework by which the
organization communicates, develops goals and objectives, and operates daily.
Companies can use one of several structures or switch from one to another based
on need. Principles within that framework allow the organization to maintain the
structure and develop the processes necessary to do so efficiently. Considerations
include:
Hierarchy of command—The authority of senior management,
leaders ,and the executive team to establish the trail of responsibility.
Role definition—Clear definition of the responsibilities to create
individual expectations and performance.
Evaluation of outcomes—Methods for determining strengths and
weaknesses and correcting or amplifying as necessary.
Alteration of organizational structure—Ability to remain dynamic
and change as necessary to meet current objectives, adopt new
technology, or react to competition.
Significance—Complex structure typical for large organizations; flat
structures for smaller organizations.
Types of structures—Product organizational structures, functional
organizational structures, and others.
Customers—Consider the different needs depending on nature of
products and services the organization offers.
Benefits—To the organization, customers, and stakeholders as aligned
to the objectives and goals.
2.8.1 Titles Used for Privacy Leaders
The titles an organization uses to denote its privacy leaders reveals much
information about its approach to privacy, reporting structure, and industry.
According to a recent survey completed by the IAPP and EY, the terms “privacy,”
“chief,” and “officer” are the most popular terms used in privacy management titles
and more often used than terms like “counsel,” “director,” or “global.”34 Further, a
larger percentage of U.S.-headquartered organizations use the terms “privacy,”
“vice president,” and “director” for privacy management roles when compared
with their European counterparts; similar roles in the European Union are more
likely to use the term “data” in such titles (likely a result of the “privacy” versus
“data protection” divide between U.S.- and EU-headquartered firms).
Some companies are asking their CPO to serve in the role of DPO (discussed in
section 2.8.5) with or without adding the title. According to the survey, such
companies are most likely in unregulated industries and those with business-to-
business (B2B) models, suggesting these companies tend to appoint fewer but
perhaps more educated or qualified personnel to privacy leadership roles and then
ask more of them.35
2.8.2 Typical Educational and Professional Backgrounds of
Privacy Leaders
Regardless of the title an organization chooses for its privacy leader, it is
important that the individual possess the requisite skills and qualifications. While
a legal background is a common requirement for most privacy positions, project
management, controls implementation, audit, risk management, and information
security experience have emerged as key qualities of privacy professionals. The
company’s jurisdiction, specific industry, where the privacy leader is placed within
the organization, and to whom the leader reports also influence the desired
background and skills of privacy leaders. For example, a privacy leader reporting
into the general counsel would likely be expected to possess legal qualifications,
while a privacy leader reporting to the CISO may be expected to have a certain
level of security and technical knowledge, in addition to privacy expertise. For
organizations subject to the requirement to appoint a DPO, the law specifies the
reporting structure and skills required. The role of the DPO is covered in more
detail in section 2.8.5.
As a relatively new field and given the breadth of skills required outside of
knowledge of privacy laws and regulations, privacy professionals have come from
a diverse range of educational backgrounds. Having a degree or equivalent
qualification in law is not necessarily required for certain privacy roles. More
recently, however, degree programs specializing in privacy law, cybersecurity, and
privacy engineering have become available, including:36
The George Washington University’s Master of Engineering—
Cybersecurity Policy and Compliance37
Carnegie Mellon’s Master of Science in Information Technology—
Privacy Engineering (MSIT-PE)38
Ryerson University’s Certificate in Privacy, Access and Information
Management39
Brown University’s Executive Master in Cybersecurity40
Dublin City University’s Master of Arts in Data Protection and Privacy
Law41
The University of Auckland’s Postgraduate Diploma in Information
Governance42
Nebrija University’s Master’s in Data Protection and Security43
The emergence of these programs has further legitimized privacy as a distinct
profession requiring increased attention, resourcing, executive support, and
credentialing.
2.8.3 Professional Certifications
Like other professional certifications, those offered by the IAPP provide a way for
individuals in the industry to demonstrate they possess a fundamental
understanding of global privacy laws, concepts, best practices, and technologies.44
IAPP certifications, which are accredited by the American National Standards
Institute (ANSI) under the International Organization for Standardization (ISO)
Standard 17024: 2012, are increasingly listed as minimum requirements in privacy
job descriptions.
2.8.4 Conferences and Seminars
Conferences and seminars are rich resources for information and expert
presentations on effective ways to build a privacy program and address privacy
governance. Individuals learn from privacy experts about approaches to privacy
management by attending sessions, working groups, and/or panel discussions that
are assembled specifically to address this topic. Other topics include governance
structures. Presentations on managing security incidents, creating a sustainable
training and awareness program, and designing and implementing programs
educate the audience on the subject matter itself while also providing industry
insights into how an organization manages these issues and assigns accountability.
Information is also obtained through informal exchanges of ideas among privacy
professionals and those interested in this industry. Learning from experts and
peers is an incredibly valuable method for acquiring information about privacy
approaches.
2.8.5 The DPO Role
Designation of a DPO is a relatively new requirement that was formally
established under Article 37 of the GDPR and is now also required under the
Brazil’s Lei Geral de Proteção de Dados (LGPD), for example, and included in
several U.S. legislative proposals. The concept of designating an individual
responsible for monitoring an organization’s privacy compliance, however, is not
new. For example, Canada’s PIPEDA requires an organization appoint someone to
be responsible for its compliance with the act’s fair information principles; South
Korea’s Data Protection Act mandates the appointment of a DPO with specific
responsibilities;45 and Germany, under its implementation of EU Data Protection
Directive (95/46/EC), required organizations to appoint a DPO under certain
circumstances (e.g., if the company employed more than nine persons). With the
GDPR, this requirement is formalized and so are key criteria with respect to the
need, reporting position, and qualifications of the DPO.
2.8.5.1 When Is a DPO Required?
Note: This section focuses on the DPO requirements as set out by the GDPR.
Organizations processing personal data and subject to the LGPD are required to have a
DPO, though the regulator has the power to limit the scope of that requirement in the
future.
Article 37 of the GDPR establishes the specific criteria triggering the
requirement for an organization to designate a DPO.46
Subject to some exceptions, designation of a DPO is required:
By public authorities or bodies (except for courts acting in their judicial
capacity)
Where the organization’s “core” activities consist of processing
operations that require “regular and systematic monitoring of data
subjects on a large scale,” e.g., online behavior tracking
Where the organization’s “core” activities consist of processing “special”
categories of data or data relating to criminal convictions and offenses
on a large scale47
Even if it is determined a DPO is not required, the organization may choose to
voluntarily appoint one or multiple across different jurisdictions where this is a
requirement. Keep in mind that formally appointing a DPO will subject the
organization to the following DPO requirements:
Reporting structure and independence—The position of the DPO is
formally elevated by Article 38, whereby the DPO is required to “report
to the highest management level of the controller or the processor.”
While “highest management level” is not further defined by the GDPR,
its literal interpretation would be at the level of C-level management or
the board of directors.48 In practice, such a reporting line may not be
feasible or practical, depending on several factors, such as the size of the
company, accessibility of CEO, and likelihood of the reporting line to
affect the DPO’s independence. Organizations should consider these
key factors when deciding the DPO’s reporting lines.
Qualifications and responsibilities—Article 37 mandates several
requirements for the DPO’s qualifications and position, including that
the DPO possess “expert knowledge of data protection law and
practices.” Quantifying “expert knowledge” is subjective—a reasonable
interpretation of someone possessing expert knowledge in the field
would be the privacy professional who has spent most of their career
practicing privacy law or operationalizing privacy programs, for
example. It should be proportionate to the type of processing carried
out and take into consideration the level of protection the personal data
requires.49 Such expertise is likely required as a result of Article 39,
which requires the DPO to perform certain activities, including
monitoring the company’s compliance with the GDPR, providing
advice during DPIAs, and cooperating with supervisory authorities.50
Designating a DPO is no trivial task given the role’s specific qualifications,
responsibilities, and organizational visibility. It is important to create a position
that is “fit for purpose,” in other words, one that considers the company’s unique
requirements in light of the criteria expected of DPOs by the GDPR.
2.9 Summary
Establishing the appropriate governance of a privacy program is complex and
challenging. Defining your overall privacy mission and strategy, selecting your
framework, and establishing your program governance and organizational models
require heavy research, information gathering, and executive support and buy-in
from stakeholders at multiple levels. There is no one-size-fits-all approach when it
comes to privacy governance—your company’s unique global footprint,
availability of resources, risk appetite, culture, and business priorities will all affect
how your program is ultimately governed.
However, once adopted and implemented, proper governance ensures an
organization’s approach to privacy adequately demonstrates its compliance with
legal obligations, aligns with broader business objectives and goals, is fully
supported at all levels across the company, and culminates in the protection of
personal information.
Endnotes
1 Mission Statement, About the University Privacy Office, Stanford University,
https://privacy.stanford.edu/about-upo (accessed November 2021).
2 “Privacy at Microsoft,” Microsoft, accessed November 2018, https://privacy.microsoft.com/en-US/.
3 “Privacy,” Apple, accessed September 2021, https://www.apple.com/privacy/.
4 “Mission Statement,” An Coimisiún um Chosaint Sonraí | Data Protection Commission, accessed
November 2018, https://www.dataprotection.ie/en/who-we-are/mission-statement.
5 “Our Mission, Vision, Strategic Goals and Values,” ICO, accessed November 2018,
https://ico.org.uk/about-the-ico/our-information/mission-and-vision/.
6 GLBA, 15 U.S.C, Subchapter I, § 6809 (1999).
7 HIPAA of 1996, 45 C.F.R. §§ 160.102, 160.103.
8 COPPA of 1998, 15 U.S.C. 6501–6505.
9 PCI DSS, PCI Security Standards Council, accessed November 2018,
https://www.pcisecuritystandards.org/documents/PCI_DSS_v3-2-1.pdf.
10 “Security Breach Notification Laws,” National Conference of State Legislatures, accessed November 2018,
www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-
laws.aspx.
11 “The Code of Fair Information Practices,” accessed November 2018,
https://epic.org/privacy/consumer/code_fair_info.html.
12 “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data,” OECD,
accessed November 2018,
http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowso
fpersonaldata.htm.
13 Generally Accepted Privacy Principles: CPA and CA Practitioner Version, IAPP presentation, August 2009,
https://iapp.org/media/presentations/11Summit/DeathofSASHO2.pdf.
14 “Principles Set Out in the National Standard of Canada Entitled Model Code for the Protection of
Personal Information,” CAN/CSA-Q830-96, Government of Canada, Justice Laws website, accessed
November 2018, https://laws-lois.justice.gc.ca/eng/acts/P-8.6/page-11.html#h-26.
15 “APEC Privacy Framework (2015),” APEC, accessed November 2018,
https://www.apec.org/Publications/2017/08/APEC-Privacy-Framework-(2015).
16 ETSI, accessed November 2018, https://www.etsi.org/standards.
17 IAPP-EY Annual Privacy Governance Report 2019, IAPP, https://iapp.org/resources/article/iapp-ey-
annual-governance-report-2019/https://iapp.org/resources/article/iapp-ey-annual-governance-report-
2019/.
18 “IAPP CIPM Crosswalk,” Privacy Framework, NIST, accessed October 2021,
https://www.nist.gov/privacy-framework/iapp-cipm-crosswalk.
19 For instance, lawyers implementing the governance policies, processes, and procedures category will
require greater familiarity with the legal regimes in the jurisdictions in which their organizations operate,
skill sets more closely aligned with the IAPP’s regionally based CIPP bodies of knowledge. Similarly,
privacy engineers assessing options for deidentification techniques under the disassociated processing
category will need more technical knowledge, such as that reflected in the IAPP’s CIPT body of
knowledge. The NIST Framework and the CIPM body of knowledge can serve as the bridge between these
stakeholders.
20 “Resolution on Privacy by Design,” 32nd International Conference of Data Protection and Privacy
Commissioners, Jerusalem, Israel, October 2010, https://edps.europa.eu/sites/edp/files/publication/10-
10-27_jerusalem_resolutionon_privacybydesign_en.pdf.
21 Ann Cavoukian, Privacy by Design: The 7 Foundational Principles, revised January 2011,
https://iapp.org/media/pdf/resource_center/pbd_implement_7found_principles.pdf.
22 For example, Ron De Jesus, “How to Operationalize Privacy by Design,” Privacy Advisor, IAPP, May 27,
2020, https://iapp.org/news/a/how-to-operationalize-privacy-by-design/.
23 Personal Information Protection and Electronic Documents Act, (S.C. 2000, C.5), Justice Laws Website,
Government of Canada, accessed November 2018, https://laws.justice.gc.ca/eng/acts/P-8.6/.
24 “EU-GDPR,” accessed November 2018, http://www.privacy-regulation.eu/en/index.htm.
25 “Privacy Shield Framework,” accessed November 2018, https://www.privacyshield.gov/EU-US-
Framework.
26 Caitlin Fennessy, “The ‘Schrems II’ Decision: EU-US Data Transfers in Question,” Privacy Tracker, IAPP,
July 16, 2020, https://iapp.org/news/a/the-schrems-ii-decision-eu-us-data-transfers-in-question/.
27 GDPR, Article 47, accessed November 2018, http://www.privacy-regulation.eu/en/article-47-binding-
corporate-rules-GDPR.htm.
28 HIPAA of 1996, 45 C.F.R. §§ 160.102, 160.103.
29 CNIL, accessed November 2018, https://www.cnil.fr/en/home.
30 IAPP and TrustArc, How Privacy Tech is Bought and Deployed, IAPP, 2019,
https://iapp.org/resources/article/how-privacy-tech-is-bought-and-deployed-2019/.
31 2021 Privacy Tech Vendor Report, IAPP,
https://iapp.org/media/pdf/resource_center/2021TechVendorReport.pdf.
32 Neil Roiter, “IT GRC tools: Control your environment,” CSO from IDG, accessed November 2018,
https://www.csoonline.com/article/2127514/compliance/it-grc-tools--control-your-environment.html.
33 “Paperwork Reduction Act,” Glossary of Privacy Terms, IAPP,
https://iapp.org/resources/glossary/#paperwork-reduction-act-2.
34 IAPP-EY Annual Privacy Governance Report 2019, IAPP, https://iapp.org/resources/article/iapp-ey-
annual-governance-report-2019/.
35 IAPP-EY Annual Privacy Governance Report 2017, IAPP, accessed November 2018,
https://iapp.org/resources/article/iapp-ey-annual-governance-report-2017/; To see the IAPP-EY Annual
Governance Report 2019, visit, https://iapp.org/resources/article/iapp-ey-annual-governance-report-
2019/.
36 Privacy and Data Protection in Academia: A Global Guide to Curricula, IAPP, June 8, 2021,
https://iapp.org/resources/article/privacy-and-data-protection-academia/.
37 Master of Engineering in the Field of Cybersecurity Policy and Compliance, The George Washington
University, accessed November 2021, http://bulletin.gwu.edu/engineering-applied-science/engineering-
management-systems-engineering/cybersecurity-policy-and-compliance-meng/
38 MSIT—Privacy Engineering, Carnegie Mellon University, accessed November 2018,
http://privacy.cs.cmu.edu/.
39 Privacy, Access and Information Management, Ryerson University, accessed November 2018, https://ce-
online.ryerson.ca/ce/default.aspx?id=3778.
40 Executive Master in Cybersecurity, Brown University, accessed November 2018,
https://professional.brown.edu/cybersecurity/.
41 Master of Arts in Data Protection and Privacy, Dublin City University, accessed September 2021,
https://www.dcu.ie/courses/postgraduate/school-law-and-government/ma-data-protection-and-privacy-
law-and-computing.
42 Postgraduate Diploma in Information Governance, University of Auckland, accessed September 2021,
https://www.online.auckland.ac.nz/postgraduate-programmes/business/postgraduate-diploma-in-
information-governance/.
43 Master’s Degree in Data Protection and Security, Nebrija University, accessed September 2021,
https://www.nebrija.com/en/postgraduate-degree/master/data-protection/.
44 “IAPP Certification,” IAPP, accessed November 2018, https://iapp.org/certify/.
45 Cynthia Rich, “Privacy and Security Law Report,” Bloomberg BNA, accessed November 2018,
https://media2.mofo.com/documents/150518privacylawsinasia.pdf.
46 Thomas J. Shaw, Esq., The DPO Handbook: Data Protection Officers under the GDPR, (Portsmouth, NH:
IAPP, 2018).
47 GDPR, Article 37, accessed November 2018, www.privacy-regulation.eu/en/article-37-designation-of-
the-data-protection-officer-GDPR.htm.
48 GDPR, Article 38, accessed November 2018, www.privacy-regulation.eu/en/article-38-position-of-the-
data-protection-officer-GDPR.htm.
49 GDPR, Article 37.
50 GDPR, Article 39, accessed November 2018, www.privacy-regulation.eu/en/article-39-tasks-of-the-data-
protection-officer-GDPR.htm.
CHAPTER 3
Privacy Program Framework:
Applicable Privacy Laws and Regulations
Susan Bandi, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Compliance with data protection laws and regulations is a major driver for many
organizations’ privacy programs. This chapter describes some of the most
encountered data privacy laws, regulations, and statutes around the world.
As there are numerous global privacy laws and regulations, privacy professionals
should seek assistance from their organization’s legal office, outside counsel, or a
third-party research firm to ensure all relevant laws and regulations have been
captured.
Elements of these global laws and regulations overlap in requirements. It is a
sound practice to capture this information in a privacy tool, database, spreadsheet,
or by another tracking method to highlight the similarities. These include notice,
choice and consent, purpose limitation, individual rights, data retention limits,
and data transfers.
Understanding the scope of data collected and processed by the organization
will guide the privacy professional in the task of researching and compiling
applicable laws, regulations, and statutes. The privacy professional must
understand applicable national laws and regulations, as well as local laws and
regulations.
3.1 Global Privacy Laws
Almost every country in the world has enacted some sort of data privacy law(s).
Regulators are enforcing how personal information is collected and how data
subjects are informed and have a right to decide how their personal data is used.
Many laws and regulations either have penalties for noncompliance or allow for
private rights of action for noncompliance. Navigating through the many laws and
regulations can be a daunting task.
More recent data privacy laws often focus on the application of new
technologies as it relates to protecting personal data. Such new areas of data
protection include:
Artificial intelligence (AI)
Machine learning (ML)
Data security measures and controls on new technologies, such as
quantum computing, and AI/ML
Handling personal data during pandemics (COVID-19)
Most countries with privacy and data protection legislation have omnibus laws
that cover the collection and use of personal data in general, with perhaps
increasing protection and sensitivity required for certain categories of data, such
as health data or data regarding sexual orientation. Table 3-1 provides examples of
these laws, which commonly apply based on where the company does business,
serves customers and the industry. As a rule of thumb, anyone actively trying to
solicit business in a country is subject to its privacy and data protection laws. It is
important, therefore, to recognize this and understand all laws and regulations
that apply to your business operation. As outlined in table 3-1, for example, if a
U.S.-based organization is operating or serving customers in the European Union,
the General Data Protection Regulation (GDPR) will apply. If an EU company
does business in Canada, the Personal Information Protection and Electronic
Documents Act (PIPEDA) and other various provincial privacy laws may apply.
Table 3-1: Examples of Global Privacy and Data Protection Laws
Country/Region International
Legislation Responsible Authority
Argentina Personal Data Protection Agency for Access to Public Information (AAPI)
Law No. 25,326 (PDPL)1
Australia Privacy Act 19882 Office of the Australian Information Commissioner
(OAIC)
Brazil Lei Geral de Proteção de Autoridade Nacional de Proteção de Dados (ANPD)
Dados (LGPD), Law No.
13.709/20183
Canada Personal Information Office of the Privacy Commissioner of Canada
Protection and Electronic (OPC) (Note: Canadian provinces have their own,
Documents Act often stricter, privacy laws)
(PIPEDA)4
China Cybersecurity Law of the Cyberspace Administration of China (CAC) (lead
People’s Republic of agency) (Note: Other agencies and sector-specific
China5 regulators may also monitor and enforce data
protection issues)
Data Security Law of the
People’s Republic of
China6
Personal Information
Protection Law of the
People’s Republic of
China (PIPL)7
Country/Region International
Legislation Responsible Authority
European Union General Data Protection Member state supervisory authority/lead
Regulation (GDPR)8 supervisory authority
Hong Kong Personal Data (Privacy) Office of the Privacy Commissioner for Personal
Ordinance (Cap. 486)9 Data (PCPD)
India Information Technology Ministry of Electronics and Information
Act, 200010 Technology (MeitY)
Israel Protection of Privacy Privacy Protection Authority (PPA)
Law, 5741-1981 (PPL)11
Japan Act on the Protection of Personal Information Protection Commission
Personal Information (PIPC)
(APPI)12
New Zealand Privacy Act 202013 Office of the Privacy Commissioner (OPC)
Singapore Personal Data Protection Personal Data Protection Commission (PDPC)
Act 2012 (No. 26 of 2012)
(PDPA)14
South Korea Personal Information Personal Information Protection Commission
Protection Act (PIPA)15 (PIPC)
Turkey Law on the Protection of Kişisel Verileri Koruma Kurumu (KVKK)
Personal Data No. 6698
dated 7 April 2016
(LPPD)16
United Kingdom Data Protection Act (DPA) Information Commissioner’s Office (ICO)
2018 and UK General
Data Protection
Regulation (GDPR)17
The principles upon which many of these data protection laws are based were
established in two important privacy frameworks: the Organisation for Economic
Co-operation and Development (OECD) Guidelines on the Protection of Privacy
and Transborder Flows of Personal Data and Asia-Pacific Economic Cooperation
(APEC) privacy framework.18 These foundational privacy principles include the
concepts of collection limitation, data quality, purpose specification, use
limitation, security safeguards, openness, individual participation, and
accountability.
As a result, there are commonalities among provisions in global privacy and data
protection laws, regulations, and standards, such as requirements for ensuring
individual rights (i.e., access, correction, and deletion) and obligations
(safeguarding data). Other similarities include contractual requirements, audit
protocol, self-regulatory regimes, and marketplace expectations. Privacy managers
must know and understand the common legal elements to avoid duplication of
compliance efforts when a jurisdiction-by-jurisdiction approach is taken.
This next section will highlight requirements from several global privacy and
data protection laws.
3.1.1 EU General Data Protection Regulation
In December 2016, the EU Parliament and Council agreed upon the GDPR, a
regulation first proposed in 2012, and, as of May 25, 2018, is now enforceable.19
The GDPR offers a framework for data protection with increased accountability
for organizations, and its reach is extraterritorial. Given the size and scope of the
EU economy, the GDPR has rapidly become a global standard for data protection
that every working privacy professional must understand on some level. Below are
the general provisions of the GDPR that provide insight into the objectives of the
regulation and applicable scope. (This is not intended to be a thorough review of
all the articles.20)
3.1.1.1 Article 1: Subject-Matter and Objectives
1. This Regulation lays down rules relating to the protection of natural
persons with regard to the processing of personal data and rules relating
to the free movement of personal data.
2. This Regulation protects fundamental rights and freedoms of natural
persons and their right to the protection of personal data.
3. The free movement of personal data within the Union shall be neither
restricted nor prohibited for reasons connected with the protection of
natural persons with regard to the processing of personal data.21
3.1.1.2 Article 2: Material Scope
1. This Regulation applies to the processing of personal data wholly or
partly by automated means and to the processing other than by
automated means of personal data which form part of a filing system or
are intended to form part of a filing system.
2. This Regulation does not apply to the processing of personal data:
a. in the course of an activity which falls outside the scope of
Union law;
b. by the Member States when carrying out activities which
fall within the scope of Chapter 2 of Title V of the TEU;
c. by a natural person in the course of a purely personal or
household activity;
d. by competent authorities for the purposes of the prevention,
investigation, detection or prosecution of criminal offences,
the execution of criminal penalties, including the
safeguarding against and the prevention of threats to public
security.
3. For the processing of personal data by the Union institutions, bodies,
offices and agencies, Regulation (EC) No 45/2001 applies. Regulation
(EC) No 45/2001 and other Union legal acts applicable to such
processing of personal data shall be adapted to the principles and rules
of this Regulation in accordance with Article 98.
4. This Regulation shall be without prejudice to the application of
Directive 2000/31/EC, in particular of the liability rules of
intermediary service providers in Articles 12 to 15 of that Directive.22
3.1.1.3 Article 3: Territorial Scope
1. This Regulation applies to the processing of personal data in the context
of the activities of an establishment of a controller or a processor in the
Union, regardless of whether the processing takes place in the Union or
not.
2. This Regulation applies to the processing of personal data of data
subjects who are in the Union by a controller or processor not
established in the Union, where the processing activities are related to:
a. the offering of goods or services, irrespective of whether a
payment of the data subject is required, to such data
subjects in the Union; or
b. the monitoring of their behavior as far as their behavior
takes place within the Union.
3. This Regulation applies to the processing of personal data by a
controller not established in the Union, but in a place where Member
State law applies by virtue of public international law.23
The GDPR sets out specific requirements for organizations collecting or
processing the personal data of individuals (data subjects) in the European Union,
what rights it grants to individuals, and what consequences exist for
noncompliance. Table 3-2 highlights requirements of the GDPR.24
Table 3-2: GDPR Awareness Guide
What
consumers Withdraw consent for processing
can do
Request a copy of all their data
Request the ability to move their data to a different organization
Request to delete all their data
Object to automated decision-making processes, including profiling
What
organizations Implement privacy by default and privacy by design (PbD)
must do
Maintain appropriate safeguards
Notify data protection authorities (DPAs) and consumers of data
breaches within 72 hours from detection (if criteria for notification
met)
Get appropriate consent for most personal data collection and
provide notification of personal data-processing activities
Get a parent’s consent to collect data for children under 16
Keep records of all processing of personal information
Appoint a data protection officer (DPO), if you regularly process lots
of data or, particularly, sensitive data
Take responsibility for the security and processing activities of third-
party vendors
Conduct data protection impact assessments (DPIAs) on new or high-
risk processing activities
Institute safeguards for cross-border data transfers
Consult with regulators before proceeding with certain processing
activities
Be able to demonstrate compliance on demand
Provide appropriate data protection training to personnel having
permanent or regular access to personal data
What
regulators can Ask for records of processing activities and proof of steps taken to
do comply with the GDPR
Impose temporary data-processing bans, require data breach
notification, or order erasure of personal data
Suspend cross-border data flows
Enforce penalties of up to 20 million euros or four percent annual
revenues for noncompliance
3.1.2 California Consumer Privacy Act
The California Consumer Privacy Act (CCPA) was the first comprehensive
privacy law introduced in the United States. In June 2018, the CCPA was signed
into law, creating new privacy rights for Californians and significant new data
protection obligations for businesses. The bill also reinvigorated state and federal
efforts to adopt comprehensive privacy legislation. The CCPA went into effect
January 1, 2020. Enforcement by California’s Office of the Attorney General
began July 1, 2020, and the office also promulgates the regulations. Table 3-3
highlights requirements of the CCPA.
Table 3-3: CCPA Awareness Guide25
What
consumers Request a record of what types of data an organization holds about
can do them, plus information about what is being done with their data in
terms of both business use and third-party sharing
Can request erasure and/or opt out of sale, with carve-outs for
completion of a transaction, research, free speech, and some internal
analytical use
What
organizations Have a verification process so consumers can prove they are who they
must do say they are when they do their requesting
Respond to data subject access requests free of charge within 45 days
Provide methods for receiving consumer requests—e.g., a toll-free
number, web form, or email address
Disclose to whom they sell data and put a “Do Not Sell My Personal
Information” link on their websites to make it easy for consumers to
object
Disclose what personal information (PI) is collected and how it will be
used, and provide an online privacy notice
Cannot “discriminate against a consumer” based on the exercising of
any of the rights granted in the bill, i.e., charge a different price or
provide different quality of goods or services, with some exceptions
Understand that sale of children’s data will require express opt-in,
either by the child, if between ages 13 and 16, or by the parent, if
younger than that
Train certain employees on consumer rights pursuant to the law
What
regulators The attorney general enforces the law. Failure to address an alleged
can do violation within 30 days could lead to a $7,500 fine per intentional
violation. For unintentional violations, which is up to courts or the
attorney general to define, the fine is $2,500.
While the CCPA is currently in force, in November 2020, California passed a
ballot initiative that provides additional protections for consumers. Unless
preempted by a federal privacy law, the California Privacy Rights Act (CPRA)
comes into force January 1, 2023, with a one-year look-back to January 2022.
3.1.3 Brazil’s Lei Geral de Proteção de Dados
Brazil’s General Data Protection Law, Lei Geral de Proteção de Dados Pessoais
(LGPD), was passed August 18, 2018. The law went into effect September 18,
2020, though administrative sanctions could not be issued until August 1, 2021.
Following a year of uncertainty regarding the date of implementation, the LGPD
officially came into effect. Although Brazil is no stranger to sectoral privacy laws
and already had more than 40 laws and norms at the federal level, the LGPD is the
country’s first law to provide a comprehensive framework regulating the use and
processing of all personal data.
Greatly influenced by the GDPR, the LGPD will be familiar to those who have
worked with the regulation.26 Comprising 65 articles, the law sets forth the
Brazilian conception of personal data and provides the legal bases authorizing its
use.27 Table 3-4 highlights requirements of the LGPD.
Table 3-4: LGPD Awareness Guide
What data
subjects can Confirm the existence of processing
do Access their data
Correct incomplete, inaccurate, or out-of-date data
Anonymize, block, or delete unnecessary or excessive data or data
processing in violation of the law
Export data to another service or product provider
Delete personal data processed pursuant to consent
Obtain information about entities with which data is shared
Obtain information about denying consent
Review decisions made solely based on automated processing
Oppose non-consent-based processing when in violation of the law
What
organizations Implement privacy-by-design and -default processes
must do Develop incident response and remediation plans
Maintain appropriate data security
Notify data subjects and regulators of data breaches
Follow special rules for directly processing children’s data
Provide notice of intention to process personal information
Appoint a DPO (for controllers)
Take responsibility for processing activities of third-party vendors
Create personal data protection impact report (RIPD)
Ensure adequacy or appropriate safeguards for data transfers
Keep records in most circumstances and demonstrate compliance
Comply with international data transfer requirements
What
regulators can Ask for records of compliance
do Apply sanctions, e.g., warnings and corrective measures, publicization
of the infraction, suspension or prohibition of processing activities
Enforce penalties up to two percent of its revenue in Brazil up to a total
maximum of 50 million reais per infraction
3.1.4 People’s Republic of China Personal Information
Protection Law
On August 20, 2021, the Standing Committee of China’s National People’s
Congress adopted the Personal Information Protection Law (PIPL), which took
effect November 1, 2021. PIPL is China’s first comprehensive information privacy
protection law. With the adoption of PIPL, the People’s Republic of China (PRC)
joins three of the top four economies with an omnibus privacy law.
While some commercial aspects of PIPL resemble the EU GDPR, with
provisions mandating companies exercise data minimization and user consent, it
will not prevent the PRC’s central government from accessing data. For
companies doing business in China, fines for violations of the law range between
$7.7 million or up to five percent of the previous year’s business revenue.28
3.1.5 Emerging Privacy and Data Protection Laws
The global landscape of privacy and data protection laws continues to evolve. It is
prudent for the privacy and data protection professionals to remain aware of
emerging laws in those jurisdictions in which their organization does business to
understand the proposed scope and requirements, as well as the likelihood of the
passage of the law. They should know when any new legislation is scheduled to
take effect and adjust the organization’s privacy program accordingly.
Once an organization has an overall view of the data protection laws it is subject
to, it is useful to carry out a monitoring exercise on a regular basis to take an
accounting of any new laws and regulations to add to the privacy program scope.
As an example, an organization may grow and expand into a new market or
jurisdiction. It may also acquire a new business, bringing new potential risk or
exposure. Adjustments should be made within the privacy program framework to
reflect these changes.
As referred to earlier, privacy professionals can keep up to date on the
emergence of these new laws and how they are progressing, including signing up
to newsletters or monitoring subscription services, as well as regulator mailing
lists, to track legislation progress.
The privacy profession should continue to see data protection laws enacted
around the globe for at least the next decade. The United States is actively working
toward federal privacy legislation, but as of the date of this writing, it has not come
to fruition.
3.1.6 Sectoral Privacy Laws
Some jurisdictions address privacy and data protection through laws that apply to
market sectors and industries. Industries with privacy-related concerns with
implications for consumers include:
Health care—There are generally special protections for health data
(protected health information [PHI] in the United States, special
categories of data in the European Union). Laws may cover health care
facilities, as well as researchers and anyone doing business with health
care operations.
Financial—Organizations must monitor confidentiality, financial, and
terrorism (anti-money laundering, specifically) laws.
Telecom—Not just the content of communications is important for
protection, but also metadata and location information, to which law
enforcement often wants access.
Online—Watch out for issues presented by online transactions, the
lure of detailed information (to law enforcement, marketers, and
criminals) available on the web for scraping and collection, and the
global nature of online privacy concepts.
Government—The courts are constantly reevaluating definitions of
“public records,” and governments have specific obligations regarding
transparency that often conflict with privacy.
Education—Laws are focused on educational agencies and
institutions.
Video—Originally designed to protect renters and purchasers of goods
from videotape rental stores, laws have now been interpreted to apply to
online-streaming services.
Marketing—This has become one of the most complicated areas of
privacy. Professionals must understand not just law, but also rapidly
evolving technology, self-regulatory schemes, and new data
marketplaces.
Energy—An emerging privacy arena due to the emergence of smart
grid technology and so-called smart houses.
Human resources (HR)/employment—Standard ideas of
confidentiality in this area are running up against technology in the
work environment, where efficiency often means monitoring.
During the COVID-19 pandemic, HR functions also had to focus on the privacy
and security impacts of working from home. Monitoring laws differ depending on
the country.29
The United States serves as an excellent example of a sectoral approach to
privacy. Table 3-5 lists the primary industry-focused privacy-related laws enforced
by the federal government.
Table 3-5: Privacy Laws Enforced by the Federal Government
U.S. Federal Enforcement Focused Concern
Legislation
Federal Trade Federal Trade Commission Unfair or deceptive acts or practices in or
Commission Act (FTC) affecting commerce
(Section 5) of
191430
Fair Credit FTC and Consumer Financial Protects information collected by consumer
Reporting Act Protection Bureau (CFPB) reporting agencies, such as credit bureaus,
(FCRA) of 197031 medical information companies, and tenant
screening services. Accuracy and fairness of
credit reporting; credit reporting agencies
adopt reasonable procedures, credit
information, collection, access to credit
reports.
Family Department of Education Improper disclosure of personally
Educational (ED), Family Policy identifiable information (PII) derived from
Rights and Compliance Office education records; unfair and deceptive
Privacy Act trade practices
(FERPA) of
197432
Federal Privacy Department of Justice (DOJ)Establishes a code of fair information
Act of 1974, as practices that governs the collection,
amended at 5 maintenance, use, and dissemination of
U.S.C. 552a33 information about individuals that is
maintained in systems of records by federal
agencies
Electronic State or law enforcement Federal wiretapping and electronic
Communications agency eavesdropping;
Privacy Act unauthorized government access to
(ECPA) of 198634 electronic communication
Video Privacy Individual private right of Prevents disclosure of personally identifiable
Protection Act action rental records of prerecorded video cassette
(VPPA) of 198835 tapes or similar audiovisual material
Telephone FTC, Federal Communications Limits the use of automatic dialing systems,
Consumer Commission (FCC), and state artificial or prerecorded voice messages, SMS
Protection Act attorneys general text messages, and fax machines
(TCPA) of 1991 36
Driver’s Privacy State attorneys general Privacy and disclosure of personal
Protection Act information gathered by state departments
(DPPA) of 1994 37 of motor vehicles (DMVs)
U.S. Federal Enforcement Focused Concern
Legislation
Health Insurance Health and Human Services Health insurance, medical records, PHI,
Portability and (HHS) Office for Civil Rights specific category data (GDPR), medical
Accountability (OCR) is responsible for research
Act (HIPAA) of enforcing the Privacy and
199638 Security Rules
Children’s Online FTC Regulating personal information collected
Privacy from minors. The law specifically prohibits
Protection Act online companies from asking for PII from
(COPPA) of children 12 and under unless there’s
199839 verifiable parental consent.
Gramm-Leach- FTC, the federal banking Financial institutions offering consumers
Bliley Act of 1999 agencies, and other federal financial services or offerings must explain
(GLBA)40 regulatory authorities, as well sensitive information sharing practices
as state insurance oversight
agencies
Controlling the FTC Establishes requirements for those who send
Assault of Non- unsolicited commercial email
Solicited
Pornography
and Marketing
Act (CAN-SPAM)
of 200341
Fair and FTC, Board of Governors Enhances consumer protections, particularly
Accurate Credit Federal Reserve System, in relation to identity theft
Transactions ActFederal Deposit Insurance
Corporation (FDIC), National
(FACTA) of 200342
Credit Union Administration,
Office of the Comptroller of
the Currency, and Office of
Thrift Supervision
National Do Not FTC Gives consumers a choice about whether to
Call Registry receive telemarketing calls
(2003)43
Health OCR Provides HHS with the authority to establish
Information programs to improve health care quality,
Technology for safety, and efficiency through the promotion
Economic and of health IT, including electronic health
Clinical Health records (EHRs) and private and secure
(HITECH) Act of electronic health information exchange
200944
3.2 Self-Regulation: Industry Standards and
Codes of Conduct
In addition to sector-specific laws, there are a variety of voluntary and contractual
initiatives that establish codes of conduct for communities of interest. Table 3-6
highlights some of the more notable self-regulatory programs.
Table 3-6: Notable Self-Regulatory Programs
Self-Regulation Sector Affected
(Voluntary)
Payment Card Industry Any organization, regardless of size or number of transactions, that
Data Security Standard accepts, transmits, or stores any cardholder data branded with one
(PCI DSS)45 of the five card association/brand logos
DMA Guidelines for Individuals and entities involved in data-driven marketing in all
Ethical Business media47
Practice46
Verisign, TrustArc, Online vendors’ ecommerce sites49
McAfee, PayPal trust
marks48
Children’s Advertising National advertising primarily directed to children under the age of
Review Unit (CARU) 12 in any medium
Advertising Guidelines50
Network Advertising NAI members’ approach to privacy and data governance in
Initiative (NAI) Code of connection with the collection and use of data for interest-based
Conduct51 advertising
EU Code of Conduct Business-to-business (B2B) cloud services where the cloud service
provider is acting as a processor under Article 28 of the GDPR52
3.3 Cross-Border Data Transfers
Knowing where an organization’s data is located and how the data is processed
(transferred) gives an organization the ability to have transparency and visibility
into the organization’s data protection culture. Many regulations around the globe
require organizations to put in place specific contracts or other protective
mechanisms to govern data when it involves cross-border data transfers.
While many countries have cross-border transfer regulations, the European
Union has some of the strictest requirements for transferring personal data. As a
general rule, transfers of personal data to countries outside the European
Economic Area (EEA) may take place if these countries are deemed to ensure an
adequate level of data protection. Article 45 of the GDPR provides the third
countries’ level of personal data protection is assessed by the European
Commission.
For countries not deemed adequate by the European Commission, standard
contractual clauses (SCCs) are a valid method of data transfer. Organizations
using SCCs should assess data transfers on a case-by-case basis for systems
transferring personal data out of the European Union. This in effect is known as a
“data transfer impact assessment” (DTIA or TIA).
To properly perform an assessment of data transfers, several steps need to be
taken:
Map where your data is at and where it is transferred to
Identify the mechanisms used for the transfer
Assess effectiveness of transfer mechanisms
Adopt additional safeguards as needed
Ensure additional measures align with business requirements
Monitor for ongoing compliance
Regulator guidance on TIAs is still evolving. However, DPAs are expecting TIAs
to include but are not limited to questions such as these:
What is the likelihood of government access to the data?
Is the data within the scope of intelligence and law enforcement
activities?
Are proper protective measures in place?
What are the applicable privacy and security standards of the receiving
country?
What are the general human rights ratings of the receiving country?
Table 3-7 outlines all approved options for cross-border data transfers from the
European Union.
Table 3-7: Approved Mechanisms for Transferring Personal Data Outside the European Union
Adequacy Adequacy is based on the concept of essential equivalence: There must be an
decision adequate level of protection of personal data essentially equivalent to the
protection of personal data in the European Union for data to freely travel from the
European Union to another jurisdiction (Article 45).
No Appropriate safeguards (if no adequacy decision) are mechanisms the company can
adequacy adopt to protect personal data and facilitate ongoing and systematic cross-border
decision personal data transfers. Note that the “Schrems II” decision suggested use of these
mechanisms requires a case-by-case assessment of the sufficiency of protections in
the receiving jurisdiction and “additional safeguards” where the standard contracts
cannot ensure essentially equivalent protection. This has resulted in updated SCCs
and guidance from the EDPB. Examples include:
Binding corporate rules (BCRs)
Allow large multinational companies to adopt a policy suite with binding
rules for handling personal data (Articles 46(b), 47)
SCCs 53
SCCs (Articles 46(c)); model contracts (Recital 109)
Codes of conduct or certification mechanisms
Approved codes of conduct (Article 40) and certification mechanisms
(Article 42) with binding and enforceable commitments (Articles 40–43)
Ad hoc contractual clauses authorized by supervisory authorities:
Required supervisory authority authorization
Individual tailoring to a company’s needs
Differences at the member-state level
Not every jurisdiction is as strict with cross-border data transfer as the European
Union. In general, the transparency principle applies here in large part, along with
what may be called “surprise minimization”: Is the country to which you’re
transferring personal data likely roughly equivalent in terms of privacy
protections? Would a person who has entrusted you with personal data be likely
to object to their data traveling to that country?
An example from the OPC of Canada: “Individuals should expect that their
personal information is protected, regardless of where it is processed.
Organizations transferring personal information to third parties are ultimately
responsible for safeguarding that information. Individuals should expect
transparency on the part of organizations when it comes to transferring to foreign
jurisdictions.”54
Pay particular attention to personal information access by national security
agencies, law enforcement, and foreign courts. As a rule of thumb, adjust the
privacy program to the most stringent legal requirements to which the data
processing is subject. While particularly sophisticated data governance operations
may with confidence know exactly where each piece of personal data was
collected and is traveling and follows the rules for those cross-border data
transfers, many organizations play it safe and err on the side of caution.
Also, note that definitions of key terms (e.g., controller, processor, sensitive data,
processing, data transfer) may differ from one jurisdiction to another. You must
understand the implications of doing business with countries that have
inadequate or no privacy laws. In many cases, the risks may outweigh the benefits.
3.4 Organizational Balance and Support
Ensuring the privacy program aligns with business initiatives is very important.
Business units must know and understand the goals and objectives of the privacy
program and be part of the solution:
Compliance should be the baseline
PbD, plus strategizing with business colleagues, will further the
organization’s goals and help strike a balance
Compliance creates an opportunity to simultaneously reevaluate and
improve data management practices, such as data inventory and data
access controls
Compliance should be achieved with the least amount of business
disruption, as business disruption is another form of penalty that
should be considered, in addition to potential fines for noncompliance55
3.5 Understanding Penalties for Noncompliance
with Laws and Regulations
Legal and regulatory penalties are typically imposed by an industry to enforce
behavior modifications due to previous neglect and improper protection of data.
Privacy is no different; organizations are held accountable to protect the privacy
of the data with which they have been entrusted. As penalties for violation of
privacy laws and regulations become more pervasive and serious, the privacy
professional must be prepared to address, track, and understand penalties that
could affect the organization. For example, as shown in table 3-8, per the U.S.
Health Information Technology for Economic and Clinical Health Act (HITECH
Act), which amended the Health Insurance Portability and Accountability Act
(HIPAA) Privacy and Security Rules, the maximum penalty for breach of
protected health information per year is now $1.5 million.56 In April 2019, the
OCR reevaluated the HITECH Act text and interpreted the maximum fines
differently. As of April 2019, the maximum fines that can be applied for violations
of an identical provision in a calendar year are different in each penalty tier. The
maximum fine per violation category, per year, is still $1.5 million for a Tier 4
violation.57
Table 3-8: HIPAA Violation Penalties58
Violation Penalties
Tier 1—Unaware of violation and by exercising reasonable due diligence $100–
$50,000/violation
Up to
$25,000/year
Tier 2—Reasonable cause the covered entity knew about or should have $1,000–
known about by exercising due diligence $50,000/violation
Up to
$100,000/year
Tier 3—Willful neglect with the violation remediated within 30 days of $10,000–
discovery $50,000/violation
Up to
$250,000/year
Tier 4—Willful neglect, no effort to correct the violation within 30 days of $50,000/violation
discovery Up to $1.5
million/year
In the European Union, the GDPR creates two tiers of maximum fines
depending on the nature of the current violation and whether the controller or
processor committed any previous violations. Fines depend on several factors,
including:
Nature, gravity, and duration of infringement
Nature, scope, and purpose(s) of processing
Number of data subjects concerned
Level of damage and damage mitigation
Intent or negligence
Degree of responsibility (technical and organizational measures)
Previous infringements
Degree of cooperation with a supervisory authority
Categories of personal data
Manner of notification
Compliance with measures ordered by supervisory authority
Adherence to approved codes of conduct/certification mechanisms
Tier 1 fines can be up to 20 million euros or four percent of total turnover,
whichever is higher, for infringements of principles, data subjects’ rights,
international data transfers, obligations of member state law, and noncompliance
with a supervisory authority’s order. These infringements tend to be more
substantive.
Tier 2 fines are for infringements of most other obligations and can be up to
10 million euros or two percent of total turnover, whichever is higher. These
infringements tend to be more administrative.
Not all GDPR infringements lead to data protection fines. Supervisory
authorities can take a range of other actions, including:
Issuing warnings and reprimands
Imposing a temporary or permanent ban on data processing
Ordering the rectification, restriction, or erasure of data
Suspending data transfers to third countries59
Violations of the CCPA are subject to enforcement by California's Office of the
Attorney General, which can seek civil penalties of $2,500 for each violation or
$7,500 for each intentional violation after notice and a 30-day opportunity to cure
have been provided. The California attorney general will enforce the CCPA. The
CCPA also provides the private right of action, which is limited to data breaches.
Under the private right of action, the fine can be from $100 to $750 per incident
per consumer.60
In a landmark settlement, Facebook was fined $5 billion for mishandling users’
personal information. The FTC found that Facebook’s handling of user data
violated a 2011 privacy settlement with the agency. That earlier settlement, which
came after the company was accused of deceiving people about how it handled
their data, required the company to revamp its privacy practices.61
For many organizations, the level of fines and enforcement activity in a given
jurisdiction will often guide the organization in prioritizing remediation of its data
protection compliance following a gap analysis or discovering an issue. Therefore,
it may be important to link this activity to the business case development at the
outset.
One possible strategy is to use examples of high-profile breaches suffered by other
organizations and the fines associated to gain management buy-in for the budget
to support and mature the privacy program.
3.6 Understanding the Scope and Authority of
Oversight Agencies
Oversight typically relates to the “watchful care, management, or supervision” of
something. Specific to the previous section, oversight agencies can fine or impose
penalties, civil and criminal, based on laws and regulations. Knowledge of these
oversight organizations will help you understand when involvement is warranted
or unwarranted, whom to call or contact, and when those actions are necessary by
law. Table 3-9 lists some of the oversight organizations around the world.
Table 3-9: Oversight Organizations around the World
Country and Enforcement Powers
Regulatory Authority
Country and Enforcement Powers
Regulatory Authority
European Union
Member state Investigate and adjudicate individual complaints
supervisory Conduct inquiries upon suspicion of or based on a complaint
authority/lead Provide directions to controllers for remedying a breach and
supervisory authority for improving protection of the data subjects
Issue warnings and reprimands
Order to rectify, block, erase, or destroy data when
processed in violation of the law
Impose temporary or definite ban on processing
Conduct search and seizure when there are reasonable
grounds for doing so
Refer matters to community institutions, European
Parliament, Council, and Commission; the Court of Justice of
the European Union (CJEU), and other relevant institutions62
Argentina
Agency of Access to Conduct investigation upon suspicion or complaints filed by
Public Information data subjects, national ombudsman, or consumer
(AAPI) association
Give warnings
Impose fines between ARS $1,000 to ARS $100,000
Revoke permission to operate63
Australia
Office of the Australian Investigate upon suspicion or referral
Information Impose contractual obligations through enforceable
Commissioner (OAIC) undertaking
File injunction against violators
Impose fines up to AUS $420,000 according to the
seriousness of a violation64
Country and Enforcement Powers
Regulatory Authority
Brazil
Autoridade Nacional de Impose fines up to two percent of the revenues of a private
Proteção de Dados legal entity, group, or conglomerate in Brazil, up to a total
(ANPD) maximum of R$50 million per infraction
Issue warnings
Publicize the violation
Block the personal data to which the infraction refers to until
its regularization
Delete the personal data to which the infraction refers to
Partially suspend the database operation to which the
infringement refers for a maximum period of six months,
extendable for the same period, until the processing activity
is regularized by the controller
Suspend the personal data-processing activity to which the
infringement refers for a maximum period of six months,
extendable for the same period
Partially or totally prohibit activities related to data
processing65
Canada
Office of the Privacy Investigate complaints
Commissioner of Conduct audits and enforce court orders under federal laws
Canada (OPC) Issue reprimands66
China The PRC currently lacks a centralized enforcement mechanism for
Cyberspace data protection. The DPAs typically have the power to:
Administration of China
(CAC) and other sector- Issue warnings and orders to comply
specific regulators Impose fines up to approximately RMB 500,000
Revoke license to operate
Suspend or prohibit violators from engaging in similar
businesses
Refer cases for criminal proceedings67
Hong Kong
Office of the Privacy Issue warnings
Commissioner for Impose fines up to HK $50,000
Personal Data (PCPD) Refer cases for criminal proceedings68
Country and Enforcement Powers
Regulatory Authority
India
Data protection Issue warnings, reprimands, and cease-and-desist orders
authority of India (not Provide direction to entities to help achieve compliance with
established yet) the law
Suspend business activity for violations
Israel A breach may result in both civil and criminal sanctions, including:
Privacy Protection
Authority (PPA) Administrative fines
One to five years of imprisonment
The right to receive statutory damages under civil
proceedings without the need to prove actual damages69
Japan
Personal Information Issue warnings and orders to comply
Protection Commission Issue orders to suspend the act of violation
(PPC) Conduct investigation70
New Zealand
Office of the Privacy Conduct audit to determine compliance with the law
Commissioner (OPC) Inquire into any matter that appears or may appear to be
infringing privacy of an individual
Conduct investigation
Refer the commissioner’s opinion to director of human
rights71
Singapore
Personal Data Conduct investigation upon suspicion or receipt of
Protection Commission complaints
(PDPC) Impose fines for violation of Do Not Call provisions of the
Personal Data Protection Act (PDPA) up to SGD 10,000
Provide direction to entities to help achieve compliance with
the law
Refer cases for criminal proceedings72
South Korea
Personal Information Conduct investigation
Protection Commission Impose administrative sanctions for violations of the law
(PIPC) Refer cases for criminal proceedings73
Country and Enforcement Powers
Regulatory Authority
Turkey
Kişisel Verileri Koruma Organize inspections ex officio or following a complaint
Kurumu (KVKK) Impose administrative fines
Refer cases for criminal proceedings74
United Kingdom
Information Impose fines of up to four percent of annual worldwide
Commissioner’s Office turnover, or GBP 17.5 million, whichever is higher
(ICO) Investigative and corrective powers
Right to claim compensation75
The IAPP Resource Center hosts the “DPAs” page that is a full directory of DPAs
globally.76
3.7 Other Privacy-Related Matters to Consider
The frameworks presented in these chapters cover many of the main privacy laws
and regulations but do not cover every detail for every country, state, or local
government. Instead, frameworks provide the high-level detail necessary to
manage and keep informed on privacy-relevant matters that concern any
organization. Other privacy matters to consider include the geographical location,
global privacy functions and organizations, and international data sharing.
3.8 Monitoring Laws and Regulations
Methods to track the changes include using many resources, such as the internet,
printed and online journals, automated online services, and third-party vendors.
Each organization should investigate the best methods based on cost,
requirements, and industry to ensure issue customization, easy access,
professional support, reliability rating, and complete, accurate coverage.
Regardless of an organization’s size and complexity, the privacy professional may
wish to consider some form of third-party support because of the number of new
laws, changing regulations, and other complex factors influencing privacy today.
3.9 Third-Party External Privacy Resources
If an organization is small or the privacy office staffing is limited, the privacy
professional and organization could consider third-party solutions to track and
monitor privacy laws relating to the business. These third parties include legal and
consultancy services that can assign people to the organization and use automated
online services that allow research on privacy law, news, and business tools.
Privacy professionals from large and small firms can also take advantage of a
growing number of free resources to help them to keep up to date with
developments in privacy. These include the IAPP’s e-newsletter, the Daily
Dashboard, and its many other publications, including Privacy Tracker, which is
dedicated to following changes in privacy law around the world. Most law firms
with privacy practices also regularly produce updates and often host free-to-attend
seminars or webinars. In addition, several independent organizations share sound
privacy practices based on privacy issues that continue to arise worldwide. Some
examples of privacy advocate groups and advisory groups include:
American Civil Liberties Union (ACLU)—United States
Better Business Bureau (BBB) Online—United States and Canada
Bits of Freedom—Netherlands
Center for Democracy and Technology (CDT)—global
Electronic Frontier Foundation (EFF)—global
Electronic Privacy Information Center (EPIC)—global
Future of Privacy Forum (FPF)—United States
Global Internet Liberty Campaign—global
International Association of Privacy Professionals (IAPP)—global
Internet Free Expression Alliance (IFEA)—global
Online Privacy Alliance (OPA)—global
Online Trust Alliance (OTA)—global
PrivacyExchange—global
Privacy International—global
Privacy Rights Clearinghouse (PRC)—United States
3.10 Summary
As part of privacy program management, the privacy policy framework begins
with an understanding of the organization’s operations and its compliance with
privacy laws and regulations in various jurisdictions. This approach will identify
the tasks involved in developing organizational privacy policies, standards, and/or
guidelines.
With the abundance of data privacy concerns, changing regulatory
environments, social media networking, increased sharing of personal data, and
advancements in the use of technology in everyday life, the privacy professional’s
responsibilities will continue to evolve, and vigilance is required.
As privacy management becomes more complex, the privacy professional needs
flexible and reusable best practices to create solid privacy programs. Programs
must conform with cultural, technological, and legal changes; otherwise, gaps will
form between internal privacy management practices and the expectations of
society.
Frameworks, in the form of reusable structures, checklists, templates, processes,
and procedures, prompt and remind the privacy professional of the details
necessary to inform all privacy-relevant decisions in the organization. Having this
framework and a blueprint provides clear guidance on protecting data privacy to
align with the expectations, requirements, laws, and public demands for handling
personal information safely and properly.
There is no one-size-fits-all standard for a privacy program. Establishing and
implementing the program with the necessary inputs and protocols is critical to
effective management and compliance. An adaptable and flexible approach will
assist the organization in making strategic business decisions when selecting
models, strategies, and technologies for the protection and privacy of data
handling and usage.
Privacy laws and regulations comprise a complex regulatory environment. They
change and are becoming much more specific and detailed to meet shifting
consumers’ expectations regarding their privacy. It is the organization’s role and
that of the privacy professional to meet those expectations. Implementing the
framework is the first cornerstone to protecting privacy in the organization and to
providing the foundation for effective privacy management.
Endnotes
1 Ley 25.326 de Protección de los Datos Personales [in Spanish], Argentina.gob.ar, accessed October 2021,
https://www.argentina.gob.ar/normativa/nacional/ley-25326-64790.
2 Privacy Act 1988, Office of the Australian Information Commissioner, accessed November 2018,
https://www.oaic.gov.au/privacy-law/privacy-act/.
3 Lei Geral de Proteção dos Dados Pessoais [in Portuguese], gov.br, National Data Protection Authority,
accessed October 2021, https://www.gov.br/anpd/pt-br/legislacao.
4 PIPEDA of 2000, Government of Canada, Justice Laws Website, accessed October 2021, https://laws-
lois.justice.gc.ca/eng/acts/P-8.6/page-1.html.
5 Cybersecurity Law of the People’s Republic of China [in Chinese], National People’s Congress of the
People’s Republic of China, accessed November 2018, www.npc.gov.cn/npc/xinwen/2016-
11/07/content_2001605.htm.
6 “Translation: Data Security Law of the People’s Republic of China,” Stanford DigiChina, June 29, 2021,
https://digichina.stanford.edu/news/translation-data-security-law-peoples-republic-china.
7 “Translation: Personal Information Protection Law of the People’s Republic of China,” (Effective Nov. 1,
2021), Stanford DigiChina, August 20, 2021, https://digichina.stanford.edu/news/translation-personal-
information-protection-law-peoples-republic-china-effective-nov-1-2021
8 GDPR, EUR-Lex, accessed November 2018, https://eur-lex.europa.eu/legal-content/EN/TXT/?
qid=1528874672298&uri=CELEX%3A32016R0679.
9 Cap. 486 Personal Data (Privacy) Ordinance, accessed November 2018, https://www.elegislation
.gov.hk/hk/cap486.
10 “Information Technology Act, 2000,” accessed October 2021,
https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf.
11 “Protection of Privacy Law, 5741-1981,” accessed October 2021,
https://www.gov.il/BlobFolder/legalinfo/legislation/en/Protectionof PrivacyLaw57411981unofficialtran
slatio.pdf.
12 “Act on the Protection of Personal Information,” Personal Information Protection Commission, Japan,
accessed October 2021, https://www.ppc.go.jp/en/legal/.
13 “Privacy Act 2020,” Parliamentarian Counsel Office, New Zealand Legislation, accessed October 2021,
https://www.legislation.govt.nz/act/public/2020/0031/latest/LMS23223.html.
14 “Personal Data Protection Act 2012,” Singapore Statutes Online, accessed November 2018,
https://sso.agc.gov.sg/Act/PDPA2012.
15 “Personal Information Protection Act,” Korea Legislation Research Institute, accessed October 2021,
https://elaw.klri.re.kr/kor_service/lawView.do?hseq=53044&lang=ENG.
16 “Personal Data Protection Law No. 6698,” Personal Data Protection Authority, accessed October 2021,
https://www.kvkk.gov.tr/Icerik/6649/Personal-Data-Protection-Law.
17 “Data Protection and Brexit,” IT Governance, accessed October 2021, https://www.itgovernance
.co.uk/eu-gdpr-uk-dpa-2018-uk-gdpr.
18 “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data,” OECD,
accessed November 2018,
www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldat
a.htm; “APEC Privacy Framework,” Asia-Pacific Economic Cooperation, accessed November 2018,
https://www.apec.org/Publications/2005/12/APEC-Privacy-Framework.
19 GDPR, EUR-Lex, accessed November 2018, https://eur-lex.europa.eu/legal-content/EN/TXT/?
qid=1528874672298&uri=CELEX%3A32016R0679.
20 For a more thorough review, see European Data Protection: Law and Practice, Second Edition, Eduardo
Ustaran (Portsmouth, NH: IAPP, 2019).
21 GDPR, Article 1, EUR-Lex, accessed November 2018, https://eur-lex.europa.eu/legal-
content/EN/TXT/?qid=1528874672298&uri=CELEX%3A32016R0679.
22 GDPR, Article 2, EUR-Lex, accessed November 2018, https://eur-lex.europa.eu/legal-
content/EN/TXT/?qid=1528874672298&uri=CELEX%3A32016R0679.
23 GDPR, Article 3, EUR-Lex, accessed November 2018, https://eur-lex.europa.eu/legal-
content/EN/TXT/?qid=1528874672298&uri=CELEX%3A32016R0679.
24 “GDPR Awareness Guide,” IAPP, September 2017, https://iapp.org/resources/article/gdpr-awareness-
guide/.
25 “CCPA and CPRA,” IAPP, accessed October 2021, https://iapp.org/resources/topics/california-
consumer-privacy-act/.
26 See Renato Leite Monteiro, “GDPR matchup: Brazil’s General Data Protection Law,” Privacy Tracker,
IAPP, October 4, 2018, https://iapp.org/news/a/gdpr-matchup-brazils-general-data-protection-law/.
27 Sarah Rippy, “An Overview of Brazil’s LGPD,” Privacy Tracker, IAPP, September 18, 2020,
https://iapp.org/news/a/an-overview-of-brazils-lgpd/.
28 Eva Xiao “China Passes One of the World’s Strictest Data-Privacy Laws ,” Wall Street Journal, August 20,
2021, https://www.wsj.com/articles/china-passes-one-of-the-worlds-strictest-data-privacy-laws-
11629429138.
29 The IAPP has rounded up COVID-19 guidance published by DPAs to date. The guidance provides
information and frequently asked questions pertaining to data processing and COVID-19. See “DPA
Guidance on COVID-19,” IAPP, October 2020, https://iapp.org/resources/article/dpa-guidance-on-
covid-19/.
30 “Federal Trade Commission Act (Section 5: Privacy and Security) of 1914,” FTC, accessed November
2018, https://www.ftc.gov/enforcement/statutes/federal-trade-commission-act.
31 “Fair Credit Reporting Act,” FTC, accessed October 2021,
https://www.ftc.gov/enforcement/statutes/fair-credit-reporting-act.
32 “Family Educational Rights and Privacy Act (FERPA),” U.S. Department of Education, accessed
November 2018, https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html.
33 “Privacy Act of 1974,” U.S. Department of Justice, accessed October 2021,
https://www.justice.gov/opcl/privacy-act-
1974#:~:text=The%20Privacy%20Act%20of%201974,of%20records%20by%20federal%20agencies.
34 “Electronic Communications Privacy Act of 1986,” Encyclopedia.com, accessed October 2021,
www.encyclopedia.com/history/encyclopedias-almanacs-transcripts-and-maps/electronic-
communications-privacy-act-1986.
35 “Video Privacy Protection Act,” EPIC, accessed October 2021, https://epic.org/privacy/vppa/.
36 “FCC Actions on Robocalls, Telemarketing,” FCC, accessed October 2021,
https://www.fcc.gov/general/telemarketing-and-robocalls.
37 “The Drivers Privacy Protection Act (DPPA) and the Privacy of Your State Motor Vehicle Record,” EPIC,
accessed October 2021, https://epic.org/privacy/drivers/.
38 “Complete Guide to Privacy Laws in the U.S.,” Varonis, accessed October 2021,
www.varonis.com/blog/us-privacy-laws/.
39 “Children’s Online Privacy Protection Rule (“COPPA”), FTC, accessed October 2021, https://www
.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-online-privacy-
protection-rule.
40 “Gramm-Leach-Bliley Act,” FTC, accessed October 2021, https://www.ftc.gov/tips-advice/business-
center/privacy-and-security/gramm-leach-bliley-act.
41 “CAN-SPAM Rule,” FTC, accessed October 2021, https://www.ftc.gov/enforcement/rules/rulemaking-
regulatory-reform-proceedings/can-spam-rule.
42“Fair and Accurate Credit Transactions Act of 2003,” FTC, accessed October 2021, https://www.ftc
.gov/enforcement/statutes/fair-accurate-credit-transactions-act-2003.
43 “National Do Not Call Registry,” FTC, accessed October 2021, https://www.donotcall.gov/.
44 “HITECH Act Enforcement Interim Final Rule,” U.S. Department of Health & Human Services, accessed
October 2021, https://www.hhs.gov/hipaa/for-professionals/special-topics/hitech-act-enforcement-
interim-final-rule/index.html.
45 PCI DSS, accessed November 2018, https://www.pcisecuritystandards.org/.
46 “Direct Marketing Association’s Guidelines for Ethical Business Practice,” accessed October 2021,
http://ethics.iit.edu/codes/DMA2007.pdf.
47 “DMA Member Principles,” DMA, accessed November 2018, https://thedma.org/accountability/ethics-
and-compliance/dma-ethical-guidelines/#member-principles.
48 Verisign, accessed November 2018, https://www.verisign.com/; TrustArc (formerly TRUSTe), accessed
November 2018, https://www.trustarc.com/; McAfee, accessed November 2018,
https://www.mcafee.com; PayPal, accessed November 2018, https://www.paypal.com.
49 “Ecommerce Store Trust Marks,” Ecommerce Guide, accessed November 2018,
https://ecommerceguide.com/guides/ecommerce-store-trust-marks/.
50 “Children’s Advertising Review Unit,” Better Business Bureau, accessed October 2021,
https://bbbprograms.org/programs/all-programs/children’s-advertising-review-unit/.
51 “NAI Code of Conduct,” Network Advertising Initiative, accessed November 2018,
www.networkadvertising.org/code-enforcement/code/.
52 “About EU Cloud Code of Conduct, EU Cloud COC, accessed October 2021,
https://eucoc.cloud/en/about/about-eu-cloud-coc/.
53 “Standard Contractual Clauses for International Transfers,” European Commission, accessed October
2021, https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-
protection/standard-contractual-clauses-scc/standard-contractual-clauses-international-transfers_en.
54 “Guidelines for Processing Personal Data Across Borders,” Office of the Privacy Commissioner, January
2009, accessed November 2018, https://www.priv.gc.ca/en/privacytopics/personal-information-
transferred-across-borders/gl_dab_090127/.
55 Bob Siegel, “For a Successful Privacy Program, Use These Three As,” Privacy Advisor, IAPP, February 22,
2016, https://iapp.org/resources/article/for-a-successful-privacy-program-use-these-three-as-three-part-
series/.
56 “What are the Penalties for HIPAA Violations?,” HIPAA Journal, January 15, 2021,
https://www.hipaajournal.com/what-are-the-penalties-for-hipaa-violations-7096/.
57 “What are the Penalties for HIPAA Violations?,” HIPAA Journal.
58 “What are the Penalties for HIPAA Violations?,” HIPAA Journal.
59 “GDPR Penalties and Fines,” IT Governance, accessed October 2021, https://www.itgovernance
.co.uk/dpa-and-gdpr-penalties.
60 “California Consumer Privacy Act (CCPA),” Microsoft, August 23, 2021, https://docs.microsoft
.com/en-us/compliance/regulatory/offering-ccpa.
61 “FTC Approves Facebook Fine of About $5 Billion,” New York Times, July 12, 2019,
https://www.nytimes.com/2019/07/12/technology/facebook-ftc-fine.html.
62 “Regulation (EC) No 45/2001 of the European Parliament and the Council of 18 December 2000,”
Official Journal of the European Communities, accessed November 2018,
https://edps.europa.eu/sites/edp/files/publication/reg_45-2001_en.pdf.
63 “Data Protection Laws of the World, Argentina,” DLA Piper, accessed November 2018,
https://www.dlapiperdataprotection.com/index.html?t=enforcement&c=AR.
64 “Guide to Privacy Regulation Action,” Australian Government, Office of the Australian Information
Commissioner, accessed November 2018, https://www.oaic.gov.au/about-us/our-regulatory-
approach/guide-to-privacy-regulatory-action/.
65 “Data Protection Laws of the World, Brazil,” DLA Piper, accessed October 2021,
https://www.dlapiperdataprotection.com/index.html?t=enforcement&c=BR.
66 “What We Do,” Office of the Privacy Commissioner of Canada, accessed November 2018,
https://www.priv.gc.ca/en/about-the-opc/what-we-do/.
67 “Data Protection Laws of the World, China,” DLA Piper, accessed November 2018,
https://www.dlapiperdataprotection.com/index.html?t=authority&c=CN.
68 “Data Privacy Law: The Ordinance at a Glance,” Privacy Commissioner for Personal Data, Hong Kong,
accessed November 2018,
https://www.pcpd.org.hk/english/data_privacy_law/ordinance_at_a_Glance/ordinance.html.
69 “Data Protection Laws of the World, Israel,” DLA Piper, accessed October 2021,
https://www.dlapiperdataprotection.com/index.html?t=enforcement&c=IL&c2=.
70 “Amended Act on the Protection of Personal Information (Tentative Translation),” Personal Information
Protection Commission, Japan, accessed November 2018,
https://www.ppc.go.jp/files/pdf/Act_on_the_Protection_of_Personal_Information.pdf.
71 “Privacy Act 1993,” Parliamentary Counsel Office, New Zealand Legislation, accessed November 2018,
www.legislation.govt.nz/act/public/1993/0028/latest/DLM296639.html; “Data Protection in New
Zealand: Overview,” Thomson Reuters, Practical Law, accessed October 2021,
https://uk.practicallaw.thomsonreuters.com/w-013-0709?
originationContext=knowHow&transitionType=KnowHowItem&contextData=
(sc.RelatedInfo)&firstPage=true.
72 Personal Data Protection Commission, Singapore, accessed October 2021, https://www.pdpc.gov.sg/.
73 Kwang Bae Park and Sunghee Chae, “Data Protection in South Korea: Overview,” Thomson Reuters,
Practical Law, accessed November 2018,
https://uk.practicallaw.thomsonreuters.com/Document/I1d81ec834f2711e498db8b09b4f043e0/View/
FullText.html?transitionType=SearchItem&contextData=(sc.Default).
74 Burak Özdağıstanli, “Data Protection in Turkey: Overview,” Thomson Reuters, Practical Law, accessed
November 2018,
https://uk.practicallaw.thomsonreuters.com/Document/I02064fb01cb611e38578f7ccc38dcbee/View/F
ullText.html?transitionType=SearchItem&contextData=(sc.Search).
75 “Data Protection Laws of the World, United Kingdom,” DLA Piper, accessed October 2021,
https://www.dlapiperdataprotection.com/index.html?t=authority&c=GB.
76 “Data Protection Authorities,” IAPP, accessed October 2021, https://iapp.org/resources/dpa/.
CHAPTER 4
Privacy Operational Life Cycle: Assess:
Data Assessments
João Torres Barreiro, CIPP/E, CIPP/US
A successful privacy program cannot be built and maintained without a
comprehensive view of the data organizations store and process. Data assessments
can help to inventory, manage, and track personal information, as well as
determine the impact organizational systems and processes will have on privacy.
Data assessments are tools that can help organizations identify privacy risks to
individuals in advance and deal with them effectively at the beginning of any
project that involves the processing of personal data. Addressing potential
problems early will help to achieve a more robust compliance regime and
ultimately reduce costs.
In this chapter, we will examine different types and functions of data
assessments, including privacy impact assessments (PIAs), data protection impact
assessments (DPIAs), and data inventories (also known as data-mapping
assessments). Performing periodic privacy risk assessments is a key activity for the
implementation of the Three Lines Model in terms of enterprise risk
management,1 not only because the compliance or privacy function (second line)
can check for potential irregularities on those assessments, but also because the
internal audit function (third line) then reviews them, post-rollout.
4.1 Data Governance
Data is the lifeblood of many organizations. It is a strategic asset that can improve
client value, increase profitability, allow risk to be managed more effectively,
enable the business to run more efficiently, and is fundamental to the delivery of
internal transformation initiatives. Because of this, it is important that not only
personal data, but also data, in general, is properly governed through the
application of a sound data governance strategy.
The Data Management Association (DAMA) International defines data
governance as the planning, oversight, and control over management of data and
the use of data and data-related sources. In other words, a data governance
framework provides your organization with a holistic approach to collecting,
managing, securing, and storing data.
To help understand what a data governance framework should cover, the DAMA
envisions data management as a wheel, with data governance as the hub from
which the following ten data management knowledge areas radiate:
1. Data architecture—The overall structure of data and data-related
resources as an integral part of the enterprise architecture.
2. Data modeling and design—Analysis, design, building, testing, and
maintenance.
3. Data storage and operations—Structured physical data assets
storage deployment and management.
4. Data security—Ensuring privacy, confidentiality, and appropriate
access.
5. Data integration and interoperability—Acquisition, extraction,
transformation, movement, delivery, replication, federation,
virtualization, and operational support.
6. Documents and content—Storing, protecting, indexing, and
enabling access to data found in unstructured sources and making this
data available for integration and interoperability with structured data.
7. Reference and master data—Managing shared data to reduce
redundancy and ensure better data quality through standardized
definition and use of data values.
8. Data warehousing and business intelligence—Managing analytical
data processing and enabling access to decision support data for
reporting and analysis.
9. Metadata—Collecting, categorizing, maintaining, integrating,
controlling, managing, and delivering metadata.
10. Data quality—Defining, monitoring, maintaining data integrity, and
improving data quality.2
When establishing a strategy for data governance and procuring the adequate
tools to execute it, each of the above facets of data collection, management,
archiving, and use should be considered.3
In what concerns to data governance roles, each organization has its own
particularities; however, we can identify some commonalities:4
On a strategic level—A data steering committee formed by C-level
individuals. It steers and approves the corporate data strategy, data
governance, and data policies.
On a managerial level—Includes the data owners. They are the
business or corporate functions leads responsible for a data domain or
data asset. They are accountable for delivering data consumer
requirements for clients or colleagues.5
Operational level—Comprises the data stewards. They are subject
matter experts (SMEs) in a data domain or data asset who are
accountable for the day-to-day management of data, and who
understand and communicate the meaning and use of information.
Not all organizations have a dedicated data governance function or individual
who is responsible for understanding the data infrastructure, as highlighted above.
It is important for privacy professionals to build strong partnerships with those in
the business who are accountable for data, support the building of data
inventories, and demonstrate compliance with Article 30 of the EU General Data
Protection Regulation (GDPR).
4.2 Inventories and Records
How do you know where personal data resides, how it is used by an organization,
and why it is important? The data inventory, also known as a data map, provides
answers to these questions by identifying the data as it moves across various
systems and thus indicating how it is shared and organized and where it is
physically located. That data is then categorized by subject area, which identifies
inconsistent data versions, enabling identification and mitigation of data
disparities, which, in turn, serves to identify the most and least valuable data and
reveal how it is accessed, used, and stored. However, it is worth noting there are
many ways and lenses when building a data inventory. What is important for
organizations to do, however, is to decide on the most appropriate layout and
approach and stick to it to avoid further problems. The benefits of the inventory
also apply more generally because it identifies risks that could affect reputation or
legal compliance. If a problem subsequently occurs, current enforcement practices
indicate penalties are likely to be less severe if the company has an established
system of recording and organizing this inventory. A key benefit for the privacy
function is that the information acquired through building a data inventory can be
extremely useful in complying with Article 30 of the GDPR to populate and
maintain the records of processing activities.
Creating and maintaining data inventories is normally the responsibility of the
privacy function and/or the information technology (IT) function always with
input from all business functions that require, own, or make decisions about the
use of the data.6 Also, many organizations now have a data governance team
outside of privacy and IT. Frequently, this team will own the data inventories so
they can track confidential and other internal use datasets that privacy may not be
as interested.
Most often, the budget for this undertaking is shared across these departments.
Considerations identified in table 4-1 can be used as a starting point for building
your inventory. Internal policies and procedures, laws, regulations, and standards
may also be used to compose the questions. Based on these aspects, the data
inventory offers a good starting point for the privacy team to prioritize resources,
efforts, risk assessments, and current policy in response to incidents.
As explained in chapter 2, building a data inventory manually can require
substantial resources and time both to build it but also to maintain it. The
reliability of the information and whether you have included all data processing in
the scope of the exercise is also questionable by carrying out the exercise in this
way; therefore, it is worth considering how tooling provided by third-party
organizations could provide more reliable and automated information, depending
on the resources and budget available to privacy functions.
A data inventory should include the items in table 4-1.
Table 4-1: Elements of a Data Inventory
Element Explanation
The nature of a What is the context and purpose of the repository?
repository of privacy-
related information
The owner of the What is the starting point for further investigation into the
repository repository?
Legal entity of the This helps to identify applicable data controllers and data
processing processors.
The volume of How much data is in the repository? How much personal data and, in
information in this particular, sensitive data is in the repository?
repository
The format of the Is this a paper or electronic repository? Is it structured or
information unstructured?
Element Explanation
The use of the How is the information being used?
information
Data retention How long is the data retained? Is it in accordance with a data
retention policy and schedules?
Type (or types) of What kinds of information are in the repository (e.g., physical or
privacy-related email addresses, government-issued identification numbers, health
information in the information, salary information)?
repository
Where the data is stored In which country or countries is the data stored? In which
system(s)/application(s) is the data located?
Where the data is From which country or countries is the data accessed? Who in the
accessed organization has access?
International transfers Where will the data flow (from country to country)?
With whom the data is Is the data shared with third parties? Are they controllers or
shared processors?
Transfer mechanisms How is this covered off by your organization’s data transfer
framework?
Once the data inventory has been completed and documented, the information
can be used when necessary to address both incidents and standard risk
assessments. This process will help set the organizational priorities for privacy
initiatives by providing data locations, data use, data storage, and data access,
which allows the privacy team to justify priorities and understand the scope of
data usage in the organization.
When building your data inventory, select the tool that will enable your
organization to update it most easily. Options may include a governance, risk,
and compliance (GRC) software system and a data-mapping or inventory tool. 7
In addition to using a data inventory, the privacy professional may want to
establish an inventory of applicable laws and regulations.8 Conducting a gap
analysis can help. This task requires effort and resources, especially when the
privacy program is first being established. Consider international, local, and
industry-specific standards and laws, then map gaps against them. Most laws have
some overlap so be sure to involve the legal team in the process. Although not
always necessary, some organizations may decide to use a privacy compliance tool.
4.3 Records of Processing Activities Under the EU
General Data Protection Regulation
One of the more onerous new requirements under the GDPR is the obligation
under Article 30 for controllers and processors to maintain detailed records of
their processing activities.9 A prescriptive list of the contents of these records
includes:
The name and contact details of the controller or processor, data
protection officer (DPO), and/or data protection representative where
applicable
Name and contact details of any joint controllers
The purposes for the processing (for controllers)
A description of the categories of personal data and categories of data
subjects (for controllers) or the categories of processing (for
processors)
The categories of recipients (for controllers)
Any international transfers to third countries or multinational
organizations
Where applicable, the safeguards in place for exceptional transfers of
personal data to third countries or international organizations
Where possible, the retention periods for the various categories of
personal data (for controllers)
A general description of the technical and organizational security
measures implemented
It is important to note this detailed record of processing must be disclosed to a
data protection authority (DPA) upon request. The only exemption from the
requirement to maintain a detailed record of processing is when the controller or
processor employs fewer than 250 people, provided the processing it undertakes
is occasional, does not include sensitive personal data, and is not likely to result in
a risk for the rights and freedoms of the individual. For example, processing may
result in a risk of this nature when a controller or processor is carrying out
profiling activities.10 To meet the requirement to maintain a detailed record of
processing activities, a business should carry out and maintain a data flow analysis
report identifying the different categories of personal data, purposes for which the
personal data is processed, recipients, and way the data flows around the business
and externally through different systems.11
To begin to tackle the data and processing inventory project, the DPO should
first identify the data controllers and processes. An organization’s finance or
company secretary can provide information to help the DPO determine the role
of the legal entities. One starting point would be to identify and interview all
known data owners, if identified. Where this is not as clear, DPOs should start
with functional leads. An organization should have someone located in each
business unit who is responsible for each type of data collected. If the organization
has a records or data team, they should also be consulted, as data custodians may
be separate from data owners. The digital marketing teams would know what
information they gather and use to reach new customers. The corporate counsel
teams may have information on data types, as well, as they need to know how to
freeze such data for litigation purposes.12
The IT organization should have database administrators who should have
schemas of the databases of varying kinds. The team responsible for backups and
business continuity should also know what data is retained and what needs to be
restored. The software team should have lists of all software used by the
organization. The process owner of each can be queried for the personal data of
which they are aware of. The compliance team would have details of personal data,
including some applicable to compliance with non-GDPR privacy statutes (e.g.,
ePrivacy Directive/Regulation). And the administrator who currently answers
data subject access requests (DSARs) would have more information about
personal data sources.13
Information from these sources and others identified as the interview process
progresses can be used to begin to put together a current data inventory. Not all
this data will be personal data. Nonpersonal data (anonymous data), which is
outside the scope of the GDPR, needs to be identified. Implementing a new
process means that revised or new applications or systems must thoroughly
document the personal data they are processing, which will help to keep the data
inventory from becoming outdated.14
While gathering information about personal data, the team should also be
gaining an understanding of how that data is being processed.15 Additional
information to discover includes the type of security used to protect the data (e.g.,
encryption), the retention periods for the data, who has access to it, to whom it is
disclosed, and the legal basis for processing the data. If done correctly, a data and
processing inventory will begin to emerge that builds a foundation upon which to
address many of the controller’s or processor’s obligations. Many iterations will be
required, but from this data and processing inventory, the DPO will understand in
some depth the GDPR compliance situation of their organization and be able to
begin formulating gap plans to remediate any noncompliance or privacy risks
identified.16
4.4 Assessments and Impact Assessments
Three types of assessments and impact assessments are of concern at this point:
privacy assessments, PIAs, and DPIAs. Although these terms are often used in
other contexts to refer to the same concept, you can also differentiate them.
4.4.1 Privacy Assessment: Measuring Compliance
Privacy assessments measure an organization’s compliance with laws, regulations,
adopted standards, and internal policies and procedures. Their scope may include
education and awareness; monitoring and responding to the regulatory
environment; data, systems, and process assessments; risk assessments; incident
response; contracts; remediation; and program assurance, including audits.
Privacy assessments may be conducted internally by the audit function, DPO or
a business function, or externally by a third party. They can happen at a predefined
time period or be conducted in response to a security or privacy event or at a
request of an enforcement authority. The standards used can be subjective, such as
employee interviews, or objective, such as information system logs.
4.4.2 Privacy Impact Assessment
A PIA is an analysis of the privacy risks associated with processing personal
information in relation to a project, product, or service. To be an effective tool, a
PIA also should suggest or provide remedial actions or mitigations necessary to
avoid or reduce/minimize those risks. Requirements regarding PIAs emanate
from industry codes, organizational policy, laws, regulations, or supervisory
authorities.
PIAs can help facilitate privacy by design (PbD), which is the concept of
building privacy directly into technology, systems, and practices at the design
phase. It helps ensure privacy is considered from the outset and not as an
afterthought. PbD will be covered in greater detail in chapter 5.
To be an effective tool, a PIA should be accomplished early, in other words:
During the ideation stage or scoping of a project, product, or service
that involves the collection of personal information
When there are new or revised industry standards, organizational
policies, or laws and regulations
When the organization creates new privacy risks through changes to
methods by which personal information is handled
Below are some examples of business projects or processing that may trigger the
need for a PIA:
Re-identification of information
Conversion of records from paper-based to electronic format
Significant merging, matching, and manipulation of multiple databases
containing personal information
Application of user-authentication technology to a publicly accessible
system
System management changes involving significant new uses and/or
application of new technologies
Retiring of systems that held or processed personal data
Incorporation of personal information obtained from commercial or
public sources into existing databases
Significant new interagency exchanges or uses of personal information
Alteration of a business process resulting in significant new collection,
use, and disclosure of personal information
Alteration of the character of personal information due to addition of
qualitatively new types
Implementation of projects using third-party service providers
Regardless of the geographical location or requirements based in law, regulation,
or guideline, the PIA is a risk management tool used to identify and reduce the
privacy/data protection risks to individuals and organizations and aimed at
ensuring a more holistic risk management strategy. Details of PIAs, how they are
used, and formats, methodologies, and processes around the assessments will vary
depending on industry, private- or public-sector orientation, geographical location
or regional requirements, and sensitivity or type of data. The privacy professional
should identify the appropriate methodology and approaches based on these
various factors and tailor the model to the specific needs of the organization.
For instance, the transfer impact assessments recommended by the European
Data Protection Board following the Court of Justice of the European Union’s
(CJEU) “Schrems II” decision are, in essence, a PIA although they are not named
as such.17 Also, Brazil’s new Lei Geral de Proteção de Dados (LGPD) uses the
expression relatório de impacto (impact report) to refer to a report that, in its
essence, is a PIA. This report should contain a description of the data-processing
activities that could pose risks to civil liberties and fundamental rights, as well as
the measures, safeguards, and mechanisms to mitigate those risks. The LGPD
does not establish when the impact report is required, but it is expected that
further guidance on this regard will be issued by the national DPA.
Finally, in China, the Personal Information Security Specification establishes as a
responsibility of the personal information controller to establish a personal
information security impact assessment system and regularly (at least once a year)
conduct it. The personal information security impact assessments mainly evaluate
whether processing activities obey the basic principles of personal information
security and assess the impact of personal information processing on the lawful
rights and interests of personal information subject. Following the assessment, the
controllers shall formulate an evaluation report on personal information security
impact, and, building upon the report, take measures to protect personal
information subjects, reducing risk to an acceptable level.18 China’s Personal
Information Protection Law (PIPL), also establishes an obligation for personal
information handlers to conduct a risk assessment in advance of some personal
information-handling activities.19
One of the biggest challenges for privacy professionals is to prioritize the
projects, products, or services that should be submitted to a PIA. To identify the
data-processing activities that represent a higher privacy risk, some organizations
first conduct an express PIA, which consists of a small questionnaire that assesses
the need for a full and more comprehensive PIA. This approach enables all
stakeholders to dedicate their resources to the areas where the risks and potential
harms for individuals are most significant and mitigate these risks, creating better
outcomes and more effective protection for individuals.20
4.4.3 PIAs in the United States
The U.S. government, under the E-Government Act of 2002, required PIAs from
government agencies when developing or procuring IT systems containing
personally identifiable information (PII) of the public or when initiating an
electronic collection of PII.21 This requirement is preceded by a privacy threshold
analysis (PTA) to determine if a PIA is needed. The PTA would seek to determine
from whom data is collected, what types of personal data are collected, how such
data is shared, whether the data has been merged, and whether any
determinations have been made as to the information security aspects of the
system.22 The Privacy Act requirements include the rights to receive timely notice
of location, routine use, storage, retrievability, access controls, retention, and
disposal; rights of access and change to personal information; consent to
disclosure; and maintenance of accurate, relevant, timely, and complete records.23
As such, the PIA will describe in detail the information collected or maintained,
sources of that information, uses and possible disclosures, and potential threats to
the information.
The uses to which the information is put by the system are described next,
including the legal authority for collecting the data, the retention periods and
eventual destruction, and any potential threats based on use of the data. Also
included are any information dissemination and the controls used, the rights listed
above, the information security program used, and compliance with the Privacy
Act.24 Under implementation guidance, the following were reasons for initiating a
PIA:
Collection of new information about individuals, whether compelled or
voluntary
Conversion of records from paper-based to electronic format
Conversion of information from anonymous to identifiable format
System management changes involving significant new uses and/or
application of new technologies
Significant merging, matching, or other manipulation of multiple
databases containing PII
Application of user-authentication technology to a publicly accessible
system
Incorporation into existing databases of PII obtained from commercial
or public sources
Significant new interagency exchanges or uses of PII
Alteration of a business process resulting in significant new collection,
use,
and/or disclosure of PII
Alteration of the character of PII due to the addition of qualitatively
new types of PII
Implementation of projects using third-party service providers25
Virginia’s Consumer Data Protection Act (CDPA), signed into law March 2,
2021, also requires controllers to conduct data protection assessments that
evaluate the risks associated with processing activities. While the act specifies the
types of activities that must be assessed, it fails to indicate how often they must
occur and how long they must be kept. Virginia’s attorney general can not only
request a controller disclose data protection assessments, but is also specifically
tasked with evaluating the assessments for compliance with the responsibilities set
out in the act.26
4.4.4 International Organization for Standardization (ISO)
ISO 29134 is a set of guidelines for the process of running a PIA and the structure
of the resulting report.27 It is not a standard for PIAs, unlike, say, a standard for
information security. These guidelines define a PIA as a process for identifying
and treating, in consultation with stakeholders, risks to PII in a process, system,
application, or device. They reiterate that PIAs are important not only for
controllers and their processors, but also the suppliers of digitally connected
devices. In addition, they specify the PIA should start at the earliest design phase
and continue until after implementation. The process first involves conducting a
threshold analysis to determine whether a PIA is needed, then preparing for a
PIA, performing a PIA, and following up on the PIA. The performing phase
consists of five steps:
1. Identifying information flows of PII
2. Analyzing the implications of the use case
3. Determining the relevant privacy-safeguarding requirements
4. Assessing privacy risk using steps of risk identification, risk analysis, and
risk evaluation
5. Preparing to treat privacy risk by choosing the privacy risk treatment
option; determining the controls using control sets, such as those
available in ISO/IEC 27002 and ISO/IEC 29151, and creating privacy
risk treatment plans
The follow-up phase consists of:
Preparing and publishing the PIA report
Implementing the privacy risk treatment plan
Reviewing the PIA and reflecting changes to the process
The structure of the PIA report should include sections on the scope of the
assessment, privacy requirements, risk assessment, risk treatment plan, and
conclusions and decisions. The risk assessment includes discussion of the risk
sources, threats, and their likelihood, consequences, and their level of impact, risk
evaluation, and compliance analysis. There should also be a summary that can be
made public.28
4.4.5 Data Protection Impact Assessments
When an organization collects, stores, or uses personal data, the individuals
whose data is being processed are exposed to risks. These risks range from
personal data being stolen or inadvertently released and used by criminals to
impersonate the individual, to worry being caused to individuals that their data
will be used by the organization for unknown purposes. A DPIA describes a
process designed to identify risks arising out of the processing of personal data
and to minimize these risks as much and as early as possible. DPIAs are important
tools for negating risk and demonstrating compliance with the GDPR.29
Under the GDPR, a DPIA is required for processing that is likely to result in a
high risk to individuals. Failure to carry out a DPIA when the processing is subject
to a DPIA, carrying out a DPIA in an incorrect way, or failing to consult the
competent supervisory authority where required can result in an administrative
fine of up to 10 million euros or, in the case of an undertaking, up to two percent
of the total worldwide annual revenue of the preceding financial year, whichever is
higher.30
DPIAs are required when an organization is subject to the GDPR and, although
the term “PIA” is often used in other contexts to refer to the same concept, a DPIA
under the GDPR has specific triggers and requirements.
4.4.5.1 When Is a DPIA Required?
In case the processing is “likely to result in a high risk to the rights and freedoms
of natural persons,” the controller shall, prior to processing, carry out a DPIA.31
The nature, scope, context, purpose, type of processing, and use of new
technologies should, however, be considered. The use of new technologies, in
particular, whose consequences and risks are less understood, may increase the
likelihood that a DPIA should be conducted.
Although each national supervisory authority shall establish and make public a
list of the kind of processing operations for which a DPIA is and is not required,32
Article 35(3) provides some examples when a processing operation is “likely to
result in high risks” and, therefore, a DPIA is required:
a. Systematic and extensive evaluation of personal aspects relating to
natural persons which is based on automated processing, including profiling,
and on which decisions are based that produce legal effects concerning the
natural person or similarly significantly affect the natural person;
b. Processing on a large scale of special categories of data, or of personal
data relating to criminal convictions and offences; or
c. A systematic monitoring of a publicly accessible area on a large scale.33
This abovementioned list is non-exhaustive, and the Article 29 Working Party
(WP29)34 provides a more concrete set of processing operations that require a
DPIA due to their inherent high risk:35
Evaluation or scoring—Includes profiling and predicting, especially
from aspects concerning the data subject’s performance at work,
economic situation, health, personal preferences or interests, reliability
or behavior, and location or movements.
Automated decision-making with legal or similar significant effect
—Processing that aims at taking decisions on data subjects producing
“legal effects concerning the natural person” or that “similarly
significantly affects the natural person.”
Systematic monitoring—Processing used to observe, monitor, or
control data subjects, including data collected through networks or a
systematic monitoring of a publicly accessible area.
Sensitive data or data of a highly personal nature—This includes
special categories of personal data, as well as personal data relating to
criminal convictions or offenses.
Data processed on a large scale—The WP29 recommends the
following factors, in particular, be considered when determining
whether the processing is carried out on a large scale: (1) the number of
data subjects concerned, either as a specific number or as a proportion
of the relevant population; (2) the volume of data and/or the range of
different data items being processed; (3) the duration or permanence of
the data processing activity; and (4) the geographical extent of the
processing activity.
Matching or combining datasets—For example, originating from two
or more data-processing operations performed for different purposes
and/or by different data controllers in a way that would exceed the
reasonable expectations of the data subject.
Data concerning vulnerable data subjects—The processing of this
type of data is a criterion because of the increased power imbalance
between the data subjects and the data controller, meaning the
individuals may be unable to easily consent to or oppose the processing
of their data or exercise their rights. Vulnerable data subjects may
include children, employees, and more vulnerable segments of the
population requiring special protection (e.g., persons with mental
health concerns, asylum seekers, the elderly).
Innovative use or application of new technological or
organizational solutions—Such as the combined use of fingerprints
and face recognition for improved physical access control.
When the processing in itself prevents data subjects from exercising
a right or using a service or a contract.36
In most cases, a data controller can consider that a processing that meets two
criteria would require a DPIA to be carried out. In general, the WP29 considers
that the more criteria are met by the processing, the more likely it is to present a
high risk to the rights and freedoms of data subjects and, therefore, to require a
DPIA, regardless of the measures the controller envisions adopting.
However, in some cases, a data controller can consider that a processing meeting
only one of these criteria requires a DPIA. In cases where it is not clear whether a
DPIA is required, the WP29 recommends that a DPIA be carried out.37 A DPIA is
a useful tool to help controllers build and demonstrate compliance with data
protection law.38
Conversely, a processing operation may still be considered by the controller not
to be “likely to result in a high risk.” In such cases, the controller should justify and
document the reasons for not carrying out a DPIA and include/record the views
of the DPO.
In addition, as part of the accountability principle, every data controller “shall
maintain a record of processing activities under its responsibility” including inter alia
the purposes of processing, a description of the categories of data and recipients
of the data and “where possible, a general description of the technical and
organizational security measures referred to in Article 32(11)” (Article 30(1)) and
must assess whether a high risk is likely, even if they ultimately decide not to carry
out a DPIA.39
It is, therefore, good practice that upon completion of a DPIA, the processing is
added to the organization’s record of processing activities.
4.4.5.2 What Should a DPIA Include?
The GDPR sets out the minimum features of a DPIA:
A description of the processing, including its purpose and the legitimate
interest being pursued
The necessity of the processing, its proportionality, and the risks that it
poses to data subjects
Measures to address the risks identified40
Figure 4-1: Generic Iterative Process for Carrying Out a DPIA41
The UK Information Commissioner’s Office (ICO) and France’s Commission
Nationale de l’Informatique et des Libertés (CNIL) have available on their
websites several guidelines on DPIAs.42 The CNIL also published an open-source
PIA software tool available in French and English.43
4.4.5.3 When Must the Supervisory Authority be Contacted?
Whenever the data controller cannot find sufficient measures to reduce the risks
to an acceptable level (i.e., the residual risks are still high), consultation with the
supervisory authority will be necessary.44
Scenarios that should lead to contact with a supervisory authority include:
An illegitimate access to data leading to a threat on the life of the data
subjects, layoff, or a financial jeopardy
Inability to reduce the number of people accessing the data because of
its sharing, use, or distribution modes, or when a well-known
vulnerability is not patched
Moreover, the controller will have to consult the supervisory authority
whenever member state law requires controllers to consult with, and/or obtain
prior authorization from, the supervisory authority in relation to processing by a
controller for the performance of a task carried out by the controller in the public
interest, including processing in relation to social protection and public health.45
4.4.5.4 Components of a DPIA
Different methodologies could be used to assist in the implementation of the
basic requirements set out in the GDPR. Annex 1 of the WP29’s Guidelines on
Data Protection Impact Assessment lists several examples of DPIA and PIA
frameworks. To allow these different approaches to exist while allowing
controllers to comply with the GDPR, common criteria have been identified on
Annex 2 of the same guidelines. They clarify the basic requirements of the
regulation but provide enough scope for different forms of implementation. These
criteria can be used to show that a particular DPIA methodology meets the
standards required by the GDPR. It is up to the data controller to choose a
methodology but this methodology should be compliant with the criteria
provided in Annex 2 of the guidelines mentioned earlier in this paragraph.46
4.4.6 Assessing Artificial Intelligence Systems
Artificial intelligence (AI) systems are software and possibly also hardware
systems designed by humans that, given a complex goal, act in the physical or
digital dimension by perceiving their environment through data acquisition,
interpreting the collected structured or unstructured data, reasoning on the
knowledge, or processing the information, derived from this data and deciding the
best action(s) to take to achieve the given goal.47 In other words, AI is as an
umbrella term for a range of algorithm-based technologies that solve complex
tasks by carrying out functions that previously required human thinking.
Decisions made using AI are either fully automated or have some form of “human
intervention.”48
Data, including personal data, is core to the use of AI systems and privacy is
commonly identified as one of the main pitfalls of AI because of AI particular
clashes with privacy principles:
Table 4-2: AI Privacy Challenges
Privacy Challenge Posed by AI50
Principle49
Lawfulness,
fairness, and AI technology inherits the bias of its makers and the algorithmic
transparency model’s result may be unfair, if the training data renders a biased
picture of reality
Discrimination may flow from the practical impacts of the correlations
or patterns that the AI system identifies on in a large dataset
The “black box-effect” makes it difficult to understand/explain how
information is correlated and weighted in a specific process51
Providing information about the algorithmic model to data subjects
may reveal commercial secrets and intellectual property rights
Data
minimization The development of AI often takes huge amounts of personal data
and purpose Normally, it is difficult to define the purpose of processing because it is
limitation not possible to predict what the algorithm will learn. As a result, it is
challenging to define which data is necessary. Also, the purpose may
also be changed as the machine learns and develops.
Integrity and
confidentiality Machine learning systems require large sets of training and testing data
(security) to be copied and imported from their original context of processing,
shared and stored in a variety of formats and places, including with
third parties
The personal data of the individuals whose data was used to train an AI
system can be inferred by observing the predictions the system returns
in response to new inputs. Those are known as model inversion attacks
and membership inference attacks.52
Considering the challenges highlighted above, a PIA, or a DPIA in the case of
the GDPR, should be conducted. Although there are links between ethical and
equality considerations and the fairness principle recognized by many data
protection laws, you will also need to consider other elements separately to ensure
you are compliant with any additional obligations you may have on unlawful
discrimination and advancement of opportunities between people who share a
protected characteristic and those who do not.
Some of the considerations to be accounted on an assessment of an AI system
are:
Ensure that the deployment of an AI system that processes personal
data is driven by the proven ability of that system to fulfill a specific and
legitimate purpose, not by the availability of the technology.
Think about any detriment to data subjects that could follow from bias
or inaccuracy in the algorithms and datasets being used.
Grant data subjects with the ability to seek effective redress against a
decision made by AI systems and the humans operating them. To do
that, the entity accountable for the decision must be identifiable, and
the decision-making process should be explicable, reproduceable,
traceable, and auditable.
Pay attention to vulnerable groups and situations that are characterized
by asymmetries of power or information.
Ensure the datasets used to train the algorithm do not contain socially
constructed biases, inaccuracies, errors, and mistakes.
Be certain AI systems do not represent themselves as humans to the
data subjects; data subjects should be informed they are interacting
with AI systems.
Assess whether the training data contains identified or identifiable
personal data of individuals, either directly or by those who may have
access to the algorithmic model.
Reduce the extent to which the training data can be tracked back to
specific individuals, by including privacy-enhancing techniques, such as
perturbation or adding “noise,” synthetic data, and federated learning. 53
There has been a proliferation of industry frameworks for the use and
development of AI in compliance with privacy and data ethics. As an example, it is
worth to mention the report from the Consultative Expert Group on Digital
Ethics in insurance from the European Insurance and Occupational Pensions
Authority (EIOPA) of which the author was part of. As highlighted on that
report, proportionate to the complexity, scale, potential impact, and intended use
of the organization’s AI developments, there may be value in appointing an AI
officer or similar to ensure the consistency and coherence of the work undertaken
by the business and technology teams.54 The AI officer may be a new
function/position created within the organizational structure of the organization
or incorporated into the responsibilities of already existing functions, like the
DPO. The AI officer would be required to have sufficient expertise to provide
oversight and advice to all functions and could coordinate the certification of
activities.
4.4.7 Attestation: A Form of Self-Assessment
Attestation is a tool for ensuring functions outside the privacy team are held
accountable for privacy-related responsibilities. Once you have determined the
privacy responsibilities of each department and relevant privacy principles against
which their personal information practices will be assessed, you can use this
information to craft questions related to each responsibility. The designated
department is required to answer the questions and, potentially, provide evidence.
Attestation questions should be specific and easy to answer, usually with yes or no
responses. Regular self-assessments can form part of an organization’s privacy
management systems and demonstrate a responsible privacy management culture.
In the United States, one example of attestation involves NIST 800-60, a guide
from the National Institute of Standards and Technology (NIST) and
Department of Commerce (DOC).55 The guide maps types of information and
information systems to security categories. For example:
Task—Classify data
Owner—IT
Questions—Has the NIST 800-60 classification system been reviewed
to ensure understanding of each category? Has each type of data within
the information system been mapped to a category? Have data types
that cannot be easily categorized been flagged, analyzed, and classified
by the chief information security officer (CISO)?
Evidence—Spreadsheet with data inventory, categories, and
classifications
The ICO has created several toolkits to help small- to medium-sized
organizations from the private, public, and third sectors to assess their compliance
with data protection laws and find out what they need to do to make sure they are
keeping personal data secure.56
4.5 Physical and Environmental Assessments
Information security is the protection of information for the purpose of
preventing loss, unauthorized access, or misuse. Information security requires
ongoing assessments of threats and risks to information and of the procedures and
controls to preserve the information, consistent with three key attributes:
1. Confidentiality—Access to data is limited to authorized parties
2. Integrity—Assurance that the data is authentic and complete
3. Availability—Knowledge that the data is accessible, as needed, by
those who are authorized to use it
Information security is achieved by implementing security controls, which need
to be monitored and reviewed to ensure that organizational security objectives are
met. It is vital to both public- and private-sector organizations.
Security controls are mechanisms put in place to prevent, detect, or correct a
security incident. The three types of security controls are physical controls,
administrative controls, and technical controls.57 For now, we are going to focus
on the assessment of physical and environmental controls. The other information
security practices will be covered in greater detail in chapter 5.
Physical and environmental security refers to methods and controls used to
proactively protect an organization from natural or manmade threats to physical
facilities and buildings, as well as to the physical locations where IT equipment is
located or work is performed (e.g., computer rooms, work locations). Physical and
environmental security protects an organization’s personnel, electronic
equipment, and data/information.
The assessment of the physical controls implemented is key because many
security incidents can be due to the theft or loss of equipment, abandonment of
old computers, or hard-copy records being lost, stolen, or incorrectly disposed of.
When considering physical security, you should look at the following factors:
The quality of doors and locks on physical document storage assets and
the protection of your premises by such means as alarms, security
lighting, or CCTV
How you control access to your premises, for instance, through the
installation of biometric sensors, such as iris scanning or fingerprint
recognition, and how visitors are supervised
How to protect against natural and manmade threats. On this topic,
many organizations only consider an estimated time of repair and an
estimated time between failures. However, another crucial element is
the system backup. It is a necessity that any backup data should be
stored in at least two different places to offer protection in the event of a
disaster or failure.
How you dispose of any paper and electronic waste. The only way to
guarantee that you dispose data adequately is to designate the
responsibility, oversight, and ongoing supervision to a team that is well
versed in data privacy and maintains an organized end-of-life data
disposal plan and process. Also, it is imperative all personal data
residing at a company, whether pertaining to the company or to a third
party, be assigned to a data retention schedule and the personal data be
thoroughly obliterated. The NIST SP 800-88 “Guidelines for Media
Sanitization” provide instructions to organizations on how to effectively
sanitize hard drives and other electronic media. Released by the NIST,
these guidelines were originally meant for government use but are now
commonly implemented by many private companies and organizations.
The NIST SP 800-88 “Guidelines for Media Sanitization” recommends
you remove your data in one of three following ways: clearing, purging,
or destroying. Clearing is a sanitization method that involves using
software or hardware products to overwrite all user-addressable storage
space. The goal of clearing is to replace written data and potentially
sensitive information with random data. While your information most
likely can’t be retrieved by basic recovery utilities, this sanitization
method only provides an intermediate level of protection. Purging
provides more comprehensive sanitization than clearing, as purging
protects information against laboratory attacks that use advanced
methods and tools to recover data. Some methods of purging include
overwriting, block erasing, and cryptographic erasure. Destroying, like
purging, protects data from being recovered by state-of-the-art
laboratory techniques. A key difference, however, is that after destroying
media the device is no longer able to store data. There are many physical
techniques for destroying media, such as disintegrating, incinerating,
melting, and shredding. While destruction can be useful for hardware
that cannot possibly be reused, in most cases, you should alternatively
consider purging your media instead.58
How you keep IT equipment, particularly mobile devices, secure. In
particular, you should assess if appropriate security measures are being
taken for the protection of personal data stored in automated data files
against accidental or unauthorized destruction or accidental loss, as well
as against unauthorized access, alteration, or dissemination.
4.6 Assessing Vendors
Under many privacy laws, organizations are liable for the data-processing activities
conducted on their behalf by their vendors. Therefore, a procuring organization
may have specific standards and processes for vendor selection. A prospective
vendor should be evaluated against these standards through questionnaires, PIAs,
and other checklists. Standards for selecting vendors may include:
Reputation—A vendor’s reputation with other companies can be a
valuable gauge of the vendor’s appropriate collection and use of
personal data. Requesting and contacting references can help determine
a vendor’s reputation.
Financial condition and insurance—The vendor’s finances should be
reviewed to ensure the vendor has enough resources in case of a
security breach and subsequent litigation. A current and sufficient
insurance policy can also protect the procuring organization in the
event of a breach.
Information security controls—A service provider should have
sufficient security controls in place to ensure the data is not lost or
stolen.
Point of transfer—The point of transfer between the procuring
organization and vendor is a potential security vulnerability.
Mechanisms of secure transfer should be developed and maintained.
Disposal of information—Appropriate destruction of data and/or
information in any format or media is a key component of information
management for both the contracting organization and its vendors. The
Disposal Rule under the Fair and Accurate Credit Transactions Act of
2003 (FACTA) sets forth required disposal protections for financial
institutions. The Disposal Rule requirements provide a good baseline
for disposal of personal information more generally.
Employee training and user awareness—The vendor should have an
established system for training its employees about its responsibilities in
managing personal or sensitive information.
Vendor incident response—Because of the potentially significant
costs associated with a data breach, the vendor should clearly explain in
advance its provisions for responding to any such breach with the
cooperation needed to meet the organization’s business and legal needs.
Audit rights—Organizations should be able to monitor the vendor’s
activities to ensure it is complying with contractual obligations. Audit
needs can sometimes be satisfied through periodic assessments or
reports by independent trusted parties regarding the vendor’s
practices.59
Policies and procedures—Vendors are generally expected to have
documented policies around the handling of personal information in
their organization, as well as procedures for dealing with data subject
rights, data breaches, etc.
DPOs—The vendor should have an individual or DPO (where
required) who is responsible for the organization’s management of
personal data. Applicable supervisory authority registrations should be
provided.
Evaluating vendors should involve all relevant internal and external stakeholders,
including internal audit, information security, physical security, and regulators.
Results may indicate improvement areas that may be fixed or identify higher-level
risk that may limit the ability of that vendor to properly perform privacy
protections. Once risk is determined, organization best practices may also be
leveraged to assist a vendor too small or with limited resources by offering help
with security engineering, risk management, training through awareness and
education, auditing, and other tasks.
Contract language should be written to call out privacy protections and
regulatory requirements within the statement of work and then mapped to
service-level agreements to ensure there are no questions about the data privacy
responsibilities, breach response, incident response, media press releases on
breaches, possible fines, and other considerations, as if the vendor were part of the
organization. The following list gives a few examples of the kind of information
you may want to consider, including:
Specifics regarding the type of personal information to which the
vendor will have access at remote locations
Vendor plans to protect personal information
Vendor responsibilities in the event of a data breach
Disposal of data upon contract termination
Limitations on the use of data that ensure it will be used only for
specified purposes
Rights of audit and investigation
Liability for data breach
The purpose of the vendor contract is to make certain all vendors are in
compliance with the requirements of your organization’s privacy program. Failure
to carry out the relevant due diligence assessments can have serious consequences
on organizations; therefore, it is important to establish procedures with the
procurement team and other business owners who will be involved in the process.
4.6.1 Assessing Cloud Computing Vendors
“Cloud computing” refers to the provision of information technology services
over the internet. These services may be provided by a company for its users in a
private cloud or third-party suppliers. The services can include software,
infrastructure (i.e., servers), hosting, and platforms (i.e., operating systems).
Cloud computing has numerous applications, from personal webmail to corporate
data storage, and can be subdivided into different types of service models:
Infrastructure as a service (IaaS)—Where the supplier simply
provides remote access to and use of physical computing resources, and
the user is responsible for implementing and maintaining both the
operating platform and all applications
Platform as a service (PaaS)—Where the supplier provides access to
and use of operating platforms, as well as the underlying hardware, but
the user remains responsible for implementing and maintaining
applications
Software as a service (SaaS)—Where the supplier provides the
infrastructure, platform, and application60
Assessing cloud computing vendors before procuring them has its distinct
challenges, not only because of the complexity of the services entailed, but also
because clients of cloud computing services may not have room for maneuver in
negotiating the contractual terms of use of the cloud services, as standardized
offers are often a feature. Furthermore, usually one of the most effective ways to
assess a vendor would be to inspect their premises. This is particularly difficult in
the case of the public cloud for various logistical reasons. It is also unlikely that a
cloud provider would be willing to permit each of its prospective customers to
enter its premises to carry out an audit.
The Cloud Industry Forum indicates some of the areas you should dedicate
special attention during a selection assessment of a cloud service provider:61
Certifications and standards
Providers that comply with recognized standards and quality
frameworks demonstrate an adherence to industry best
practices and standards. While standards may not determine
which service provider you choose, they can be very helpful
in shortlisting potential suppliers. For instance, if security is a
priority, look for suppliers accredited with certifications
like ISO 27001 or the UK government’s Cyber Essentials
Scheme.
Technologies
Make sure the provider’s platform and preferred technologies
align with your current environment, workloads, and
management preferences.
In case there is too much re-coding or customization
required, you may have to make your workloads suitable for
their platforms. Many service providers offer comprehensive
migration services and assist in the assessment and planning
phases. Ensure you have a good understanding of the support
on offer, map this against project tasks, and decide who will
do what. Often service providers have technical staff that can
fill skills gaps in your migration teams.
Service road map
Ask about how the provider plans to continue to innovate
and grow over time, and understand if its road map fits your
needs in the long term. Specifically, SaaS providers should
demonstrate similar deployments to the ones you are
planning. An important factor to consider is a commitment to
support interoperability.
Data management
The location your data resides in and the subsequent local
laws it is subject to may be a key part of the selection process.
If you have specific requirements and obligations, you should
look for providers that give you choice and control regarding
the jurisdiction in which your data is stored, processed, and
managed. Cloud service providers should be transparent
about their data center locations.
If relevant, assess the ability to protect data in transit through
encryption of data moving to or within the cloud. Also,
sensitive volumes should be encrypted at rest to limit
exposure to unapproved administrator access.
Look to understand the provider’s data loss and breach
notification processes, and ensure they are aligned with your
organization’s risk appetite and legal or regulatory
obligations.
Information security
Ensure user access and activity is auditable via all routes, and
get clarity on security roles and responsibilities as laid out in
the contacts or business policies documentation.
Ask for internal security audit reports, incident reports, and
evidence of remedial actions for any issues raised.
Subcontractors and service dependencies
In general, think twice before considering providers with a
long chain of subcontractors, especially with mission critical
business processes or data governed by data privacy
regulations.
Uncover any service dependencies and partnerships involved
in the provision of the cloud services. For example, SaaS
providers will often build their service on existing IaaS
platforms, so it must be clear how and where the service is
being delivered.
In some cases, there may be a complex network of connected
components and subcontractors that all play a part in
delivering a cloud service. It is vital to ensure the provider
discloses these relationships and can guarantee the primary
SLAs stated across all parts of the service, including those not
directly under its control. You should also look to understand
limitations of liability and service disruption policies related
to these subcomponents.
Data policies and protection
Assess a provider’s security policies and data management
policies particularly relating to data privacy regulations.
Ensure there are enough guarantees around data access, data
location and jurisdiction, confidentiality, and
usage/ownership rights. Scrutinize backup and resilience
provisions. Review data conversion policies to understand
how transferable data maybe if you decide to leave.
4.6.2 Assessing Vendors under the GDPR
Article 28 of the GDPR contains specific provisions for the controller-processor
relationship and supply chain. It is concerned with the entirety of the relationship
between the controller and processor and all the data protection principles, not
just security.
The intention of Article 28(1) is to flow down the security principle and security
requirements into the processor’s organization and through the supply chain to
subprocessors.
To do this, Article 28 uses the device of limiting the controller’s use of
processors to those who can provide “sufficient guarantees” about the
implementation of appropriate technical and organizational measures for
compliance with the GDPR and the protection of the rights of data subjects.
This idea of “sufficient guarantees” encompasses much more than the creation of
contracts. Instead, what it is really focused on is getting proof of the processor’s
competence. To make any sense and be truly effective, the idea of sufficient
guarantees must encompass assurance mechanisms. When looked at in this way,
there must be appropriate checking and vetting of the processor by the supplier
via a third-party assessment or certification validation, both before a contract is
created and afterward. The nature of the steps that are required for assurance must
necessarily reflect the consensus of professional opinion, of course, for the reasons
discussed earlier. In appropriate circumstances, the processes of assurance must
include processes of audit, which are made clear in Article 28(3)(h). If the
controller is unable to establish proof of the processor’s competence, it must walk
away; otherwise, it will be in automatic breach of Article 28. All this must work in
a commercial context.
The defining feature of the controller-processor relationship is that the processor
can only act on the instructions of the controller and, as can be seen from Article
28(10), if the processor steps outside the boundaries of its instructions, it risks
being defined as a controller, with all the attendant obligations. None of this is
new to European data protection law from the previous directive to the GDPR.
What is new to legislation is the duty to assist the controller with achieving
compliance and reduction of risk, which includes assisting the controller with the
handling of the personal data breach notification requirements (Article 28(3)(f)).
Given this more practical application, it is clear there will need to be close
working operations between the controller and the processor to ensure effective
incident detection and response.
Arguably, Article 28 poses many challenges for the established order of things,
particularly where there is an imbalance between the controller and processor in
the market. Where the processor is a technology giant, perhaps there is a risk that
it may use its more powerful position in a way that the GDPR says triggers
controllership under Article 28(10). Thus, for the processor industry, there are
very good incentives to behave flexibly during contract formation and
procurement.62
4.6.3 Assessing Vendors under the California Consumer Privacy
Act
Vendors can be a great option when it comes to almost any aspect of your
business operations, giving you more time to manage the most important
elements while outsourcing those that do not necessarily require internal review.
However, relying on vendors also means that you need to pay attention to new
requirements under the California Consumer Privacy Act (CCPA).
Given the focus on the sale of data under CCPA, it might make sense to assess
how the CCPA impacts your negotiated agreements with vendors, especially if
you currently sell data to them. Businesses that engage vendors to handle personal
information must execute written contracts with specific criteria with those
vendors if they want to shift liability to the vendor for any violations of the CCPA
caused by the vendor. If the vendor is defined as a “service provider” under the
CCPA, it must have a written contract that limits processing to the business
purpose of the contract. If the vendor is defined as a “person” under the statute,
the contract should, among other requirements, prohibit vendors from selling,
retaining, using, or disclosing the personal information outside of the direct
business relationship with the business. The contract must also include a
certification from the vendor that it understands the restrictions and will comply
with them.63
Also, it is important to identify the categories of third parties with whom the
business shares personal information. A consumer has the right to request from a
business, and a business shall be required to disclose to the consumer, certain
details on the personal information it collects on the consumer, including the
categories of third parties with whom the business shares personal information.
Here it is important to remain cognizant that “third parties” are defined in Section
1798.140(w) of the CCPA and that not all vendors will be considered third
parties, and third parties are not the same as service providers. Third parties will
be characterized by the vendors that have separate uses for the data that a business
shares with them. Therefore, it is important to understand which vendors a
company is using that have independent uses for the personal information shared.
This can be done by reviewing the master services agreement or equivalent terms
to see if the vendor reserves the right to use the data for its own purposes. Do not
take at face value representations that the third party is using “aggregate” or
“anonymized” data. Deidentified and aggregated data have very specific
definitions in the CCPA. If your vendor is not meeting these specific standards,
then the vendor’s processing will come within the ambit of the CCPA.
Accordingly, even if a contract states that a vendor is using “aggregate” or
“anonymized” data that a business provides, it is important for the business to
ensure that the vendor’s anonymization or aggregation practices meet the
requirements of the CCPA definitions of aggregate or deidentified data in Section
1798.140(a) and (h) in the statute.64
4.7 Mergers, Acquisitions, and Divestitures:
Privacy Checkpoints
Mergers, acquisitions, and divestitures contain many legal and compliance aspects
with their own sets of concerns related to privacy. Mergers form one organization
from others, while acquisitions involve one organization buying one or many
others; in divestitures, companies sell one division of an organization for reasons
that may include the parent company’s desire to rid itself of divisions not integral
to its core business.
An organization can be exposed to corporate risk by merging with or acquiring
companies that have different regulatory concerns. Merger and acquisition
processes should include a privacy checkpoint that evaluates:
Applicable new compliance requirements
Sector-specific laws (e.g., Health Insurance Portability and
Accountability Act [HIPAA] in the United States)
Standards (e.g., the Payment Card Industry Data Security Standards
[PCI DSS])
Comprehensive laws/regulations
Existing client agreements
New resources, technologies, and processes to identify all actions that
are required to bring them into alignment with privacy and security
policies before they are integrated into the existing system
If a merger, acquisition, or other change in organizational structure means that
you must transfer data to a different or additional controller, you need to:
Ensure you consider the data sharing as part of the due diligence you
carry out
Follow this data-sharing code
Establish what data you are transferring
Identify the purposes for which the data was originally obtained
Establish your lawful basis for sharing the data
Ensure you comply with the data-processing principles—especially
lawfulness, fairness, and transparency, to start with
Document the data sharing
Seek technical advice before sharing data where different systems are
involved: there is a potential security risk that could result in the loss,
corruption, or degradation of the data
Consider when and how you will inform data subjects about what is
happening. Under the GDPR, you are required to keep individual data
subjects informed about certain changes relating to the processing of
their data, and they may have a right to object. The same considerations
may apply in reverse to the controller receiving the data.65
On a practical level, it can be difficult to manage shared data immediately after a
change of this kind, especially if you are using different databases or trying to
integrate different systems. It is particularly important in this period to consider
the governance and accountability requirements. In particular, you must:
Check that the data records are accurate and up to date
Ensure you document what you do with the data
Adhere to a consistent retention policy for all records
Ensure appropriate security is in place66
With respect to both partial and total divestitures, the organization should
conduct a thorough assessment of the infrastructure of all or any part of the entity
being divested prior to the conclusion of the divestiture. These activities are
performed to confirm no unauthorized information, including personal
information, remains on the organization’s infrastructure as part of the divestiture,
except for any preapproved proprietary data. It is good practice to follow the same
structure of the privacy program to manage the areas that require privacy
implementation or improvement, where gaps are identified.
4.8 Summary
In recent years, with the proliferation of information communication
technologies, particularly AI systems, and the complex data protection problems
they raise, data assessments and risk management have taken an even more
prominent role in various privacy law regimes. Nevertheless, the necessity of
conducting data assessments goes beyond compliance with certain legal
requirements. They are an important risk management tool with clear financial
benefits. Identifying a problem early will generally require a simpler and less
costly solution. Moreover, data assessments allow organizations to reduce the
ongoing costs of a project by minimizing the amount of information being
collected or used and devising more straightforward processes for staff. Finally,
data assessments improve transparency and accountability, making it easier for
data subjects and supervisory authorities to understand how and why the personal
data is being used.
Endnotes
1 For more information about the Three Lines Model, see “The IIA’s Three Lines Model, An Update of the
Three Lines of Defense,” accessed April 2021, https://global.theiia.org/about/about-internal-
auditing/Public%20Documents/Three-Lines-Model-Updated.pdf. More detailed information on how to
deploy this model for privacy governance can be found in “Privacy Governance and Controls: How Do
You Know You Are Doing it Right?”, IAPP Practical Privacy Series 2014, accessed April 2021,
https://iapp.org/media/presentations/14PPS/PPSNY14_Privacy_Governance_Controls_PPT.pdf.
2 Thor Olavsrud, “What is Data Governance? A Best Practices Framework for Managing Data Assets,”
CIO.com, February 2020, https://www.cio.com/article/3521011/what-is-data-governance-a-best-
practices-framework-for-managing-data-assets.html.
3 For a list of technology solutions on the market that can help you with data governance, see Privacy Tech
Vendor Report, IAPP, https://iapp.org/resources/article/privacy-tech-vendor-report/.
4 As an example of roles and responsibilities related to data governance, see “Data Governance and Data
Policies at the European Commission, Secretariat-General,” 2020,
https://ec.europa.eu/info/sites/info/files/summary-data-governance-data-policies_en.pdf .
5 A data consumer is normally understood as the decision maker who needs data to achieve a desired
outcome.
6 Peter Swire and DeBrae Kennedy-Mayo, U.S. Private-Sector Privacy: Law and Practice for Information
Privacy Professionals, Third Edition (Portsmouth, NH: IAPP, 2020).
7 For a list of technology solutions on the market, see Privacy Tech Vendor Report, IAPP,
https://iapp.org/resources/article/privacy-tech-vendor-report/.
8 Data Protection Laws of the World, DLA Piper, accessed April 2021, https://www
.dlapiperdataprotection.com/index.html#handbook/world-map-section; Global Privacy Handbook, 2020
Edition, Baker McKenzie, accessed April 2021,
https://www.bakermckenzie.com/en/insight/publications/2020/04/2020-global-data-privacy-and-
security-handbook.
9 GDPR, Article 30, accessed November 2018, www.privacy-regulation.eu/en/article-30-records-of-
processing-activities-GDPR.htm.
10 The GDPR defines profiling in Article 4(4) as “any form of automated processing of personal data
consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in
particular to analyze or predict aspects concerning that natural person’s performance at work, economic
situation, health, personal preferences, interests, reliability, behavior, location or movements.” For more
information about profiling, see “Guidelines on Automated Individual Decision-making and Profiling for
the Purposes of Regulation 2016/679,” October 2017,
http://ec.europa.eu/newsroom/article29/document.cfm?doc_id=49826.
11 William RM Long, Geraldine Scali, and Francesca Blythe, EU General Data Protection Regulation, Privacy
& Data Security Practice Portfolio Series, Portfolio No. 550, Bloomberg BNA, 2016,
https://www.sidleycareers.com/~/media/publications/gdpr-portfolio-section-iii-with-cover.pdf.
12 Thomas J. Shaw, DPO Handbook: Data Protection Officers under the GDPR (Portsmouth, NH: IAPP, 2018).
13 Shaw, DPO Handbook.
14 Shaw, DPO Handbook.
15 As explained by Peter Swire and DeBrae Kennedy-Mayo, U.S. Private-Sector Privacy: Law and Practice for
Information Privacy Professionals, almost anything that someone may do with personal information might
constitute processing under privacy and data protection laws. The term “processing” refers to the
collection, recording, organization, storage, updating or modification, retrieval, consultation and use of
personal information. It also includes the disclosure by transmission, dissemination or making available in
any other form, linking, alignment or combination, blocking, erasure, or destruction of personal
information.
16 Shaw, DPO Handbook.
17 For more information, see “Recommendations 01/2020 on Measures that Supplement Transfer Tools to
Ensure Compliance with the EU Level of Protection of Personal Data,” European Data Protection Board,
November 10, 2020,
https://edpb.europa.eu/sites/edpb/files/consultation/edpb_recommendations_202001_supplementary
measurestransferstools_en.pdf.
18 China’s Personal Information Security Specification, Article 10(2), accessed April 2021,
https://www.newamerica.org/cybersecurity-initiative/digichina/blog/translation-chinas-personal-
information-security-specification/.
19 China’s draft Personal Information Protection Law, Article 54, accessed April 2021,
https://www.newamerica.org/cybersecurity-initiative/digichina/blog/chinas-draft-personal-information-
protection-law-full-translation/.
20 For more information about this phase-based approach, please see the interview “Thought Leaders in
Privacy: João Torres Barreiro,” Data Guidance, 2018, https://www.dataguidance.com/video/jo%C3%A3o-
torres-barreiro-former-chief-privacy-officer-associate-vice-president-risk-compliance-hcl.
21 Public Law 107-347, December 2002.
22 See, e.g., “Privacy Threshold Analysis,” U.S. Department of Homeland Security (DHS), accessed April
2021, https://www.dhs.gov/xlibrary/assets/privacy/DHS_PTA_Template.pdf.
23 5 U.S.C. § 552a(d)–(e).
24 See, e.g., “Privacy Impact Assessment for ECS,” DHS, January 2013, accessed April 2021,
https://www.dhs.gov/sites/default/files/publications/privacy_pia_28_nppd_ecs_jan2013.pdf.
25 Shaw, DPO Handbook; M-03-22, “OMB Guidance for Implementing the Privacy Provisions of the E-
Government Act of 2002,” September 2003, accessed April 2021, https://www.whitehouse.gov/wp-
content/uploads/2017/11/203-M-03-22-OMB-Guidance-for-Implementing-the-Privacy-Provisions-of-
the-E-Government-Act-of-2002-1.pdf.
26 For more information on the Virginia Data Protection Act, see Sarah Rippy, “Virginia Passes the
Consumer Data Protection Act,” IAPP, March 3, 2021, https://iapp.org/news/a/virginia-passes-the-
consumer-data-protection-act/.
27 “ISO/IEC 29134:2017, Information Technology—Security Techniques—Guidelines for Privacy Impact
Assessment,” International Organization for Standardization, June 2017,
https://www.iso.org/standard/62289.html.
28 Shaw, DPO Handbook.
29 “Data Protection Impact Assessments (DPIA),” Data Protection Commission, accessed April 2020,
https://www.dataprotection.ie/en/organisations/know-your-obligations/data-protection-impact-
assessments#what-is-a-data-protection-impact-assessment.
30 GDPR, Article 35(1) and (3)–(4); GDPR, Article 35(2) and (7)–(9); GDPR, Article 36 (3)(e);
“Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing is
‘Likely to Result in a High Risk’ for the Purposes of Regulation 2016/679,” April 2017,
https://ec.europa.eu/newsroom/document.cfm?doc_id=44137.
31 GDPR, Article 35.
32 GDPR, Article 35(4) and (5).
33 GDPR, Article 35(3).
34 Upon enactment of the GDPR, May 25, 2018, the Article 29 Working Party has been replaced by the
European Data Protection Board. However, the opinions from the Working Party are still valid.
35 The GDPR does not define the concept of risk. However, a definition of risk that has been used in the
privacy community and proposed for the application of the GDPR is as follows: privacy risk equals the
probability that a data processing activity will result in an impact, threat to or loss of
(in varying degrees of severity) a valued outcome (e.g., rights and freedoms). An unacceptable privacy risk,
therefore, would be a threat to, or loss of, a valued outcome that cannot be mitigated through the
implementation of effective controls and/or that is unreasonable in relation to the intended benefits. For
more information about what constitutes risk, see “Risk, High Risk, Risk Assessments and Data Protection
Impact Assessments under the GDPR,” Centre for Information Policy Leadership, Hunton & Williams
LLP, December 2016, accessed April 2021, https://www
.informationpolicycentre.com/uploads/5/7/1/0/57104281/cipl_gdpr_project_risk_white_paper_21_d
ecember_2016.pdf.
36 “Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing is
‘Likely to Result in a High Risk’ for the purposes of Regulation 2016/679.
37 “Guidelines on Data Protection Impact Assessment (DPIA)…”
38 “Guidelines on Data Protection Impact Assessment (DPIA)…”
39 “Guidelines on Data Protection Impact Assessment (DPIA)…”
40 GDPR, Article 35(7) and recitals 84 and 90.
41 “Guidelines on Data Protection Impact Assessment (DPIA)…”
42 For more information, access the ICO website, accessed April 2021, at https://ico.org.uk/for-
organisations/guide-to-the-general-data-protection-regulation-gdpr/data-protection-impact-assessments-
dpias/ and the CNIL website, accessed April 2021, at https://www.cnil.fr/en/privacy-impact-assessment-
pia.
43 “The Open Source PIA Software Helps to Carry Out Data Protection Impact Assessment,” CNIL, June 30,
2021, https://www.cnil.fr/en/open-source-pia-software-helps-carry-out-data-protection-impact-
assesment.
44 Pseudonymization and encryption of personal data (as well as data minimization and oversight
mechanisms, among others) are not necessarily appropriate measures for reducing risks to an acceptable
level. They are only examples. Appropriate measures depend on the context and the risks specific to the
processing operations.
45 “Guidelines on Data Protection Impact Assessment (DPIA)...”
46 “Guidelines on Data Protection Impact Assessment (DPIA)...”
47 “Ethics Guidelines for Trustworthy AI,” Independent High-Level Expert Group on Artificial Intelligence
set up by the European Commission for Trustworthy AI, April 2019, https://ai.bsa.org/wp-
content/uploads/2019/09/AIHLEG_EthicsGuidelinesforTrustworthyAI-ENpdf.pdf .
48 For more information, access the ICO website, accessed April 2021, at https://ico.org.uk/for-
organisations/guide-to-data-protection/key-data-protection-themes/explaining-decisions-made-with-
artificial-intelligence/part-1-the-basics-of-explaining-
ai/definitions/#:~:text=Artificial%20Intelligence%20(AI)%20can%20be,that%20previously%20required
%20human%20thinking.
49 For more information on the privacy principles, see Eduardo Ustaran, European Data Protection: Law and
Practice, Second Edition, Chapter 6, (Portsmouth, NH: IAPP, 2019).
50 For more information on the privacy challenges of AI, see Artificial Intelligence and Privacy, January 2018,
Datatilsynet, The Norwegian Data Protection Authority,
https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf.
51 As elucidated on “Ethics Guidelines for Trustworthy AI,” an explanation as why a model has generated a
particular output or decision (and what combination of input factors contributed to that) is not always
possible. These cases are referred to as “black box” algorithms.
52 For more information on this topic, see the ICO “How Should We Assess Security and Data Minimisation
in AI?”, accessed April 2021, https://ico.org.uk/for-organisations/guide-to-data-protection/key-data-
protection-themes/guidance-on-ai-and-data-protection/how-should-we-assess-security-and-data-
minimisation-in-ai/.
53 ICO, “How Should We Assess Security and Data Minimisation in AI?” Also, for a more comprehensive
overview on conducting privacy assessments on AI projects, please see the IAPP web conference “AI, a
Privacy Odyssey: Conducting Privacy Assessments on AI Projects,” November 5,
2020, https://iapp.org/resources/article/web-conference-ai-a-privacy-odyssey-conducting-privacy-
assessments-on-ai-projects/.
54 For more information, please see Artificial Intelligence Governance Principles: Towards Ethical and
Trustworthy Artificial intelligence in the European Insurance Sector, A Report from the EIOPA’s Consultative
Expert Group on Digital Ethics in Insurance, accessed July 2021, https://www.eiopa
.europa.eu/sites/default/files/publications/reports/eiopa-ai-governance-principles-june-2021.pdf.
55 “Volume I: Guide for Mapping Types of Information and Information Systems to Security Categories,”
NIST, August 2008, accessed April 2021,
https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-60v1r1.pdf.
56 For more information, access the ICO website, accessed April 2021, at https://ico.org.uk/for-
organisations/data-protection-advice-for-small-organisations/checklists/data-protection-self-assessment/.
57 Swire and Kennedy-Mayo, U.S. Private-Sector Privacy.
58 For more information, access the NIST guidelines, accessed May 2021, at
https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-88r1.pdf.
59 Swire and Kennedy-Mayo, U.S. Private-Sector Privacy.
60 Ustaran, European Data Protection.
61 “8 Criteria to Ensure You Select the Right Cloud Service Provider,” Cloud Industry Forum, accessed April
2021, https://www.cloudindustryforum.org/content/8-criteria-ensure-you-select-right-cloud-service-
provider. For support documentation in the risk decision-making process of cloud providers, see “Cloud
Risk Decision Framework, Principles & Risk-Based Decision—Making for Cloud-Based Computing
Derived from ISO 31000,” accessed April 2021,
http://download.microsoft.com/documents/australia/enterprise/SMIC1545_PDF_v7_pdf.pdf, and also
“Guidelines on the Use of Cloud Computing Services by the European Institutions and Bodies,” European
Data Protection Supervisor, April 16, 2018, https://edps.europa.eu/sites/default/files/publication/18-03-
16_cloud_computing_guidelines_en.pdf.
62 Ustaran, European Data Protection.
63 Dominique Shelton Leipzig, Implementing the CCPA: A Guide for Global Business, Second Edition,
(Portsmouth, NH: IAPP, 2021), 11.
64 Shelton Leipzig, Implementing the CCPA, 53 – 54.
65 “Data Sharing Code of Practice,” ICO, version 1.0.53, December 2020, https://ico.org.uk/media/for-
organisations/data-sharing-a-code-of-practice-1-0.pdf.
66 “Data Sharing Code of Practice,” ICO.
CHAPTER 5
Privacy Operational Life Cycle: Protect:
Protecting Personal Information
Jonathan Fox, CIPP/US, CIPM
For data privacy, protecting personal information (which is sometimes referred to
as “personal data” or “personally identifiable information” [PII]) starts with
privacy by design (PbD); includes determining which information security
privacy controls are needed; and continues through ensuring those controls are
successfully designed, engineered, deployed, and monitored in whatever it is (e.g.,
product, service, IT system, business process) that is processing personal
information.
5.1 Privacy by Design
Originating in the mid-1990s and developed by Ann Cavoukian, former
information and privacy commissioner of Ontario, the PbD framework dictates
that privacy and data protection are embedded throughout the entire life cycle of
technologies, from the early design stage through deployment, use, and ultimate
disposal or disposition. The foundational concept—that organizations need to
build privacy directly into technology, systems, and practices—has gained
recognition around the globe, including from the U.S. Federal Trade Commission
(FTC) and the European Commission.
PbD consists of seven foundational principles:1
1. Proactive, not reactive; preventative, not remedial. PbD anticipates
and prevents privacy-invasive events before they happen, rather than
waiting for privacy risks to materialize.
2. Privacy as the default. No action is required by individuals to maintain
their privacy; it is built into the system by default. This concept has
been introduced in the EU General Data Protection Regulation
(GDPR).
3. Privacy embedded into design. Privacy is an essential component of
the core functionality being designed and delivered. The FTC has
adopted this principle in its consumer privacy framework, calling for
companies to promote consumer privacy throughout the organization
and at every stage of product development.
4. Full functionality—positive-sum, not zero-sum. PbD seeks to
accommodate all legitimate interests and objectives, rather than making
unnecessary trade-offs.
5. End-to-end security—full life cycle protection. Strong security
measures are essential to privacy, from start to finish of the life cycle of
data.
6. Visibility and transparency. Component parts and operations remain
visible and transparent to users and providers alike. Visibility and
transparency are essential to establishing accountability and trust.
7. Respect for user privacy. Above all, PbD requires keeping the interests
of the individual uppermost by offering such measures as strong privacy
defaults, appropriate notice, and empowering user-friendly options.
When followed, the principles of PbD (see figure 5-1) ensure an
organization establishes a culture of privacy as realized through the privacy
framework, mission statement, training, and awareness. The organization,
having implemented a tactical strategy to reduce privacy-associated risks,
may then be viewed favorably by its peer industry partners and consumers.
Figure 5-1: The Foundational Principles of Privacy by Design (after Cavoukian)
5.1.1 Paradigm of Privacy by Design
The PbD paradigm ensures privacy and security controls are aligned with
an organization’s tolerance for risk, its compliance with regulations, and its
commitment to building a sustainable privacy-minded culture. Notably,
though, the paradigm is not a formal security/privacy engineering process
(i.e., a system development life cycle [SDLC]).
The qualities of the paradigm include:
Being proactive—By default, privacy controls are part of the
system engineering requirements. They are tested for
effectiveness and monitored continuously.
Embedded privacy controls—This involves putting them into
systems and applications, auditing them for regulatory
compliance, and evaluating them when new threats to
information systems are discovered.
Demonstrating respect for users—Privacy and security
controls coexist transparently to a user. They do not diminish the
necessary authorizations to access data. The protection of
organizational information assets is enabled without
unnecessary trade-offs.
Privacy has historically been viewed as an impediment to innovation and
progress, but that’s so yesterday and so ineffective as a business model. Without
user trust, technologies can’t move forward.2
—Ann Cavoukian, PhD, former Ontario information and privacy
commissioner, who has been encouraging organizations since the 1990s to
embrace the concept of PbD.
5.2 Data Protection by Design and Default
Following are Article 25 from Chapter IV of the GDPR and Recital 78, which
articulate what is meant by data protection by design and default from an EU
perspective. While different in language and principles, they are highly similar in
concept and in goal: that data privacy should be built into the design process and
not added on as an afterthought.
Article 25—Data protection by design and by default
1. Taking into account the state of the art, the cost of implementation and
the nature, scope, context and purposes of processing as well as the risks
of varying likelihood and severity for rights and freedoms of natural
persons posed by the processing, the controller shall, both at the time of
the determination of the means for processing and at the time of the
processing itself, implement appropriate technical and organizational
measures, such as pseudonymization, which are designed to implement
data-protection principles, such as data minimization, in an effective
manner and to integrate the necessary safeguards into the processing in
order to meet the requirements of this Regulation and protect the rights
of data subjects.
2. The controller shall implement appropriate technical and
organizational measures for ensuring that, by default, only personal
data which are necessary for each specific purpose of the processing are
processed. That obligation applies to the amount of personal data
collected, the extent of their processing, the period of their storage and
their accessibility. In particular, such measures shall ensure that by
default personal data are not made accessible without the individual’s
intervention to an indefinite number of natural persons.
3. An approved certification mechanism pursuant to Article 42 may be
used as an element to demonstrate compliance with the requirements set
out in paragraphs 1 and 2 of this Article.3
Recital 78—Appropriate technical and organizational measures
The protection of the rights and freedoms of natural persons with regard to the
processing of personal data require that appropriate technical and organisational
measures be taken to ensure that the requirements of this Regulation are met.
In order to be able to demonstrate compliance with this Regulation, the controller
should adopt internal policies and implement measures which meet in particular
the principles of data protection by design and data protection by default.
Such measures could consist, inter alia, of minimising the processing of personal
data, pseudonymising personal data as soon as possible, transparency with
regard to the functions and processing of personal data, enabling the data subject
to monitor the data processing, enabling the controller to create and improve
security features.
When developing, designing, selecting and using applications, services and
products that are based on the processing of personal data or process personal
data to fulfil their task, producers of the products, services and applications
should be encouraged to take into account the right to data protection when
developing and designing such products, services and applications and, with due
regard to the state of the art, to make sure that controllers and processors are able
to fulfil their data protection obligations.
The principles of data protection by design and by default should also be taken
into consideration in the context of public tenders.4
Like PbD, it sets an expectation for privacy to be designed to whatever is being
developed and outlines what this means. With this, PbD, as a concept, has moved
from something recognized as good practice to a compliance requirement—at
least for systems, applications, and products that will process data in the European
Union or data collected in the European Union.
Although PbD and the GDPR’s notion of data protection by design and default
both have similar goals, the way each defines it is different. The GDPR’s principles
are tethered to the requirements of the regulation:5
Lawfulness, fairness, and transparency of processing in relation to the
data subject
Purpose limitation: collecting and processing for the specified purpose
only
Compatibility test for further processing: link between
purposes, nature of the data, method of collection,
consequences of secondary uses, and safeguards
Data minimization: relevance and limit to what is necessary
Accuracy: complete and up-to-date data
Storage limitation: kept only as long as needed to conduct the specified
processing
Integrity and confidentiality: security
Accountability: responsibility and demonstration of compliance
Privacy by Design and Privacy Engineering
“If Privacy by Design provides the ‘what’ to do, then privacy engineering
provides the ‘how’ to do it.”
—Ann Cavoukian, PhD, Stuart Shapiro, PhD, R. Jason Cronk, Esq6
Privacy engineering is a concept for which PbD is a facilitator. It provides
valuable design guidelines that privacy engineers should follow. In turn, privacy
engineering adds to and extends PbD. It provides a methodology and technical
tools based on industry guidelines and best practices.7
Figure 5-2: The Intersection of Privacy and Engineering
5.3 Diagramming Privacy by Design
One way to approach PbD is to take advantage of the data inventory and visually
lay out, at a high level, data flow diagrams, include administrative and end users,
first- and third-party processors, and geographic locations (see figure 5-3), then
add the data flow (see figure 5-4).
Figure 5-3: Data Flow Diagram 1: Actors
Figure 5-4: Data Flow Diagram 2: Data Flow
Next, begin to work through likely, less likely, and edge-case risks (harms,
threats, vulnerabilities), and, with each, identify what privacy and information
security controls are warranted or what must change about the design (see figure
5-5).
Figure 5-5: Data Flow Diagram 3: Risks
5.4 Information Security
Whether you use PbD, the GDPR’s Article 25 data protection by design and
default, or privacy engineering as the means to design, develop, deploy, manage,
and retire things that process personal information, throughout the project life
cycle, there will be a dependency on information security in the protection of the
data that is being processed.
Therefore, as a privacy professional, it is important to understand the
connections and disconnects information security has with data privacy.
Just as data privacy has a set of objectives that need to be achieved to assert that
privacy is maintained, so does information security. Also, just as data privacy has
terms and concepts with specific definitions, so does information security. In
addition, just as the privacy governance life cycle provides the methods to assess,
protect, sustain, and respond, so does the information security life cycle.
5.4.1 Confidentiality, Integrity, Availability
Information security aims to ensure the confidentiality, integrity, and availability
(CIA) of information throughout the data life cycle. Confidentiality, integrity, and
availability are often referred to as CIA. “Confidentiality” means prevention of
unauthorized disclosure of information. “Integrity” ensures information is
protected from unauthorized or unintentional alteration, modification, or
deletion. “Availability” means information is readily accessible to authorized users.
Additionally, information security includes the concepts of “accountability” and
“assurance.” For information security, accountability means entity ownership is
traceable, while assurance means all other four objectives are met.8
5.4.2 Information Security Practices
Like data privacy, information security involves a continual, ongoing set of
practices that are applied throughout the data life cycle—from creating/collection
of data to its destruction. Information security defines “risk” as the combination of
the probability of an event and its consequence (ISO/IEC 73).9
Information security builds upon risk management practices. The risk
management processes provide:
Identification of risk
Selection and implementation of controls and measures to mitigate risk
Tracking and evaluation of risk to validate the first two parts
The reason for this is that risks should be understood and prioritized before they
are addressed, and once the risks are understood and prioritized, the appropriate
controls and measures can then be applied to mitigate such risks and ensure its
effectiveness can be monitored. This is so the response to the risks can be effective
and efficient.
Information security risks in today’s environment have evolved and become
increasing nuanced and focused. Here is a list to consider from Security
Bouevard:10
1. Cloud-based threats—“This rapid migration to the cloud exposes
businesses to a slew of security challenges and threats. Cloud app
vulnerabilities, incomplete data deletion, misconfigurations in cloud
storage, and diminished visibility and control are some of the
common cloud services issues that increase cybersecurity risks.”
2. Insider threats—“To stay secured against insider threats, companies
must quickly and accurately detect, investigate and respond to issues
that could be indicators of potential insider attacks. Since
conventional antivirus and antimalware tools are not as effective
against insider threats, you need specialized tools to safeguard your
business against them.”
3. Remote worker endpoint security—“Remote workers often
operate without any network perimeter security, thus missing out on a
critical part of layered cybersecurity defense. Cybercriminals have
quickly adapted to the remote working environment by exploiting
cloud-based services, improperly secured VPNs and unpatched
remote computers to hack into off-network systems.”
4. Phishing attacks—“Phishing scams typically employ social
engineering in traditional email and cloud services attacks. Phishing
can result in Business Email Compromise (BEC), Account Takeover
(ATO), credential theft, ransomware and other security breaches.”
5. Deepfakes—“A deepfake is the use of machine learning and artificial
intelligence (AI) to manipulate an existing image or video of a person
to portray some activity that didn’t actually happen.”
6. Internet-of-Things (IoT) devices—“The fact that a majority of the
new [IoT] devices are still in their infancy means that there’s a much
larger attack surface for cybercriminals to target the vulnerabilities
associated with these novel technologies. In addition, it is extremely
difficult to develop cybersecurity strategies to keep up with the rapid
emergence of new IoT devices.”
7. Malvertising—“Malvertising, a portmanteau of malicious
advertising, is the use of online ads to spread malware. Generally this
occurs through the injection of malicious code into ads that are then
displayed on various websites through legitimate online advertising
networks.”
8. Fileless attacks and living off the land—“A subcategory of living off
the land (LotL) attacks, fileless attacks exploit features and tools that
are present in the victim’s environment. Such attacks do not depend
on file-based payloads and usually do not generate new files. That
said, fileless attacks are often not identified by conventional detection
and prevention (antivirus) solutions.”
9. Sophisticated and targeted ransomware attacks—“Ransomware
operators have devised innovative ways to spread rapidly, dodge
endpoint security protocols and launch successful attacks on targeted
companies and individuals. This is a major cause for concern since the
effects of a single ransomware attack can be extremely damaging to
small and midsize businesses, leading to exorbitant costs associated
with downtime and recovery.”
10. Social media-based attacks—“Social media has frequently been the
medium of choice for launching various types of cyberattack. For
example, cybercriminals might launch an attack by announcing a new
product or a webinar mimicking a legitimate business. Once the user
clicks on the registration URL, they would be redirected to a
malicious website and driven to compromise personally identifiable
information or credentials for multifactor authentication.
Inefficient verification and authentication practices further enable
social media attacks to succeed. Abbreviated URLs and malicious
QR codes might be employed to obscure malicious websites and
launch cyberattacks on either the legitimate business account or
through rogue accounts that use the same name.”
5.4.3 Controls
Information security uses controls to manage risk. ISACA defines controls as “the
means of managing risk, including policies, procedures, guidelines, practices or
organizational structures, which can be of an administrative, technical,
management, or legal nature.”11
Controls are divided into the following categories:
Preventive controls are intended to prevent an incident from
occurring. They are used to avoid undesirable events, errors, and other
occurrences that an enterprise has determined could have a negative
impact on a process or end product (e.g., by locking out unauthorized
intruders).12
Detective controls are intended to identify and characterize an
incident in progress. They detect and report when errors, omissions,
and unauthorized uses of entries occur (e.g., by sounding an alarm and
alerting the appropriate person).13
Corrective controls are intended to limit the extent of any damage
caused by the incident. They are designed to correct errors, omissions,
and unauthorized uses and intrusions once they are detected (e.g., by
recovering the organization to normal working status as efficiently as
possible).14
According to their nature, for example:
Physical controls govern physical access to hard copies of data and the
systems that process and store electronic copies (e.g., fences, doors,
locks, and fire extinguishers).15
Administrative or policy controls govern an organization’s business
practices (e.g., incident response processes, management oversight,
security awareness and training, policies regarding how the organization
handles data).16
Technical controls govern software processes and data (e.g., user
authentication [login] and logical access controls, antivirus software,
firewalls).17
Here are some information security controls:18
Information security policies—Document the rule set for an
organization to maintain the confidentiality, integrity, availability, and
resiliency of its information resources and technology, as well as the
consequences of noncompliance with the rules. It is generally good
practice for the privacy function to align and reference information
security policies to ensure consistency in messaging, as well as
distinguish the functions and responsibilities as there is often confusion
between the two when it comes to protecting data.
Organization of information security—An entity’s organizational
structure, staffing, roles, and responsibilities that is implemented to
support its information security needs.
Asset management—The secure management of information
resources throughout the data life cycle from acquisition of data or
technology through to its destruction or disposal of either. This starts
with documenting what information an organization processes, what
system and technology is used, and who has access.
Access control—Manage who has what access to what information
and technology resources and under what circumstances.
Cryptography—The process of encoding and decoding information so
only intended parties have access to it.
Physical and environmental security—Physical and environment
security are controls to protect tangible assets and locations from harm
and intrusion, and/or obtrusion, and can consist of techniques to
monitor, detect, and deter.
Operational security—The process of identifying information
security risks (threat and vulnerabilities) to specific data or information
resources and applying controls to reduce the identified risk to an
acceptable level.
Communications security—Ensures communications are not
intercepted or accessed by unintended recipients of the
communications.
Systems acquisition, development, maintenance, and disposal—
The management of technology throughout its life cycle from sourcing
(and necessary due diligence of suppliers and the technology) to
disposal and includes version controls and patching of software and
firmware.
Supplier relationships—Including:
Information security incident management—Triaging and
responding to real and suspected information security
incidents and ensuring necessary countermeasures are
applied as warranted.
Information security aspects of business continuity
management—Ensuring business continuity management
processes are designed, implemented, and used without
compromising the security of the information assets.
Compliance—An organization’s adherence to its information security
policies, standards, and processes and entails mechanisms and
processes to validate that adherence.
5.4.4 Information Security Standards and Guidelines
Information security practices use standards and guidelines for consistent
application of management, technical, and operational controls to reduce the risks
to confidentiality, availability, and integrity of information.
The best known and most prominent are those from the International
Organization for Standardization (ISO) and U.S. National Institute of Standards
and Technology (NIST).
These include:
ISO/IEC 27000. Information security management systems—
Overview and vocabulary. ISO/IEC 27000:2018 provides the
overview of information security management systems (ISMS). It also
provides terms and definitions commonly used in the ISMS family of
standards.19
ISO/IEC 27001. Information security management systems—
Requirements. ISO/IEC 27001 is the best-known standard in the
family providing requirements for an ISMS.20
What is an ISMS? An ISMS is a systematic approach to managing
sensitive company information so that it remains secure. It includes
people, processes, and IT systems by applying a risk management
process.
ISO/IEC 27002. Code of practice for information security
management. ISO/IEC 27002:2013 gives guidelines for organizational
information security standards and information security management
practices, including the selection, implementation, and management of
controls taking into consideration the organization’s information
security risk environment(s).21
ISO/IEC 27003. Information security management system
implementation guidance. ISO/IEC 27003:2017 describes the
process of ISMS specification and design from inception to the
production of implementation plans. It describes the process of
obtaining management approval to implement an ISMS, defines a
project to implement an ISMS, and provides guidance on how to plan
the ISMS project, resulting in a final ISMS project implementation
plan.22
ISO/IEC 27004. Information security management—
Measurement. ISO/IEC 27004:2016 provides guidelines intended to
assist organizations in evaluating the information security performance
and effectiveness of an information security management system.23
ISO/IEC 27005. Information security risk management. ISO/IEC
27005:2018 provides guidelines for information security risk
management and is designed to assist the satisfactory implementation
of information security based on a risk management approach.24
ISO/IEC 27006. Requirements for bodies providing audit and
certification information security management systems. ISO/IEC
27006:2015 specifies requirements and provides guidance for bodies
providing audit and certification of an ISMS. It is primarily intended to
support the accreditation of certification bodies providing ISMS
certification.25
ISO/IEC 27009. Security techniques—Sector-specific application
of ISO/IEC 27001—Requirements. This document specifies the
requirements for creating sector-specific standards that extend
ISO/IEC 27001 and complement or amend ISO/IEC 27002 to
support a specific sector (domain, application area, or market).26
ISO/IEC 27010. Information technology, security techniques,
information security management for inter-sector and inter-
organizational communications. ISO/IEC 27010:2015 provides
guidelines and guidance given in the ISO/IEC 27000 family of
standards for implementing information security management within
information-sharing communities.27
ISO/IEC 27011. Information security management guidelines for
telecommunications organizations based on ISO/IEC 27002. The
scope of recommendation ISO/IEC 27011:2016 is to define guidelines
supporting the implementation of information security controls in
telecommunications organizations.28
ISO/IEC 27013. Guidance on the integrated implementation of
ISO/IEC 27001 and ISO/IEC 20000-1. ISO/IEC 27013:2015
provides guidance on the integrated implementation of ISO/IEC
27001 and ISO/IEC 20000-1.29
ISO/IEC 27031. Guidelines for information and communications
technology readiness for business continuity. ISO/IEC 27031:2011
describes the concepts and principles of information and
communication technology (ICT) readiness for business continuity
and provides a framework of methods and processes to identify and
specify all aspects, such as performance criteria, design, and
implementation, for improving an organization’s ICT readiness to
ensure business continuity.30
ISO/IEC 27033-1. Network security overview and concepts.
ISO/IEC 27033-1:2015 provides an overview of network security and
related definitions. It defines and describes the concepts associated with
and provides management guidance on network security.31
ISO/IEC 27034. Security techniques—Application security—Part
1: Overview and concepts. ISO/IEC 27034 provides guidance to
assist organizations in integrating security into the processes used for
managing their applications.32
ISO/IEC 27035. Information security incident management.
ISO/IEC 27035-1:2016 is the foundation of this multipart
international standard. It presents basic concepts and phases of
information security incident management and combines these
concepts with principles in a structured approach to detecting,
reporting, assessing, and responding to incidents and applying lessons
learned.33
ISO/IEC 27701. Security techniques—Extension to 27001 and
27002 for privacy information management—Requirements and
guidelines. This document specifies requirements and provides
guidance for establishing, implementing, maintaining, and continually
improving a Privacy Information Management System (PIMS) in the
form of an extension to ISO/IEC 27001 and ISO/IEC 27002 for
privacy management within the context of the organization.34
ISO/IEC 27550. Security techniques—Privacy engineering for
system life cycle processes provides privacy engineering guidelines
that are intended to help organizations integrate recent advances in
privacy engineering into system life cycle processes where “privacy
engineering” may be associated with terms such as:
“privacy by design” and “privacy by default,” which are
inherently relevant to PC 317 WG1 deliberations; or
“data protection by design” and “data protection by default,”
used in European regulation and privacy by design
approaches, respectively.35
ISO 27799. Information security management in health using
ISO/IEC 27002. ISO 27799:2016 gives guidelines for organizational
information security standards and information security management
practices, including the selection, implementation, and management of
controls taking into consideration the organization’s information
security risk environment(s).36
ISO/IEC 29100: “Security techniques—Privacy Framework”
provides a privacy framework that:
specifies a common privacy terminology;
defines the actors and their roles in processing PII;
describes privacy safeguarding considerations; and
provides references to known privacy principles for
information technology.37
ISO/IEC 29134. Security techniques—Guidelines for privacy
impact assessment. ISO/IEC 29134:2017 gives guidelines for:
a process on privacy impact assessments (PIAs); and
a structure and content of a PIA report.38
ISO/IEC 29151—Security techniques—Code of practice for PII
protection. ISO/IEC 29151:2017 establishes control objectives,
controls, and guidelines for implementing controls to meet the
requirements identified by a risk and impact assessment related to the
protection of PII.39
The NIST guidelines include:
“Framework for Improving Critical Infrastructure Cybersecurity,
Version 1.1.” The “Framework for Improving Critical Infrastructure
Cybersecurity” describes a voluntary risk management framework to
manage cybersecurity-related risk.40
NIST Special Publication (SP) 800-30, Revision 1, “Guide for
Conducting Risk Assessments.” NIST SP 800-30 provides guidance
for conducting risk assessments of information systems and
organizations, amplifying the guidance in SP 800-39.41
NIST SP 800-37, Revision 2, “Risk Management Framework for
Information Systems and Organizations: A System Life Cycle
Approach for Security and Privacy.” NIST SP 800-37 provides
guidelines for applying the Risk Management Framework to
information systems and organizations with a process for managing
security and privacy risk that includes information security
categorization; control selection, implementation, and assessment;
system and common control authorizations; and continuous
monitoring.42
NIST SP 800-39, “Managing Information Security Risk:
Organization, Mission, and Information System View.” NIST SP
800-39 provides guidance for an integrated, organization-wide program
for managing information security risk to organizational operations,
organizational assets, individuals, and other organizations resulting
from the operation and use of information systems.43
NIST SP 800-53, Revision 5, “Security and Privacy Controls for
Information Systems and Organizations.” NIST SP 800-53 provides
a catalog of security and privacy controls for information systems and
organizations to protect organizational operations and assets,
individuals, and other organizations from a diverse set of threats and
risks.44
5.5 Data Privacy and Information Security
Privacy addresses the rights of individuals to control how and to what extent
information about them—their personal information—is collected and further
processed. Information security is about assuring the CIA of information assets.
While privacy and information security overlap, they are also different in certain
respects (see figure 5-6).
Figure 5-6: Privacy versus Information Security
The safeguards enable the “authorized” in the “authorized access and use”
element that is a cornerstone of the operational definition of privacy. This is the
first overlap between privacy and information security.
In addition to the fact that both information security and privacy are data
protection regimens, other areas of overlap are:
Integrity (information security) and accuracy (privacy)
Availability (information security) and access (privacy)
Accountability (both)
Confidentiality (when the data is both personal information and
nonpublic)
Information security’s focus on data integrity overlaps with privacy’s accuracy
requirement in that both ensure the data is not altered without authorization.
Information security’s availability requirement supports privacy’s access
requirement because if the data is not available, it cannot be accessed.
Both information security and privacy doctrines require data owners and
custodians to be responsible for protecting the data in accordance with the
respective protection regimen, which is a form of accountability.
And when the information is both nonpublic and personal information,
confidentiality supports privacy because nonpublic data needs to be kept
nonpublic.
5.5.1 The Disconnects
The reason there is not a complete overlap between privacy and information
security is threefold.
First, privacy has a wider set of obligations and responsibilities than information
security’s ones of confidentiality, integrity, and availability. Information security
does not contain requirements like privacy’s collection limitation, openness,
relevancy, and use limitation. This means there are issues privacy addresses that
information security does not.
The second disconnect is confidentiality. Because personal information is not
always nonpublic (consider the phone book), the notion of confidentiality does
not apply.
Also, in a resource-constrained world, if the data is not considered confidential, it
is not always valued, and the necessary measures to ensure authorized access and
use will be overlooked.
Third, and perhaps most important, while information security techniques can
be privacy-enhancing technologies (PETs) (tools that enable privacy) and are
often necessary, these PETs can also become “feral” if applied incorrectly (i.e., in
an invasive manner). Therefore, you can have security without privacy, but you
cannot have privacy without security.45
Privacy and information security have orthogonal information classification
systems:
Data privacy classifies personal information into three categories:
personal information, sensitive personal information, and nonpersonal
information
Information security protects information differently, usually along the
lines of degree of confidentiality: public, confidential, highly
confidential, or restricted
Sometimes, but not always, that which is public, confidential, highly
confidential, or restricted/top secret will contain personal information that is
public, confidential, highly confidential, or restricted/top secret, depending on
the specifics and context.
In addition, confidentiality is state determined by two parties regarding how to
manage access to some kind of information. While personal information is more
organic, the degree to which data identifies an individual determines whether it is
personal. If it does, it is personal; if it has characteristics of a person, it is
deidentified or pseudonymized. If it has no characteristics of an individual, it is
anonymized.
5.5.2 Alignment
Data privacy and information security are both data protection regimes and, as
noted, while they have different focuses, they do have significant overlaps. Data
privacy professionals have a vested interest in ensuring security controls are
implemented and operating effectively. The business partners of both, data privacy
and information security, also have a vested interest in privacy and security being
effectively and efficiently implemented so that assets and people are protected but
not unnecessarily encumbered from getting work done.
As the data privacy function within organizations has matured, been translated
into controls, and embedded into development processes and reviews, this
overlap, as well as the distinctions, of data privacy and information security have
become increasingly clear.
Building partnerships between stakeholders in the privacy and information
security functions is, therefore, essential to ensure consistency, visibility, and
alignment on key elements of the privacy program, including on:
Data breach and incident response plans
Training and awareness initiatives
Vendor due diligence
Privacy impact assessments
Certification frameworks
Data mapping/inventory
With this clarity, the opportunity for the two domains to align and support each
other to the benefit of both has become increasingly clear and achievable.
A survey commissioned by the IAPP and TrustArc found the nexus of this
alignment and support was driven by the mutual goal of preventing or mitigating
data breaches.46 Thus, both are interested in driving data minimization, having
good data maps and inventories, and ensuring the right controls and measures are
in place and accessed. As such, often privacy review and assessment processes
have been embedded in secure development life cycle processes.
With this nexus identified, it is not surprising that the same survey found some
of the ways for data privacy and information security programs to align are:
Increased involvement of privacy personnel on information security
teams and vice versa
Employment of core privacy functions with the IT team motivated to
get a better handle on their data and the extent of their corporate risk
Increased investment in privacy technology
Increased use of privacy impact assessments and data inventory and
classification
Increased use of data retention policies
The findings of how privacy and information security support and align were
further supported by the “Cisco Privacy Maturity Benchmarking Study” in 2021,
which found privacy has become a top area of responsibility for security
professionals.47
Another reason for this increased alignment is the recognition that many PETs
and standards are, in many cases, information security technologies and standards.
The details of use and implementation are crucial, however.
Finally, driving this increased alignment is the limitation of time and money.
Budgets are limited and so, to maximize efficiency and productivity and to lessen
the burden on themselves and the business, data privacy and information security
have found ways to improve how they work together. To realize better alignment,
consider these four principles:
Teaming—Data privacy teams should work closely with information
security and development teams (e.g., product, IT, web) to evaluate
security controls. Be part of the process. Train colleagues in information
security and development, including quality assurance, on data privacy.
Don’t reinvent—Leverage existing reviews (e.g., System and
Organization Controls for Service Organizations [SOC 1 and SOC 2]
audits, internal audits, pen test, ISO certifications), and review
processes for security assurances whenever possible. Layer in—don’t
become a silo or parallel activity.
Stay aware—Make sure security risks relevant to your organization or
enterprise are part of your privacy risk framework. If security risks are
not part of your risk framework, it is much harder to evaluate and
ensure that controls are in place and correctly implemented.
Rank and prioritize—Not all problems can be solved or mitigated at
once, and having an agreed-upon ranking of risk factors is key to
prioritizing and allocating resources and evaluating outcomes.
5.5.3 Access Control
Access to an organization’s information systems should be tied to an employee’s
role. No employees should have greater information access than is necessary to
perform their job functions.
An access control policy should be established, documented, and reviewed
based on business and security requirements for access.
The privacy team should work with information security and IT to ensure
effective access controls. Basic security principles for role-based access controls
(RBAC) include the following:
Segregation of duties—Ensure one person cannot exploit or gain
access to information inappropriately.
Least privilege—Grant access at the lowest possible level required to
perform the function.
Need-to-know access—Restrict access to only information that is
critical to the performance of an authorized, assigned mission.
Guidelines for user access management (also known as identity access
management) include the following:
Unique user IDs; in other words, no group or generic user IDs like
“admin”
Credentials for ID (e.g., smart card, password, two-factor
authentication, machine certificate); in other words, control access
Level of access based on business purpose; in other words, ensure type
of access is in line with business need for the access
Formal logical access process for granting and removing; in other
words, systemic controls (and tracking) how access is given and how it
is revoked
Password management; in other words, ensure passwords are changed
at a regular cadence
Review of user access rights (e.g., privileged accounts, job function
changes, employment termination); in other words, regularly audit
access rights for changes in jobs that indicate change in access needs
User responsibility; in other words, training users with access on their
information security responsibilities and obligations
Users required to follow good security practices in selecting and
protecting passwords; in other words, ensure passwords are sufficiently
strong and have changed at a regular cadence
Clean desk policy for papers and removable storage media; in other
words, apply physical controls to the data in your possession
5.5.4 Data Classification
To properly protect data, it needs to be classified so the level of risk of
unauthorized use, access, or loss of it is understood and can be used to guide the
information security controls used to protect it. The risk could be:
Financial
Operational
Strategic (reputation/business)
Risk to individuals
Regulatory/legal
Most information security classification schemas use the following categories:
Public—No impact
Confidential—Impact isolated within organization and/or moderate
risk to individuals
Highly confidential—Impact to multiple parts of an organization
and/or significant risk to individuals
Restricted—Enterprise-wide impact to an organization and/or severe
risk to individuals
They have associated definitions for each category based on the risk to the
business in the event of unauthorized access or loss of the data.
Data privacy traditionally classifies personal information based on whether it is
sensitive personal information or not, which is defined by both policy and law.
Another axis with which to classify personal information is identifiability and
linkability. This becomes useful in calibrating risk, especially when contemplating
big data analytics, finding ways to use data, or articulating how and to what degree
personal information has been deidentified. See table 5-1 for an example of a
classification scheme.
Table 5-1: Classification Scheme Example48
Identifiability Linkability System Characteristics
Identified Linked Unique identifiers across databases
Contact information stored with profile
information
Pseudonymous Linkable with reasonable and No unique identifiers across databases
automatable effort Common attributes across databases
Contact information stored separately from
profile or transaction information
Not linkable with reasonable No unique identifiers across databases
effort No common attributes across databases
Random identifiers
Contact information stored separately from
profile or transaction information
Collection of long-term personal characteristics
on a low level of granularity
Technical enforced deletion of profile details at
regular intervals
Anonymous Unlinkable No collection of contact information
No collection of long-term personal
characteristics
K-anonymity with large value k
5.6 Privacy Policy and Technical Controls
For data protection, controls need to be implemented with privacy in mind.
Required (or suggested) administrative or policy controls for privacy can be
found in four areas. See table 5-2 for examples.
Table 5-2: Policy Control Examples
Type Source Control Implementation
Laws and GDPR: Right to erasure Delete data upon Have a delete button that
regulations request actually deletes data
Self- Payment Card Industry Data Encrypt cardholder Use AES 256 in transit
regulatory Security Standard (PCI DSS) data
regime
Industry Generally Accepted Privacy Get explicit Require opt-in selection for
practices Principles (GAPP) consent for specified uses
sensitive data
Corporate Google’s former motto: Do not be Always clearly identify
ethos/policy “Don’t be evil” deceptive in search advertising as a “sponsored
results link”
It is the policy that dictates the controls, which, in turn, establish what
mechanism or process must be implemented to ensure the control is enabled.
Technical controls fall into four main areas:49
Obfuscation—Personal data is made obscure, unclear, or unintelligible
(e.g., masking, tokenization, randomization, noise, hashing)
Data minimization—The collection of personal information is limited
to that which is directly relevant and necessary to accomplish a
specified purpose (e.g., granulation, data segregation, deletion,
deidentification, aggregation)
Security—Protective privacy measures are used to prevent
unauthorized access (e.g., encryption, access controls for physical and
virtual systems, data loss management, destruction, auditing, testing)
PETs—Technologies ensure engineered systems provide acceptable
levels of privacy (e.g., secure multiparty computations, homomorphic
encryption, differential privacy, mix networks, anonymous digital
credentials)
Data Destruction
One important way to protect personal information and privacy is to destroy
personal information when it is no longer needed. A policy to reflect these
requirements and appropriate guidance to support this should be created and
implemented.
Two ways of electronically destroying data are overwriting and degaussing.
Three ways of physically destroying data are shredding, melting, and burning.
Regardless of the methodology selected, privacy professionals should work with
their data retention functions so agreed-upon policies, standards, and guidelines
are in place to ensure personal information is destroyed when it is supposed to
be destroyed.50
It is important to be precise and expansive when giving direction on deleting.
For instance, a hard delete is not the same as a soft delete. Also, backups,
cached, and nonvolatile data should be considered and included in the
directions.
5.7 Summary
Protecting personal information is an ongoing effort. Many failures are related to
inability to imagine the worst or to understand evolving technologies. It is
important to stay abreast of new technologies; to ensure that system, product, and
application updates are reviewed; and that new or different privacy controls are
not needed. As a privacy professional, you will need not only to help design and
engineer privacy, but also to design and engineer processes that will enable easy
adoption of change.
Endnotes
1 Ann Cavoukian, “Privacy by Design: The 7 Foundational Principles, Information and Privacy
Commissioner,” accessed November 2018, https://iab.org/wp-content/IAB-
uploads/2011/03/fred_carter.pdf.
2 Kashmir Hill, “Why ‘Privacy by Design’ Is the New Corporate Hotness,” Forbes, July 28, 2011, accessed
November 2018, www.forbes.com/sites/kashmirhill/2011/07/28/why-privacy-by-design-is-the-new-
corporate-hotness/.
3 GDPR, Article 25, accessed November 2018, www.privacy-regulation.eu/en/article-25-data-protection-
by-design-and-by-default-GDPR.htm.
4 GDPR, Recital 78, accessed November 2018, www.privacy-regulation.eu/en/r78.htm.
5 GDPR, Article 5, accessed November 2018, www.privacy-regulation.eu/en/article-5-principles-relating-
to-processing-of-personal-data-GDPR.htm.
6 Ann Cavoukian, Ph.D., Stuart Shapiro, PhD , R. Jason Cronk, “Privacy Engineering: Proactively
Embedding Privacy, by Design,” January 2014, https://www.ipc.on.ca/wp-
content/uploads/resources/pbd-priv-engineering.pdf.
7 Jonathan Fox, Michelle Dennedy, and Tom Finneran, The Privacy Engineer’s Manifesto: Getting from Policy to
Code to QA to Value (Apress Media, 2014), 71.
8 GDPR Article 32 includes the notion of “resiliency” when referencing confidentiality, integrity, and
availability. Resiliency is the ability to quickly recover. An example of this would be a failover server, which
replaces another server as soon as it goes down.
9 ISACA Cybersecurity Fundamentals Glossary, accessed November 2018,
https://www.isaca.org/Knowledge-
Center/Documents/Glossary/Cybersecurity_Fundamentals_glossary.pdf.
10 John Emmitt, “Top 10 Cybersecurity Threats in 2021 and How to Protect Your Business,” December 23,
2020, https://securityboulevard.com/2020/12/top-10-cybersecurity-threats-in-2021-and-how-to-
protect-your business/.
11 Definition: “Control,” ISACA Cybersecurity Fundamentals Glossary, accessed November 2018,
https://www.isaca.org/Knowledge-
Center/Documents/Glossary/Cybersecurity_Fundamentals_glossary.pdf.
12 Definition: “Preventive Control,” ISACA Glossary, accessed November 2018, https://www.isaca
.org/Pages/Glossary.aspx?tid=1698&char=P.
13 Definition: “Detective Control,” ISACA Glossary, accessed November 2018,
https://www.isaca.org/Pages/Glossary.aspx?tid=1322&char=D.
14 Definition: “Corrective Control,” ISACA Glossary, accessed November 2018, https://www.isaca
.org/Pages/Glossary.aspx?tid=1265&char=C.
15 Travis Breaux, Introduction to Privacy for Technology Professionals (Portsmouth, NH: IAPP, 2020).
16 Breaux, Introduction to IT Privacy.
17 Breaux, Introduction to IT Privacy.
18 ISO/IEC 2702:2013, “Information Technology—Security Techniques—Code of Practice for Information
Security Controls,” ISO, accessed November 2018, https://www.iso.org/standard/54533.html.
19 ISO/IEC 27000:2018, “Information Technology—Security Techniques—Information Management
Systems—Overview and Vocabulary,” ISO, accessed November 2018,
https://www.iso.org/standard/73906.html.
20 ISO/IEC 27001, “Information Security Management,” ISO, accessed November 2018,
https://www.iso.org/isoiec-27001-information-security.html.
21 ISO/IEC 27002:2013, “Information Technology—Security Techniques—Code of Practice for Security
Controls,” ISO, accessed November 2018, https://www.iso.org/standard/54533.html.
22 ISO/IEC 27003:2017, “Information Technology—Security Techniques—Information Security
Management Systems—Guidance,” ISO, accessed November 2018,
https://www.iso.org/standard/63417.html.
23 ISO/IEC 27004:2016, “Information Technology—Security Techniques—Information Management—
Monitoring, Measurement, Analysis and Evaluation,” ISO, accessed November 2018,
https://www.iso.org/standard/64120.html.
24 ISO/IEC27005:2018, “Information Technology—Security Techniques—Information Security Risk
Management,” ISO, accessed November 2018, https://www.iso.org/standard/75281.html.
25 ISO/IEC 27006:2015, “Information Technology—Security Techniques—Requirements for Bodies
Providing Audit and Certification of Information Security Management Systems,” ISO, accessed
November 2018, https://www.iso.org/standard/62313.html.
26 ISO/IEC 27009, “Information Security, Cybersecurity and Privacy Protection—Sector-specific
Application of ISO/IEC 27001—Requirements,” accessed April 2021,
https://www.iso.org/standard/73907.html.
27 ISO/IEC 27010, “Information Technology—Security Techniques—Information Security Management
for Inter-sector and Inter-organizational Communications,” accessed April 2021,
https://www.iso.org/standard/68427.html.
28 ISO/IEC 27011, “Information Technology—Security Techniques—Code of Practice for Information
Security Controls Based on ISO/IEC 27002 for Telecommunications Organizations,” accessed April 2021,
https://www.iso.org/standard/64143.html.
29 ISO/IEC 27013, “Information Technology—Security Techniques—Guidance on the Integrated
Implementation of ISO/IEC 27001 and ISO/IEC 20000-1,” accessed April 2021, https://www.iso
.org/standard/64138.html.
30 ISO/IEC 27031, “Information Technology—Security Techniques—Guidelines for Information and
Communications Technology Readiness for Business Continuity,” accessed April 2021,
https://www.iso.org/standard/44374.html.
31 ISO/IEC 27033-1, “Information Technology—Security Techniques—Network Security—Part 1:
Overview and Concepts,” accessed April 2021, https://www.iso.org/standard/63461.html.
32 ISO/IEC 27034, “Information Technology—Security Techniques—Application Security—Part 1:
Overview and Concepts,” accessed April 2021, https://www.iso.org/standard/44378.html.
33 ISO/IEC 27035, “Information Technology—Security Techniques—Information Security Incident
Management—Part 1: Principles of Incident Management,” accessed April 2021,
https://www.iso.org/standard/60803.html.
34 ISO/IEC 27701, “Security Techniques—Extension to ISO/IEC 27001 and ISO/IEC 27002 for Privacy
Information Management—Requirements and Guidelines,” accessed April 2021,
https://www.iso.org/standard/71670.html.
35 ISO/IEC 27550, “Information Technology—Security Techniques—Privacy Engineering for System Life
Cycle Processes,” accessed April 2021, https://www.iso.org/standard/72024.html.
36 ISO 27799, “Health Informatics—Information Security Management in Health Using ISO/IEC 27002,”
accessed April 2021, https://www.iso.org/standard/62777.html.
37 ISO/IEC 29100, “Information Technology—Security Techniques—Privacy Framework,” accessed April
2021, https://www.iso.org/standard/45123.html.
38 ISO/IEC 29134, “Information Technology—Security Techniques—Guidelines for Privacy Impact
Assessment,” accessed April 2021, https://www.iso.org/standard/62289.html.
39 ISO/IEC 29151, “Information Technology—Security Techniques—Code of Practice for Personally
Identifiable Information Protection,” accessed April 2021, https://www.iso.org/obp/ui/#iso:std:iso-
iec:29151:ed-1:v1:en.
40 “Framework for Improving Critical Infrastructure Cybersecurity, Version 1.1,” NIST, accessed April 2021,
https://doi.org/10.6028/NIST.CSWP.04162018.
41 NIST SP 800-30, Revision 1, “Guide for Conducting Risk Assessments,” NIST, accessed April 2021,
https://doi.org/10.6028/NIST.SP.800-30r1.
42 NIST SP 800-37, Revision 2, “Risk Management Framework for Information Systems and Organizations:
A System Life Cycle Approach for Security and Privacy,” NIST, accessed April 2021,
https://doi.org/10.6028/NIST.SP.800-37r2.
43 NIST SP 800-39, “Managing Information Security Risk: Organization, Mission, and Information System
View,” NIST, accessed April 2021, https://doi.org/10.6028/NIST.SP.800-39.
44 NIST SP 800-53, Revision 5, “Security and Privacy Controls for Information Systems and Organizations,”
NIST, accessed April 2021, https://doi.org/10.6028/NIST.SP.800-53r5.
45 Fox, Dennedy, and Finneran, The Privacy Engineer’s Manifesto, 71.
46 IAPP and TRUSTe, How IT and InfoSec Value Privacy, March 2016, IAPP,
https://iapp.org/resources/article/how-it-and-infosec-value-privacy/.
47 Cisco 2021 Data Privacy Benchmark Study, Forged by the Pandemic: The Age of Privacy, accessed April
2021, https://www.cisco.com/c/dam/en_us/about/doing_business/trust-center/docs/cisco-privacy-
benchmark-study-2021.pdf.
48 R. Jason Cronk, “Embedding Privacy by Design,” recorded December 2, 2016, IAPP: Portsmouth, NH,
web conference, https://iapp.org/store/webconferences/a0l1a000002m05dAAA/.
49 Cronk, “Embedding Privacy by Design.”
50 Bob Violina, “The In-Depth Guide to Data Destruction,” February 6, 2012, CSO from IDG, accessed
November 2018, https://www.csoonline.com/article/2130822/it-audit/the-in-depth-guide-to-data-
destruction.html.
CHAPTER 6
Privacy Operational Life Cycle: Protect: Policies
Edward Yakabovicz, CIPP/G, CIPM, CIPT
Policies provide a deliberate system of principles to guide decisions by dictating a
course of action and providing clear instructions for implementation through
procedures, protocols, or guidance documents. Policies and compliance are more
than box-ticking and documentation but rather used daily by busy employees
trying to get their jobs done.1 The business must define and verify the usefulness
of the policies, then revise them to meet the ever-changing regulatory
requirements and business needs. For example, the global market coronavirus
disease 2019 (COVID-19) disruptions have caused significant data privacy
challenges to include remote work for business operations and health care data
privacy disputes. As the regulatory environment changes and adapts to the new
ways of business, so do the practices of each organization to stay aware and adapt
to the demands from different jurisdictions locally, nationally, and globally.
The objective of this chapter is to review the basic construct of an organizational
policy specific to privacy management. It will define the components of a privacy
policy and practices necessary to make that policy successful by discussing the
importance of communication and the way other organizational policies support
and reinforce the privacy policy.
6.1 What Is a Privacy Policy?
A privacy policy governs the privacy goals and strategic direction of the
organization’s privacy office. As discussed in chapter 2, it is important that the
organization first develops a privacy vision or mission statement that aligns with
its overall business strategy and objectives, thus the foundation of privacy
program management. These statements help guide the management of the
privacy program and allocation of resources to support the program. It also serves
as the foundation for developing effective privacy policies. Depending on the
industry and organization’s customers, the privacy policy could also be dictated by
law and regulations or by industry standard.
Policies become difficult to create if there is no definition of how they influence
and can be used by an organization. They should be considered at the highest level
of governance for any organization. In addition, they should be clear and easy to
understand, accessible to all employees, comprehensive yet concise, action-
oriented, and measurable and testable. Policies should align to organizational
standards for format, structure, and intent to meet organizational goals.
The privacy policy is a high-level policy that supports documents such as
standards and guidelines that focus on technology and methodologies for meeting
policy goals through manuals, handbooks, and/or directives. Examples of
documents supported by the privacy policy may include:
Organization standards, such as uniforms, identification badges, and
physical building systems
Guidelines on such topics as the use of antivirus software, firewalls, and
email security
Procedures to define and then describe the detailed steps employees
should follow to accomplish tasks, such as hiring practices and the
creation of new user accounts2
The privacy policy (sometimes referred to by organizations as a “privacy
notice”) also supports a variety of documents communicated internally and
externally that:
Explain to customers how the organization handles their personal
information
Explain to employees how the organization handles their personal
information
Describe steps for employees handling personal information and their
responsibilities
Outline how personal data will be processed
6.2 Privacy Policy Components
Although policy formats will differ from organization to organization, a privacy
policy should include the following components:
Purpose—This component explains why the policy exists, as well as
the goals of the privacy policy and program, which could be used to
meet a privacy standard based on national, regional, or local laws. This
component could also meet other nonbinding standards or frameworks
that answer the needs of the organization.
Scope—Scope defines which resources (e.g., facilities, hardware and
software, information, personnel) the policy protects.
Applicability—Who the policy applies to (e.g., customers, employees,
contractors, third parties).
Roles and responsibilities—This section assigns privacy
responsibilities to roles throughout the organization, typically overseen
by a privacy program office or manager. The responsibilities of leaders,
managers, employees, contractors, vendors, and all users of the data at
the operations, management, and use levels are delineated. Most
importantly, this component serves as the basis for establishing all
employee and data user accountability.
Compliance—Compliance issues are a main topic in privacy policy.
Sometimes they are found in the relevant standard, such as the
applicable data protection law, and are not written into the
organization’s privacy policy document. Potential compliance factors
include the following:
General organization compliance to ensure the privacy
policy assigns roles and responsibilities at the proper level in
the organization to create an oversight group. This group has
responsibility for monitoring compliance with the policy,
conducting enforcement activities, and aligning with the
organization’s priorities.
The ability to apply penalties, sanctions, and disciplinary
actions with authorization for the creation of compliance
structures that may include disciplinary actions for specific
violations. For example, penalties and sanctions should be
outlined in a table to be used as a guide in administering
discipline to help ensure that like disciplinary action is taken
for like offenses. These ensure fair and equal treatment across
the organization through published standards available to all
employees. It is good practice to keep a record of any actions
taken for noncompliance with policies.
Understanding of the penalties or sanctions for noncompliance
with laws and regulations. Legal and regulatory penalties are typically
imposed within any industry to enforce behavior modification needed
to rectify previous neglect and lack of proper data protection. Privacy is
no different; organizations are now held accountable for protecting the
privacy of the data with which they have been entrusted. As penalties
and sanctions for violation of privacy laws and regulations become
more serious, the privacy professional must be prepared to address,
track, and understand any penalty that could affect the organization.
Written and implied rules have become part of the compliance
discussion as written rules are those found in standards and regulations,
while implied rules related to those standards or regulations have yet to
keep pace with the industry and society needs. The COVID-19
pandemic continues to change the privacy landscape, including health
care records related to inoculations; social impacts, such as contact
tracing; and the use of telehealth and other newer use of virtual
technologies.
The privacy policy should not be confused with detailed process manuals and
practices that are typically outlined in standards, guidelines, handbooks, and
procedures documents. Remember, the privacy policy is the high-level
governance that aligns with the privacy vision or mission statement of the
organization. However, it is good practice to include references to other applicable
policies and procedures.
6.2.1 Privacy Notice versus Privacy Policy
To ensure there is no confusion between privacy notice and privacy policy, the
following is offered:
A privacy policy is an internal document addressed to employees and data
users. This document clearly states how personal information will be handled,
stored, and transmitted to meet organizational needs, as well as any laws or
regulations. It will define all aspects of data privacy for the organization, including
how the privacy notice will be formed, if necessary, and what it will contain.
A privacy notice is an external communication to individuals, customers, or
data subjects to create transparency in how the organization collects, uses, shares,
retains, and discloses its personal information based on the organization’s privacy
policy.
This is discussed in more detail in chapter 9.
6.3 Interfacing and Communicating with an
Organization
Protecting personal information and building a program that drives privacy
principles into the organization cannot be the exclusive job of the privacy officer
or the privacy team, any more than playing a symphony is the exclusive
responsibility of the conductor. As with an orchestra, many people, functions, and
talents will merge to execute the privacy vision or mission of the organization.
Many organizations create a privacy committee or council composed of
stakeholders or representatives of functions. These individuals may launch the
privacy program and manage it throughout the privacy policy life cycle. They can
be instrumental in making strategic decisions that may affect the vision, change
key concepts, or determine when alterations are needed and act as a great
additional resource for the privacy function. Because of their experience and
knowledge, they play a critical role in communicating the privacy policy, which is
almost as important as having a solid privacy policy; without informed
communications, the policy will simply sit on a shelf or hard drive.
Organizations with a global footprint often create a governance structure
composed of representatives from each business function and every geographic
region in which the organization has a presence to ensure that proposed privacy
policies, processes, and solutions align with local laws and are tailored to them as
necessary. This governance structure also provides a communication chain, both
formally and informally, that the privacy professional should continue to use in
performing key data protection activities.
6.4 Communicating the Privacy Policy within
the Organization
The internal communications plan is an important planning tool to help educate
the organization on privacy, provide awareness, and provide updates and
guidance. The tool provides employee awareness of the risks they pose and
associated practices they can use to modify and change their behavior. This
awareness is important to each and every employee from their first day of
employment through their last for full understanding of the organizations data
privacy standards. This approach will help to reduce confusion and improve
transparency. This formal plan keeps everyone in the loop to avoid gaps in internal
messaging. It also helps define the organization’s goals and overall privacy intent.
The privacy program management team should answer the following questions
when developing an effective internal communications plan:
What is the purpose of the communication? For example, does it simply
communicate the existence of a policy or spread knowledge about the
policy, or is it intended to train employees and cause behavior
modification concerning privacy?
Should there be a recurring time slot assigned on the communications
calendar dedicated to particular messaging? E.g., Data Privacy Day in
January, EU General Data Protection Regulation (GDPR) anniversary
in May, security threat awareness over summer holidays and Christmas
(example of a trend when security breaches are more likely to occur),
privacy team milestones, objectives, or end-of-year reviews, etc.?
Is a communication necessary, or can the messaging be covered in more
dedicated guidance documents or training?
How will the privacy team work with the communications team? What
methods, such as meetings, phone calls, and conference calls, will be
used?
Who is the audience for the communication relating to? Are there
different potential user groups, such as production or administrative
staff, managers, and vendors?
What existing communication methods, such as a company intranet,
can be employed? What assets, such as posters, digital screens,
animated videos, flyers, mouse pads, and other awareness tools, can be
considered as alternative ways of presenting the privacy message to
boost engagement?
Which functional areas most align with the privacy program, and how
should one best communicate with each? For example, production,
administrative, information assurance, cybersecurity (sometimes called
information security or infosec), and human resources (HR) may all
need to be in close coordination.
What is the best way to motivate training and awareness for the
organization? Which metrics are best for tracking effectiveness and
demonstrating the return on investment?
Has the privacy team conducted a privacy workshop or training for
stakeholders to define privacy for the organization, explain the market
expectations, answer questions, and reduce confusion?
Communications should include the formal privacy policy to help ensure
everyone, including third-party service providers, in an organization receives the
same guidance and adheres to the same privacy mission and vision. This will
reduce confusion and improve transparency by offering tools to improve
awareness and educate the organization. Consequently, a proper communication
plan will also ensure employee confidence to use the tools and practices necessary
to protect the organization and customer data privacy.
The program manager should always look into ways to improve the way the
privacy function communicates key topics from the business and, as such, should
obtain feedback. This can provide honest, constructive, and creative ideas to boost
engagement and employee awareness, which, in turn, can help to prevent or better
identify privacy risks.
6.5 Policy Cost Considerations
Several potential costs are associated with developing, implementing, and
maintaining policies. The most significant are related to implementing the policy
and addressing the impacts on the organization that potentially limit, reduce,
remove, or change the way data is protected. Historically, privacy has been
governed by information security, but it now requires resources assigned directly
to privacy management. The establishment and upkeep of a privacy management
program does not come at a negligible cost to the organization or impact on its
people. Limiting any business function has a direct and sometimes measurable
impact on an employee’s or a data user’s ability to perform certain tasks. The
privacy professional should be cognizant that all changes made to any policy affect
the organization.3
Other costs are incurred through the policy development and management
process. Administrative and management functions are required to draft, develop,
finalize, and then update the policy. Beyond that, the policy must be disseminated
and then communicated through training and awareness activities. Although the
cost is unavoidable for policy management, in most cases due to regulations, there
must be a balance between practical protections to meet any regulations and laws,
the organization’s privacy vision or mission, and the organization’s need to
perform the intended business transactions.4
Proactive reviews on the successful implementation of the privacy policy should
be carried out by the privacy function on a regular basis, in particular to identify
any areas of noncompliance.
6.6 Design Effective Employee Policies
Employees should be encouraged and empowered to take the right decisions for
the right purposes, while leadership must prioritize protection of customer data
over short-term gains and support employee’s decisions.5 Designing effective
employee policies offers comprehensive policies and procedures that include
instructions on reporting violations or impermissible uses, while providing
transparency and support through the process without reprisal. Employee policies
must also contain onboarding and exit procedures that ensure full awareness of
the organization’s privacy intent while protecting against misappropriation of
knowledge and data upon termination of contracts and employment. This process
should specifically highlight the privacy and confidentiality clauses included in the
employment contracts, which will continue to apply to the employee upon them
leaving the organization so they are reminded of their obligations.
Comprehensive privacy policies must align with employee policies in
supporting documents, including additional policies that respond to the needs
and intent of the organization to fix an issue, serve a specific purpose, or meet a
specific goal. Higher-level policies and procedures include items such as security
configurations and responsibilities, while examples of those that address issues
include behavior modification, proper usage of organization property, newer
technology threats, social media use, email use, and internet use. Documents
addressing these issues should be reviewed and updated regularly. Regardless of
the intent, supporting policies may contain the following data.
Issue/objective statement—To formulate a policy on an issue, the
information owner/steward must first define the issue with any relevant
terms, distinctions, and conditions included. It is often useful to specify
the goal or justification for the policy to facilitate compliance. For
example, an organization might want to develop an issue-specific policy
on the use of “unofficial software,” which might be defined to mean any
software not approved, purchased, screened, managed, or owned by the
organization. The applicable distinctions and conditions might then
need to be included for some software, such as software privately owned
by employees but approved for use at work or owned and used by other
businesses under contract to the organization.
Statements of the organization’s position—Once the issue is stated
and related terms and conditions are detailed, this section is used to
clearly state the organization’s position (i.e., management’s decision) on
the issue. In the previous example, this would mean stating whether the
use of unofficial software as defined is prohibited in all or some cases;
whether there are further guidelines for approval and use; or whether
case-by-case exceptions may be granted, by whom, and on what basis.
Applicability—Issue-specific policies also need to include statements
of applicability. This means clarifying where, how, when, to whom, and
to what a policy applies. For example, it could be that the hypothetical
policy on unofficial software is intended to apply only to the
organization’s own on-site resources and employees and not to
contractors with offices at other locations. Additionally, the policy’s
applicability might need to be clarified as it pertains to employees
traveling among different sites, working from home, or needing to
transport and use hardware at multiple sites.
Roles and responsibilities—The assignment of roles and
responsibilities is also usually included in issue-specific policies. For
example, if the policy permits employees to use privately owned,
unofficial software at work with the appropriate approvals, then the
approval authority granting such permission would need to be stated.
(The policy would stipulate, who, by position, has such authority.)
Likewise, the policy would need to clarify who would be responsible for
ensuring that only approved software is used on organizational system
resources and, possibly, for monitoring users regarding unofficial
software.
Compliance—Some types of policies may describe unacceptable
infractions and the consequences or repercussions of such behaviors in
greater detail. Penalties may be explicitly stated and consistent with
organizational personnel policies and practices. When used, they can be
coordinated with appropriate officials, offices, and even employee-
bargaining units. A specific office in the organization may be tasked with
monitoring compliance.
Points of contact and supplementary information—For any issue-
specific policy, indicate the appropriate individuals to contact in the
organization for further information, guidance, and compliance. Since
positions tend to change less often than the individuals occupying
them, specific positions may be preferable as the point of contact. For
example, for some issues, the point of contact might be a line manager;
for others, it might be a facility manager, technical support person,
system administrator, or security program representative. Using the
above example once more, employees would need to know whether the
point of contact for questions and procedural information would be
their immediate superior, a system administrator, or an information
security official.
Many offices in the organization may be responsible for selecting, developing,
updating, and finalizing policies and all supporting documents, including the
privacy office, legal, HR, and information security. This distribution of
responsibility helps ensure a clear and accurate policy that meets the needs of the
organization and any regulatory or external standards.
The following section presents several high-level examples of supporting
documents that affect data protection and the privacy vision or mission of the
organization, including materials on acceptable use, information security,
procurement, and data retention and destruction. These represent only a small
subset of topics that can be considered as privacy-supporting policies.
6.6.1 Acceptable Use Policies
An acceptable use policy (AUP) stipulates rules and constraints for people within
and outside the organization who access the organization’s mobile devices,
computers, network, and internet connection. It outlines acceptable and
unacceptable use of an organization resource that is shared across the organization
for a common good or goal that the user agrees to either in writing or electronic
form, such as a banner at logon or pop-up like text box that must be agreed to
before continuing. An AUP can be thought of as the terms of service or end user
license agreements as standards of behavior with respect to the proper usage of
the organization’s resources.
An example of an AUP for guest wireless access applies to the use of the
organization network or internet connections to which the user agrees to the AUP
in written or electronic form. Many organizations will provide education and
awareness to individuals during employment onboarding or upon gaining access
to building and facilities where the users must sign the AUP before gaining access.
Other times, the user is given network or internet connection details, and they
agree to the terms while signing/logging on each time.
Violations are typically the same results for any user that leads to loss of use
and/or punitive action either by the organization or by law enforcement if
necessary. Since most AUP documents almost include “notice of monitoring,” the
user is agreeing to the AUP terms that include a privacy notice that details
monitoring and logging. Typical users include but are not limited to employees,
students, guests, contractors, and vendors.
The information security function usually plays a major role in developing AUP
and aligns them to the organization policies that consider the following:
Organization, customer, and other user’s privacy—Intent and use
for the organization to conduct business
Legal protections (e.g., copyright)—AUPs protect the company from
legal harm through user acceptance for proper use of systems
Integrity of computer systems (e.g., anti-hacking rules)—Ensures
the user doesn’t use any hacking tools to disrupt or degrade the
organization resources
Ethics—Enforces illegal or unacceptable practices, such as gambling
and other illicit activities
Laws and regulations—Enforce legal practices and inform users of
consequences if broken
Others’ network access—Limits use to only the approved user and
reduces sharing resources with non-approved users
Routing patterns—Keep company confidential information secure
and reduced hacking data
Unsolicited advertising and intrusive communications—Reduce
wasting organization resources for unsolicited use
User responsibilities for damages—Assigns penalties and responsibly
on the user for all prohibited actions that harm the organization
Security and proprietary information—Protection of organization
resources and data
Virus, malware protection, and malicious programs—Reduce harm
to organization by mitigating cybersecurity threats
Safeguards (e.g., scanning, port scanning, monitoring) against
security breaches or disruptions of network communication—
Identify and inform the user of security tools that may run to protect
the organization from cybersecurity threats; also, limits those tools to
only organization-approved applications
The AUP is a living document that will continue to be modified and updated as
the privacy standards and regulations change to keep pace with IT, social media,
and other challenges. Due to overall technology, security, and privacy changes, in
general, the AUPs should be reviewed by all users on a yearly basis through formal
training and awareness sessions that are documented.
6.6.2 Information Security Policies: Access and Data
Classification
Internal information security policies focus on protecting the organization from
internal and external threats through use of IT methods and practices. Specific
purposes relate to the industry type, physical location, standards, and regulatory
requirements aligned directly to the organization goals and mission. These
include:
To protect against unauthorized access to data and information systems
To provide stakeholders with information efficiently, while
simultaneously maintaining confidentiality, integrity, and availability
(CIA)
To promote compliance with laws, regulations, standards, and other
organizational policies
To promote data quality and security locally
The information security policies will focus on purpose, scope, and objectives of
the organization to establish what will be done to protect the data and
information stored on organization systems. These include but are not limited to:
Administrative responsibilities—Users can perform administrative
tasks, such as user management, password changes, user provisioning,
and system changes
Antivirus and malware policies—Enforce antivirus and malware
protections with mandatory applications, configurations, and update
schedule
Email policies—Identify best practices, proper email security, use of
email systems
Firewall rules and use—Protections of organization confidential
settings, administrative controls, access to internet connections, control
of data into and out of the organization
Internet policies—Proper and restricted use of the internet purpose
and intent
Intrusion detection—Mandatory organization tools, configuration,
and reporting; rules for after an intrusion event for remediation events
Loading of software (authorized and unauthorized)—Organization
rules to buy, load, and use software on organization resources; proper
rules for control of software
Monitoring and logging of all technology use to include local and
internet—User notification of monitoring and logging, user
acceptance, and reporting
Wiping of audit logs and other associated logs—Mandatory
configuration and access to logs
Risk assessments—Purpose, intent, schedule, roles, and
responsibilities
User and password policies—Mandatory password length, use,
sharing, time interval, history, and reporting
Use of security tools, such as pen testing and others—Roles and
responsible use, intent, reporting, and management
Wireless management—Purpose, intent, configuration, control,
administrative rights, and management
Information security and privacy should not be confused with each other as they
overlap in responsibilities for data protection. In basic terms, they form a
symbiotic relationship to protect the organization’s and customer data, where
information security provides data authorizations and access control, and privacy
allows for and controls explicit and informed consent. The associated policies will
reflect the same.
6.7 Procurement: Engaging Vendors
By organizational standards and policy, vendors should be held to the same
privacy standards as the employees for the organization. Many times vendors may
have more data access than employees or the data may be moved to their
operational environment, thus additional consideration should be given to the
vendor relationship, contract, and legal obligations that meet the organization
privacy policy and that of any industry standard or regulatory agency. As the IT
boundaries of the organization disappear in the use of external storage and
processing found in the cloud, there must be a strategy and vision for how the
organization will remain in control of the data privacy, as well as how that data will
remain secure and protected. Vendor agreements must be reviewed and updated
on a regular basis to ensure social, technical, industry, and regulatory impacts are
considered and addressed as nothing should ever be implied.
When engaging vendors, an organization may:
Identify vendors and their legal obligations
Evaluate risk, policies, and server locations
Develop a thorough contract
Monitor vendors’ practices and performance
Use a vendor policy
An organization must exercise similar if not stricter due diligence as part of
mergers, acquisitions, and divestitures. These will be further addressed in the
following sections with more details and discussion. Additional information on
these can be found in chapter 4.
6.7.1 Create a Vendor Policy
Vendor policies should guide an organization in working with third parties from
procurement through termination. Policy components may include requirements
for vendors, logistics (e.g., where work should be conducted), and onboarding
and employee training. A vendor policy may require identification and inventories
of all vendors and entry points, such as free survey tools, personal information the
vendor can access, and legal obligations on the organization and vendor. The
vendor policy may stipulate that the procuring organization evaluate its processes
for risk assessment, risk profile, and categories of vendors based on risk. This may
include evaluating the vendor’s internal policies; affiliations and memberships
with other organizations; mandatory and nonmandatory certifications; location
of data servers; and data storage, use, and transport. Another consideration that
should be evaluated would be the purchase of privacy/cyber insurance to cover
any unmitigated or unknown risk.
6.7.2 Develop a Vendor Contract
The development of a vendor contract should include the standard legal and
organization compliance offices, but there needs to be additional consideration
for the privacy and security offices. This is to ensure specific and unique
regulatory compliance issues have been addressed and considered. Typical
boilerplates, like contracts, may be good to start the discussion, but specific
subject matter experts need to be involved from the start to ensure a complete and
accurate agreement. It is also important to work with the organization’s legal,
compliance, and HR departments as necessary on any contract, including the
following concepts:
Standard contract language
Data backup and disaster recovery plans
Generation of reports and metrics
List of authorized users with privilege access
List of organization authorized users of the data or those can make
changes
Prohibition against making policy changes that weaken
privacy/security protections
Requirement to inform the organization when any privacy/security
policies change
Right to audit
Unauthorized changes to the data to include migration/deletion upon
termination
Vendor liability
Vendor security incident response procedures
Vendor risk management
Vendor contracts should be reviewed and modified based on the needs of the
organization and the customers. It is the duty of the organization to ensure all
vendor data access is controlled and managed to ensure data privacy is not
violated.
6.7.3 Monitor Vendors (Vendor Risk Management)
Vendor risk management, also called third-party risk management, has been a hot
topic across the privacy and security industries recently. Many times,
organizations will have vendor management programs (VMP) to address the
many risks and procedural agreements from legal obligations, organization
requirements, and fiscal response to both the vendor and organization.
Responsivity to legal obligations centers around the standards, laws, and
regulations. Organization success depends on the ability to be accountable for the
services and data protection provided to customers. Fiscal responsivity relates to
minimizing the fiscal impact to the organization and to customers and
stakeholders.
VMPs offer the mechanism and tools to ensure consistent and accurate delivery
of services while protecting data privacy, thus meeting the organization’s mission
and goals. They protect both the organization and vendor in describing the
authority in how the data may be used while offering the methods and practices
that will protect that data from misuse or loss. After the basic VMP and once
contracts are in place, the procuring organization should consider the vendor in its
monitoring plan to ensure crossover with its audit and compliance functions with
reporting. This may include reoccurring on-site visits, attestations, and/or
periodic reassessments, or reports and tracking methods. Consideration should be
given for the use of a software technology platform for automation of vendor risk
management, of which there are dozens now available.
6.7.4 Implement Procurement/Information Security Policies:
Cloud Computing Acceptable Use
Cloud computing technologies can be implemented in a wide variety of
architectures, models, technologies, and software design approaches. The privacy
challenges of cloud computing present difficult decisions when choosing to store
data in a cloud. Public, private, and hybrid clouds offer distinct advantages and
disadvantages.
The privacy aspects of any potential cloud choices should be considered before
engaging vendors or external parties. Working from requirements, an organization
should determine the purpose and fit of a cloud solution, obtain advice from
experts, then contact external cloud vendors. Furthermore, understanding the
policies, procedures, and technical controls used by a cloud provider is a
prerequisite to assessing the security and privacy risks involved.
With the increased use of cloud computing and other offsite storage, vendors
that provide cloud computing services may pose distinct privacy challenges,
especially because of compliance requirements and security risks. An organization
should ensure its acceptable use policy for cloud computing requires the privacy
and security of its data, as well as compliance with policies, laws, regulations, and
standards. Risks of processing data using cloud-based applications and tools
should be mitigated. The policy should stipulate approval of all cloud computing
agreements by appropriate leadership, such as the chief information officer (CIO).
Both information security and privacy teams should agree on the policy and
vendor of choice before final decisions are made. This ensures alignment of the
stakeholders and the policy that will be used to protect the organization. It may
also outline specific cloud services that may be used, restrictions for processing
sensitive information in the cloud, restrictions for personal use, and data
classification and rules for handling.
6.7.5 Implement HR Policies
HR and human resources management (HRM) handle diverse employee personal
information and typically will have policies to guide processing. HR policies often
provide rules regarding who may access employee data and under what
circumstances. Typically, these rules will include keeping such data tightly
controlled with special monitoring and reporting to ensure full transparency in
access to the data. Many HR groups will place special categories on “special data”
or “sensitive data” to ensure the data is fully controlled and used for only
authorized purpose. Employee data includes any data the employee has created in
the process of performing normal business efforts for the organization, including
emails, phone calls, voicemail, internet browsing, and use of systems.
When creating or updating any HR policy that concerns privacy, HR should also
consult with legal and information security to ensure all laws, regulations, and
other possible organization policies are met. Especially regarding updates to laws
and regulations, such as the GDPR, the privacy professional should consult with
all stakeholders prior to creating or updating any HR policy.
As shown from COVID-19 lessons learned, the HR policy must be adaptable
when normal employee data processing and communications are disrupted or
changes during emergency situations. In the case of COVID-19, employee emails,
phone calls, and other normal electronic correspondence has been impacted by
lockdown measures and social distancing, and the new shift to working from
home during the health crisis has forced the greater use of technology to
communicate that many organizations had not considered. New tools, methods,
and practices to communicate potentially decrease the privacy and security
awareness for the employees while they attempt to accomplish work tasks. Data
breach examples include:
Remote work—Data breaches are surfacing where employees are
attempting to get common workplace tasks done while attempting to
adapt to the paradigm shift to working from home.
Workarounds—Another data breach may include “not collecting data”
as a work around in the case of thermometer checks, which are simply a
pass/fail reading.
Software settings—Office walls that used to provide privacy for
protected discussions are now replaced by virtual technology with
software-defined privacy settings that when improperly set, potentially
allow virtual meetings to be overheard, recorded, or disrupted.
HR privacy concerns can be addressed through several types of HR policies.
These policies may address the following privacy concerns:
Employee communications, including employee browser histories,
contact lists, phone recordings, and geolocations
Employee hiring and review, including performance evaluations,
background checks, and the handling of resumes
Employee financial information, such as bank account information,
benefits information, and salary
Employee data collection exceptions and emergency situations
A new frontier for health records applicability and storage is becoming apparent
pertaining to COVID-19 inoculations, as well. Drafting policy will help how
organizations document, use, and then share employee COVID-19 data as
necessary in meeting relevant policy to protect privacy data.
6.7.5.1 Types of HR Policies
HR policies will be defined partly by the internal needs, industry, standards, and
regulations that impact the organization. Types of data collected, use, and storage
will dictate the protections and policies needed. For example, a financial business
will keep different records from a medical business, thus HR privacy policies will
need to be defined and focused based on the organization needs. A one-size-fits-
all does not apply. Typical HR privacy policies to consider include but are not
limited to the following:
Handling of applicant information
Employee background checks
Access to employee data
Termination of access
Bring your own device (BYOD)
Social media
Employee/workplace monitoring
Employee health programs
HR policies need to be flexible to handle diverse employee personal information
and provide direction for proper processing and rules regarding who may access
employee data and under what circumstances. Tracking and reporting in the use
of employee data is critical through a monitoring plan to ensure crossover with its
audit and compliance functions to report misuse and track the purpose for use.
6.8 Data Retention and Destruction Policies
Data retention and destruction policies should support the idea that personal
information should be retained only for as long as necessary to perform its stated
purpose. Data destruction triggers and methods should be documented and
followed consistently by all employees. These should align with laws, regulations,
and standards, which usually provide such time limits for record storage.
Ownership of a data retention/destruction policy may vary and intersect with
privacy, legal, IT, operations, finance, and the business function.
Actions an organization can take to develop a data retention policy include:
Determine what data is currently being retained, as well as how and
where is it stored. This information can be found in an organization’s
data inventory and should be documented as a standard and then policy
developed to support the retention and destruction life cycle.
Work with legal to determine applicable legal data retention
requirements based on laws, regulations, and standards that apply to the
business. This may also be impacted by global location and industry.
Brainstorm scenarios that would require data retention. This allows for
the organization to understand any gaps and or opportunities for better
policies.
Estimate business impacts of retaining versus destroying the data. This
allows the organization to understand the business costs when retaining
and then those for any physical destruction.
Work with IT and technology teams to develop and implement a policy
to ensure both privacy and IT needs are met based on the organization
mission and goals.
Data management requires answers to questions, such as why we have the data,
why we are keeping it, and how long we need to keep it. The process begins with
identifying all the data contained in the organization and determining how it is
used. Next, the organization should match the data to the legal obligations around
retention. Data retention and data deletion should be executed with caution.
Keeping the data for as long as the organization has a legitimate business purpose
is a common best practice. To comply with legal requirements and organization
governance standards, the organization should review all associated policies,
standards, guidelines, and handbooks. This includes every relevant country’s
required minimum retention time. Legal requirements could change if the
company is involved in litigation and discovery actions. Thus, the policy and all
supporting standards and technical controls should be flexible.
6.8.1 Implementing Policies
Privacy-related policies will not be effective if individuals do not care about or
follow them. An organization should seek ways to enable employees to integrate
the policies into their daily tasks. The privacy team can achieve this objective by
aligning policies with existing business procedures, training employees, and
raising awareness.
6.8.2 Aligning with Procedures
Multinational and multisector organizations have additional challenges to ensure
policies are consistent and uniform across all locations while satisfying local laws,
regulations, and industry guidance. Different business functions may have diverse
policy needs. Alignment of procedures provides greater clarity around who holds
ownership and decision rights for work functions. This also empowers employees
to focus on work tasks rather than guessing or fighting over who owns the task.
Finally, alignment provides the roles and responsibilities (who, what, and where)
for business decision-making and who will or should be consulted when making
those decisions. The organization should document and review policies of the
following functions and others to ensure alignment:
HR/HRM policies and procedures are guidelines for managing the
business and what employee functions are within the business
structure. They are set by business leaders and managers to set
employee expectations to meet industry standards and regulations. This
applies to data privacy and security of the company, as well as employee
and customer data.
When assessing proposed projects, business development practices
must align to the core business focus and intent to ensure future
business can be performed by the organization without major changes
to technology or people skills.
Project management must understand the HR policy to further
implement and uphold that policy through the project life cycle. This
offers the framework that employees expect to be governed by and
influences all decisions to complete project tasks.
Procurement and contract management alignment offers a sound
decision-making platform when expending the organization’s resources
and assigning legal responsibility to adhere to the rules or policies of the
organization.
Risk management alignment provides a measurement activity to
evaluate the organization’s ability to manage a project or resource
within organization standards and identify hazards when they arise.
Risk management refers to the practice of identifying potential risks,
analyzing them, and taking the precautionary steps necessary to reduce
or mitigate the risk.
Incident management alignment ensures a proactive organization
approach to disaster recovery and business continuity when an
emergency or business incident occurs.
Performance management alignment ensures identification of other
policies that may impact a set of activities and outputs that effect the
organization’s goals. This is sometimes found in the efficiency and
effectiveness in delivery of organization products.
Inconsistencies between policies should be explained fully to ensure there are no
gaps or misunderstandings and that no one single policy or group of policies
negatively impact another policy’s role(s), responsibility intention, or result(s) to
the organization.
6.9 Implementing and Closing the Loop
Once policies have been created, approved, and implemented, they must be
communicated to the organization by using the internal communications plan.
Raising awareness for the plan and organization’s policies should be
communicated by properly training employees and any data users to ensure
knowledge transfer and memory retention.
Awareness is a key factor to the organization’s education efforts as being vigilant
and/or watchful. From a privacy perspective, achieving awareness requires
communicating the various components of an organization’s privacy program,
thus creating a vigilant and watchful attitude toward the protection of privacy.
Everyone who handles privacy information must be alert to the constant need to
protect data. Yet, no one is immune to the daily pressures and deadlines that can
distract attention from the big picture. This reality underscores the need for
organizations to put reminders in front of their workforces to keep attention
focused on the proper handling and safeguarding of personal information. These
reminders may take the form of awareness tools, such as Data Privacy Day on
January 28, infographics, tip sheets, comics, posters, postcards, stickers, blogs,
wikis, simulations, email campaigns, announcements on the intranet, web
sessions, drop-in sessions, and lunch-and-learns. Raising workforce awareness on
a consistent basis should be one of the top activities considered by any privacy
management team.
Formal training practices are also part of closing the communication loop.
Training may be delivered through dedicated classroom, instructor-led courses, or
online platforms. Employees and data users may be required to train regularly.
This is where the privacy vision for the organization and policy enforcement is
communicated clearly and consistently. Training and awareness reinforce policies
by educating personnel with constant and consistent reminders about why they
are important, who they affect, and how they are accomplished. Training is
covered in greater detail in chapter 8.
Finally, policies apply to everyone in the organization. One loophole or one
break in the organization’s protection can lead to a hack or data breach to the
entire organization. Leadership, management, the privacy office, and the
information security office do not have a waiver to break any policy they believe
does not address them. Individual choices that breach policy can place the entire
workforce at risk for significant legal consequences and loss of credibility. If a
policy is disconnected from reality, it needs to be corrected—and all risk factors
mitigated—as soon as possible to protect the data owners’ privacy rights and the
organization from crippling loss and impacts. Each employee should feel
empowered to do the right thing but also have the ability to report and change
those policies that may be dated or ineffective. Regardless, policies ensure the
values and goals of the organization are implemented in an orderly and known
manner that is trusted and well understood.
6.10 Summary
A privacy policy should be considered a living document that adapts over time
based on the needs of the organization, evolving business environment, regulatory
updates, changing industry standards, and many other factors. This could be
considered the life cycle of the policy that continues to be reviewed and updated
on a regular basis. Part of this life cycle should be the communication of the policy
through effective training and awareness practices that should also be recurring
and mandatory for every employee, vendor, contractor, or other data user within
the organization.
The privacy policy should contain, at a minimum, the purpose, scope,
responsibilities, and compliance reasons to allow the reader a full understanding
of how privacy will be managed. In some cases, the privacy policy may also
address risks, other organizational responsibilities, data subject rights, data use
rules, and other privacy-related information and practices. The composition of the
policy should align with the needs of the organization in meeting national, state,
and local laws or other standards for data privacy protection.
Beyond the privacy policy are other supporting policies that provide practical
guidance on potential issues or specific intent. These include information security
policies that also protect data but for a different purpose and with potentially
different tools, people, and processes to support common goals between privacy
and security.
It is important to understand that information security is a complex topic that
will span the organization and overlap privacy management. By becoming familiar
with information security practices and stakeholders, the privacy professional will
open channels of communication with those key players throughout the
organization and during any incident response.
Managing privacy within an organization requires the contribution and
participation of many members of that organization. Because privacy should
continue to develop and mature over time within an organization, functional
groups must understand just how they contribute and support the overall privacy
program, as well as the privacy principles themselves. Importantly, individual
groups must have a fundamental understanding of data privacy because, in
addition to supporting the vision and plan of the privacy officer and the privacy
organization, these groups may need to support independent initiatives and
projects from other stakeholders.
The privacy professional should have awareness of other policies and standards
that support privacy or offer other data protections. An example is the data
retention/records management strategies that reinforce the basic concept that
data should only be retained for the length of time the business needs to use the
data. Records management and data retention should meet legal and business
needs for privacy, security, and data archiving.
Creating privacy policies does not mean employees or other internal data users
will know and follow them or understand their purpose and intent. The same is
true for any organization policy, standard, guideline, or handbook. The privacy
policy, like many other business-related policies, has a specific intent to protect
data privacy during and after business use. To meet the privacy intent, users of the
data will need to be educated and reminded on a regular basis of the organization’s
vision and mission. Because data users focus on their primary objectives and jobs
rather than on privacy, education and reminders about what privacy is and the
how and why of privacy management is important for the continued success of
the organization.
Endnotes
1 Adrian Barrett, Greg Albertyn, and Odia Kagan, “What’s Next with the GDPR? How Companies are
Turning Policy into Practice,” recorded October 3, 2019, IAPP: Portsmouth, NH, web conference,
https://iapp.org/store/webconferences/a0l1P00000Dr66bQAB/.
2 NIST Special Publication 800-12 Revision 1, “An Introduction to Computer Security,” National Institute
of Standards and Technology, U.S. Department of Commerce, accessed November 2018,
https://doi.org/10.6028/NIST.SP.800-12r1.
3 NIST Special Publication 800-12 Revision 1.
4 NIST Special Publication 800-12 Revision 1.
5 Rafae Bhatti, “Design Considerations for Building Privacy-Protecting Analytics Services,” Privacy Tech,
IAPP, September 18, 2019, https://iapp.org/news/a/design-considerations-for-building-privacy-
protecting-analytics-services/.
CHAPTER 7
Privacy Operational Life Cycle: Sustain:
Monitoring and Auditing Program Performance
Russell Densmore, CIPP/E, CIPP/US, CIPM, CIPT, FIP1
Edward Yakabovicz, CIPP/G, CIPM, CIPT
7.1 Metrics
This section will assist the privacy professional with general best practices for
identifying, defining, selecting, collecting, and analyzing metrics specific to
privacy. With advances in technology and the corresponding legal obligations,
organizations must ensure proper protections are in place and functioning
optimally to track performance and report metrics to leaders, managers, and
regulatory officials when necessary.
Metrics should reflect currency and value to the organization in effective
tracking and benchmarking of the business. Currency and value metrics imply the
metrics are simple, quantifiable, and easy to use with correlation to the business
performance, operational goal, technical outcome, and regulatory guideline. These
business-oriented goals will drive the need for a metric in meeting organization,
department, or group efforts. The right metric will inform the organization of
measuring an outcome or event, such as success, failure, and progress, toward
executing certain operational intent and objectives. For privacy professionals,
products, services, and systems that cannot provide value or protect data must
change. Loss of information through breaches, noncompliance with regulatory
requirements, and/or data misuse may result in loss of consumer and investor
confidence; financial and reputational harm; and financial losses through fines or
penalties.
Performance measurement is often used by organizations to inform different
audiences (e.g., leadership, management, employees) about operations.
Measurement systems must be easy to understand, repeatable, and reflective of
relevant indicators to the business and organization. A metric is a unit of
measurement that should be as objective as possible and provide data that helps to
answer specific questions about the business operations. As a basic rule, a metric
must add value by accurately reflecting the state of business objectives and goals.
The same logic applies to privacy programs and operations: an objective can be
broad-based, but a goal should be structured in a way that is measurable in an
easy-to-understand method that is both relevant and useful to the organization.
For example, an objective could be to develop privacy notices, while a goal could
be to provide privacy notices to 100 percent of the customer base within a specific
time frame. Metrics like these have the added benefit of helping to increase the
understanding of necessary protections to meet privacy obligations while
informing leadership of the progress.
While metrics assist leaders with tracking specific privacy objectives and goals,
there are other ways they help the entire organization understand and implement
effective privacy policies. First, by normalizing privacy concepts, metrics help
conversations about the privacy regime be meaningful to key stakeholders who
may not be familiar with the concept or profession. Second, the use of metrics can
help eliminate terminology and jargon, making it easier for decisions to be made
at the program and operational level with colleagues in different roles and
specializations. Third, metrics consider but are not based on a specific technology
or application (e.g., iPhone versus Android, or Facebook versus Twitter). Fourth,
using metrics advances the maturity of privacy programs and operations, which is
critical now as regulations and expectations are on the rise.
To be beneficial, metrics must be described clearly; otherwise, they may not
represent similar value throughout an organization. Generic privacy metrics
should be developed for different processes (e.g., collection, responses to data
subject inquiries, use, retention, disclosure, incidents, training, review coverage,
risk, and assessments). Once defined, that data should be captured regularly to
enable trending-over-time analysis and historical reporting. Valuable metrics can
provide powerful results, when communicated in the right way, especially when it
comes to reporting to senior management on the privacy program. It can not only
demonstrate compliance and the success of a privacy program, but also when it
comes to evidencing the requirement for additional resource and
investment/budgets, demonstrating the return on investment (ROI), overall
program maturity, and privacy resource utilization and thus feed into overall
business resiliency metrics.
That said, more metrics do not equate to more value. Also, purely volume-based
metrics do not provide much value without any indication as to how this impacts
the privacy program or to highlight any trends, issues, or gaps. Metric
identification is difficult and must be done in consideration of what is both
sustainable and scalable within cost contrasts, storage methods, and available
tools. Making informed decisions on the investment and application of privacy-
enhancing technology (PET) and process improvements (e.g., automated
reviews) is a challenge. Using the right metrics allows for development of key
performance indicators (KPIs) that assist the organization in setting and tracking
multiple objectives and goals.
As an example of privacy metrics, the IAPP-FTI Consulting Privacy Governance
Report 2020 found the following privacy metrics to be the most popular:2
No formal metrics are used (21 percent)
Incident response metrics (58 percent use them)
Privacy impact assessments (PIAs) or data protection impact
assessments (DPIAs) (57 percent use them)
Training and awareness metrics (56 percent use them)
Data subject access request (DSAR) metrics (53 percent use them)
Many other privacy metrics exist through the industry, but general guidelines
include starting with identifying which privacy goals are critical to your
organization, why, and to whom. Next, develop the formal intent of the metric
based on those goals, and apply a practical measurement to quantify the output
(e.g., success, failure, goal met, positive, false, nominal, and so on). Metrics can be
positive, negative, or natural at any point during analysis. For example, does X
metric reflect your organization’s need to adhere to new regulatory guidance? Or
does it address a risk around vendor management? Does it count something,
measure a result, or provide historical tracking? Considerations should include all
layers of the organization to encourage the overall success and usefulness of any
metric beyond the needs of the privacy professional with group consensus for
management, use, intent, and purpose. Finally, metrics and the resulting data will
need to be evaluated and categorized with nondisclosure or proprietary markings
based on the data collected and presented and not cause a data breach if reported
outside the organization.
Measuring privacy as a metric is sometimes not an easy task and will take some
effort. Consideration should be given to ensure any metrics show value to the
organization for, in some organizations, the metrics may justify the existence of
the privacy office. It may be necessary to provide evidence of an impact on the
business and that resources are being properly used/dedicated to the business
objectives. More importantly, the privacy metrics should reflect data privacy
compliance, data-driven decision-making, and what is the overall impact of the
privacy program to assist the business mission and goals to succeed.
7.1.1 Intended Audience
There are different audiences for different metrics. Relevant stakeholders are
generally those who will use the data to view, discuss, and make strategic decisions
or some combination of all three. There are no limits to both internal and external
audiences, particularly in consideration of reporting requirements. For example,
one metric may be useful for a board of directors, another for external auditors,
while another for practical privacy program measurement in creating and
maintaining compliance factors. The way in which metrics are presented may also
need to be adapted to different audiences. For example, a simple graph may be
enough in some cases, whereas other audiences may wish to be provided with
further detailed presentations and reports.
The difference in audience is based on level of interest, influence, and
responsibility for privacy as specified by the business objectives, laws and
regulations, or ownership. Primary audiences generally include legal and privacy
officers, including a data protection officer (DPO) as prescribed under the EU
General Data Protection Regulation (GDPR), senior leadership, chief
information officer (CIO), chief security officer (CSO), program managers,
information system owner, information security officer (ISO), others considered
users, and managers. Secondary audiences include the chief financial officer
(CFO), training organizations, human resources (HR), inspectors general, and
U.S. Health Insurance Portability and Accountability Act (HIPAA) security
officials. Tertiary audiences may include external watchdog groups, sponsors, and
shareholders. In health care, for example, audiences may include a HIPAA privacy
officer, medical interdisciplinary readiness team, senior executive staff, and
covered entity workforce.
7.1.2 Role of a Metric Owner
Since metrics continue to change as the business objectives and goals evolve, an
individual should be assigned to both champion and own the metric. A metric
owner must be able to evangelize the purpose and intent of that metric to the
organization. As a best practice, it is highly recommended that a person with
privacy knowledge, training, and experience perform this role to limit possible
errors in interpretation of privacy-related laws, regulations, and practices. It is not
necessary that the metric owner be responsible for data collection or
measurement.
The metric owner is its champion. They should know what is critical about the
metric and how it fits into the business objective. They should be responsible for
monitoring process performance. In addition, the owner is accountable for
keeping process documentation up to date, minimizing variance, and undertaking
visualizations (e.g., flowcharts, graphs). Finally, the owner is responsible for
performing regular reviews to determine if the metric is still effective and provides
value, as well as ensuring improvements are incorporated and maintained in the
process.
The metric owner and champion should not understate the amount of time and
effort it will take to first create and then evangelize the intent and value of the
privacy metrics to the organization. Often, the privacy conversation will center
around educating and awareness of why privacy is important and defining the
basic privacy concepts before the actual privacy metric is discussed. For these
reasons, the privacy owner should be well versed and knowledgeable about
privacy, business privacy objectives, and metric development.
7.1.3 Analysis: Measuring Privacy
Once metrics have been created and the data collected, a data analysis is
conducted. One practical analysis method is to use statistical methods to ensure
data is interpreted correctly. This step sometimes takes the most time due to the
potentially large dataset. Where possible, the privacy professional should consider
the use of automated tools or methods to gather, sort, and report. As privacy
becomes more of an operational- and technology-led function, data privacy
dashboards, such as PowerBI and Tableau, may also provide alternative options to
visualize the data and accompanying metrics. Many other software applications
perform statistical and financial functions; some are open source, in the public
domain, or freeware. There are also several commercial solutions for small to large
organizations. Other tools may already exist on company resources, such as
spreadsheets and databases. Selection and use of any tool should always be based
on organization requirements, budget, or direction. A good place to start is in
review of any metrics tools or processes already in place for the organization. Try
to leverage these for the privacy management use and reporting. Using common
metric processes and tools from the organization will make the privacy task easier
for the organization to understand and track rather than use of an unknown and
untrusted tool.
7.1.3.1 Trend Analysis
“Trending,” or “trend analysis,” is one of the easiest statistical methods to use for
reporting data. This approach attempts to spot a pattern in the information as
viewed over a period. There are many different statistical trending methods,
including simple data patterns, fitting a trend (e.g., least-squares), trends in
random data (i.e., data as a trend plus noise, or a noisy time series), and the
goodness of fit (e.g., R-squared). Without going into formal statistics (e.g.,
defining mean, standard deviation, variance, linear trend, sample, population,
signal, and noise), the privacy professional can focus on looking for data patterns.
For example, time series analysis shows trends in an upward or downward
tendency, such as the number of incidents over time, how many PIAs were
completed, or the number of privacy training and awareness sessions.
A second form of analysis is called “cyclical component,” which shows data over
a period focused on regular fluctuations. Measuring the number of incident
responses in the month after an organization rolls out new privacy training, this
analysis is focused on explaining any changes in the number of reported incidents
as the distance from training increases.
Third is a type of analysis called “irregular component,” or “noise.” This analysis
focuses on what is left over when the other components of the series (time and
cyclical) have been accounted for. It is the most difficult to detect—an example
would be the absence of incidents over a longer period.
Not every privacy office will have someone who can do statistics or formal data
analysis; thus, consider using automated methods to create and report privacy
metrics. Trending analysis, for example, can take on many forms and methods
depending on the analysis tools either automated or manual that are used. Using a
practical privacy analysis tool will offer developed approaches or “canned” metrics
and reports that have been shown to be useful to privacy management. These
preconfigured items can reduce the amount of time and effort necessary to
develop and track privacy metrics, but there may be a software cost that will need
to be evaluated to ensure the cost is worth the benefit.
7.1.3.2 Return on Investment
ROI is an indicator used to measure the financial gain or loss of a project or
program in relation to its cost. In simple terms, what is the “benefit” and “loss"?
Leaders will look at ROI to determine if a participle business unit, such as privacy,
is a value to the organization in meeting business goals. Financial gains and losses
are defined by the organization’s leadership but can come from any of the
stakeholders or data owners. If privacy protects the organization from a financial
loss, or what will be called “business risk” for this chapter, that risk and associated
savings from loss should be quantified in the form of ROI.
ROI analysis provides the quantitative measurement of benefits and costs,
strengths and weaknesses of the organization’s privacy controls. The data can be
fixed or variable and represent a best attempt to form an economical risk
assessment to determine the probability of a loss, as well as the probable
economic consequences, with the goal of maximizing the benefits of investments
that generally do not generate revenue; rather, they prevent loss.
There are two considerations in developing the metric. First, the ROI of a given
function must be related to the reason for implementing that function. Second,
the value of the asset must be defined. In assessing value of an information asset,
the privacy professional should consider how that changes over time—for
example, the costs of producing (and reproducing) information, repercussions if
the information is not available, and other factors, such as harm to reputation or
loss of confidence.
ROI is difficult to determine at first glance when developing metrics and may
not be fully realized until after development of several draft metrics. This is
especially true at the beginning where there were no privacy metrics to improve or
start from. ROI may not even be something that the privacy office will need to
consider or maintain, but the privacy professional should be prepared somewhere
in the metric life cycle to provide a place where the value of the privacy program
can be shown. This is true during reorganizations, divestitures, and other
organizational changes. The best way to develop a privacy ROI is to look at the
risk that has been mitigated and how to track that risk in financial terms.
7.1.3.3 Business Resiliency
To the privacy professional, business resiliency is measured through metrics
associated with data privacy, incident response, compliance, system outages, and
other factors as defined by the business case and organization objectives. If it
exists, the organization’s business continuity or disaster recovery office should be
contacted to assist in the selection and use of data for this metric type as it is the
expert in this data type and organizational objectives.
Focusing solely on disasters will lead an organization to be defensive, but using a
proactive approach enables the organization to “respond to an unexpected event
more quickly and more cost effectively. In addition to disaster situations, a strong
business resilience program can help your organization prepare for audits and
demonstrate compliance with regulatory requirements.”3
7.1.3.4 Program Maturity
The Privacy Maturity Model (PMM) is a well-established model that sets out
maturity levels for privacy programs and operations.4 Maturity is a useful metric
because it focuses on a scale as opposed to an endpoint. For example, acceptable
data privacy protections may be in place without being the “most mature.” The
PMM uses five maturity levels described in figure 7-1.
Figure 7-1: Privacy Maturity Levels5
Maturity level one, “ad hoc,” is used to describe a situation in which the
procedures or processes are generally informal, incomplete, and inconsistently
applied. “Repeatable,” or maturity level two, similarly has procedures and
processes, but they are not fully documented and do not cover all relevant aspects.
Third, the “defined” level indicates that procedures and processes are fully
documented, implemented, and cover all relevant aspects. “Managed,” or maturity
level four, indicates that reviews are conducted to assess the effectiveness of the
controls in place. The fifth and final level, “optimized,” represents a level at which
regular review and feedback are used to ensure continual improvement toward
optimization of a given process. Each level builds on the previous one; for
example, to reach maturity level three, all the requirements for levels one and two
must have been met.
The PMM can be customized in many ways, and the authors provide a structure
to identify where to start and what to document. Key startup activities include
identifying a sponsor (e.g., the privacy officer), assigning responsibility for the
project, and considering stakeholders/the oversight committee with non-privacy
representation (e.g., legal, audit, risk management). Once assessment of maturity
has begun, it is important to be transparent about the progress and results to
ensure identifiable risk and compliance issues are appropriately escalated.
An initial assessment can identify strengths and reveal weaknesses. Once the
baseline assessment has been established, the organization can decide at which
level of maturity it ultimately wants or needs to operate. Ideal maturity levels can
be challenging to pinpoint. Note that “in developing the PMM, it was recognized
that each organization’s personal information privacy practices may be at various
levels, whether due to legislative requirements, corporate policies or the status of
the organization’s privacy initiatives. It was also recognized that, based on an
organization’s approach to risk, not all privacy initiatives would need to reach the
highest level on the maturity model.”6
7.1.4 Metrics in Action: Reporting to the Board
Several factors have contributed to the elevation of the privacy professional in
organizations around the world, but perhaps none have been as influential as the
GDPR and its “mandatory DPO.” Suddenly, organizations processing large
amounts of personal data not only must employ a DPO, but also ensure those
DPOs report to “the highest management level” of the organizations.
For many organizations, that means the DPO must report to the board of
directors itself, and if not to the board, then to the C-suite, which must then
transfer that information up the corporate ladder.
In fact, in the 2018 IAPP-EY Privacy Governance Report, 78 percent of
organizations said they report privacy matters to the board, and the privacy leader
does the reporting themselves 37 percent of the time.7 The most common topic
reported to the board, identified by 83 percent of respondents, is “status of
compliance with the GDPR.”
In contrast, in the 2019 IAPP-EY Privacy Governance Report, the GDPR had the
highest privacy interest by the board, followed by a data breach. Among 337
respondents who must comply with the GDPR, 38 percent have reported a
breach this year (compared to just 16 percent in 2018), and 22 percent have
reported more than 10.8 Another contrast is 35 percent of EU responses report
directly to the board, while in the United States, only 10 percent of the privacy
leaders report to the board.
This is where metrics are vital to the privacy program. How can the DPO or
other privacy leader demonstrate the status of compliance?
To answer this question, the IAPP undertook the exercise of creating a template
for a DPO report.9 To do this, it identified all the activities the GDPR mandates
for the DPO and created metrics for demonstrating compliance.
It identifies several categories of metrics to include but not limited to:
Defending the company’s systems and data
Number of reported incidents
Number of actual incidents
Number of reportable incidents
Attempted intrusions
Attempted distributed-denial-of-service (DDOS) attacks
Average investigation and response time
Complying with legal responsibilities and regulations
Complaints by data subjects
Data access requests by data subjects
Rectification requests by data subjects
Unsubscribe requests by data subjects
Erasure requests by data subjects
Stop processing requests by subjects
Data downloads by data subjects
Advising the business
PIAs
Average completion time
DPIAs
Average completion time
Staff receiving privacy training
Percentage of all staff
Of course, some of the metrics in the report are basic: number of privacy staff,
total privacy budget, number of products or services utilizing personal data, total
number of data subjects about whom data is held, number of processors, and so
forth. But it is important to be as granular as possible. What is the ratio of
employees in compliance to those in legal, or the ratio of either, to the number of
employees as a whole? How many business processes use consumer data versus
employee data?
While the board may not need this granularity, it is better to have it than not.
The GDPR also mandates operational activities that can be measured. How
many DPIAs have been conducted? How many DSARs have been received? How
many complaints have been received? How many data security incidents have
been discovered or reported? How many of those were elevated to notification of
the data protection authority (DPA) or data subjects?
These activities might be tracked by month or quarter so the organization can
identify and demonstrate trends. A rising number of complaints could indicate a
poorly performing program or a need for more staff and budget. For many of these
tasks, space is provided for documenting the amount of time and HR needed to
comply so that it can be easier to advocate for more help.
If DSARs are increasing, perhaps there’s a trust issue with the business and so
public relations (PR) and marketing need to be more closely involved. While a
DPO is independent from the business, it is still important they advise the
business about the number of resources and activity needed from the business to
comply with the GDPR.
The template can be adapted to any business; emphasis can be placed on
different data elements as they are more and less important to the business. For
some organizations, data portability requests will be quite rare. Other such
requests will need to be closely tracked as a potential threat to the bottom line.
Obviously, a DPO report is just one way to look at a privacy program and its
performance, but it may be a useful way to assess what a DPO report might look
like at your organization, whether you have a DPO mandated by the GDPR.
7.2 Monitor
This section refers to ongoing activities organizations undertake to control,
manage, and report risk associated with privacy management practices. Another
monitoring topic is found in Article 39 of the GDPR in which the DPO includes
monitoring compliance, including managing internal data protection activities,
training data-processing staff, and conducting internal audits.
General monitoring should be done to ensure the organization is doing what it
says it is doing and what it is supposed to be doing. Monitoring should be
continual, based on the organization’s risk goals, and executed through defined
roles and responsibilities that may include privacy, audit, risk, and security
personnel. Typical outcomes include compliance, increased awareness,
transparency, and credibility.
The privacy professional should identify the business-as-usual rhythms of the
organization to understand how monitoring practices are used and maintained for
privacy management and to validate programs are being implemented in a manner
consistent with the organization’s privacy policies and standards. Without a
formal process for monitoring privacy requirements, the organization cannot be
reasonably assured that personal information is handled appropriately and aligned
with the organization, compliance expectations, and policy requirements. A few
other general benefits of monitoring include ensuring privacy program goals are
achieved, detecting privacy failures early, and obtaining feedback for privacy
program improvement.
7.2.1 Types of Monitoring
There are different types of monitoring for different business purposes.
7.2.1.1 Compliance
Compliance monitoring through audits is focused on the collection, use, and
retention of personal information to ensure necessary policies and controls are in
place for compliance. The degree of monitoring an organization needs depends on
the sensitivity of the information collected, compliance risk factors, and industry
requirements. There are four common approaches: self-monitoring, audit
management, security/system management, and risk management. Compliance
monitoring is essential for detecting and correcting violations, supporting
enforcement actions, and evaluating progress.
7.2.1.2 Regulation
Laws, regulations, and requirements are constantly changing, so there is a need to
monitor the changes and update policies accordingly. Defined roles and
responsibilities will determine who owns which tasks, when updates can or
should occur, and how to communicate them broadly.
Based on the size of the organization or the scope of regulatory activities,
subscription services can be purchased to conduct this type of monitoring
externally.
7.2.1.3 Environment
Internal and external environmental monitoring focus on vulnerabilities, which
may include physical concerns, such as building access or visitor activities. It may
address programmatic concerns, for example, a lack of awareness or training.
Finally, it may address insider threats, such as sabotaging, modifying, or stealing
information for personal gain, as well as cybersecurity threats to information
technology (IT) assets.
7.2.1.4 Training Data
Training performed by topic and employee should be tracked and assessed
biyearly to ensure each and every employee is aware of laws, regulations, and
organization requirements for the handling of data. Partnering with learning and
development or HR teams can help privacy professionals keep track of
completion rates and any factors that may be having an impact on them. These
functions are also experts in how to boost employee engagement and may be able
to advise on the format and methods of the training, as well as the content itself
and how it is presented in a way that employees understand.
7.2.2 Forms of Monitoring
Use of different or multiple forms of monitoring should be considered for each
organization and their program goals.
7.2.2.1 Tools
Active scanning tools for network and storage can be used to identify risks to
personal information and to monitor for compliance with internal policies and
procedures. For example, a scan result may find files with personal information are
incorrectly stored on a network that is publicly accessible, thus proactively
identifying a potential privacy breach.
7.2.2.2 Audits
Audits include internal and external reviews of people, processes, technology,
finances, and many other aspects of business functions. It is good practice to build
strong partnerships with internal audit functions to gain feedback on the
framework or approach being adopted to carry out such audits. This topic will be
further reviewed in section 7.3.
7.2.2.3 Breaches
As discussed in chapter 10, breach management practices are more important
than ever before, driven by the laws and regulations of countries, states, or
provinces.10 Tracking, particularly over time, the type of breach, severity, and time
to remediation is an important type of monitoring to determine if both training
activities and program processes are sufficient.
7.2.2.4 Complaints
Complaint-monitoring processes track, report, document, and provide
resolutions of customer, consumer, patient, employee, supplier, and other
complaints. Tracking details about the type and origin (location) of complaints
can provide early indicators of the potential for regulatory activity. (This topic is
discussed further in chapters 2, 9, and 10.)
7.2.2.5 Data Retention
Records management and data retention should meet legal and business needs for
privacy, security, and data archiving. In monitoring, look for potential areas for
risk present in retention schedules or practices, such as excessive collection,
inadequate controls (access and use), or undue disclosure practices.11
7.2.2.6 Controls
Relying on an established set of privacy controls at the operational and program
level, this type of monitoring is about assessing the design and efficacy of a given
control set. Some governance, risk, and compliance (GRC) tools may provide
automated means to undertake some or all of these checks, right through to
tracking remediation activity.
7.2.2.7 Human Resources
HR is responsible for ensuring privacy protections are in place for employee
personal information across HR processes. Multinational organizations are
required to meet local regulations and the privacy expectations of their employees
in all countries in which they operate. While the employment contract provides
overall employee accountability for certain work-related activities, some
workplace monitoring will require additional privacy considerations. For
monitoring purposes, investigations related to compliance with security and
privacy practices are of particular interest. (This topic is covered further in
chapters 2 and 10.)
7.2.2.8 Suppliers/Third Parties
Outsourcing of operations to suppliers (e.g., subcontractors, third parties) and the
use of technology providers (e.g., cloud services) are guided by agreements, which
should contain monitoring protection procedures. Supplier monitoring should
include appropriate privacy and security requirements, as well as provider
performance, to ensure compliance to contract specifications, laws, and policies.
In addition, you should monitor the security of mobile devices to confirm that
personal information contained in those devices is adequately protected. (This
topic is covered further in chapter 4.)
7.3 Audits
Audits are an ongoing process of evaluating the effectiveness of controls
throughout the organization’s operations, systems, and processes. While typically
associated with accounting or finance, audits are now commonplace in IT and
privacy management. Generally concerned with process and technical
improvements, audits can be used to identify the risks posed by vulnerabilities
and weaknesses and provide opportunities to strengthen the organization. They
can also be used to ensure compliance with regulations and laws.
Elements of the audit process in respect of privacy may happen simultaneously
or in separate components depending on organizational requirements.
7.3.1 Definition
The purpose of a privacy audit is to determine the degree to which technology,
processes, and people comply with privacy policies and practices. Privacy audits
help measure efficacy of privacy procedures, demonstrate compliance, increase
the level of general privacy awareness, reveal gaps, and provide a basis for
remediation planning.12 Audits differ from assessments in that they are evidence-
based. For more on assessments, see chapter 4.
7.3.2 Rationale
Audits may be conducted either regularly, ad hoc, or on demand, depending on
the purpose. Privacy audits provide evidence regarding whether privacy
operations are doing what they were designed to do and whether privacy controls
are correctly managed.
However, there are other reasons to perform audits. Audits are conducted when
change occurs—whether that is policy degradation, system updates or
maintenance, or some kind of security (or other) event. They are also triggered by
user errors or accidents, security or privacy breaches, or requests from regulators,
leadership, or media. Other factors may include new categories of customers or
acquisitions of new lines of business, changing priorities, new suppliers, new
countries of operations, or risks identified through other business processes.
7.3.3 Phases
The auditor must have full authority to perform duties; otherwise, the tasks and
actions may be challenged and delay the work. Stakeholders and corresponding
roles and responsibilities should be defined before the audit begins. Stakeholders
may include executive leadership, those who have related functional duties
(security officer), and/or regulators.
Scoping the audit is critical to determine the types of personnel (e.g., employees,
contractors, third parties) who are permitted to handle personal information.
Once scoping is complete, the five-phase audit approach begins, as illustrated in
figure 7-2.13
Figure 7-2: Audit Life Cycle
Phase one is auditing “planning” that generally involves conducting a risk
assessment, setting a schedule, selecting the right auditor, completing a pre-audit
questionnaire, hosting an introductory meeting to prepare for the audit, and
compiling a checklist. Once complete, the “preparation” phase two requires
confirming the schedule, preparing any additional checklists and sampling criteria,
and finalizing the audit plan. The actual “audit” takes place in phase three that
includes meeting with stakeholders and business process owners and executing
the functional goals of the audit.
“Reporting” is phase four, which records and reports on noncompliance
(providing records, categorizing instances as major or minor), drafts a formal
audit report, and hosts a close-out meeting. Copies of the audit report should be
distributed as appropriate. Specific report formats will vary, based on organization
requirements, auditing methodologies, privacy framework, and other factors. The
auditor and auditee roles and responsibleness, level and depth of reporting, and
the content of the recommendations should be defined appropriately based on
these factors. Ideally, the report will be a formal record of what was audited and
when, insight into areas that comply (or do not), details to support the findings,
and suggested corrective action and work estimates, among other topics.14
Instances of noncompliance should be sufficiently documented to include facts,
evidence, best practices, and standards that help assess the situation. Findings
should be communicated to all stakeholders, along with risks, remediation plans,
and associated costs. Corrective actions are based on best practices, lessons
learned, or industry standards, which provide a way to measure the effectiveness
of the actions and inform their prioritization. Work estimates provide the
potential cost impact of implementing corrective actions. Completing these steps
ensures that reports offer a comprehensive summary of each issue, their potential
impacts, recommendations for corrective actions, and an understanding of the
potential cost and effectiveness of those actions.
The final phase five involves “follow-up” confirmation of the scope of
remediation activities, scheduling those activities, and addressing any
requirements around methodology. Once completed, the audit can be closed out
entirely. This is a generic life cycle that can be adjusted to fit the organization and
project needs. Phases can be added or removed based on those needs. For
example, there could be a pre-audit phase to sample some data beforehand to
better prepare the final audit phase for a better understanding and potential
unique needs. Another example is adding a phase before reporting to include a
pre-report phase to dry-run the final report and allow input from other subject
matter experts or even the stakeholders to better align the report and ensure there
are no inaccuracies or misrepresentations, especially if the reports are going
directly to the board.
7.3.4 Types
There are three types of audits: first-party (internal), second-party (supplier), and
third-party (independent). The frequency and type will vary based on resources,
organizational culture, risk tolerance, and demand.
First-party audits are generally used to support self-certifications; scope is based
on resources and the current state of compliance. The self-certification process can
provide the relevant facts, data, documentation, and standards necessary to reflect
consistent, standardized, and valid privacy management that aligns to a particular
privacy standard, guideline, or policy. Like other audits, first-party audits will
consider the organization’s risk management culture, identify privacy risk factors,
and evaluate control design and implementation.
Second-party audits reference the notion of supplier audits (covered in chapter
4). When a data collector outsources activities related to personal information
management, accountability for compliance is retained. Contract language should
include specific privacy protections and regulatory requirements and be mapped
to service-level agreements as if the supplier were part of the organization. The
entity outsourcing must have the right to audit the supplier for evidence of these
protections. To summarize, the purpose of the contract is to surface the
requirements; the audit function provides evidence of that compliance.
Third-party audits are conducted by independent outside sources, typically
under consent decree or regulatory request. They may align to various regional or
industry frameworks, such as the U.S. National Institute of Standards and
Technology (NIST) or International Organization for Standardization (ISO).
There are some advantages to these kinds of audits—they identify weaknesses of
internal controls, make first-party audits more credible, and provide a level of
expert recommendations. There are also a few disadvantages, mostly related to
bringing in external parties: cost, scheduling, and the time it takes to get up to
speed. Ultimately, when an independent authority attests to the efficacy of privacy
controls, there is increased confidence that the organization’s practices are an
accurate reflection of its claims.
7.3.5 Review
The activities described in this section are not useful without time to analyze
results. The audit process should have a trigger to signal the privacy officer to step
back and evaluate the program or, ideally, specific pieces of it.
7.4 Summary
This chapter started with a discussion of metrics and outlines how they can
provide a baseline for evaluating projects and gauging their contribution over time
as privacy technologies, processes, and programs evolve. Metrics help privacy
professionals communicate the value they add to the organization. Several
examples of metrics have been provided throughout the paper, along with IAPP
resources from the IAPP website to assist in the development of the metrics and
then the reporting mechanisms.
Metrics are only a small part of the overall privacy program management puzzle
to success as there also needs to be active ongoing monitoring and auditing to
help identify any gaps in privacy program functions and provide a mechanism for
program and policy optimization. Auditing (whether first-, second-, or third-
party) can assess how well the program and controls are working together, if there
any legal or regulatory compliance issues, and then report those. Finally,
communication about metrics, monitoring, and audit activities combined helps to
create greater awareness of the privacy program (internally and externally) and
ensures flexibility to respond to environmental, regulatory, and technological
impacts to privacy program performance and organizations daily.
7.5 Glossary
Performance measurement—The process of formulating or selecting metrics to
evaluate implementation, efficiency, or effectiveness; the gathering of data and
production of quantifiable output that describes performance
Metrics—Tools that facilitate decision-making and accountability through
collection, analysis, and reporting of data; must be measurable, meaningful, clearly
defined (with boundaries), and able to indicate progress and answer a specific
question to be valuable and practical
Metrics life cycle—The processes and methods to sustain a metric to match the
ever-changing needs of an organization
Metric audience—Primary, secondary, and tertiary stakeholders who obtain
value from a metric
Metrics owner—Process owner, champion, advocate, and evangelist
responsible for management of the metric throughout the metric life cycle
Endnotes
1 Tracy Kosa wrote this chapter for the Second Edition. Third Edition updates made by Russell Densmore
and Edward Yakabovicz.
2 IAPP and FTI Consulting, IAPP-FTI Consulting Privacy Governance Report 2020,
https://iapp.org/resources/article/iapp-fti-consulting-privacy-governance-report-2020/.
3 IBM, “Business Resilience: The Best Defense is a Good Offense,” p. 3, January 2009, accessed November
2018, https://www-935.ibm.com/services/uk/en/it-
services/Business_resilience_the_best_defence_is_a_good_offence.pdf.
4 AICPA/CICA, Privacy Maturity Model, March 2011, accessed November 2018,
https://iapp.org/media/pdf/resource_center/aicpa_cica_privacy_maturity_model_final-2011.pdf.
5 AICPA/CICA, Privacy Maturity Model.
6 AICPA/CICA, Privacy Maturity Model.
7 IAPP and EY, IAPP-EY Annual Privacy Governance Report 2018, https://iapp.org/resources/article/iapp-
ey-annual-governance-report-2018/.
8 IAPP and EY, IAPP-EY Annual Privacy Governance Report 2019, https://iapp.org/resources/article/iapp-
ey-annual-governance-report-2019/.
9 Access the DPO Report Template at https://iapp.org/resources/article/dpo-report-template/.
10 Theodore J. Kobus III, “Data Breach Response: A Year in Review,” Data Privacy Monitor, BakerHostetler,
December 27, 2011, accessed November 2018, https://www.dataprivacymonitor
.com/breach-notification/data-breach-response-a-year-in-review/.
11 Ulrich Hahn, Ken Askelson and Robert Stiles, “Global Technology Audit Guide: Managing and Auditing
Privacy Risks,” Institute of Internal Auditors, p. 4, June 2006, accessed November 2018,
https://www.interniaudit.cz/download/ippf/GTAG/gtag_5_managing_and_auditing_privacy_risks.pdf.
12 Bruce J. Bakis, “Mitre: How to Conduct a Privacy Audit,” Presentation for the 2007 New York State Cyber
Security Conference, June 6, 2007, accessed November 2018,
https://www.mitre.org/sites/default/files/pdf/HowToConductPrivacyAudit.pdf.
13 UK Information Commissioner’s Office, A Guide to ICO Audits, September 2018,
https://ico.org.uk/media/for-organisations/documents/2787/guide-to-data-protection-audits.pdf.
14 UK Information Commissioner’s Office, A Guide to ICO Audits.
CHAPTER 8
Privacy Operational Life Cycle: Sustain:
Training and Awareness
Chris Pahl, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP
Employees have many issues to consider as they perform their daily duties. While
privacy offices believe appropriate collection and use of personal information is
the most important priority for employees, to the employee, the focus is on task
completion. This disconnect between expected and actual behavior can frustrate
the privacy office, but closing the gap requires ongoing and innovative efforts to
keep privacy integrated with everyday responsibilities. Management needs to
approve funding to support privacy initiatives, such as training and awareness, and
hold employees accountable for failing to follow privacy policies and procedures.
Building a privacy strategy may mean changing the mindset and perspective of an
entire organization through training and awareness. Effectively protecting
personal information within an organization means all members of the
organization must do their share.
Ponemon’s 2020 Cost of a Data Breach Report estimates the average global cost of
a data breach is $3.86 million, which is a 1.5 percent reduction over 2019’s
estimate of $3.92 million.1 The study identified that health care companies
continue to have the highest costs associated with a data breach at an average of
$7.1 million, up from $6.54 million in 2019. The energy sector follows at $6.39
million, and the financial sector rounds out the top three at $5.9 million, up from
$5.1 million in 2019. The Ponemon study points out industries with more robust
regulatory requirements had higher data breach costs.
The Ponemon study points out that 80 percent of breached companies suffered a
loss of customer personally identifiable information (PII), resulting in a business
cost of $150 per compromised record. This number increased to $175 per
compromised record when data is breached via a malicious attack. Other breaches
of information include intellectual property at 32 percent, anonymized customer
data at 24 percent, and employee PII at 21 percent.
Having an incident response team can mitigate costs. This incident response
team can perform incident response testing, red team testing, threat intel sharing,
and data loss prevention (DLP), which have been proven to be cost mitigating
factors. However, not having efficient controls, lack of qualified cybersecurity
personnel, and complex security systems may lead to cost contributors. The
Ponemon study points out, while there is a cost to standing up an incident
response team, having trained personnel and a plan can save an organization
money. It is estimated that having both in place can save an organization $2
million in 2021; however, companies without both have a $5.29 million cost
versus $3.29 million with both. Deployment of security automation technologies
may also help. Companies without security automation have a higher cost of
$3.58 million than those with automation deployed.
Verizon’s 2020 Data Breach Investigations Report cites 32,002 security incidents,
of which 3,950 were confirmed breaches. Approximately 70 percent of these
confirmed breaches were caused by outsiders, which are motivated by money at
86 percent, compared to four percent for advanced threats.2 While the number of
events continue to make headlines, the tactics are not. Credential theft, social
attacks, such as phishing and business email compromises, and administrative
errors account for 67 percent of the events. Attackers continue to use these vectors
because they are low cost and easy to engage. Companies must focus on education
and training to help employees spot these issues and use it as an additional
mechanism to employees who have made mistakes to ensure the same mistake
does not happen again. This is also an important remedial factor to highlight to
supervisory authorities where notifications have been made or as part of
complaints reported to them.
Ransomware continues to grow in popularity, with an uptick seen in critical
infrastructure industries, such as hospitals, but also in other areas, such as
education. Ransomware accounts for 27 percent of malware incidents, but only 18
percent of companies blocked at least one piece of ransomware.
Some of the breach activity can be seen as companies moved to remote work
environments, quickly shifting storing data in the cloud. Attacks on web
applications were estimated at 43 percent of breaches, which doubled from 2019.
However, the most common method (80 percent) of attacking web applications is
through stolen credentials or brute forced credentials, in which exploiting
vulnerabilities accounted for less than 20 percent.
Not surprising, PII continues to be the target. In 2020, this accounted for 58
percent of breaches, which is double from 2019. Names, email addresses, phone
numbers, physical addresses, and other types of information that can be used to
perpetrate additional phishing or hacks lead the type of data stolen. This
information is also commonly found in many databases and generally not as
protected as more sensitive data, such as passwords, government identification, or
credit card information.
The number of internal-error-related breaches also climbed in 2020, 881 versus
424 in 2019. This may be due to the increased regulatory reporting requirements
rather than an uptick in actual events.
While the statistics are shocking, it is all not doom and gloom as companies
have become more effective at deploying technology to block common malware.
In 2016, Trojan-like malware peaked at 50 percent but has dropped rapidly to only
six and a half percent. In 45 percent of cases, malware is either droppers,
backdoors, or keyloggers, much of which is being blocked. Companies are also
effective at patching their systems, with less than five percent of breaches being
caused by exploiting a vulnerability.
Hackers do not always access company systems for money; sometimes, it is to
make a political statement, such as the impacts stemming from the SolarWinds
breach that was disclosed in 2020. As technology becomes more intertwined, a
breach in one part of the technology ecosystem can impact an unrelated industry,
such as critical infrastructure. In February 2021, hackers broke into a computer
system that treats water for 15,000 residents near Tampa, Florida, by attempting
to add dangerous levels of additive to the water supply. Hackers were able to
access a software program, TeamViewer, but were fortunately caught by an
employee who saw a window pop up on his screen and the mouse clicking in
various parts of the system to manipulate code. Hackers were able to increase the
amount of sodium hydroxide (lye), which the water treatment facility was able to
quickly reverse, without causing harm. While the water treatment plant was able
to stop the issue, the fact that TeamViewer has been installed on 2.5 billion devices
worldwide to enable remote technical support leads to a threat vector that no
company is safe. Companies must train their employees on identifying and
immediately reporting suspicious activities to avoid the potential of sickening
thousands of people.3
Failures to protect personal information can become expensive but also lead to
public embarrassment, which was the case for CAM4, an adult website. In May
2020, SafetyDetectives identified that CAM4 left a database of sensitive
information accessible without a password. The amount of data available exceeded
seven terabytes, or 10.88 billion records, starting with production data from
March 16, 2020. Full names, email addresses, sexual orientation, and chat
transcripts were exposed. SafetyDetectives reported 11 million records, including
emails, which allowed their team to identify affected user by country.4 These hacks
have primarily focused on the supply chain, the most notable occurring with
SolarWinds and Kaseya.
8.1 Training and Awareness
Training and awareness are two different but related concepts. According to U.S.
National Institute of Standards and Technology’s (NIST) Special Publication
(SP) 800-16, awareness is not training.5
Training strives to produce relevant and needed skills and competencies.
Awareness focuses on a specific topic, such as security. Awareness is intended to
allow individuals to recognize specific concerns, such as technology, and respond
accordingly.6
NIST SP 800-50 states, “The most significant difference between training and
awareness is that training seeks to teach skills, which allow a person to perform a
specific function, while awareness seeks to focus an individual’s attention on an
issue or set of issues.”7
Together, training and awareness reinforce the organization’s privacy policy and
practices as this is an ongoing communication cycle to keep key messages and
topic at the forefront of the organization’s business processes. Training allows for
communication and social acceptance of the privacy policy and supporting
processes. It is critical to the successful delivery of the privacy message and sets
the stage for reception and acceptance throughout the organization.
NIST SP 800-50, “Building an Information Technology Security Awareness and
Training Program,” provides a framework companies may use to build out an
effective program. This publication lists four steps in the program’s life cycle,
including:
Awareness and training program design (Section 3)
Awareness and training material development (Section 4)
Program implementation (Section 5)
Post-implementation (Section 6)
NIST SP 800-50 works with NIST SP 800-16, “Information Technology
Security Training Requirements: A Role-and Performance-Based Model.” NIST
SP 800-50 is more strategic, while NIST SP 800-16 is tactical.
The training strategy and budget typically determine the best or approved
methods for education within the organization. The privacy professional should
first understand these areas to ensure they align with and meet corporate
standards before offering any solutions.
Tip: Have a regular coffee and catch-up on one privacy topic via a 15-minute
video conference or a face-to-face meeting with stakeholders. These short but
powerful meetings build an important network. This can also be achieved
through two-minute animated videos, where budgets permit.
Training is a key control with most privacy regulations. Even if a training
requirement is not specifically identified, most regulations assume employees will
receive some form of regular training and awareness regarding the regulation and
associated controls. Besides, how can an organization claim their employees are
complying with a privacy regulation without conducting proper training,
awareness, and quality assurance checks?
However, training must go beyond checking a box. It must address applicable
laws and policies, identify potential violations, address privacy complaints and
misconduct, and include proper reporting procedures and consequences for
violating privacy laws and policies. Where appropriate, the training delivered
internally must extend to business partners and vendors. Companies should
require trainees to acknowledge they have received training and agree to abide by
company policies and applicable law, which should also be overseen by the
privacy team.8 This step can easily be accomplished by concluding a course with a
simple signoff, requiring the employees to check multiple boxes regarding three to
five principles they must follow at all times, which must be tailored to traditional,
hybrid, and at-home work modes. Potentially, there may need to be a slight
variation in the training and messaging offered for those working in the office and
at home as there is a higher risk of data exposure in work-from-home offices.
If people are not aware of what they are processing, they are also unaware of the
consequences and liabilities that may result from mishandling data. It is, therefore,
important for the privacy team, in conjunction with the learning and development
or HR teams, to refresh the training content on a regular basis to capture key
issues, topics, and trends in noncompliance (e.g., identified through data breaches
or regulator complaints) so the business can relate more to the content. The
privacy office cannot assume understanding without ensuring there are sufficient
learning opportunities. The words “training” and “awareness” are used
interchangeably, but they serve different functions. Training communicates the
organization’s privacy message, policies, and processes, including those for data
usage and retention, access control, and incident reporting. Training must be
engaging—for example, using gamification or creating friendly competitive
contests—to motivate individuals to protect information. In some cases, training
should be personal, teaching employees how to implement appropriate privacy
controls at their homes, which will make them more aware in the office. Its impact
must be measured using attendance and other metrics. Metrics give leaders a
powerful picture of what is occurring in the company.
An organization’s privacy awareness program reinforces the privacy message
through reminders, continued advertisement, and mechanisms, such as quizzes,
posters, flyers, and lobby video screens. Reinforcement of this message ensures
greater privacy awareness, which can effectively reduce the risk of privacy data
breaches or accidental disclosures. If implemented effectively, training and
awareness programs can communicate beyond what is written in privacy policies
and procedures to shape expected behaviors and best practices. Where possible,
integration with other training and awareness programs—particularly, other
learning considered “mandatory” by the business—reinforces the messaging.
Privacy training should be considered as “mandatory,” given the significance and
importance of complying with privacy laws. The privacy office should work with
HR teams to ensure that all new employees are required to carry out privacy
training during the first few months of employment, in addition to regular, often
annual, refreshers. As the privacy office will only get a few chances to train and
communicate during the year, making a simple mistake could become costly. Be
sure to understand there are differences between education and awareness,
including:
Equating education with awareness
Using only one communication channel
Lacking effectiveness measurements
Eliminating either education or awareness due to budget concerns
Awareness-raising is one of the key aspects of the privacy framework and should
be prioritized for all organizations. It can come in different forms, none of which
require huge budgets. If people are not aware of what they are processing, they
are also unaware of the consequences and liabilities that result from not knowing.
8.2 Leveraging Privacy Incidents
Most privacy compliance programs have mechanisms in place for gathering
information regarding privacy incidents. While these incidents result in
investigative work by the privacy office in conjunction with the information
security team, usually, they also provide training opportunities. Where possible,
the privacy office should provide targeted training to the affected department.
When privacy incidents occur, it is important to consider the following:
Where possible, leverage lessons learned from events that make the
headlines. Employees are curious of past company incidents, and they
are effective tools to demonstrate the company is not immune from
mistakes. This can also include incidents affecting a different
organization but that operate in the same industry and, therefore, carry
a similar risk profile. However, it is important to assure the event has
been appropriately “sanitized” to assure the impacted business unit or
individual could not be identified by others. Use the events as learning
opportunities, including discussions of how the incidents described
suggest ways to improve company’s processes.
Doing business means mistakes will happen. Use mistakes as learning
opportunities to improve processes rather than as cause for
complaint. Build a culture of trust that reporting a mistake will not
result in termination, rather create a path of process improvement. This
first starts with the privacy office building strong relationships with
business units, making known that while reporting privacy incidents are
required, resolving the matter focuses on the facts, not the individual.
Mistakes are best handled when they are approached constructively and
not through threats of employment actions. It will also encourage
employees to come forward and own up to their mistakes or report
incidents that may not have ordinarily been picked up by existing
processes or tools preventing disclosures, such as DLP tools.
Use stories. It is human nature to want to hear other people’s stories.
Share a privacy incident with others, or ask a victim of identity theft to
speak about their experience. The best example may come from
someone within the privacy office as employees may think they are
immune from the impacts of a privacy breach or falling for a phishing
event that led to compromising their data.
Hold “lunch and learn” sessions. Lunch and learn is a perfect way to
educate employees during their lunch hour. Allow them to bring their
lunch and listen to an expert speaker on a topic of personal interest,
such as how to protect families from identity theft. Assess if the training
qualifies for professional training credit, which will increase the
likelihood of more people attending. In certain matters, training may be
prequalified through the legal office’s training committee, allowing
those with a legal degree to obtain credits. The legal office’s training
committee may also help communicate the training opportunities.
These sessions could be held on one of the dedicated privacy and
cybersecurity days sponsored by the cybersecurity industry. For
example, ask a law enforcement expert to speak during lunch on
worldwide Data Privacy Day, January 28, about data breaches or
identity theft, and make free resources, such as information available
through the U.S. Federal Trade Commission (FTC) or Stay Safe
Online, available to attendees. At the end of the lunch, connect personal
privacy with the responsibilities each employee has to protect the
organization’s data.
Make it fun. Admit it: compliance training is not fun, and those around
you have no idea why you are passionate about your job. However, take
that passion and share it through games, stickers, competitions, and
giveaways. Depending on the company culture, consider scavenger
hunts, puzzles, or games of skills to solve privacy challenges. Where
appropriate, create commentary challenges and prizes with other
business units charged to data protection to show the larger piece of the
puzzle. The IAPP has posters, coasters, and stickers available to
download and print on its website.9 As many privacy budgets are small,
take advantage of free training and awareness resources. In some cases,
the company branding could also be added to personalize messages.
Develop slogans that can be used in presentations to capture the
essence of the message. For example, the word “security” is frequently
used. However, privacy professionals know the human element is the
concern. Consider playing off the word like this: “There can be all the
security in the world, but at the end SECU-R-ITY.” The letters SECTY
fade away, and employees are told U-R-IT. While employees may not
retain most of their training, slogans are easily retained. Exploit it by
adding to prizes and communications to reenforce the message as U-R-
IT in the effectiveness of your training and awareness program.
8.3 Communication
Communication is one of the most effective tools an organization has for
strengthening and sustaining the operational life cycle of its privacy program.
Privacy information is dynamic and constantly changing, so for privacy policies
and procedures to remain effective, organizations must continually communicate
expectations and policy requirements to their representatives, including
contractors and vendors, through training and awareness campaigns.
Improvements to the privacy program will also depend on the organization
providing ongoing communication, guidance, and awareness to its representatives
regarding proper handling and safeguarding of all privacy data. All available means
should be used to take the message to everyone who handles personal information
on behalf of the organization. A good question to ask regularly is: How effectively
are we communicating the expectations of our privacy program to the workforce
—everyone who is using the data? Measure understanding through metrics or
other objective means. This requires use of multiple metrics to assess an overall
trend, which will demonstrate to the privacy office where additional or refined
training is required.
Each organization needs a communications strategy to create awareness of its
privacy program and a specific, targeted training program for all employees. A goal
of this communications strategy is to educate and develop privacy program
advocates for each affected business unit within the organization. One of the best
ways to accomplish this goal is by employing a variety of methods to
communicate the message.
The privacy office is responsible for updating employees’ knowledge when
changes occur. However, employees cannot be expected to be trained on every
aspect of a privacy regulation—just on the guiding principles of compliance and
expected behavioral outcomes. Additionally, training to the details of a regulation
will require more frequent retraining when changes are made. Taking a big-picture
approach for protecting personal data is easier to manage than addressing the
details of what constitutes PII.
Creating a strategic activity plan for the year is a good way to provide regular
updates. Some groups specifically build into their plans a calendar of workforce
communications to ensure ongoing reinforcement throughout the year. For
example, the plan might specify that “every quarter we will produce a targeted
email campaign that will instruct employees on how to do X, Y, and Z. We will
conduct knowledge tests (contests) to assess learning.”
8.4 Creating Awareness of the Organization’s
Privacy Program
As discussed in chapter 6, “awareness” means to be vigilant or watchful. From a
privacy perspective, achieving awareness requires communicating the various
components of an organization’s privacy program, thus creating a vigilant or
watchful attitude toward the protection of personal data. The need for the privacy
office to constantly put reminders in front of their workforce requires innovative
thinking to identify different reminder techniques. Trying new approaches should
not be impeded by the fear of failure. Sometimes, the best planned reminders fail,
but the failures spark new, more effective ideas.
8.4.1 Internally
How does an organization build an awareness program internally? A good place to
start is through interdepartmental cooperation working toward the shared goal of
privacy protection. For example, the marketing department could work with
information security and tie in its campaign with the awareness program. You may
also look at including your organization’s ethics and integrity department, as well
as human resources (HR), in planning effective ways for departments to share
their awareness programs and experiences. Discuss how different groups can work
together to reinforce the privacy message with the workforce, creating an even
greater awareness of your privacy program.
You could also take an interdepartmental approach to assessing the various
privacy awareness programs throughout the organization. This can reveal both
strengths and weaknesses in individual programs, contributing to an overall
strengthening of all internal awareness programs. Proactive engagement with the
public relations (PR) or communication team can help to drive your
communications strategy and plan, as well as offer their expert guidance and
advice on the most effective tried and tested methods that work well for your
organization.
Conferences and seminars can be rich sources of information and expert
suggestions on effective ways to build a privacy program and address privacy
governance. An individual may learn about various approaches from privacy
professionals by attending presentations or panel discussions on these topics. And
often, a person learns about governance structures and approaches to privacy
through presentations on other topics.
Managing security incidents, creating a sustainable training and awareness
program, and designing and implementing programs or presentations on privacy
challenges can educate the workforce on privacy topics and provide insights into
how an organization manages these issues and assigns accountability. Here is
where your cybersecurity and corporate security teams may be able to show the
life cycle of a potential privacy incident, starting with a security event. Often these
departments are active with getting their message heard and will assist with most
opportunities to inject their messaging.
Information can also be obtained through informal exchanges of ideas. Most
privacy professionals are engaged in some phase of launching a privacy program.
The challenge is that technologies are always changing, new laws are always being
adopted, and processes can always be improved. Consider using the IAPP’s
Privacy List, joining a local KnowledgeNet chapter or professional association
message board, or, where feasible and lawful, gain insights or assistance through
your industry consortium.
8.4.2 Externally
Creating external awareness of a privacy program requires different resources and
methods than building internal awareness. External awareness is more directed
toward building confidence through brand marketing and is dependent on the
industry. For example, with the advertising technology business, there will be a
greater need to promote privacy to obtain the consumer’s trust that their
information will be used consistent with the business purpose. Other industries,
where the business model is not solely reliant on profile building or data sharing,
may not make this a centerpiece of their consumer communication. This occurs,
for example, when an organization makes statements such as, “We respect your
personal information, and we take steps to make sure that your information is
secure with us.” Increasing external awareness will also require obtaining partner
agreements and, in certain cases, providing training or obtaining attestations of
compliance. The challenge is to meet the reasonable expectations of consumers
and regulators and provide proof of compliance if challenged; otherwise, state
agencies or the FTC may file civil penalties against your company for misleading
its consumers.
External awareness is aimed at building consumer confidence in a brand by
creating recognition of a corporation’s commitment to security or fulfilling a legal
requirement. Organizations must have integrity when it comes to handling
personal information if customers are to remain loyal.
An example of creating external awareness is found in the growing cloud
computing industry. Many corporations are now exclusively, or at least heavily,
involved in providing infrastructure, platform, and software services for
individuals and businesses. The marketing of cloud services is built on the
consumers’ perception of the ability of the host organization to protect their
personal information.
Much of this information is personal information that other organizations are
transferring to an external site for storage. The most successful cloud-hosting
organizations are those that inspire confidence in their ability to provide security
for the personal data consumers entrust to them.
8.5 Awareness: Operational Actions
The privacy team, along with all relevant departments, can take the following
operational actions to ensure ongoing awareness. Many times, this can be
accomplished through creating alliances with other business units that are also in
charge of protecting the company, such as cybersecurity or physical security.
Methods to assist with developing and executing an effective training and
awareness plan include:
Develop and use internal and external communication plans to ingrain
operational accountability—meaning, when teaming up with other
functions within the same organization ensuring there is a designated
individual responsible to executing training opportunities. Many times,
this can be rotated.
Communicate information about the organization’s privacy and other
similar programs, such as ensuring there is a holistic view on data
protection. Consider leveraging information governance to assist with
where and how to store sensitive information.
Ensure policy flexibility for incorporating changes to compliance
requirements from laws, regulations, and standards. Remember, these
are living documents and require constant maintenance and
communication. Keeping these documents current is important for
effectiveness.
Identify, catalog, and maintain all document requirements updates as
privacy requirements change. This will demonstrate to company
leadership and regulators of your program’s activities. Periodically, this
may mean engaging other parts of the company to assure they
understand what changes they need to make to their program controls.
As stated in the first bullet point, accountability is key.
No privacy program can be an island. Privacy programs are most effective when
they are integrated into the larger data protection program. The more effective a
privacy program’s integration is, the more valuable it becomes as it plays a part in
the company’s data protection strategy, and similar business units can assist with
training, awareness, and risk mitigation.
8.6 Identifying Audiences for Training
Staff, managers, contractors, and other third parties may need privacy training.
The key is to identify who has access to personal information and provide targeted
training to those people. Targeted training implies there may be a variety of
training programs, depending on the department, the type of information that is
being handled, how that information is processed, and who is responsible for the
content and completion. While it is common for companies to have baseline
training, depending on the trainee’s role, specific examples and scenarios are
generated. This can be done a couple ways. First, when building the course, ask
the employee to select their role. This can be as broad as an office, warehouse,
field, or contractor or selecting by business function. Another option is to split the
training out to more specific audiences, such as office-based staff versus shop staff,
where different risks and incidents occur that can be tailored into the training. The
more detailed the selection, the more information and costs may be associated
with delivering training and also maintaining it. However, the risk versus reward
may be beneficial, especially if the company’s risk has changed due to
implementing new systems, new regulations, or data breaches. Practical
consideration should also be given to employees whose first language might not
necessarily be English. For example, in Belgium, it is a legal requirement for
certain information to be translated into French and Dutch.
The same concept should be applied when conducting in-person or live web
training. The most effective trainings are those where the privacy office spends
time with the business unit’s leader to learn about risk areas and modifies the
training or presentation accordingly. Create a training needs analysis to identify
the training requirements so that training is specific for the responsibilities of each
role.
Tip: The most effective privacy program managers are those who spend less time
at their desk telling others what they should be doing and more time identifying
and talking with key stakeholders to build trust, understand business processes,
and be aware of key challenges. A business owner who trusts the privacy office
will be more willing to ask for assistance and become a program champion.
8.7 Training and Awareness Strategies
Training and awareness must have the intention of changing bad behaviors and
reinforcing good ones that are integral to the success of the privacy program.
Many organizations have a learning and development group managing activities
related to employee training. This function enables policies and procedures to be
translated into teachable content and can help contextualize privacy principles
into tangible operations and processes. In smaller companies, these
responsibilities may fall on the privacy function. Whatever the size of the
organization, the privacy team will always need to approve the training materials
that have been produced.
There is a constant battle for training and awareness time, which requires the
privacy office to choose the correct training or awareness modality and
information. Generally, this is done through collaboration with the company’s
training organization, legal, and other similar business unit. Training and
awareness cannot be a one-and-done activity and requires ongoing reminders,
such as:
Assessing the organization’s education and awareness initiatives,
identifying gaps in its current program, and implementing a plan to fill
in areas of opportunity
Sustaining communication via awareness and targeted employee,
management, and contractor training, through a combination of
activities (formal training, speaking at department or company events,
creating material for the company’s intranet)
Partnering with HR or training functions or an organizational change
management expert to assess the best method to convey a message. At
times, privacy offices are too involved in the daily activities to assess
how to clearly articulate their message. Training and communication
experts can assist with a complex message and distill it in a way that
resonates with the layperson and assures terminology is consistent with
other programs.
Using badges and slogans is an excellent way to promote a program.
Employees enjoy receiving items that can be displayed in a physical
office or used as a virtual background. These are generally inexpensive,
highly creative, and generate conversation.
Repeating training over a predetermined period (e.g., quarterly) instead
of delivering training once. Companies have found that shorter and
repeated training, building on a prior message, can reinforce ideas.
Using microlearning or blended learning, such as three- to five-minute
videos or having learners complete one portion of training online and
use that material for in-class exercises
Inserting privacy messaging into other department trainings
Going to different company offices and holding informal meetings
regarding how to learn to protect company or personal information.
Employees are curious regarding best practices for how to protect their
and their family’s information, and those messages can be used to
promote personal information protection in the workplace.
Tracking participation and comprehension, which has become an
emerging issue with regulators. Regulators expect companies to track
completion and, in cases, show the learning has gained knowledge.
Regardless, this information can be used to help the company assess its
training effectiveness and retool messaging as needed.
The communications group can assist by publishing periodic intranet content,
communicating via email, and providing posters and other collateral that reinforce
good privacy practices. However, most organizations now encourage content
owners to manage specific spaces and content on the company intranet. These
good practices must go far beyond the traditional in-person or web-based training
as employees are required to take more regulatory and skills-based training. A
one-and-done approach, while may check a regulatory box, does not further the
company’s privacy program or provide value. Therefore, the best approach is
multiprong, requiring the privacy office to personally engage stakeholders, while
leveraging similar programs with a data protection mission.
8.8 Training and Awareness Methods
Companies must think of innovative ways to communicate training and awareness
opportunities to their employees. Leveraging international Data Privacy Day
activities is an excellent source to use as many ideas have been created. Also,
several nonprofit organizations allow their activities to be used for free. Methods
may differ based on the company’s culture and budget. Some methods are low-
cost. It is not uncommon for companies to use different ways for delivering
messaging. These messages can be tailored to office and remote workers such as by
doing the following:
Formal education utilizing a classroom environment to deliver official
training that may be recorded and documented. Training activities for
each modality will need to be modified to assure all trainees are
appropriately engaged while receiving similar information to facilitate
discussions. Training may also be delivered just-in-time; for instance, an
organization might provide a brief training session when an individual
has been authorized to perform a new task. Instructors can use out-of-
the-box modules.
Different training methods include instructor/classroom,
online, hybrid, and gamification
Awareness includes email, posters, blogs, internal social
media, games, and expos
E-learning includes online, live learning, in-person training, or
podcasts. E-learning can be self-paced or live with the active support of
an instructor. Simulations can be used.
Lunch (brown bag) and department team meetings offer
opportunities to provide training pamphlets or other material for
individuals to pick up at booths or to receive at staff meetings. The most
effective way to get employees to give up their personal time is to
engage them on how to protect their personal information at home.
Many times, large data or security breaches are followed by employees,
and holding a discussion on the events and how those events could
happen within their company will generate interest. Employees are
naturally curious.
Newsletters, emails, and posters have been used for many years.
Delivery methods include email, websites, print materials, and physical
displays among other options. Stickers are also a unique way to deliver
messages.
Handouts containing frequently asked questions and tip sheets are
helpful for answering common questions or dispelling myths. These
handouts should be in physical and electronic form for ease of reference
and be adapted to different working modalities.
Slogans and comics can be used to summarize important aspects of
the program and promoted as giveaway items.
Internal social media page dedicated to share privacy news articles or
current events, and briefly explain what the reader can do to protect
themselves. More than likely, at least one reader or their family has been
impacted. This is also a non-threatening way for employees to learn
about privacy without the company recording training or test
completion.
Webpages can be used to communicate data, knowledge bases, and
frequently asked questions.
Hallway or lobby monitors to broadcast simple but effective
messages or upcoming training. For those working remotely, the same
message may appear on the company or business unit’s intranet
homepage. The advantage of placing information on the intranet is the
same graphic used for monitors can be placed online but users can click
on the image to get more information.
Use contests to engage employees in playing a game while learning
privacy topics. Companies can either build an online game or use some
pre-built offers by companies, such as knowledge challenges, puzzle
completion, or privacy escape rooms. Prizes can be as simple as
bragging rights, choice of a sticker, or a privacy cover for their computer
or phone webcam.
Communication should be consistent at all levels of the organization and among
all stakeholders to ensure they understand the framework and how it affects and
improves the organization; however, creativity is key to constant engagement.
Remember, employees are working in two different worlds—the traditional office
and at home.
8.9 Using Metrics
Privacy programs are generally not seen as revenue-generating; however, they can
reduce risk and provide for innovative and ethical data usage. For privacy
programs to show how they support the company’s mission and prove to
regulators they are actively addressing compliance risks, they must keep records
regarding training and awareness programs, including any remediation taken after
an event. Metrics must go beyond the numbers enrolled in a training or
communication event and tell a story regarding process improvement. They
should be linked to other program metrics, such as reduction in the number of
privacy events. It may take time to measure the impact of training and awareness
activities.
While there are many metrics to consider when telling the program’s “story,” it is
important to work with company leaders to determine what data points are most
important. Additionally, an effective set of metrics can assist with regulatory
inquiries on program effectiveness, as well as be used to spot program strengths
and weaknesses. Some examples of how to measure program effectiveness
include:
Number of training or awareness opportunities by topic
Number of individuals who enrolled or received awareness
communication
Training method (e.g., live, online, poster, road shows)
Percent of training completed
Results of quizzes or knowledge tests
Changes to the number of privacy incident reports or requests for
consultation or additional training
Table 8-1: Metric Template Example: Awareness and Training Measure10
Field Data
Measure ID Security training measure 1 (or a unique identifier to be filled out by the
organization)
Goal
Strategic goal: ensure a high-quality workforce supported by modern
and secure infrastructure and operational capabilities
Privacy goal: ensure that organization’s personnel is adequately
trained to carry out their assigned information-security-related
duties and responsibilities
Measure Percentage of information system security personnel who has received
security training (see NIST SP 800-53 “Controls: AT-3: Security Training” for
definitions)
Measure type Implementation
Formula Number of personnel who completed security training within the past
year/total number of information system security personnel *100
Target High percentage defined by the organization
Implementation
evidence Training records maintained
Percentage of those with significant privacy responsibilities who
received the required training
Frequency
Collection frequency: organization-defined (e.g., quarterly)
Reporting frequency: organization-defined (e.g., annually, monthly,
weekly)
Responsible
parties Information owner: organization-defined (e.g., training manager)
Information collector: organization-defined (e.g., information system
security officer [ISSO], training manager, privacy officer)
Information customer: chief information officer (CIO), ISSO, chief
information security officer (CISO)
Data source Training and awareness-tracking records
Reporting Pie chart illustrating the percentage of personnel who received training versus
format those who have not received training; if performance is below target, pie chart
illustrating causes of performance falling short of targets
Training programs dealing with privacy policies should be based on clear
policies and standards and have ongoing mechanisms and processes to educate
and guide employees in implementation. Everyone who handles personal
information needs to be trained in privacy policies and how to deploy them
within their area to ensure compliance with all policy requirements. This applies
to employees, management, contractors, and other entities with which the
organization might share personal information.
8.10 Summary
As companies continue to closely monitor training seat time, the privacy
compliance program should seek out innovative ways to ensure its message
continues to be heard. This means the program must build alliances with other
similar organizations, such as cybersecurity and physical security, to ensure a
consistent message is carried through all applicable training. Where possible, the
topic of privacy should become a core topic within the company, ensuring its
importance is emphasized in the code of conduct. Awareness is an ongoing
journey, during which the privacy program can leverage company technology to
build a privacy coalition and facilitate friendly competitions but, more
importantly, make protecting information personal through practical application.
An effective training and awareness program makes a complex topic
comprehensible and enables people to integrate key aspects of it effortlessly into
their daily routines.
Endnotes
1 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, September 2020,
https://www.capita.com/sites/g/files/nginej291/files/2020-08/Ponemon-Global-Cost-of-Data-Breach-
Study-2020.pdf.
2 Verizon, 2020 Data Breach Investigations Report, May 2020,
https://enterprise.verizon.com/resources/reports/2020-data-breach-investigations-report.pdf.
3 “Hackers Try to Contaminate Florida Town’s Water Supply Through Computer Breach,” Reuters, February
8, 2021, https://www.reuters.com/article/us-usa-cyber-florida-idUSKBN2A82FV.
4 “Adult Streaming Site Leaks Info on Millions of Users,” TechRadar, May 10, 2020, https://www
.techradar.com/in/news/this-popular-adult-streaming-site-accidentally-outed-millions-of-users.
5 NIST SP 800-50, “Building an Information Technology Security Awareness and Training Program,”
accessed March 2021, https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-50
.pdf.
6 NIST SP 800-50.
7 NIST SP 800-50.
8 “Developing a Privacy Compliance Program,” Thomson Reuters Practical Law, accessed November 2018,
https://content.next.westlaw.com/5-617-5067?transitionType=Default&contextData=(sc
.Default)&__lrTS=20180705135935578&firstPage=true&bhcp=1.
9 “Prudence the Privacy Pro,” IAPP, accessed November 2018,
https://iapp.org/resources/topics/prudence-the-privacy-pro-3/#posters-and-coasters.
10 NIST SP 800-55, Revision 1, “Performance Measurement Guide for Information Security,” 32–33,
accessed November 2018, http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1
.pdf.
CHAPTER 9
Privacy Operational Life Cycle: Respond:
Data Subject Rights
Amanda Witt, CIPP/E, CIPP/US;
Jon Neiditz, CIPP/E, CIPP/US, CIPM;
John Brigagliano
Across the globe, privacy and data protection laws are being enacted with a
primary goal of strengthening both the rights of data subjects and their control
over the processing of their personal information. Data subjects are identified or
identifiable individuals (natural persons) whose personal information is being
processed by an organization, such as a patient at a medical facility, employee of a
company, candidate applying for a job, customer of a retail store, or visitor to a
website. Data subject rights vary across jurisdictions and include the right to know
how personal information will be used and right to opt out of certain processing
activities. Data subject requests (DSRs) may come directly from the data subject,
their agent, or, under some privacy laws and based on required contractual
arrangements, organizations may need to assist business customers with fulfilling
DSRs.
Preparing to respond and efficiently responding to DSRs is a critical component
of any privacy program. Some laws require a response without undue delay or
within a certain set period. Organizations also run the risk that a poorly handled
response to a DSR could result in a data subject making a complaint to a regulator
or supervisory authority or going public with a complaint against the
organization. Thus, a timely and effective response to DSRs is of paramount
importance to excellent customer service and brand protection. The practices
outlined in this chapter can help your organization communicate effectively with
data subjects and develop an effective subject request process.
9.1 Privacy Notices and Policies
Transparency is critically important under most privacy laws, so it is crucial to
provide data subjects with information about privacy practices when or before
personal information is collected. A privacy notice is an external statement
directed to data subjects, which might include current or potential customers,
users, or employees. Provided when an organization collects information from a
data subject, it is a tool used to describe the organization’s privacy practices. On
the other hand, privacy policies are typically internal documents directed to
employees or contractors that describe how the organization will process their
personal information. Both privacy notices and policies describe how personal
information will be collected, used, shared, and stored.
A privacy notice can help an organization comply with applicable laws, but it
does not provide blanket protection from privacy-related litigation. A privacy
notice should be considered a promise the organization makes to data subjects. If
the organization breaks those promises or fails to adequately describe its personal
data processing, the organization may face regulatory action or litigation.
Section 5(a) of the Federal Trade Commission Act prohibits unfair and
deceptive trade practices and allows the U.S. Federal Trade Commission (FTC) to
investigate and bring enforcement actions against companies engaging in unfair
and deceptive trade practices.1 Similarly, most states have consumer protection
laws that provide state attorneys general with the authority to address unfair and
deceptive business practices.2 In the United States, the FTC and attorneys general
have routinely used this power to investigate whether companies act contrary to
the statements made in their privacy notices. Citing breaches of company privacy
notices, the FTC has initiated enforcement actions alleging deceptive acts and
practices against companies, such as Google and its subsidiaries, YouTube,
Facebook, and Snapchat.3 In connection with the Snapchat settlement in 2014,
Edith Ramirez, then chairwoman of the FTC, noted the following: “If a company
markets privacy and security as key selling points in pitching its service to
consumers, it is critical that it keep those promises. Any company that makes
misrepresentations to consumers about its privacy and security practices risks
FTC action.”4 In 2019, the FTC fined Facebook $5 billion for, among other
misrepresentations and the violation of a 2012 FTC order, a misrepresentation in
Facebook’s privacy notice as to whether facial recognition technology was applied
to a user’s account on an opt-in or opt-out basis.
9.1.1 Privacy Notice: Common Elements
Your organization’s privacy notice may provide—depending upon the laws to
which your organization is subject—the following information:
Who your organization is and your organization’s contact information
(usually for the privacy office or appointed data protection officer(s)
[DPO])
What information is collected, directly or indirectly
How your organization will use the information
With whom your organization will share the information, including
whether your organization sells personal information
An overview of applicable data subject rights, the process for exercising
those rights, and that those rights may be exercised through an agent
How information is protected and processed securely
Under what circumstances your organization acts as a processor for
other organizations
How the behavior of website users is monitored
9.1.2 Privacy Notice: Design Challenges and Solutions
Privacy notices should be living documents, maintained in a life cycle that
includes designing and developing, testing, releasing, and reviewing and updating
where necessary. It is good practice to align your internal processes when
considering updates to these notices, for example, identifying new processing
activities as part of a data inventory exercise (data mapping), developing the
records of processing activities (Article 30 of the EU General Data Protection
Regulation [GDPR]), or carrying out data protection impact assessments
(DPIAs). When designing a privacy notice, consider the intended audience and
how the users will view it (e.g., on a computer screen, mobile device, home
appliance). It is important to not only consider the primary users (e.g., the car
owner, homeowner), but also any other individuals whose information could be
collected (e.g., car passengers, other home residents, visitors).5 The goal of the
privacy notice should be to “help the recipient make informed privacy decisions”6
and facilitate the exercise of rights.
Several design strategies can help keep privacy notices accessible to your
customers or external stakeholders. There are several approaches to providing
privacy notices. A layered approach to privacy notices has been endorsed both by
the FTC and the European Union’s Article 29 Working Party (WP29), which is
now the European Data Protection Board (EDPB).7 A layered approach provides
a high-level summary of the various sections of the privacy notice and allows the
users to read more about that section by clicking a link to that section or scrolling
below. The EDPB states in guidance that “layered and granular information can be
an appropriate way to deal with the two-fold obligation of being precise and
complete on the one hand and understandable on the other hand.”8 To provide
sufficient transparency explaining an organization’s use of cookies and other
online-tracking tools and comply with similar legal requirements, the organization
may devote a section of the privacy notice to cookies. As a more comprehensive
alternative for ensuring transparency, however, organizations subject to EU
and/or UK privacy laws often display a cookie notice as a separate document from
the organization’s privacy notice.9
The California Consumer Privacy Act (CCPA) also requires a layered approach
to delivering notices. The law requires, as applicable, a notice at the point of
collecting information, privacy notice, notice of the right to opt out of sales of
personal information, and notice of financial incentives offered for the sale of
personal information.10 For online information collection, however, a business
may satisfy the notice at collection by providing a link to the appropriate section
of a business’s privacy policy.11
Another way to provide data subjects with notices about the information you
plan to collect from them is with the just-in-time notice. It is a type of layered
approach provided immediately before the data is collected—for instance, when a
mobile application asks to track your location. More information is typically
available by clicking a link or hovering over the notice. The WP29 recommends
using this type of notice when “providing information at various points
throughout the process of data collection; it helps to spread the provision of
information into easily digestible chunks and reduces the reliance on a single
privacy statement/notice containing information that is difficult to understand
out of context.”12 The CCPA affirmatively requires a just-in-time notice if a
business “collects personal information from a consumer’s mobile device for a
purpose that the consumer would not reasonably expect.”13
Icons or symbols can also be used to communicate privacy practices to data
subjects. Proposed changes to CCPA regulations from the California attorney
general, for example, would allow a business to display a button for a consumer to
opt out of the business’s sale of personal information.14 As other examples, the
Network Advertising Initiative (NAI) and Digital Advertising Alliance (DAA) use
the AdChoices, WebChoices, and AppChoices icons on websites and mobile
screens to notify visitors that their information may be used for interest-based
advertising and to link them to opt-out choices.15 The use of icons or symbols is
another type of layered approach to presenting privacy notices. Hyperlinks
connected to the icon or hovering over the icon should lead the data subject to
additional information on the meaning of the icon. Alternatively, QR codes can
link mobile devices to privacy notices. These approaches can be useful in
scenarios in which there is limited space to provide a privacy notice, such as in
Internet-of-Things (IoT) devices or mobile screens. User testing is recommended
so you confirm the symbol is meaningful and adequately conveys the required
information.
Lastly, another tool for providing transparent privacy notices and user control is
a privacy dashboard, which offers a summary of privacy-related information and
metrics and is easy to access and navigate. As described by the WP29, a privacy
dashboard, which is built into the existing platform, can be a tool for providing
more personalized privacy notices and choices to data subjects in that “it will
ensure that access and use of it will be intuitive and may help to encourage users to
engage with this information, in the same way that they would with other aspects
of the service.”16
9.1.3 Privacy Notice: Communication Considerations and
Re-Evaluation of the Fair Information Practice Principles
Privacy notices inform individuals of an organization’s privacy practices but do
not solicit or imply consent. If an individual has a choice about the use or
disclosure of their information, consent is that person’s way of giving permission
for the use or disclosure. In the United States, certain consents may be implied,
but collecting a data subject’s express consent for the collection of personal
information is required to establish consent as a basis of processing under the
GDPR. If relying on consent, it is important to keep a legally admissible record
that establishes what the individual consented to, the date this was completed (if
possible to evidence), and establishes that such individual agreed to the notice.
While consent may be required by law in many instances, it is not always
required and may not be the only reliable basis for processing personal
information. For example, under the GDPR, in addition to consent, lawful bases
for processing personal data include contract, legal obligation, vital interests,
public interest, and legitimate interests.
Increasingly, privacy scholars and practitioners question the effectiveness of
privacy notices and illusion of control they create. Lawmakers in the 1990s
determined that using “notice and choice” was the best way to protect consumer
privacy.17 Presumably consumers could maintain control of their information and
their privacy would be respected if they received notice of a company’s data
practices and decided to continue using that company’s website or services.18
Over time, the length and complexity of privacy notices have increased, and few
consumers actually read them. Given that thousands of companies collect
personal information, expecting a consumer to devote enough time to read the
privacy policy of every such company is unrealistic.19 In a well-known study at
Carnegie Mellon, researchers Lorrie Faith Cranor, CIPT, and Aleecia McDonald
determined that if every internet user were to spend eight hours per day reading
every privacy policy they encountered while using the internet, it would take 76
days to complete the task.20 The privacy notice is generally a way for organizations
to limit their liability by disclosing their practices and providing themselves with
maximum flexibility for use of consumer data. Knowing these limitations and the
GDPR requirements for increased transparency, organizations should explore
ways to provide more meaningful, digestible notices and develop new ways to
protect consumer privacy instead of relying on notice and illusory control.
According to leading privacy scholar, the notice-and-choice model produces
“incentives for companies to both hide the risks in their data practices through
manipulative design, vague abstractions and complex words as they shift risk by
engineering a system meant to expedite the transfer of rights and relinquishment
of protections.”21 Organizations should consider providing more innovative and
user-friendly tools to consumers, such as the privacy dashboards and other tools
discussed above.
9.2 Choice, Consent, and Opt-Outs
9.2.1 Choice and Consent
If consent is required by law or regulation, there must be a method for obtaining
and recording it. Under the GDPR, electronic consent requires an affirmative act
from the individual establishing a freely given, specific, informed, and
unambiguous indication of the individual’s agreement to the processing.22 A pre-
ticked box is not sufficient to validate consent; according to the WP29, a clear
action, such as swiping a bar on a screen, waving in front of a smart camera, or
turning a smartphone around, may be sufficient. In the context of “cookie
consents” for online tracking, the UK Information Commissioner’s Office (ICO)
directs that any consent mechanism that emphasizes a choice to agree to tracking
“over ‘reject’ or ‘block’” is not compliant.23 U.S. state privacy laws increasingly also
prohibit so called “dark patterns” as a legitimate form of consent, where “dark
patterns” are generally defined as any interface designed to substantially subvert
an end-user’s autonomy.24
Individuals who do not have a choice about the processing of their personal
information should not be led to believe they do. Individuals who do have a
choice must be given the ability to exercise that choice. Further, organizations
should not process personal information in a manner incompatible with the
consent. For example, an application on a social media site that collects personal
information based on consent for one purpose, such as a personality quiz, may not
also use that personal information for a different purpose, such as targeted
advertising. Individuals who provide consent must be able to revoke that decision.
For example, an individual may decide they no longer want the application to
continue processing their personal information, so there should be a mechanism
for them to withdraw consent, and such mechanism should be as easy to
administer as the means through which the consent was originally provided.
In addition to a record of consent, organizations should keep documentation of
the privacy notice provided at the time of consent. Consents should be regularly
reviewed to determine if a refresh is necessary (if, for example, the organization
has made changes to the processing operations or if laws, regulations, or standards
have changed).
9.2.2 Opt-In versus Opt-Out
There are two central concepts of choice: data subjects can either give their
consent to processing by opting in, or withhold or revoke such consent by opting
out.
When individuals opt in, they make an active, affirmative indication of choice—
for instance, an individual might check a box to signal their desire to share their
information with third parties. Ideally, this choice should be clear and easy to
execute.
When individuals opt out, they may do so by performing an affirmative action
by hitting an “unsubscribe” link, for example. It is important to carry out “spot
checks” to ensure this link is operational. Alternatively, opting out could be
inferred by their lack of action. For example, unless an individual checks or
unchecks a box, their information will be shared with third parties.
If an organization will perform different types of processing, as required by the
GDPR, an individual should consent to each activity separately. For example, an
individual might be asked to check “yes” or “no” beside various methods for direct
marketing—for instance, email or phone.
9.3 Obtaining Consents from Children
9.3.1 Compliance
The U.S. Children’s Online Privacy Protection Act (COPPA) and the GDPR set
out specific rules regarding providing privacy notices to children and obtaining
parental consent for processing their personal information.25 Children’s
information may be considered sensitive information, which warrants heightened
protections. The ICO released the code for Age Appropriate Design in 2020 that
implements GDPR principles “in the context of children using digital services.”26
That code establishes rules and flexible standards, e.g., not sharing geolocation
data by default and differentiating an offering based on child-users’ ages. COPPA
and its implementing rule are designed to provide parents with control over the
personal information collected online from children under 13 years of age.27 Prior
to collecting personal information online, including via mobile applications, from
a child under 13, organizations must obtain verifiable parental consent from a
parent or legal guardian of the child.28 The law provides parents with the right of
access, modification, and deletion of their child’s personal information.29 COPPA
also provides parents with an opportunity to prevent and limit further collection
and use of their child’s personal information.30
Prior to selling a child’s personal information, the CCPA requires organizations
to obtain parental consent for children under the age of 13 years old and the
affirmative consent of children between 13 and 16 years of age.31
9.3.2 Language and Delivery
Generally, privacy notices directed toward children should be presented in ways
children can understand. For example, the Office of the Privacy Commissioner of
Canada (OPC) states, “For minors able to provide meaningful consent, consent
can only be considered meaningful if organizations have reasonably taken into
account their level of maturity in developing their consent processes and adapted
them accordingly.”32 The WP29 suggests “vocabulary, tone and style of the
language used is appropriate to and resonates with children [...]”33 Furthermore, it
recommends reviewing the “UN Convention on the Rights of the Child in Child
Friendly Language,” which provides useful examples of child-centric language.34
Images or videos may also be an effective method of relaying key privacy
information to children.
9.3.3 Age
Laws and regulations may establish an age threshold for consent. In practice, a
website may ask users to enter their age before accessing content, or a web
application for children may require consent via a parent’s or legal guardian’s email
account before collecting and processing the personal information of a child
under 13 years old in the United States. The age threshold may vary depending on
jurisdiction. For example, the GDPR sets 16 as the age threshold but allows
individual countries to set the age threshold between 13 and 16 years old. Because
each country can set its own age threshold, there is considerable variation by
country with some countries, such as Belgium, Denmark, and Sweden, setting the
age at 13 while others set the age at older than 13 (e.g., the age of consent in
France is 15, while Italy and Spain set the age at 14).35 The UK Age Appropriate
Design Code, also known as the Children’s Code, applies to age ranges of data
subjects up to 17 years old.36 Although Australian law does not specifically
regulate children’s data, the Office of the Australian Information Commissioner
(OAIC) guides that as a general rule, only persons 15 years or older carry
sufficient maturity to provide effective consent as may be required under
Australian privacy law.37
9.3.4 Purpose of Processing
The purpose of processing may trigger certain rules. For example, organizations
may be required to refrain from tracking children for online behavioral advertising
purposes or selling their personal information. Conversely, COPPA exempts from
certain rules use and disclosure of personal information if the purpose for such
processing is limited to supporting the operator’s internal operations, such as
website maintenance and security.38
9.4 Data Subject Rights in the United States
Many federal and state laws provide data subjects with rights of control over their
personal information. Although these rights are not encompassed in one
comprehensive federal law, federal laws across different sectors, as well as
individual state laws, grant data subjects rights over how their personal
information is processed.
9.4.1 Federal Laws
9.4.1.1 Fair Credit Reporting Act
The Fair Credit Reporting Act (FCRA) grants several important rights to
consumers with respect to how their data is used.39 Under the FCRA, customers
can obtain access to all the information a consumer reporting agency has on file
about them.40 Such information is usually compiled into a credit report and must
be provided to consumers free of charge once a year.41 The FCRA also allows
consumers to correct or delete any incorrect information that may be contained in
their files by notifying the credit-reporting agency.42 If inaccurate information is
discovered in the consumer file, the credit-reporting agency must examine the
disputed information, usually within 30 days of notification.43 Consumers also
have the right to request removal of outdated negative information, such as civil
suits, judgments, and liens from their consumer reports seven years after the
statute of limitations has expired, while bankruptcies may be removed from credit
reports after 10 years.44 Consumers have several notification rights under the
FCRA. For example, consumers have the right to be notified of adverse actions
taken against them based on information contained in their credit reports. If a
financial institution submits or plans to submit negative information to a credit
reporting agency, the financial institution must provide a notice to the consumer
prior to furnishing such negative information.45
The FCRA places obligations upon employers to obtain an applicant or
employee’s written consent prior to conducting a background check.46
Additionally, the FCRA requires employers to inform the applicant or employee
that the information obtained in the background check may be used to make the
decision about their employment.47 This information must be provided in a
standalone written notice separate from an employment application.48
9.4.1.2 Health Insurance Portability and Accountability Act of 1996
The Health Insurance Portability and Accountability Act (HIPAA), along with its
implementing regulations, including the Privacy Rule, regulates the use and
disclosure of protected health information (PHI) and provides individuals with
rights relating to their PHI.49 With certain exceptions, any use or disclosure by or
on behalf of a HIPAA-covered entity other than for “treatment, payment or health
care operations” must be done pursuant to a revocable authorization.50 Under
HIPAA, individuals have the right to obtain a copy of their medical records.51
Usually, a copy of this information must be provided within 30 days.52 An
individual has the right to change any incorrect information and add any
information that may be missing or incomplete. In most cases, these changes must
be implemented within 60 days.53 Similarly, an individual has the right to know
how their information has been shared with others and request restrictions on
information that an individual may not want to be shared.54
9.4.1.3 Do Not Call Registry
The FTC created the National Do Not Call (DNC) Registry as a part of revisions
to its Telemarketing Sales Rule (TSR).55 The Federal Communications
Commission (FCC) also enforces the DNC Registry and adopted its own do-not-
call rule, effectively plugging gaps in the FTC’s jurisdiction and applying the
registry to all telemarketing of goods or services. Consumers may also file a
complaint with the FCC if they believe they have received an illegal call or text.56
Consumers can sign up without charge for the DNC Registry to stop unwanted
commercial solicitation calls.57 The FTC enforces the DNC Registry by taking
legal action against companies and telemarketers who violate the applicable law
and rules.58
9.4.1.4 Controlling the Assault of Non-Solicited Pornography and
Marketing Act of 2003
The FTC allows individuals to forward unwanted or deceptive messages to the
FTC to report and, in effect, reduce the number of spam emails.59 The Controlling
the Assault of Non-Solicited Pornography and Marketing Act (CAN-SPAM)
prohibits organizations subject to the law from sending many types of commercial
messages.60 An individual may also file a complaint with the FTC regarding a
company’s unsolicited emails and/or its refusal to honor a request to unsubscribe
from the mailing list.61
9.4.1.5 Privacy Act of 1974
The Privacy Act of 1974 provides individuals with a right of access to their own
records from each federal agency that maintains a system of records upon receipt
of a written request from an individual.62 The law also permits an individual to
request an amendment of their records and challenge the accuracy of information
that an agency has on file. Information collected for one purpose may not be used
for a different purpose.63 Lastly, individuals may bring civil actions against
agencies for violations of the act.64
9.4.1.6 Freedom of Information Act
Under the Freedom of Information Act (FOIA), federal agencies are required to
disclose any federal agency records or information upon request by the public,
unless the request falls under one of the nine exemptions and three exclusions that
protect national security interests, personal privacy, and law enforcement
interests, for example.65 The nine exemptions to information that may be
requested under FOIA are:
1. Information that is classified to protect national security
2. Information related solely to the internal personnel rules and practices
of an agency
3. Information that is prohibited from disclosure by another federal law
4. Trade secrets or commercial or financial information that is confidential
or privileged
5. Privileged communications within or between agencies
6. Information that, if disclosed, would invade another individual’s
personal privacy
7. Information compiled for various law enforcement purposes
8. Information that concerns the supervision of financial institutions
9. Geological information on wells
Congress has provided special protection for information requests for three
categories of law enforcement and national security records, and these records are
not subject to the requirements of the FOIA. The first exclusion protects the
existence of an ongoing criminal law enforcement investigation when the subject
of the investigation is unaware it is pending and disclosure could reasonably be
expected to interfere with enforcement proceedings. The second exclusion is
limited to criminal law enforcement agencies and protects the existence of
informant records when the informant’s status has not been officially confirmed.
The third exclusion is limited to the Federal Bureau of Investigation (FBI) and
protects the existence of foreign intelligence or counterintelligence, or
international terrorism records when the existence of such records is classified.66
9.4.2 State Laws
Increasingly, states have begun enacting laws that grant data subjects rights over
their information. California has served as the national trendsetter in its
enactment of various laws aimed at providing individuals with rights over how
their information is processed.
9.4.2.1 California Online Privacy Protection Act and Delaware Online
Privacy Protection Act
California was the first state in the nation to require commercial website or online
service operators to conspicuously post a privacy notice on their websites or
online services.67 The impact of the California Online Privacy Protection Act
(CalOPPA) is wide reaching as the law applies to any website or online service
operator in the United States and possibly the world whose website collects
personally identifiable information (PII) from California consumers.68 The law
requires the disclosure of specific information in the privacy notice, such as
categories of PII collected, description of a process by which a website operator
notifies consumers of material changes to the privacy notice, and disclosure of
how an operator honors Do Not Track requests, among others.69
The Delaware Online Privacy and Protection Act (DOPPA) is materially like
CalOPPA; however, there are a few notable differences.70 CalOPPA applies to
“consumers,” while DOPPA applies to “users” and, therefore, has a broader
application. Under CalOPPA, a consumer is defined as “any individual who seeks
or acquires, by purchase or lease, any goods, services, money, or credit for
personal, family, or household purposes.”71 DOPPA defines a user as an
“individual that uses an internet website, online or cloud computing service,
online application or mobile application [...].”72 DOPPA covers a broader range of
entities that could be handling PII, including websites, cloud computing services,
and online and mobile applications, while CalOPPA is limited to commercial
websites and applications.73 Both CalOPPA and DOPPA require operators
disclose in their privacy notices how they respond to Do Not Track requests
regarding the collection of consumers’ and users’ PII.74
The state of Nevada has a similar law, although the disclosures required under
that Nevada law are less comprehensive than those required under CalOPPA and
DOPPA.75 Generally speaking, therefore, compliance with the CalOPPA and
DOPPA satisfy disclosure requirements under the Nevada law. One unique
requirement, however, is operators must disclose whether third parties may track
“an individual consumer’s online activities over time and across different Internet
websites or online services” on the operator’s website to satisfy the Nevada law.
9.4.2.2 California Shine the Light Law
Separate but related to CalOPPA is California’s Shine the Light law, which gives
California residents the right to request and be notified about how businesses use
and share their personal information with other businesses for direct marketing
purposes.76 The law also gives consumers a private right of action in the event a
business fails to respond to a consumer’s request.
9.4.2.3 California Online Eraser Law
California’s Online Eraser law, which is designed to protect individuals under the
age of 18, requires operators of websites, online services, and online and mobile
applications to permit minors who are registered users of services to request and
remove content a minor may have posted on the operator’s website or
application.77 However, there are exceptions to the law that limit its effectiveness
to data subjects in practice. A service operator is not required to comply with the
request for removal and deletion if the content about the minor was posted by a
third party other than the minor, who is a registered user of a website. This
exception may make it less effective in combating online bullying, for example.
Further, if a minor does not follow instructions provided to the minor on how to
request removal of content or if a minor received compensation or other
consideration for the content, the service operator is not legally obligated to
comply with this law.78
9.4.2.4 California Consumer Privacy Act of 2018
On June 28, 2018, the governor of California signed into law a landmark privacy
bill, the California Consumer Privacy Act of 2018 (CCPA). Lawmakers revised
the CCPA multiple times, and the California attorney general issued extensive
regulations between June 2018 and the law’s January 1, 2020, effective date. The
CCPA impacts businesses around the world, including, according to IAPP
estimates, more than 500,000 U.S. companies, most of which are small- to
medium-sized enterprises.79
The CCPA, as initially adopted, extends existing privacy rights of California
residents in the California constitution, including:
The ability to request a record of:
What types of personal information an organization holds
about the requestor, its sources, and the specific personal
information that has been collected
Information about the use of the individual’s personal
information in terms of both business use and third-party
sharing
A right to erasure—that is, deletion of the personal information (with
exceptions for completion of a transaction, research, free speech, and
some internal analytical use), as well as disclosure of this right to
consumers, although this right is subject to substantial exceptions.
In relation to businesses that sell personal information, the option for
consumers to opt out of having their data sold to third parties. A
business that sells personal information must post a link titled “Do Not
Sell My Personal Information” on the business’s homepage.
9.4.2.5 The California Privacy Rights Act
On November 3, 2020, California voters passed Proposition 24, the California
Privacy Rights Act (CPRA), which functions as an amendment to the CCPA and
becomes effective January 1, 2023.80
The CPRA creates several new data subject rights, in addition to those provided
in the CCPA:
The CPRA, unlike the CCPA, defines “sensitive personal information”
and allows consumers to limit the purposes for which businesses may
use sensitive personal information. Businesses must facilitate that right
by providing a link on the business’s homepage labeled “Limit the Use
of My Sensitive Personal Information.”
The CPRA creates a newly defined consumer right to opt out of sharing
consumers’ personal information, but the CPRA’s definition of
“sharing” is quite limited and includes only disclosures for cross-context
behavioral advertising. The CCPA’s “Do Not Sell My Personal
Information” link will, therefore, be replaced with a link labeled “Do
Not Share or Sell My Personal Information.”
Consumers may ask businesses to correct inaccurate personal information.
9.4.2.6 Virginia’s Consumer Data Protection Act and other recently
enacted state privacy laws
Virginia Democratic Governor Ralph Northam signed Virginia’s Consumer Data
Protection Act (CDPA) into law March 2, 2021. The law takes effect January 1,
2023, matching the CPRA’s effective date.81
The consumer privacy rights in Virginia’s CDPA include to:
Confirm whether a controller processes the consumer’s personal data
Correct inaccurate personal data
Delete personal data provided by or obtained about the consumer
Access personal data in a portable format
Opt out of profiling, targeted advertising, and personal data sales
Virginia’s CDPA also regulates sensitive personal data. Under the CDPA,
controllers must obtain a consumer’s consent before processing sensitive personal
data, defined to include, among other categories, geolocation information,
biometric information, sexual orientation, religious beliefs, and children’s personal
data.
On July 8, 2021, Colorado Democratic Governor Jared Polis signed the
Colorado Privacy Act (CPA) into law. The CPA will take effect July 1, 2023, six
months after Virginia’s CDPA. The CPA has similar data subject rights to the
CCPA and CDPA, which include the right to access, correct, delete, and opt out
of the sale and certain uses of personal data.82 Like the CDPA, under the CPA, the
consumer has the right to opt out of the processing of personal data for targeted
advertising and the sale of personal data. Consistent with the CCPA and CDPA,
controllers have 45 days to respond to consumer requests.83
On October 1, 2021, Nevadans will have the right to opt out of the sale of their
personal information to data brokers or other third parties.84 To receive such opt-
out-of-sale requests, operators subject to that Nevada law must make available an
email address, toll-free telephone number, or web form through which consumers
may make requests.85 Finally, operators must respond to such requests within 60
days upon receipt.
9.4.2.7 Biometric Privacy Laws
Illinois, Washington, and Texas have enacted biometric privacy laws that govern
how biometric identifiers or unique identifying characteristics may be used.86
Illinois’ Biometric Information Privacy Act (BIPA) provides the most robust
consumer rights of the biometric laws adopted thus far. BIPA requires that a
private entity notify an individual in writing of its intent to collect biometric
information, inform the individual of the purpose and length of term for which
biometric information is being collected and used, and receive a written release
authorizing the use.87 A private entity must also obtain consent for further
disclosure of biometric identifiers.88 BIPA may carry substantial penalties for
companies that violate the law’s terms, as BIPA grants individuals a private right of
action for violations of the act.89 Statutory damages against an entity that
negligently violates a provision of BIPA are $1,000 of liquidated damages or actual
damages, whichever amount is greater, and $5,000 of liquidated damages or actual
damages, whichever is greater, for intentional or reckless violation of BIPA.90 Even
technical violations of the law may give rise to that liability.91 Successful plaintiffs
may also recover reasonable attorneys’ fees and costs.92
Sections 9.5 through 9.5.11 on data subject rights in Europe is excerpted from
Chapter 9 of European Data Protection: Law and Practice: Data Subjects’
Rights, by Jyn Schultze-Melling.93
9.5 Data Subject Rights in Europe and the United
Kingdom
European data protection law has always provided individuals with a range of
rights enforceable against organizations processing their data.
Compared to the Data Protection Directive (“Directive”), the GDPR is
considerably more complex and far-reaching in this respect as it includes an
extensive set of rights. This is, in part, because bolstering individuals’ rights was
one of the main ambitions of the European Commission (“Commission”) in
proposing the new data protection framework. Data subjects’ rights, set forth in
Articles 12 to 23 of the GDPR, may not only limit an organization’s ability to
lawfully process personal data, but they can also have a significant impact upon an
organization’s core business processes and even its business model.
These rights encompass the following:
Articles 12–14—Right of transparent communication and
information.
Article 15—Right of access. The data subject is entitled to have
information concerning their personal data that is undergoing
processing, as well as a copy of such data.
Article 16—Right to rectification. The data subject has the right to
have inaccuracies related to their personal data corrected.
Article 17—Right to erasure (or the “right to be forgotten”). The
data subject has the right to require the data controller to delete their
personal data if the continued processing of those data is not justified.
Recitals 42 and 65 and Article 7(3)—Right to withdraw consent
(when processing is based on consent). For a consent to be valid, the
data subject must have the right to withdraw such consent, and the data
subject has the right to withdraw such consent at any time.
Article 18—Right to restrict processing. The data subject has, in
some situations, the right to limit the processing of their personal data
to some purposes. This means the data controller must refrain from
using such personal data during the period for which the right applies.
This right may be used, for example, when the data subject contests the
lawfulness of the processing or the accuracy of the data, and data
controller is in the process of verifying the accuracy of the data.
Article 19—Obligation to notify recipients. A controller has an
obligation to communicate any rectification or erasure of personal data
or restriction of processing carried out in accordance with Articles 16,
17(1), and 18 to each recipient to whom the personal data have been
disclosed, unless this proves impossible or involves disproportionate
effort. The controller shall inform the data subject about the recipients
of such personal data if the data subject requests such information.
Article 20—Right to data portability. Under certain conditions, the
data subject may require their data to be provided to themselves or to
another company in a commonly used machine-readable format.
Article 21—Right to object. The data subject has the right to object to
certain processing, upon which the processing shall cease. The data
subject also has the right, on grounds relating to their situation, to
object to processing if the processing is based on legitimate interest or
public interest. When personal data is processed for the purposes of
direct marketing, the data subject always has the right to object to such
processing, including profiling. The data subject must be informed of
this right clearly and separately from other information.
Article 22—Right to not be subject to automated decision-making
(to profiling). The data subject’s right to not be subject to a decision
based solely on automated processing, including profiling, which
produces legal effects concerning them or similarly significantly affects
them.
9.5.1 The Modalities—to Whom, How, and When
Article 12(2) of the GDPR requires organizations to facilitate the exercise of data
subject rights. Whereas the Directive did not explicitly require organizations to
confirm data subjects’ identities, the GDPR now requires the controller to use all
reasonable efforts to verify the identity of data subjects. Consequently, where the
controller has reasonable doubts as to a data subject’s identity, the controller may
request the provision of additional information necessary to confirm it. The
controller, however, is not obliged to collect any additional personal data, just to
link certain pieces of data it holds to a specific data subject.
Under the GDPR, companies must protect the confidentiality, integrity, and
availability (CIA) of personal data. On the other hand, the method of confirming
identity should be reasonable, not unduly burdensome, and companies should
avoid collecting excessive amounts of personal data for the authentication itself.94
Another operational aspect refers to the time frame to honor data subjects’
requests. Preliminarily, the controller should acknowledge receiving the request
and confirm or clarify what is requested. Article 12(3) sets out the relevant time
windows for responding: one month, starting with receipt of the request, should
be the normal time frame, which can be extended by two further months for cases
of specific situations and/or especially complex requests. This decision should
naturally sit with an organization’s privacy professionals and, specifically, the
DPO, as applicable. However, as per the ICO guidance, the time frame to respond
starts upon receiving ID or other information to establish the identity of the
individual or that a third party has the authority to act on their behalf, although an
organization must promptly request such verification documents.95 During the
first month, however, the organization has to decide whether it can act on the
users’ request at all—if the organization decides not to proceed, it must inform
the data subjects about this and inform them of their right to lodge complaints
with regulators. In situations in which there are multiple controllers or processors
who have some or all of the relevant information, it is important to have processes
in place to ensure the requested information can be provided within the
applicable deadlines under the GDPR.
In terms of form, the GDPR aims to establish and rely on technology-based
processes—electronically received requests should be answered electronically,
unless the data subject requests something else, e.g., in paper form. While this
seems like a straight-forward requirement, companies should not underestimate
the potential security implications of honoring rights through technological
means alone. As email encryption is still not a thoroughly widespread mean of
providing secure communication for sensitive information, companies will be
challenged to adopt ways to deliver this information electronically in a safe and
accountable way.
9.5.2 The General Necessity of Transparent Communication
As mentioned earlier in this chapter, transparency is fundamental to any data
protection system, as individuals’ right to privacy cannot be assured if they are not
properly informed about the data controllers’ activities. In essence, the rights
established by the GDPR require that data subjects have all the information they
need to understand the nature of the processing and exercise their further
statutory rights. Consequently, Article 12(1) requires any information
communicated by the organization be provided in a “concise, transparent,
intelligible and easily accessible form, using clear and plain language.”
9.5.3 Right to Information (About Personal Data Collection
and Processing)
Under Article 13 of the GDPR, data subjects have the right to be provided with
certain pieces of information that describe their relationship with the controller.
This includes the controller’s identity and contact details, reasons or purposes for
processing their personal data, legal basis for doing so, recipients of that data,
especially if those reside in third countries, and other relevant information
necessary to ensure the fair and transparent processing of the data. Additionally,
the controller must identify the source of data if collected or obtained from a third
party to effectively enable the data subject to pursue their rights.96
9.5.4 Right of Access
The GDPR’s right of access set out in Article 15 is, in some ways, a counterpart to
the more passive right of information in Articles 13 and 14. Any data subject who
exercises their right to know must be told about the personal data the organization
holds about them and, more specifically, why and how it does so. In comparison
to the Directive, the GDPR expands considerably the mandatory categories of
information that a company must provide.
The GDPR prescribes that the data subject has the right to obtain from the
controller confirmation as to whether personal data that concerns them is being
processed. Where that is the case, in addition to providing access to the personal
data, the data subject is entitled to receive the following information:
The purposes of the processing
The categories of personal data concerned
The recipients or categories of recipients to whom the personal data
have been or will be disclosed, in particular, recipients in third countries
or international organizations
Where possible, the envisaged period for which the personal data will
be stored or, if not possible, the criteria used to determine that period
The existence of the right to request from the controller rectification or
erasure of personal data or restriction of processing of personal data
concerning the data subject or to object to such processing
The right to lodge a complaint with a supervisory authority
Where the personal data is not collected from the data subject, any
available information as to their source
The existence of automated decision-making, including profiling,
referred to in Article 22(1) and (4) and, at least in those cases,
meaningful information about the logic involved, as well as the
significance and the envisaged consequences of such processing for the
data subject
In practice, these types of requests are likely to pose a substantial administrative
burden on organizations so they should consider upfront what types of processes
need to be in place to assist with this task. These processes should include
pragmatic solutions to the following issues:
The processes set up must ensure the organization responds to a DSR
without undue delay and within one month of its receipt. This will
regularly require deployment of a ticketing system or a similar
technological solution. As the extension of the time limit by two more
months is only possible if the request is complex, or if the organization
has received several requests from the same individual, those
circumstances should be considered up front and adequate internal
time limits set.
Also, in case of any reasonable doubts about the identity of the
individual making the request, the processes must be temporarily
paused while the organization approaches the requesting party asking
for more information. However, it is important the organization only
request information that is necessary to confirm who the requesting
party is. The key to this is proportionality.
The organization’s processes must further take into consideration that
there are particularities in place when the access request is about a
child. It should emphasize the use of especially clear and plain language
when disclosing information to a child and, before responding to a DSR
for information held about a child, the child’s maturity must be assessed
in terms of whether the child is mature enough to understand their
rights. If the organization is confident the child can understand their
rights, they should respond directly. There may be situations, however,
in which it would seem reasonable to allow the parent to exercise the
child’s rights on their behalf or if it is evident this is in the child’s best
interests.
Generally, when disclosing any information, organizations must
consider if an access request by one individual includes information
about others, in which case their rights must be adequately protected,
too. In some cases, this might require data be redacted before it is
disclosed or, if that proves impossible or apparently unreasonable, the
other individuals’ consents should be required before the access request
is finally granted. Other exemptions can apply to the request. For
example, the ICO also identifies concerns regarding criminal
investigations and legal privilege as potential exceptions to access
requests.97
Regarding access requests by proxies, note that the GDPR does not
prevent an individual from making a DSR via a third party. This third
party might be an attorney, accountant, or anybody else acting on behalf
of a client if the individual feels more comfortable having someone else
act for them. In these cases, it is of vital importance that the
organization’s processes grant that any personal data is disclosed only if
—and as soon as—it has been sufficiently ensured the third party
making the request is, in fact, entitled to act on behalf of the individual.
While it is the third party’s responsibility to provide evidence of this
entitlement, the organization should clearly document, as a part of the
process, the nature of the request as a proxy request and retain proof of
the entitlement, whether it is a written authority to make the request or
a more general power of attorney. Obviously, as this constitutes a new
data processing, the individual acting on behalf of the data subject
should be adequately informed about this. Failing to carry out the
necessary checks on third parties allegedly acting on behalf of data
subjects can create a data breach. Equally, there should be a balance to
ensure the organization is not asking for more information than needed
to verify the identity of the third party, as this could come across as
obstructing the request and not practicing data minimization.
Finally, if the organization considers a DSR to be manifestly unfounded
or excessive, it can either request a “reasonable fee” to deal with the
request or refuse to deal with the request at all. In either case, it needs to
justify its decision and thoroughly document the facts that lead to the
respective assessment. The ICO has published guidance around this.
9.5.5 Right to Rectification
The scope of this right under the GDPR is largely unchanged from the Directive.
In a nutshell, data subjects have the right to rectification of inaccurate personal
data, and controllers must ensure that inaccurate or incomplete data is erased,
amended, or rectified. This right can generate a considerable amount of effort
operationally.
In WP 251, the WP29 states the right to rectification might apply where, “for
example, an individual is placed into a category that says something about their
ability to perform a task, and that profile is based on incorrect information.”98
Rectifying wrong or completing incomplete entries in databases is typically not an
isolated issue for most organizations. Because data is often interconnected and
together and in duplicate, any changes to any piece of data may have wider,
unforeseen consequences.
Further, the GDPR does not provide a definition of the term “accuracy.”
However, some national privacy laws, such as the UK’s former Data Protection
Act 2018, include language that defines personal data as inaccurate if it is incorrect
or misleading as to any matter of fact. It must be noted, though, that opinions are,
by their very nature, subjective. Consequently, it can be difficult to conclude that
the record in question is, in fact, inaccurate. As a best practice, it seems reasonable
to ensure the record shows clearly that the information’s nature is that of an
opinion. Where appropriate from the organization’s point of view, it seems further
reasonable to honor the individual’s explicit request even if the inaccuracy of the
record cannot be completely proven.
The organization’s processes must take into consideration that, under the
GDPR, any individual can make a request for rectification either verbally or in
writing, and further, that the organization must react within one calendar month
of receipt of the request. Only in limited circumstances can a request for
rectification be refused. This right is closely linked to the controller’s obligations
under the accuracy principle of the GDPR (see Article (5)(1)(d)). As a matter of
professional courtesy, the organization should, if possible, restrict the processing
of the data in question while it is verifying its accuracy.
If the decision is taken to reject the request, the individual must be informed
without undue delay about the organization’s reasons not to act as desired, their
right to make a complaint to the data protection authority (DPA) in charge and
their ability to seek to enforce this right through a judicial remedy. This
communication should be documented to allow the organization to defend itself
against claims of noncompliance. Obviously, as this requires some further
processing of personal data, the individual should be adequately informed in that
regard, too.
If, however, the decision is taken to follow the individual’s request, and if the
organization had previously disclosed any relevant personal data to third parties, it
has to contact and inform them of the rectification. In case this proves impossible
or involves disproportionate effort, this should be well documented to ensure the
ability to defend itself at any later point in time.
9.5.6 Right to Erasure (“Right to be Forgotten”)
The so-called “right to be forgotten” (RTBF) is probably one of the most actively
scrutinized aspects of the original proposal by the Commission.99
Article 17(1) establishes that data subjects can request at any time, in writing or
verbally, to have their personal data erased if:
The data is no longer needed for its original purpose, and no new lawful
purpose exists
The lawful basis for the processing is the data subject’s consent, the data
subject withdraws that consent, and no other lawful ground exists
The data subject exercises the right to object, and the controller has no
overriding grounds for continuing the processing
The data has been processed unlawfully
Erasure is necessary for compliance with EU law or the national law of
the relevant member state
In addition, Article 17(2) of the GDPR requires that, where the controller has
made any personal data public (e.g., in a telephone directory or in a social
network) and the data subject exercises the right to erasure, the controller must
take reasonable steps, including applying technological solutions but taking costs
into account, to inform third parties that are processing this published personal
data as controllers that the data subject has exercised this right. Given how
prominent the right to be forgotten was during the legislative process, it seems
reasonable to assume that regulators will emphasize the importance of honoring
this right in full.
Further, it must be noted that there is a strong emphasis on the right to have
children’s personal data erased on request, which clearly reflects the goal of
enhanced protection of children’s information, especially in online environments,
under the GDPR. Consequently, organizations should give weight to any request
for erasure if the processing of the data is based upon consent given by a child,
even if the individual requesting this is no longer a child, because previously they
may not have been fully aware of the risks involved in the processing at the time of
consent. This would have to be adequately considered by the organization’s
processes.
Exemptions to the right of erasure are listed in Article 17(3), which allows
organizations to decline data subjects’ requests to the extent that processing is
necessary for:
Exercising the right of freedom of expression and information
Compliance with a legal obligation that requires processing by
European Union or member state law to which the controller is subject
or for the performance of a task carried out in the public interest, like
public health, archiving and scientific, historical research, or statistical
purposes
The establishment of, exercise of, or defense against legal claims
To ensure alignment to other processes and exercises, it is good practice for
organizations to review and implement a process that considers any policies and
procedures around data retention and destruction. Organizations that have
implemented strong procedures around this may find that actively deleting data is
no longer required upon these requests as data will have already been deleted in
line with prescribed time frames. However, acknowledgement and
communication with data subjects is still required even if the applicable personal
data has been deleted.
The GDPR also entitles data subjects to request information about the identities
of recipients to whom the personal data has been disclosed. Consequently, Article
19 requires that where a controller has disclosed personal data to third parties, and
the data subject has subsequently exercised their right of rectification, erasure, or
blocking, the controller must notify those third parties of the data subject’s
exercise of those rights. The controller is exempt from this obligation only if it is
impossible to comply with it or where compliance would require disproportionate
effort, which must be proven by the controller. As Recital 66 mentions, this
extension of the right of erasure is meant to strengthen the RTBF specifically in
the online environment, where personal data is notoriously difficult to control
once it has been shared and distributed—so online service providers will
probably find dealing with this obligation especially difficult.
In terms of operational impact, this means that organizations, in addition to
implementing systems and procedures for giving effect to the new rights that the
GDPR grants to data subjects, are also required to implement systems and
procedures for reliably notifying affected third parties about the exercise of those
rights. For organizations that disclose personal data to many third parties, these
provisions may be particularly burdensome.
Finally, one of the most relevant issues with the right to erasure is the question of
whether this includes backup systems. Generally, if an organization receives a
valid erasure request, and if no exemption applies, it will have to take adequate
steps as described above to ensure erasure from all its systems—not just from
productive live systems, but also from backup archives. How the organization
honors this kind of request regarding its archives depends on the circumstances.
Although data might not be immediately deletable from backup tapes or other
long-term data storage systems, the main goal must be to find a way to ensure the
backup data cannot easily or accidentally be restored and, thus, further used after
the deletion request was received. The respective backup dataset must, therefore,
be marked, and the organization’s processes must be set up in a way to ensure that
such marked backups cannot be restored without a thorough check for
compliance with this data subject right.
9.5.7 Right to Restriction of Processing
Article 18 of the GDPR establishes something like a prior right under the
Directive that allowed the “blocking” of data. Data subjects have the right to
restrict the processing of their personal data if:
The accuracy of the data is contested (and only for as long as it takes to
verify that accuracy)
The processing is unlawful, and the data subject requests restriction, as
opposed to exercising the right to erasure
The controller no longer needs the data for their original purpose, but
the data is still required by the data subject to establish, exercise, or
defend legal rights
Verification of overriding grounds is pending in the context of an
erasure request
According to WP 251, the right to restrict processing as set forth under Article
18 will apply to any stage of processing.100 Therefore, from an operational point of
view, organizations face a broad range of circumstances under which data subjects
can require the processing of their personal data is restricted under the GDPR.
How this will be done technologically remains to be seen. Recital 67 at least
provides some guidance by suggesting that compliance with this right could be
achieved by “temporarily moving the selected data to another processing system,
making the selected personal data unavailable to users, or temporarily removing it
from a website.” The goal must be to ensure the data in question must not be
processed in any way except to store it unless the individual provided their
consent to a particular form of processing or unless it is for the establishment,
exercise, or defense of legal claims; for the protection of the rights of another
person; or for reasons of important public interest. If any of these reasons are
considered relevant, the organization should ensure thorough documentation of
its motivation and the underlying facts to enable itself to defend against any later
charge of noncompliance.
In many cases, the restriction of processing is only temporary. Once the
organization has come to a final decision, for example, regarding the accuracy of
the data or whether its legitimate grounds override those of the data subject
requesting the restriction, it can lift the restriction, but it must adequately inform
the individual about this before overriding the request. This information will
likely include the reasons for the organization’s refusal to act upon the data
subject’s rights under Articles 16 or 21, as the right to restriction is linked mainly
to these two provisions. In other words, if an organization is notifying an
individual about the fact that it is lifting the restriction, it will likely have to
explain, at the same time, why it is satisfied in terms of the data’s accuracy or
demonstrate that, in fact, legitimate grounds override the data subject’s rights
under the GDPR.
9.5.8 Right to Data Portability
“Data portability” is an entirely new term in European data protection law.101
Article 20 of the GDPR states data subjects have the right to receive their own
personal data, which they have provided to a controller, in a structured,
commonly used, and machine-readable format. They also have the right to
transmit the data to another controller without hindrance from the controller.
Technically, the controller must either hand the data over to the data subject in a
usable fashion, or, at their request (Article 20(2)), transfer the data directly to the
recipient of the data subject’s choice where technically feasible.
WP 242 states this new right aims to “empower data subjects regarding their
own personal data, as it facilitates their ability to move, copy or transmit personal
data easily from one IT environment to another (whether to their own systems,
the systems of trusted third parties, or those of new data controllers).”102 It
provides the example of banks providing additional services, under the data
subject’s control, using personal data initially collected as part of an energy supply
service.103 In fact, it seems that the financial and insurance markets already see
their share of requests, based on Article 20, to provide individualized services that
require access to an individual’s financial transactions or specific insurance case
data. For some organizations, this new right to transfer personal data between
controllers creates a significant additional burden, requiring substantial
investment in new systems and processes. On top of that, many issues with this
particular provision still need further guidance by the regulatory community to
allow companies to set up their processes. It remains to be seen what is meant by a
“structured, commonly used and machine-readable” format for modern
information services or how the threshold of “hindrance” and “technical
feasibility” is determined in the context of direct controller-to-controller transfers.
In this context, WP 242 refers to the case in which a data subject is interested in
“retrieving his current playlist (or a history of listened tracks) from a music
streaming service, to find out how many times he listened to specific tracks, or to
check which music he wants to purchase or listen to on another platform.” As
another example, a user might “want to retrieve his contact list from his webmail
application, for example, to build a wedding list, or get information about
purchases using different loyalty cards, or to assess his or her carbon footprint.”104
While these examples show the underlying intention to boost new kinds of data-
enabled services, the WP29’s opinion falls well short in terms of describing its
understanding of the data formats required to enable them. It can be reasonable
assumed, though, that the term “commonly used” excludes any proprietary data
formats and, hence, effectively forces the industry players to come up with
common data exchange formats to be employed in cases where current file
formats, like Portable Document Format (PDF), Comma-Separated Values
(CSV), Microsoft Excel (XLS), Extensible Markup Language (XML), JavaScript
Object Notation ( JSON), or Rich Text Format (RTF), might not be sufficient,
and even guidance from the Open Knowledge Foundation does not provide
resolution.105
Consequently, Recital 68 explicitly states this provision is intended to encourage
data controllers to develop interoperable formats that enable data portability. WP
242 goes even further by explicitly stating that “in addition to providing consumer
empowerment by preventing ‘lock-in,’ the right to data portability is expected to
foster opportunities for innovation and sharing of personal data between data
controllers in a safe and secure manner, under the data subject’s control.” Clearly,
in the many use cases in which such data portability requests are prone to misuse,
such risks apparently have been considered of less importance than some
companies’ ability to attract customers from competitors as easily as they could in
the past, when users may have been reluctant to set up just another new account.
A real issue in this regard is, for example, the question of how to deal with data to
be ported at the request of one individual that includes personal data about
others. In this case, regulators often argue this is not a real problem, as it was the
requestor themself who provided this data to the organization in the first place
and was, therefore, also responsible for ensuring that any necessary consents by
third parties applied. However, at the end of the day, an organization that provides
data based on a portability request to any other controller effectively puts itself
into legal jeopardy and can easily be found liable for wrongly disclosing personal
data even though it did not have any real chance to verify compliance prior to the
disclosure.
Therefore, organizations should always consider whether there will be an
adverse effect on the rights and freedoms of third parties, in particular, when they
are transmitting personal data directly to another controller. While there may be
legitimate reasons to deny the transmission, for example, if it would adversely
affect the rights and freedoms of others, it must be considered that it is the
organization’s responsibility to explain why these concerns are, in fact, legitimate
and not just a “hindrance” to the transmission. As usual, proper documentation is
key in this regard.
9.5.9 Right to Object
In accordance with Article 21(1), whenever a controller justifies the data
processing based on its legitimate interests, data subjects can object to such
processing. Because the GDPR does not specify any form requirement for a valid
objection, and as objections can be made either verbally or in writing, spotting
them reliably in the organization’s processes turns out to be a practical issue for
many companies. Further, an objection can be made to any part of an
organization; it does not have to be directed toward a specific person or contact
point and need not include the exact words “objection to processing” or a
reference to Article 21 of the GDPR. As, consequently, any employee could
receive a valid verbal objection, while the company faces the legal responsibility to
handle it appropriately, organizations need to consider carefully which of their
staff members have regular interaction with individuals and, therefore, may need
specific training to identify an objection. Additionally, it is generally considered a
best practice to draft a specific policy for adequately documenting relevant details
of the objections received, particularly those made by telephone or in person.
Because of any valid objection, the controller is no longer allowed to process the
data subject’s personal data unless it can demonstrate compelling, legitimate
grounds for the processing. These grounds must be sufficiently compelling to
override the interests, rights, and freedoms of the data subject, such as to establish,
exercise, or defend against legal claims. The GDPR does not explain what would
be considered compelling legitimate grounds. However, WP 217 states that any
legitimate interest must be lawful (i.e., in accordance with applicable EU and
national law), sufficiently clearly articulated to allow the balancing test to be
carried out against the interests and fundamental rights of the data subject (i.e.,
sufficiently specific), and representative of a real and present interest (i.e., not be
speculative).106 This at least provides some guidance on how to assess the issue in
terms of the necessary balancing efforts. In this regard, WP 217 further states that
“if the interest pursued by the controller is not compelling, the interests and rights
of the data subject are more likely to override the legitimate—but less significant
—interests of the controller,” and “at the same time, this does not mean that less
compelling interests of the controller cannot sometimes override the interests and
rights of the data subjects: this typically happens when the impact of the
processing on the data subjects is also less significant.”107 Proper documentation is
key in terms of being able to defend the organization’s interests in case this
particular issue should ever be challenged by a regulator, as, unlike in the Directive
95/46/EC, under the GDPR, the burden of proof to show compelling legitimate
grounds lies with the controller rather than the data subject.108 Further, in terms of
transparency, WP 251 clearly states that the controller has to “bring details of the
right to object under Article 21(1) and (2) explicitly to the data subject’s
attention, and present it clearly and separately from other information (Article
21(4)).”109
Under Article 14 of the Directive, data subjects already had the right to object to
the processing of personal data for the purpose of direct marketing. Under the
GDPR, this now explicitly includes profiling. In addition, the data subject must be
explicitly, clearly, and separately notified of the right to object—at the latest, at the
time of the first communication.
Under Article 21(6), which states that personal data is processed for scientific
and historical research purposes or statistical purposes, the right to object exists
only as far as the processing is not considered necessary for the performance of a
task carried out for reasons of public interest.
9.5.10 Right to Not be Subject to Automated Decision-Making
Article 22 bears considerable potential for misunderstandings just in its title. The
term “right” does not mean the provision applies only when actively invoked by a
data subject. To the contrary, Article 22(1) establishes a general prohibition for
decision-making based solely on automated processing, and it applies irrespective
of the data subject’s actions.110 The right not to be evaluated based on automated
processing is closely connected with the aforementioned right to object. It is
important to take into consideration, though, that Article 22 has a narrow
application. The right not to be subject to automated decision-making applies
only if such a decision is based solely on automated processing and produces legal
effects concerning the data subject or similarly significantly affects them. Because
of their ambiguity, however, these terms will need further explanation and
probably additional guidance from the regulators. There is no common
understanding of what “solely automated process” means, and there are no
commonly accepted rules as to what kind of decisions have significant effects on
individuals. However, WP 251 expressively states that a controller cannot avoid
the Article 22 provisions by fabricating human involvement: “For example, if
someone routinely applies automatically generated profiles to individuals without
any actual influence on the result, this would still be a decision based solely on
automated processing.”111
Yet, if a decision-making process falls within these parameters, the underlying
processing of personal data is allowed if it is authorized by law, necessary for the
preparation and execution of a contract, or done with the data subject’s explicit
consent, provided the controller has put sufficient safeguards in place. Recital 71
explicitly states that such processing should be “subject to suitable safeguards,
which should include specific information to the data subject and the right to
obtain human intervention, to express his or her point of view, to obtain an
explanation of the decision reached after such assessment and to challenge the
decision.”112
9.5.11 Restrictions of Data Subjects’ Rights
Despite the GDPR’s prescriptive nature, controllers must also be prepared to
comply with additional EU and member state law that could further impose
obligations and provide rights, in addition to those provided for in Articles 12 to
22. Individual EU countries have the right to provide additional guidelines on the
principles of Article 5, insofar as its provisions correspond to the rights and
obligations provided for in Articles 12 to 22. In particular, while they must still
respect data subjects’ fundamental rights and freedoms, member states may enact
additional restrictions on data subject rights under the GDPR that are deemed
necessary to safeguard interests of national security, defense, or public security.
To comply with the GDPR’s requirements relating to data subject rights,
controllers should revise their privacy notices and internal privacy policies. If the
controller does not already have such policies or procedures, the controller will
also need to prepare ones for data subject access requests (DSARs) and privacy by
design (PbD). These policies will need to be implemented at every business level,
and training will need to be provided to the employees in such business units,
from entry-level employees through leadership of an organization.
9.6 Responding to Data Subject Requests
9.6.1 Responding to Withdrawals of Consent and
Data Access Requests
For consent to be freely given, it must also be freely revocable. Therefore, it is
important to have a process and policy in place for enabling data subjects to
withdraw their consent.
9.6.1.1 Responding to Withdrawals of Consent
Choice and control should be offered to individuals even after the opt-in stage. If
an organization relies on consent to process personal data, it may want or be
required to state in the privacy notice that the individual can withdraw consent.
An organization’s procedures around withdrawal of consent may address:
When and how consent may be withdrawn
Rules for communicating with individuals
Methods for withdrawing consent
Documentation of requests and actions taken
The process for withdrawing consent should be publicized—via privacy notices,
consent requests, and so on—to inform individuals about the steps they should
take.
9.6.1.2 Responding to Data Subject Access and Rectification Requests
Under certain circumstances, laws and regulations may require an organization to
provide individuals, upon request, with access to their personal information and
information about the processing performed on it and allow them to correct their
information. The information must be provided:
Completely
In a timely manner
Without charge to the individual
In the same form that the request was made
There may be limits to this right, such as protections for the rights and freedoms
of others.
A privacy team should work with the legal team to establish policies and
procedures that align with legal requirements.
It is important for an organization to have a documented process and follow it.
The process may be the first thing a regulator asks about in the event of an issue,
and the regulator might even request evidence of such process. The regulator will
also likely want to know if the employees charged with implementing such
policies understand them and have received training on them.
9.7 Handling Complaints: Procedural
Considerations
Complaints about how the organization manages DSRs may come from both
internal sources, such as employees, and from external sources, such as customers,
consumers, competitors, patients, the public, regulators, and vendors. Individuals
handling complaints or requests for an organization must be trained to identify
these requests because they may be submitted in a variety of ways, such as by
email, phone, or social media.113 If an individual from a country or U.S. state with
a comprehensive consumer privacy law, e.g., the European Economic Area (EEA),
the United Kingdom, Brazil, California, or Virginia, makes any request relating to
their personal data, it is safe to assume that such a request that may be subject to a
law such as the GDPR.114 Therefore, all employees who may come across such
requests should be trained on how to recognize them and instructed on how to
quickly send them to the person or team within the organization who has the
responsibility of handling them. The CCPA affirmatively requires such training for
all individuals responsible for handling consumer requests and inquiries.115
Internal procedures should define and enable mechanisms for:
Differentiating between sources and types of complaints
Designating proper recipients
Implementing a centralized intake process
Tracking the process
Reporting and documenting resolutions
Redressing
Departments and roles designated for receiving complaints should be easy to
reach, whether through dedicated phone numbers, email addresses, or physical
addresses.
Complaints from data subjects should go through some centralized process or
team, such as customer services. There needs to be a central point of control that
deals with data subject complaints. Because you have a limited time to respond
and may need cooperation from other parties (e.g., other controllers, processors),
it is critical to have an efficient and consistent process. If you have a centralized
process, you have trained people who know the organization and can quickly
determine how to resolve most complaints. The individuals who will respond to
these complaints need to understand how to properly authenticate the individuals
making the complaints or requests. They also need to be trained to identify fraud
and phishing. It has become somewhat common for terminated or disciplined
employees to use DSARs in litigation or threatened litigation, so it is important to
make sure the organization’s subject access request policies contemplate all likely
scenarios. Individuals handling these requests for an organization must be trained
on the types of requests the organization may receive, how to respond to such
requests, and when such requests can be declined or limited in accordance with
applicable law. Such individuals must also be able to consult with the legal team
when any uncertainty arises in responding to such requests. Other than regular
training, it can be useful to create guidance documents and posters for teams
dealing with these requests so they have access to quick and easily digestible
materials on the subject.
Organizations have also been bombarded with automated access and deletion
requests generated by third-party websites that send bulk access or deletion
requests on behalf of data subjects. When organizations respond to these emails
to request verification of such individual’s identity, many of these verification
requests receive no response from the data subject. Handling such requests can
become very time-consuming for organizations, thus increasing the demand for
more automated responses.
To better operationalize a response to a large volume of access requests and
complaints, some organizations are exploring the use of technology to increase
their efficiency in responding to such requests and complaints. This technology
includes complaint forms and software that can be used to automate responses.116
When designing processes to respond to data subject requests and complaints, it
is also important to consider special situations, including responses to visually
impaired individuals, handling of multiple languages across multiple jurisdictions,
and tracking of metrics related to such requests both for resolution and for
efficiency.
Companies that handle a large volume of requests have realized the value in
having a strong records retention program that facilitates finding the requested
information, ensures that years of information have not been unnecessarily
retained, and provides a defensible rationale for deleting data on a regular basis,
subject to applicable legal holds. DSARs can be time consuming and labor
intensive, especially when a large volume of documents must be provided.117
When providing such documents, which may include the personal data of third-
party data subjects, it is critical to redact or remove such data if the third parties
have not consented to disclosure, or, as permitted in the UK, it is otherwise
reasonable to comply with the request without the third party’s consent.118 In
many ways, responding to DSARs may remind American organizations of
complex e-discovery production exercises. Given the timing requirements in
responding to such DSARs, a sound process and solid record retention program
become even more important.
9.8 Data Subject Rights Outside the United
States and Europe
Clearly there is a trend toward strengthening privacy- and cybersecurity-related
laws globally. Many countries in Latin America and Asia, for example, have either
adopted or are in the process of adopting GDPR-like laws. Canada has also
recently strengthened its privacy laws with recent changes in its Personal
Information Protection and Electronic Documents Act (PIPEDA) relating to
breach notification.119 Canada is also exploring additional changes to its anti-spam
legislation (CASL).120 In Canada, PIPEDA generally provides data subjects with a
general right to access their personal information (defined as “information about
an identifiable individual”) held by businesses subject to the law.121 In November
2020, Canada published draft legislation to update and modernize Canada’s
federal private-sector privacy legislation and such updates are expected by the end
of 2021. The new law, entitled the Consumer Privacy Protection Act, would
strengthen Canada’s privacy laws and bring it more in line with the protections of
the GDPR, including by imposing administrative fines and allowing a privacy
right of action.122
9.8.1 Latin America
Latin American privacy laws are composed both of constitutional protections and,
where adopted in certain countries, comprehensive privacy legislation. As of
December 2020, at least 13 Latin American countries had comprehensive privacy
or data protection laws.123 In the laws that have been adopted thus far, there are
some common elements, such as requiring notice and giving access and
correction rights to data subjects.124 In Mexico, these rights are referred to as
“ARCO” rights and specifically grant Mexican citizens the right of access,
rectification, cancellation, and opposition to processing of their personal data.125
On July 10, 2018, Brazil’s Federal Senate approved Lei Geral de Proteção de
Dados (LGPD).126 The bill was largely inspired by the GDPR. Although the law
became effective in September 2020, full enforcement began August 2021.
Specifically, pursuant to the LGPD, data subjects have the right to access, rectify,
cancel, or exclude their personal data. Further, data subjects may also oppose the
processing of their personal data. The LGPD also sets forth a right to data
portability, pursuant to which an individual may request a copy of his or her data
in a transferrable format.
9.8.2 East Asia
In China, Article 40 of the Constitution of the People’s Republic of China (PRC)
and several sets of laws and regulations expressly protect privacy. For example, the
PRC Criminal Law prohibits selling or providing Chinese citizens’ personal
information in breach of relevant state requirements.127 The PRC General
Provisions of the Civil Law prohibits the unlawful collection, use, processing,
transfer, sale, provision, or disclosure of another person’s information.128 Also, the
National People’s Congress Standing Committee Decision on Strengthening
Network Information Protection (“NPCSC Decision”) stipulates that no
organization or individual can steal or obtain citizens’ personal digital information
by other unlawful means, or sell or provide citizens’ personal digital information
to others unlawfully.129
The Chinese National Information Security Standardization Technical
Committee, or TC260, established the Information Security Technology —Personal
Information Security Specification (“PI Security Specification”) in May 2018, with
an update in October 2020. Under the PI Security Specification, the handling of
“personal information” and “personal sensitive information” must follow the seven
principles of (1) consistent rights and responsibilities; (2) clear purpose; (3)
choice and consent; (4) minimal and necessary uses; (5) openness and
transparency; (6) security assurance; and (7) data subject participation.130
South Korea’s privacy regime is one of the strictest in the world and requires the
personal information processor to provide detailed notices to data subjects and
obtain explicit consent in a manner that is similar to the requirements of the
GDPR in that matters requiring consent should be segregated from the personal
information not requiring consent.131 Korea’s Personal Information Protection
Act, which was enacted September 30, 2011, prohibits denying goods and
services to a data subject on the basis that such individual denied consent to
certain processing (e.g., receiving marketing messages).132 Furthermore, the
Korean law gives data subjects the right to access, correction, deletion, and
destruction of their personal information.
Japan’s Act on the Protection of Personal Information (APPI) largely aligns with
the GDPR in terms of the rights afforded to data subjects.133 For example, under
the APPI, data subjects may request to access, correct, add, or delete the personal
data that an organization retains about the data subject. Organizations, moreover,
must promptly provide notice of the purposes for which the personal data will be
processed and may not process personal data for other purposes without first
notifying data subjects.134
Malaysian law contains a comprehensive data protection regime, the Personal
Data Protection Act of 2010 (PDPA), that confers certain rights to data
subjects.135 Under the PDPA, data subjects may request to access or correct
personal data, which organizations must complete within 21 days upon receiving
a request.136 Although the act does not directly establish deletion rights, data
subjects may request that an organization delete incorrect personal data.
Organizations should provide a privacy notice to data subjects before collecting or
requesting personal data from the data subject.137 That privacy notice must
include information regarding “the data subject’s right to request access to, and to
request correction of, personal data, and how to contact the data user with any
inquiries or complaints in respect of personal data.”138
Singapore law nearly mirrors Malaysian law with respect to data subject rights as
the Personal Data Protection Act 2012 (PDPA), also recognizing access,
correction, and limited deletion rights.139 Organizations should, as a matter of
course, response to requests within 30 days of receipt.140 Organizations must also
provide at least a limited privacy notice to data subjects upon collecting personal
data stating the purposes for which the organization collects the information and
other purposes for which the information may be used.141
Although promulgated in 2019, Thailand’s Personal Data Protection Act’s
(PDPA) effective date was delayed until June 1, 2021.142 That law requires
companies to provide data subjects with a privacy notice within 30 days upon
collecting personal information from a data subject.143 Although it does not
provide for a right of correction, the PDPA permits data subjects to access and
delete personal data.144 If an organization cannot respond to a request within 30
days, then the organization must notify of the expected response date. A data
subject may, as an alternative to deletion, request that a company anonymize the
personal data. Data subjects may also request that a data controller restrict the
controller’s use of personal information so long as the purpose for which the
information was initially collected has been satisfied.145 The PDPA, finally, does
not specify channels through which data subjects must make requests.
9.8.3 Australia and New Zealand
Australian privacy law establishes a consumer right to access the personal
information that organizations hold about the consumer.146 Consumers may also
correct any information that is inaccurate, out of date, incomplete, irrelevant,
and/or misleading.147 Organization may charge a fee for responding to DSRs, but
the Australian privacy regulator emphasizes the organization may not use the
charge to discourage data subjects from making requests and should disclose the
amount of the fee up front.148 To effectuate those data subject rights, organizations
must develop procedures for fielding and responding to such requests within 30
days from receiving the request from data subjects or their agents.149 An
organization’s privacy notice, in particular, must list how the data subject may
access and correct personal information.150
The Privacy Act of 2020 overhauled data protection laws in New Zealand.151 Just
as in Australia, data subjects in New Zealand may access and correct the
information that organizations hold about that person, and businesses must
effectuate such requests within 20 business days of receiving such requests.
However, New Zealand law offers many exceptions to an individual’s access right,
including if the disclosure would reveal information about a third party or if the
request is frivolous or trivial (e.g., multiple requests for the same information or a
very small item of information contained within a large amount of information).152
If, in response to a request for correction, an organization disagrees with the data
subject that the information requires correction, then the data subject may file a
“statement of correction” with the organization, explaining the data subject’s
position.153
Given the requirements of various global privacy laws, it is critical for global
organizations to have robust DSAR policies in place and the ability to respond in a
timely manner to these requests.
9.9 Summary
The trend in global privacy is to endow data subjects with greater control over
their personal data and require increased transparency regarding how
organizations communicate to data subjects about the ways they process the data
subjects’ personal data. It will be critical for your organization’s brand to handle
DSARs in a way that engenders trust in your organization and does not instead
lead the dissatisfied data subject to complain to the regulator or on social media.
Organizations that are responsive and have sound processes will come to view
each interaction with a data subject as a trust-building exercise and an
opportunity to improve how their organization is viewed internally and externally.
Endnotes
1 15 USC § 45(a)(1); Michael Scully and Cobun Keegan, “IAPP Guide to FTC Privacy Enforcement,” IAPP,
accessed November 2018, https://iapp.org/media/pdf/resource_center/Scully-FTC-Remedies2017.pdf.
2 See, for example, Mass. R. Civ. P. 55(b )(2) and Commonwealth v. Kuvayev, Suffolk Sup. Ct. 05-1856-H.
3 “Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law,”
FTC, press release, September 4, 2019, https://www.ftc.gov/news-events/press-
releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations; “Snapchat Settles FTC
Charges That Promises of Disappearing Messages Were False,” FTC, press release, May 8, 2014,
https://www.ftc.gov/news-events/press-releases/2014/05/snapchat-settles-ftc-charges-promises-
disappearing-messages-were; “FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on
Facebook,” FTC, press release, July 24, 2019, https://www.ftc.gov/news-events/press-
releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions; and “Myspace Settles
FTC Charges That It Misled Millions of Users About Sharing Personal Information with Advertisers,”
FTC, press release, May 8, 2012, https://www.ftc.gov/news-events/press-releases/2012/05/myspace-
settles-ftc-charges-it-misled-millions-users-about.
4 “Snapchat Settles FTC Charges That Promises of Disappearing Messages Were False,” FTC.
5 Florian Schaub, Rebecca Balebako, Adam L. Durity, and Lorrie Faith Cranor, “A Design Space for Effective
Privacy Notices,” 2015 Symposium on Usable Privacy and Security,
https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf.
6 Schaub, Balebako, Durity, and Cranor, “A Design Space for Effective Privacy Notices.”
7 FTC staff report, “Mobile Privacy Disclosures: Building Trust Through Transparency,” February 2013,
(noting that the “Commission staff supports this type of innovation as a way to provide a starting point for
improved disclosures”), accessed November 2018,
https://www.ftc.gov/sites/default/files/documents/reports/mobile-privacy-disclosures-building-trust-
through-transparency-federal-trade-commission-staff-report/130201mobileprivacyreport.pdf; Müge
Fazlioglu, “What’s New in WP29’s Final Guidelines on Transparency?” Privacy Advisor, IAPP, April 18,
2018, https://iapp.org/news/a/whats-new-in-wp29s-final-guidelines-on-transparency/.
8 EDPB, “Guidelines 05/2020 on Consent under Regulation 2016/679,” accessed September 2021,
https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_202005_consent_en.pdf
(referencing Article 29 Working Party “Guidance on Transparency under Regulation 2016/679,” Section
30, p. 17, http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227).
9 “How Do We Comply with the Cookie Rules?,” ICO, accessed September 2021, https://ico.org.uk/for-
organisations/guide-to-pecr/guidance-on-the-use-of-cookies-and-similar-technologies/how-do-we-
comply-with-the-cookie-rules/.
10 CCPA Final Regulation § 999.304, 2020, https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oal-sub-
final-text-of-regs.pdf.
11 CCPA Final Regulation § 999.305(c), 2020, https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oal-
sub-final-text-of-regs.pdf.
12 Article 29 Working Party, “Guidance on Transparency Under Regulation 2016/679,” Section 32, p. 18,
accessed November 2018, http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227.
( )( )
13 CCPA Final Regulation § 999.305(a)(4), 2020,
https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oal-sub-final-text-of-regs.pdf.
14 Proposed modification to the CCPA regulations, § 999.306(f)(2), 2020,
https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-prop-mods-text-of-regs-4th.pdf.
15 The NAI’s opt-out is offered at https://optout.networkadvertising.org/?c=1, and the DAA’s at
https://youradchoices.com/control. The AdChoices, WebChoices and AppChoices icons are all presented
at https://youradchoices.com/ (all accessed March 2021).
16 Article 29 Working Party, “Guidance on Transparency Under Regulation 2016/679,” Section 32,
p. 18.
17 Woodrow Hartzog, “User Agreements are Betraying You,” Medium, June 5, 2018, https://medium
.com/s/trustissues/user-agreements-are-betraying-you-19db7135441f.
18 Hartzog, “User Agreements are Betraying You.”
19 Dan Solove, “The Myth of the Privacy Paradox,” 89 George Washington Law Review 1, revised January 29,
2021, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3536265 (citing Bruce Schneier, “It’s Not
Just Facebook. Thousands of Companies Are Spying on You,” CNN, March 26, 2018,
https://www.cnn.com/2018/03/26/opinions/data-company-spyingopinion-schneier/index.html).
20 Alexis Madrigal, “Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days,”
Atlantic, March 1, 2012, accessed November 2018,
https://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-
in-a-year-would-take-76-work-days/253851/.
21 Prepared Testimony and Statement for the Record of Woodrow Hartzog, hearing on “Policy Principles for
a Federal Data Privacy Framework in the United States,” before the Committee on Commerce, Science and
Transportation, U.S. Senate, February 27, 2019, https://www.commerce
.senate.gov/services/files/8B9ADFCC-89E6-4DF3-9471-5FD287051B53.
22 GDPR, Recital 32.
23 “How Do We Comply with the Cookie Rules?,” ICO.
24 Colorado Revised Statutes § 6-1-1303.
25 15 U.S.C.A. § 6501 et. Seq; GDPR, Article 8, accessed November 2018, www.privacy-
regulation.eu/en/article-8-conditions-applicable-to-child’s-consent-in-relation-to-information-society-
services-GDPR.htm.
26 “Introduction to the Age Appropriate Design Code,” ICO, 2020, https://ico.org.uk/for-
organisations/guide-to-data-protection/key-data-protection-themes/age-appropriate-design-a-code-of-
practice-for-online-services/.
27 16 C.F.R. § 312.1 et. seq.
28 16 C.F.R. § 312.4(a).
29 16 C.F.R. § 312.6.
30 16 C.F.R. § 312.4(a)(iii).
31 Lothar Determann, “Analysis: The California Consumer Privacy Act of 2018,” Privacy Tracker, IAPP, July
2, 2018, https://iapp.org/news/a/analysis-the-california-consumer-privacy-act-of-2018/.
32 “Guidelines for Obtaining Meaningful Consent,” Office of the Privacy Commissioner of Canada, May
2018, https://www.priv.gc.ca/en/privacy-topics/collecting-personal-
information/consent/gl_omc_201805/.
33 Article 29 Working Party, “Guidance on Transparency Under Regulation 2016/679,” Section 14, p.10,
accessed November 2018, http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=622227.
34 Article 29 Working Party, “Guidance on Transparency Under Regulation 2016/679,” Section 14, p.10.
35 “Children Online,” Bird & Bird, accessed March 2021, https://www.twobirds.com/en/in-focus/general-
data-protection-regulation/gdpr-tracker/children.
36 “Introduction to the Age Appropriate Design Code,” ICO, p. 3.
37 “Children and Young People,” Office of the Australian Information Commissioner, accessed September
2021, https://www.oaic.gov.au/privacy/your-privacy-rights/children-and-young-people/.
38 “Complying With COPPA: Frequently Asked Questions,” FTC, accessed September 2021,
https://www.ftc.gov/tips-advice/business-center/guidance/complying-coppa-frequently-asked-
questions-0.
39 15 U.S.C.A. § 1681.
40 15 U.S.C.A. § 1681(g).
41 15 U.S.C.A. § 1681(a)(1)(A).
42 15 U.S.C.A. § 1681(i).
43 15 U.S.C.A. § 1681(i)(a)(1)(A).
44 15 U.S.C.A. § 1681(c)(a).
45 15 U.S.C.A. § 1681s-2(a)(7)(A)(i).
46 “Background Checks: What Employers Need to Know,” U.S. Equal Employment Opportunity
Commission, accessed November 2018,
https://www.eeoc.gov/eeoc/publications/background_checks_employers.cfm.
47 “Background Checks: What Employers Need to Know,” U.S. EEOC.
48 “Background Checks: What Employers Need to Know,” U.S. EEOC.
49 Pub. L. 104-191; 45 C.F.R. § 160.103 et.seq; 45 C.F.R. § 162.100 et. seq; 45 C.F.R. § 164.102 et. seq.
50 45 C.F.R. § 164.512.; 45 C.F.R. § 164.506.
51 45 C.F.R. § 164.524.
52 45 C.F.R. § 164.524(b)(2)(i).
53 45 C.F.R. § 164.526.
54 45 C.F.R. § 164.528; 45 C.F.R. § 164.522.
55 “National Do Not Call Registry,” FTC, accessed November 2018, https://www.donotcall.gov/;
“Telemarketing Sales Rule,” FTC, accessed November 2018,
https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/telemarketing-sales-
rule.
56 “Stop Unwanted Robocalls and Texts,” Consumer Guides, FCC, accessed November 2018,
https://www.fcc.gov/consumers/guides/stop-unwanted-robocalls-and-texts.
57 “National Do Not Call Registry,” FTC.
58 “Enforcement of the Do Not Call Registry,” FTC, accessed November 2018, https://www.ftc.gov/news-
events/media-resources/do-not-call-registry/enforcement. In 2017, for example, a federal court ordered
Dish Network to pay penalties totaling $280 million and injunctive relief for Dish’s failure to comply with
the FTC’s Telemarketing Sales Rule, the Telephone Consumer Protection Act, and state law.
59 “CAN-SPAM Act: A Compliance Guide for Business,” FTC, accessed November 2018, https://www
.ftc.gov/tips-advice/business-center/guidance/can-spam-act-compliance-guide-business.
60 15 U.S.C.A. § 7701 et. seq.; CAN-SPAM Act: A Compliance Guide for Business,” FTC.
61 “Spam,” Consumer Information, FTC, accessed February 2019,
https://www.consumer.ftc.gov/articles/0038-spam.
62 5 U.S.C.A. § 552a; 5 U.S.C.A. § 552a(d)(1).
63 5 U.S.C.A. § 552a(d)(3).
( )( )
64 5 U.S.C.A. § 552a(g)(1).
65 5 U.S.C.A. § 552.
66 “Frequently Asked Questions,” FOIA.gov, accessed November 2018, https://www.foia.gov/faq.html.
67 Cal. Bus. & Prof. Code § 22575(a).
68 Cal. Bus. & Prof. Code § 22575 et. seq; “California Online Privacy Protection Act (CalOPPA),” Consumer
Federation of California Education Foundation, accessed November 2018,
https://consumercal.org/about-cfc/cfc-education-foundation/california-online-privacy-protection-act-
caloppa-3/.
69 Cal. Bus. & Prof. Code § 22575(b).
70 Del. Code Ann. tit. 6, § 1201C et seq.
71 Cal. Bus. & Prof. Code § 22577(d).
72 Del. Code Ann. tit. 6, § 1201C(17).
73 Del. Code Ann. tit. 6, § 1201C(14); Cal. Bus. & Prof. Code § 22575(a).
74 Del. Code Ann. tit. 6, § 1201C(b)(5); Cal. Bus. & Prof. Code § 22575(b)(5).
75 Nevada Revised Code § NRS 603A.340.
76 Cal. Civ. Code § 1798.83 et. seq.
77 Cal. Bus. & Prof. Code § 22580 et seq.; Cal. Bus. & Prof. Code § 22581.
78 Cal. Bus. & Prof. Code § 22580 et seq.; Cal. Bus. & Prof. Code § 22581.
79 Rita Heimes, “New California Privacy Law to Affect More Than Half a Million U.S. Companies,” Privacy
Advisor, IAPP, July 2, 2018, https://iapp.org/news/a/new-california-privacy-law-to-affect-more-than-half-
a-million-us-companies/.
80 Full text of the CPRA is available at:
https://iapp.org/media/pdf/resource_center/ca_privacy_rights_act_2020_ballot_initiative.pdf.
81 Sarah Rippy, “Virginia Passes the Consumer Data Protection Act,” Privacy Tracker, IAPP, March 3, 2021,
https://iapp.org/news/a/virginia-passes-the-consumer-data-protection-act/.
82 Colorado Revised Statutes § 6-1-1305; 6-1-1306.
83 Colorado Revised Statutes § 6-1-1306.
84 Nevada Revised Statutes § 603A.333.
85 Nevada Revised Statutes § 603A.325.
86 740 Ill. Comp. Stat. Ann. 14/1 et. seq.; Wash. Rev. Code Ann. § 19.375.010 et seq.; Tex. Bus. & Com.
Code Ann. § 503.001.
87 740 Ill. Comp. Stat. Ann. 14/15.
88 740 Ill. Comp. Stat. Ann. 14/15.
89 740 Ill. Comp. Stat. Ann. 14/20.
90 740 Ill. Comp. Stat. Ann. 14/20.
91 Rosenbach v. Six Flags Entertainment Corp., 2019 IL 123186 (Ill. Jan. 25, 2019).
92 740 Ill. Comp. Stat. Ann. 14/20.
93 Jyn Shultze-Melling, “Data Subjects’ Rights,” in European Data Protection Law and Practice (Portsmouth,
NH: IAPP, 2018), 159.
94 Piotr Foitzik, “How to Verify Identity of Data Subjects for DSARs under the GDPR,” Privacy Advisor,
IAPP, June 26, 2018, https://iapp.org/news/a/how-to-verify-identity-of-data-subjects-for-dsars-under-
the-gdpr/.
95 “Right of Access,” ICO, accessed September 2021, https://ico.org.uk/for-organisations/guide-to-data-
protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-of-access/#ID.
( )
96 GDPR: Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on
the protection of natural persons with regard to the processing of personal data and on the free movement
of such data, and repealing Directive 95/46/EC (“General Data Protection Regulation”), Art. 14, OJ 2016
L 119/1.
97 “Right of Access,” ICO, accessed September 2021, https://ico.org.uk/for-organisations/guide-to-data-
protection/guide-to-the-general-data-protection-regulation-gdpr/individual-rights/right-of-access/#ID.
98 Article 29 Data Protection Working Party, WP 251 rev. 01, “Guidelines on Automated Individual
Decision-making and Profiling for the Purposes of Regulation 2016/679,” p. 17, adopted October 3, 2017,
revised and adopted February 6, 2018, https://ec.europa.eu/newsroom/article29/items/612053/en.
99 “Factsheet on the ‘Right to be Forgotten’ Ruling,” (C-131/12), European Commission, accessed February
2019, https://www.inforights.im/media/1186/cl_eu_commission_factsheet_right_to_be-forgotten.pdf.
100 WP 251 rev. 01, p. 18.
101 Article 29 Data Protection Working Party, WP 242, “Guidelines on the Right to Data Portability,”
December 13, 2016, http://ec.europa.eu/newsroom/document.cfm?doc_id=43822.
102 Article 29 Data Protection Working Party, “Guidelines on the Right to Data Portability,” p. 4.
103 WP 242 rev. 01, p. 4, footnote 2.
104 WP 242 rev. 01, p. 5.
105 See Open Data Handbook, accessed June 2019, http://opendatahandbook.org.
106 Article 29 Data Protection Working Party, WP 217, “Opinion 06/2014 on the Notion of Legitimate
Interests of the Data Controller under Article 7 of Directive 95/46/EC,” adopted April 9, 2014, p. 24 – 26,
https://ec.europa.eu/justice/article-29/documentation/opinion-
recommendation/files/2014/wp217_en.pdf.
107 WP 217, p. 26
108 WP 251 rev. 01, p. 19.
109 WP 251 rev. 01, p. 18.
110 WP 251 rev. 01, p. 18.
111 WP 251 rev. 01, p. 18.
112 “Recital 71, Profiling,” Intersoft Consulting, accessed October 2021, https://gdpr-info.eu/recitals/no-
71/.
113 Sophie Lalor-Harbord and Ian Gatt, “Dealing with Subject Access Requests under the GDPR,” Stewarts,
September 11, 2018, https://www.lexology.com/library/detail.aspx?g=c2ea32d0-c695-4d60-93dc-
8702aa5d8b6f.
114 Lalor-Harbord and Gatt, “Dealing with Subject Access Requests under the GDPR.”
115 Cal. Civ. Code § 1798.130(a)(6).
116 Ryan Chiavetta, “DSAR Tool Seeks to Help Large Companies Locate User Data,” Privacy Tech, IAPP,
March 5, 2018, https://iapp.org/news/a/dsar-tool-seeks-to-help-large-companies-located-their-data/.
117 Lalor-Harbord and Gatt, “Dealing with Subject Access Requests under the GDPR.”
118 Lalor-Harbord and Gatt, “Dealing with Subject Access Requests under the GDPR.”
119 “The Personal Information Protection and Electronic Documents Act (PIPEDA),” Office of the Privacy
Commissioner of Canada, accessed November 2018, https://www.priv.gc.ca/en/privacy-topics/privacy-
laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/.
120 S.C. 2010, c. 23, Government of Canada, Justice Laws website, accessed November 2018, https://laws-
lois.justice.gc.ca/eng/acts/E-1.6/index.html.
121 “Accessing Your Personal Information,” Office of the Privacy Commissioner of Canada, accessed August
2018, https://www.priv.gc.ca/en/privacy-topics/access-to-personal-information/accessing-your-
personal-information/.
122 “Federal Privacy Reform in Canada: The Consumer Privacy Protection Act,” Privacy Tracker, IAPP,
November 18, 2020, https://iapp.org/news/a/federal-privacy-reform-in-canada-the-consumer-privacy-
protection-act/.
123 “Spotlight: Personal Data Protection Regulations in Latin America,” BNA Americas, December 21, 2020,
https://www.bnamericas.com/en/features/spotlight-personal-data-protection-regulations-in-latin-
america.
124 “Privacy in Latin America and the Caribbean,” Bloomberg BNA, 2015,
https://www.bna.com/uploadedFiles/BNA_V2/Legal/Pages/Custom_Trials/PVRC/Privacy_Laws_L
atin_America.pdf.
125 “Personal Data Held by Government Agencies Now Heavily Protected in Mexico,” Jones Day, May 2017,
http://www.jonesday.com/personal-data-held-by-government-agencies-now-heavily-protected-in-
mexico-05-15-2017/#.
126 Bill 53/2018 of the Brazilian Congress on the protection of personal data, amending Law No. 12,965,
dated April 23, 2014 (LGPD).
127 Article 253-1 of the PRC Criminal Law.
128 Article 111 of the PRC Civil Law.
129 Article 1 of the NPCSC Decision.
130 National Standard of the People’s Republic of China, “Information Security Technology—Personal
Information (PI) Security Specification,” Article 4, March 6, 2020,
https://www.tc260.org.cn/upload/2020-09-18/1600432872689070371.pdf.
131 Alex Wall, “GDPR Matchup: South Korea’s Personal Information Protection Act,” Privacy Tracker, IAPP,
January 8, 2018, https://iapp.org/news/a/gdpr-matchup-south-koreas-personal-information-protection-
act/.
132 Wall, “GDPR matchup: South Korea’s Personal Information Protection Act.”
133 Kensaku Takase, “GDPR Matchup: Japan’s Act on the Protection of Personal Information,” Privacy
Tracker, IAPP, August 29, 2017, https://iapp.org/news/a/gdpr-matchup-japans-act-on-the-protection-of-
personal-information/.
134 “Data Protected—Japan,” Linklaters, March 2020, https://www.linklaters.com/en-us/insights/data-
protected/data-protected---japan.
135 “Laws Of Malaysia, Act 709, Personal Data Protection Act 2010,” [hereinafter PDPA], accessed
September 2021,
https://www.kkmm.gov.my/pdf/Personal%20Data%20Protection%20Act%202010.pdf.
136 PDPA § 12.
137 PDPA § 7.
138 PDPA § 7.
139 “Personal Data Protection Act 2012,” Singapore Statutes Online, accessed September 2021,
https://sso.agc.gov.sg/Act/PDPA2012#pr21-.
140 Singapore Personal Data Protection Act 2012 § 21.
141 Singapore Personal Data Protection Act 2012 § 12.
142 “Delayed Implementation of Thailand’s Personal Data Protection Act,” Privacy and Information Security
Law Blog, Hunton Andrews Kurth, May 29, 2020,
https://www.huntonprivacyblog.com/2020/05/29/delayed-implementation-of-thailands-personal-data-
protection-act/.
( ) [
143 Thailand Royal Gazette, “Personal Data Protection Act (2019),” § 25, May 27, 2019, [hereinafter
Thailand PDPA],
https://www.dataguidance.com/sites/default/files/entranslation_of_the_personal_data_protection_ac
t_0.pdf.
144 Thailand PDPA § 31–33.
145 Thailand PDPA § 34.
146 “Access Your Personal Information,” Office of the Australian Information Commissioner, accessed
September 2021, https://www.oaic.gov.au/privacy/your-privacy-rights/your-personal-
information/access-your-personal-information/.
147 “Correct Your Personal Information,” Office of the Australian Information Commissioner, accessed
September 2021, https://www.oaic.gov.au/privacy/your-privacy-rights/your-personal-
information/correct-your-personal-information/.
148 “Access Your Personal Information,” Office of the Australian Information Commissioner.
149 “Access Your Personal Information,” Office of the Australian Information Commissioner.
150 “What is a Privacy Policy?,” Office of the Australian Information Commissioner, accessed September
2021, https://www.oaic.gov.au/privacy/your-privacy-rights/what-is-a-privacy-policy/.
151 “New Zealand’s Privacy Law Has Changed—A Global Comparison,” Lexology, December 8, 2020,
https://www.lexology.com/library/detail.aspx?g=fbedb163-07f1-48e7-ab88-3cdc29d0dc3b.
152 “Principle 6—Access to Personal Information,” Privacy Commissioner of New Zealand, accessed
September 2021, https://www.privacy.org.nz/privacy-act-2020/privacy-principles/6/.
153 “Principle 7—Correction of Personal Information,” Privacy Commissioner of New Zealand, accessed
September 2021, https://www.privacy.org.nz/privacy-act-2020/privacy-principles/7/.
CHAPTER 10
Privacy Operational Life Cycle: Respond:
Data Breach Incident Plans
Liisa Thomas
For organizations, having a clear plan to respond to a possible data breach is often
a core and critical issue. There are a wide range of laws that apply when a company
is responding to a data breach that impacts individuals’ personal information. In
the United States, there are laws in every state, as well as industry-specific federal
laws, and many of these laws give harmed individuals standing, with California’s
Consumer Privacy Act (CCPA) going so far as to provide statutory damages. In
the European Union, the General Data Protection Regulation (GDPR) addresses
how a company responds to a breach, and other countries have laws, as well. After
addressing notification requirements, corporations often find themselves exposed
to post-notice scrutiny. This can take the form of regulatory inquiries or lawsuits,
including lawsuits from class-action lawyers.
How, then, can organizations best handle these choppy waters? Every
organization needs to be prepared to respond to a potential data breach. No
matter the type of request, they need to be prepared to properly receive, assess,
and respond to it.
Breach notification laws in the United States are numerous, and lawsuits often
arise post-notification.
10.1 Incident Planning
10.1.1 What’s at Risk
Throughout the globe, there are jurisdictions with laws that require companies to
provide notification to affected individuals and/or government authorities in the
event of a data breach. These laws exist in the United States, as well as in the
European Union and elsewhere. When notification must be given depends on the
jurisdiction and the law in question. How notification must be given and the
contents of the notification similarly may vary. Failure to give notice properly can
give rise to liability and exposure.
Even a company that notifies properly faces risks. Those risks may be in the form
of public relations (PR) scrutiny and bad press or follow-on lawsuits or regulatory
action accusing the company of having failed to take proper actions to protect
information.
10.1.2 Legal Exposure and Liability
Companies that suffer a data breach may face litigation exposure, reputational
liability, and potential regulatory scrutiny. In the European Union, fines can be
significant (up to 10 million euros or two percent of global turnover), and in the
United States, post-breach class-action lawsuits have resulted in payment of
significant settlement amounts. Reputational liability is difficult to anticipate.
Similarly, it is not always clear when a fact pattern will result in a lawsuit or
regulatory investigation. Should a company face such scrutiny, factors that will be
considered include:
A purported obligation to prevent unauthorized access to or use of the
data
If the company satisfied an applicable industry standard of care
Whether there were damages or injury, and if the organization’s conduct
or lack thereof was the proximate cause of the damages
Given the possible exposure, companies often explore insurance coverage, as
discussed in greater detail in section 10.4.4.
10.1.3 Costs When Addressing an Incident
Another significant exposure to an organization is the underlying cost of a data
incident. According to the Ponemon Institute’s 2020 Cost of a Data Breach Report,
the average cost of an incident is $3.86 million, and the cost per individual record
lost or compromised is $146.1 Translating statistics to monetary values can help
senior executives see the value of planning for a data incident or breach.
Other risks include not just the cost of the incident itself, but also potential loss
of revenue. If a company does face litigation, including class-action litigation, or
regulatory scrutiny, it may find itself subject to fines or paying significant
settlement amounts. Harder to quantify but no less risky is the possible loss of
existing and potential business. There might also be an impact on business
relationships and third-party contracts, which can be especially problematic
during mergers.
In addition to costs to the company, there are arguably also potential costs to the
affected individuals, who might suffer identity theft or personal reputational harm.
They might also have financial damage from misuse of financial account
information.
10.2 How Incidents Occur
The same Ponemon Institute study revealed that malicious actors or criminal
attacks are the most common cause of data breaches.2 The root causes of breaches
cited in the study include malicious or criminal attack (52 percent), human error
(23 percent), and systems glitches (25 percent).3 Nearly 20 percent of companies
that suffered a breach due to a malicious actor were infiltrated with stolen or
compromised credentials.4 Ransomware attacks by malicious actors are
particularly dangerous, costing more than the average malicious breach or average
data breach.5
Employee error or negligence is reported to be one of the biggest causes of
security incidents. Even malicious and criminal attacks often take the form of
phishing attacks, which rely on unsuspecting employees. Ongoing training and
awareness-raising around information security policies and mature privacy
programs are, therefore, an essential part in reducing the risk of a privacy breach.6
However, some situations arise from unforeseen circumstances. Thus, having
teams that work well together to manage unpredictable and uncontrollable risk is
another important part of being prepared for data incidents.
In summary, data incidents can occur in many ways, including through hacking
or malware, device loss or theft, and unintended disclosure of information. These
situations are more than just a technical or information technology (IT) issue;
everyone in an organization can and should play a role in following responsible
data practices.
10.3 Terminology: Security Incident versus
Breach
When faced with a potential security incident, there is often a temptation to call
the situation a data breach. However, that term is a legal one giving rise to specific
legal obligations and is defined in different ways under various laws around the
globe. Until a lawyer has made a determination that a fact pattern meets the legal
definition, organizations should refer to a security incident as just that, an incident
or a potential incident. Using the term before such a determination has been made
can create risks and exposure for the organization. It may find itself having to
defend why it believes it has no legal obligations when it used the term “data
breach.”
An incident is a situation in which the confidentiality, integrity, or availability of
personal information may potentially be compromised. For a data breach to exist,
typically there must be some sort of unauthorized access or acquisition of
personal information, although the definition of “breach” varies. If a breach exists,
impacted individuals and, in many cases, regulatory authorities must be notified.
In sum, all breaches are incidents, but not all incidents are breaches. Only the
privacy office or legal office can declare a breach, based on certain triggers.
10.4 Getting Prepared
Organizations generally recognize they may be subject to a data incident. The
common phrase is “not if, but when.” With this in mind, what measures can a
company take to prepare for an incident? Preparedness does not prevent an
incident. Instead, while prevention focuses on tasks and technologies that stop a
breach from occurring, preparedness focuses on measures a company can take to
respond optimally—in other words, to answer the question, “What will the
company do when prevention fails?”
Preparedness falls into five categories: training, getting an incident response plan
in place, understanding key stakeholders, getting insurance coverage where
appropriate, and managing vendors that might be a part of an incident.
Organizations that have thought through these five steps will be better prepared
when faced with an actual incident, the mechanics of which are discussed in
section 10.7.
10.4.1 Training
Organizations typically face the following questions when they’re making the case
for training or planning its execution.
Why train? The answer to this is straightforward. Training exposes gaps in
applications, procedures, and pre-incident plans. It can also cultivate greater
overall security for customers, partners, and employees. As a result, training has
the potential to reduce financial liability and regulatory exposure while lowering
breach-related costs, including legal counsel and consumer notification. If
appropriate training has been put in place, it can help a company get through an
incident with brand reputation and integrity preserved.
Which function within an organization should fund training? Leaders often
disagree, and what is appropriate will vary by company. Considerations to take
into account include where most of the data is housed, how other similar projects
have been funded, what is driving the compliance efforts, and what functions
would be most negatively affected by an incident. Many companies find it helpful
to consider a shared-cost arrangement, for example, among IT, finance, human
resources (HR), and legal. Quantify the benefits of training by calculating return
on investment (ROI) and savings.
Who should receive training? The entire organization will likely need some
form of training. Many may only need to learn how to report a potential incident;
others may require more in-depth training. For example, the incident response
team will need thorough training. The IT and security teams will similarly need in-
depth training. The executive team will need education about how to speak
publicly about incidents. While there will need to be—and should be—different
levels and programs for different employee groups, all employees should have a
basic understanding of security procedures and how to report a suspected
incident.
What form should training take? Training will take various forms, and content
should be customized to the audience. It might be a short video or a structured
readiness-testing simulation. A training exercise could also simulate an actual
incident, for instance, circulating a fake phishing email. Regardless of form, record
results and update the plan accordingly.
10.4.2 Creating an Incident Response Plan
A key step in incident preparation is the formal creation of an incident response
plan. To create the plan, the drafting team will need to gather a vast amount of
information and then use the information they have gathered to develop processes
and procedures. There are different schools of thought about the drafting team,
but given the importance of legal requirements and privilege, a good practice is to
have the drafting team led by the legal department and include help from the
privacy office, the information security team and IT, communications, HR,
compliance, and senior management. The exact stakeholders will vary by
organization, and some of these same stakeholders may sit on the incident
response plan. Often, organizations will engage outside support to help draft the
plan. The stakeholders listed here, though, will have valuable input to give,
however.
When you put together the plan, the team should be thinking about some key
factors. These include what types of personal information your organization
collects, in what format you collect that information, and the method of
collection. You will, of course, also want to consider the applicable laws, which is
one of the many reasons working with legal is so important. Third-party
relationships are also critical. What vendors are most likely to have a breach that
would affect you? Another key factor is internal administration: what works for
another company may not work for you. Have there been prior incidents? If so,
there may be learnings you can take away from them as you put together your
plan.
The plan should touch on some key areas, including how to protect privilege, the
roles and responsibilities of team members, how to escalate possible issues and
report suspicious activities, severity rankings (i.e., what triggers escalation and
what type of escalation), and interactions with external parties (e.g., regulators,
vendors, investigators for impacted individuals, insurance providers). A good plan
will also consider integration with business continuity plans (BCPs) and provide a
mechanism for engaging in learnings post-incident.
The purpose of a good plan is to map out for people in an organization what to
do. The lawyers within or outside of your organization will also want to ensure the
plan appropriately includes any regulatory requirements. This is not the place to
list every law that applies but to help the team understand what they may be facing
if an incident occurs.
There are several schools of thought about listing specific team members and
contact information. Many plans include these, but keep in mind the information
changes. Do you want to be updating your plan constantly depending on new
contact details or new job roles? That may make sense for many organizations. For
others, having the key starting point may be enough. As always, your plan will be
customized to the realities of your organization.
10.4.3 Know Your Roster of Stakeholders
Effective incident response requires systematic, well-conceived planning before a
breach occurs. The success of an incident-response plan ultimately depends on
how efficiently stakeholders and constituent teams execute assigned tasks as a
crisis unfolds.
The potential size and scope of breach-related consequences can’t be
understated. At issue are current and future revenue streams, brand equity, and
marketplace reputation. Other risks resulting from bad publicity include
opportunity costs, such as high churn and diminished rates of new customer
acquisitions.
These high stakes can demand the inclusion and expertise of stakeholders from a
wide range of job functions and disciplines in the planning process. Who should
ultimately be involved in a response depends on the nature of the incident, the
culture of the organization, and the needs for confidentiality and containment.
However, to prepare for a potential incident, getting input from stakeholders who
may have personal or sensitive information within an organization is helpful, and
the common locations are listed below:
IT or information security
HR/marketing
Customer relationship management (CRM) systems of customer care
and sales departments
Audit and compliance
Shareholder management
In addition to needing support from stakeholders in these functions during
incident response planning, input from other senior leaders in formulating a plan
can help minimize a breach’s financial and operational impact. Their involvement
will ultimately result in a stronger, more richly multidisciplinary plan that enables
breached companies to effectively restore security, preserve the evidence, and
protect the brand. Examples of departments that should be involved, depending
on the type of organization, its culture, and the personnel in these positions, may
include:
Business development (BD)
Communications and PR
Union leadership
Finance
President, chief executive officer (CEO)
Board of directors
10.4.4 Insurance Coverage
Insurance may be a viable source of funding to offset breach-response and
recovery costs. While traditional policies may provide a certain level of protection,
they do not normally cover expenses resulting from a data compromise. To reduce
exposure, risk managers must work closely with insurance carriers and finance
stakeholders to select the right type and level of coverage prior to an incident. A
relatively new type of coverage, called cyber-liability insurance, may cover many
breach-related expenses, including:
Forensic investigations
Outside counsel fees
Crisis management services
PR experts
Breach notification
Call center costs
Credit/identity monitoring
Fraud resolution and restoration services
Any preparedness process should take insurance coverage into account, and the
policy should be reviewed closely. For example, there may be specific timing
obligations that need to be addressed in the incident plan. When looking at
coverage, keep in mind that you will be asked to fill out questionnaires that speak
to your level of preparedness. Before completing them, coordinate with the legal
department, as you will be making disclosures about internal operations to
external third parties. It is also good practice to align with finance teams to ensure
there is a dedicated budget for dealing with the consequences of such breaches.
10.4.5 Management of Vendors that May Be the Source of an Incident
Often, vendors are the ones that suffer a data breach. But because of the way data
breach notification laws are drafted, the obligation to notify may fall on your
company, not the vendor. For this reason, it is important to have a good
understanding of what information your vendors have, how they use it, and what
they will do if they suffer a breach. This exercise goes beyond merely reviewing
contracts and updating language. For vendors with key information, it may be
necessary to do on-the-ground diligence to understand their preparedness level
and ensure their coordination in the event of an incident.
10.5 Roles in Incident Response Planning by
Function
This section covers the core elements of incident response planning, incident
detection, incident handling, and consumer notification. The focus is on the U.S.
approach to responding to data breaches, since the United States has some of the
world’s strictest and financially consequential breach notification requirements.
The section begins by identifying the roles and responsibilities previously
identified stakeholders may play during a breach.
Different stakeholder teams have different responsibilities in planning for a
possible breach. Table 10-1 details sample departmental responsibilities for the
planning period. In reviewing these responsibilities, remember your organization
is unique; just because a plan worked in one corporate structure does not
guarantee that it will work in yours. Nevertheless, table 10-1 provides guidelines
for common planning expectations by function.
Table 10-1: Sample Departmental Responsibilities
Function Planning Role
Information Provides guidance regarding how the organization address detection, isolation,
security removal, and preservation of affected systems
Legal Ensures the response program is designed to protect privilege and think about
and design the program with an eye toward limiting legal liability and provides
advice on whether the incident meets the requirement to notify supervisory
authorities
Compliance/ Provides advice and direction on how the organization can fulfill obligations
privacy during an incident and may also manage data maps and other repositories that
help the company keep track of impacted individuals’ contact information
Human Provides an employee prospective
resources
Marketing Advises about CRM
Business Represents knowledge in handling and keeping the account
development
Public Plans strategic and tactical communication to inform and influence
relations
Union Represents union interests
leadership
Finance Calculates and manages the bottom-line impact of containment and correction
President/CEO Demonstrates value of preventing breaches through actions
Customer care Offers insight on customer/caller behavior
10.5.1 Information Security and/or Information Technology
Knowledge of enterprise-wide configurations, networking and protocols, and
security measures gives information security a broad enough perspective on the
organization’s electronic assets to help it identify vulnerabilities before criminals
exploit them. As part of the incident response planning process, the information
security group will provide guidance regarding the detection, isolation, removal,
and preservation of affected systems.
10.5.2 Legal
When developing an incident response plan, companies should always seek the
advice of competent counsel experienced in the field of data breach response. If it
is uncertain whether your legal department possesses the requisite knowledge, an
assessment, overseen by the senior legal stakeholder, should be undertaken.
Legal stakeholders are central to incident response planning because they, more
than any other executives, understand the legal precedents and requirements for
handling data and reporting a breach. Their guidance—which is provided on a
legally privileged basis (something other functions cannot do)—helps companies
limit the liability and economic consequences of a breach, including avoidance of
litigation and fines. In addition, most data breach legislation requires intensive
legal knowledge to implement a proper procedure. During incident response
planning, organization attorneys may negotiate any requirements the organization
wishes to impose upon its business partners. Conversely, the organization may
also use attorneys to help determine what it is willing to do in the event data
belonging to a client is compromised.
Finally, as previously noted, legal involvement in planning for an incident is
critical given the need to protect privilege during an incident investigation
process, as well as the level of legal exposure and risk that can arise depending on
how a company handles an incident. After notification, companies may often find
themselves subject to regulatory scrutiny or class-action lawsuits. The
involvement of a lawyer who understands these risks is a key part of successfully
handling an incident.
10.5.3. Compliance and Privacy
Privacy and compliance stakeholders, to the extent that an organization has
individuals filling this role, can provide invaluable resources to an organization
during an incident. They are often most familiar with the location of personal
information within the organization and the stakeholders involved in collecting
and using information. The privacy office (which, in some cases, may sit within
the overall compliance function) may already have a strong data map and be able
to more quickly assess what personal data might have been impacted in an
incident and identify contact information for impacted individuals. In preparing
for potential incidents, this function can think about how—in coordination with
the IT or information security function—the organization can deploy the
mechanics of notification, in particular, identifying contact information, mapping
what information resides in what location, and the like.
10.5.4 Human Resources
Given the extensive amount of personal information that typical HR departments
have on hand, it is highly advisable to include HR team members when discussing
incident response planning. HR staff may also be included because of their unique
perspective regarding employees or for notification of current or past employees.
During incident response planning, the HR stakeholders will normally address
topics such as employee data handling, security awareness training, and/or
incident recognition and response.
10.5.5 Marketing
The typical marketing department has spent years, even decades, gathering,
slicing, dicing, and warehousing vast amounts of customer data, much of which is
personal information, individually or in the aggregate (e.g., name, address,
birthdate, Social Security number, driver’s license number). Through
segmentation and analysis of such data, they gain the necessary insight to be both
the voice of the brand to external audiences and the voice of the customer to
engineering, research and development, and other internal teams.
However, being stewards of such a rich data storehouse also increases
marketing’s vulnerability to hacking and unintentional breaches. This exposure,
combined with the team’s access to campaign and CRM databases, more than
qualifies marketing decision makers for a role in incident response planning.
10.5.6 Business Development
The BD stakeholder, often aided by a dedicated account support team, monitors
and manages vital business relationships. Companies with a certain level of value
or prestige receive regular, personalized attention aimed at building trust,
nurturing loyalty, and sustaining the bond over time.
Stakeholders in this position gain firsthand knowledge into handling and
keeping key accounts, developing an understanding of their corporate culture,
organizational strengths and weaknesses, decision-makers’ personalities, and
management styles of potential customers. These insights can prove invaluable in
incident response planning, which is why BD stakeholders should have a seat at
the table when the planning process begins.
10.5.7 Communications and Public Relations
PR and communications stakeholders are usually senior, media-savvy
professionals who are highly adept at media relations and crisis management.
They serve as stewards of public image and reputation, overseeing the
development of strategic and tactical programs aimed at informing and
influencing audiences.
10.5.8 Union Leadership
Though their numbers have declined since the 1980s, unionized workers still
comprise a sizable percentage of the American workforce. According to the
Bureau of Labor Statistics, the number of wage and salary workers belonging to
unions stood at 14.3 million, or 10.8 percent, in 2020.7 The AFL-CIO, the United
States’ most prominent and well-known union, is a labor federation consisting of
more than 12.5 million members of 56 different unionized entities.8
Data belonging to union workers, like that of all employees, is stored on an
organization’s servers and is vulnerable to breach by accidental or unauthorized
access. If their employer reports a data breach, union members will naturally look
to stewards or other union leaders for information and guidance.
These individuals represent union interests and are authorized to act and speak
on members’ behalf—both to internal groups and to the media at large. For these
reasons, any organization whose workers are unionized should consider including
a senior union stakeholder in data breach planning and response.
10.5.9 Finance
In their response-planning capacity, the main role of finance stakeholders is to
calculate and manage the bottom-line impact of breach containment and
correction. Once the potential costs of responding to a breach are computed, it is
up to finance to allocate the necessary resources to fund resolution and recovery.
The chief financial officer (CFO) should also champion more cost-effective
measures that might help mitigate the risk of having a breach in the first place. To
further aid in containing costs, finance executives or procurement leaders can help
negotiate agreements with new or returning data-breach-resolution providers.
10.5.10 President, Chief Executive Officer
Executives lead; employees and stakeholders follow. In central business functions,
the president/CEO’s attitude and behavior set the tone for the entire
organization. This is especially true with policies and practices surrounding data
security. Through actions taken or not taken, and training funded or not,
employees can easily discern the value their leaders truly place on preventing
breaches. Once data is compromised and the shortcomings of an organization’s
security practices become public, it is the top executive who will ultimately bear
the blame.
10.5.11 Board of Directors
Boards of directors are becoming more aware of and concerned about companies’
level of preparedness for an incident. Many board members have received training
about data incidents, and all are concerned about fulfilling their fiduciary
obligations to their organizations. While boards are not tasked with running the
day-to-day operations of a company, many members will want to make sure their
business is ready in the event of an incident.
Boards often have a wealth of knowledge about handling a data incident,
sometimes from their direct experiences with other incidents.
10.5.12 Customer Care
The head of the customer care operation must contend with issues, such as high
employee turnover and access to large amounts of potentially sensitive CRM data.
These factors make customer care teams susceptible to various forms of attacks by
intruders looking to access personal information.
Social engineering is an increasingly prevalent threat that can surface in a call
center, as criminals call repeatedly to probe and test how security procedures are
applied and how often they are enforced. According to Wombat Security, a
subsidiary of Proofpoint, 74 percent of U.S. organizations experienced a phishing
attack in 2020.9 Respondents to Wombat’s survey reported various impacts
stemming from the phishing attacks, including malware infections, compromised
accounts, and a loss of data.10
Aside from deployment of the necessary technology as a first line of defense,
employee training and awareness of phishing attacks can help to reduce the
potential instances of an attack. Companies are increasingly training their
employees and, according to the survey, 80 percent say security awareness training
has led to a quantified reduction in phishing susceptibility.11
When trained to recognize unusual employee or caller behaviors or to notice
trends in certain types of calls, customer care teams can help deter criminal
activity and prevent potential breaches.
10.6 Integrating Incident Response into the
Business Continuity Plan
To help operations run smoothly in a time of crisis, many companies depend on a
BCP. The plan is typically drafted and maintained by key stakeholders and spells
out departmental responsibilities and actions teams must take before, during, and
after an event. Situations covered in a BCP often include fires, natural disasters
(e.g., tornadoes, hurricanes, floods), and terrorist attacks.
To ensure proper execution of the BCP, all planning and response teams should
know which stakeholder is responsible for overseeing the plan and who, within a
specific job function, will lead them during an event. Knowledge of the plan and
preparation for executing it can mean the difference between a successful
response and a failed one, especially during the first 24 hours.
In terms of overall organizational impact, a serious or protracted data breach can
rival big disasters. Like a fire, tornado, or terrorist attack, a breach can strike
unexpectedly at any time and leave in its wake damages of immeasurable cost and
consequence. As with other calamitous events, cleaning up a digital disaster can
take weeks, months, or longer; in the aftermath, victims’ lives may be changed
forever.
In a 2020 study surveying business continuity professionals, only 26 percent
described their organization’s business continuity program as having multiple
employees dedicated to it full-time.12 An additional 33 percent of respondents said
that their organization assigns people to business continuity, but it is not their
primary responsibility, while another 27 percent reported their organization’s
BCP is managed by a team of one.13 Considering a breach’s potential repercussions
and the benefits that can result from informed and thoughtful preparation, it is
imperative that companies dedicate more resources to business continuity and
integrate breach response planning into their broader BCP.
10.6.1 Tabletop Exercises
Once breach preparedness is integrated into the BCP or if the company decides to
have a standalone incident response plan, incident response training will likely be
required. This training may take many forms, including workshops, seminars, and
online videos, but often includes tabletop exercises, a strategic mainstay of
corporate trainers, and business continuity planners.
A tabletop exercise is a structured readiness-testing activity that simulates an
emergency situation, such as a data breach, in an informal, stress-free setting. It
can both prepare people but also identify gaps that may need to be remediated.
Participants—usually key stakeholders, decision makers, and their alternates—
gather around a table to discuss roles, responsibilities, and procedures in the
context of an emergency scenario.
The focus is on training and familiarization with established policies and plans.
Most exercises last between two and four hours and should be conducted at least
semiannually—more often if resources and personnel are available.
10.6.2 Updating the Plan
Soon after concluding the exercise, results should be summarized, recorded, and
distributed to all participants. Perhaps most importantly, fresh or actionable
insights gained from the exercise should be added to the BCP.
It is imperative to keep the incident response plan or BCP current. There is little
strategic, practical, or economic value to a plan that is painstakingly developed but
seldom tested or improved. Those responsible should always ensure the plan
includes the most up-to-date timeline, action steps, policies and procedures, and
current emergency contact information (vital but often overlooked) for all plan
participants. All those involved should be notified of any changes or updates to
the plan.
10.6.3 Budgeting for Training and Response
Breach preparedness training, especially in a large organization, represents a
significant investment. Creating an environment that ingrains data security into
the corporate culture and prepares teams to respond effectively requires an
organization-wide commitment backed by the resources to see it through.
In most cases, the long-term financial and operational benefits of teaching
employees to prevent, detect, report, and resolve data breaches far outweigh the
costs. The strategic upside of investing in breach preparedness includes:
Exposure of critical gaps in applications, procedures, and plans in a
pre-incident phase
Greater overall security for customers, partners, and employees
Reduced financial liability and regulatory exposure
Lower breach-related costs, including legal counsel and consumer
notification
Preservation of brand reputation and integrity in the marketplace
Though organization leaders often agree about the value of breach awareness
and training, there is rarely consensus about who should foot the bill. Many
businesses utilize a shared-cost arrangement that equitably splits training costs
among participating stakeholder groups, such as IT, finance, and HR.
Negotiations between them can include everything from funding levels and
oversight to allocation of unused funds.
However costs are divided, companies should ensure adequate funding is
available to support business continuity and breach preparedness training. To
facilitate the negotiation, parties should focus on quantifying benefits, ROI, and
savings rather than the bottom-line expense to any individual group.
10.6.4 Breach Response Best Practices
Allocating funds for breach response is just as important as training, perhaps even
more so. Typical costs incurred in responding to a breach include threat isolation,
forensic investigation, engaging of legal counsel, PR communications and media
outreach, and reporting and notification, including printing, postage, and call
center.
Without a breach response budget in place, companies may be forced to
redistribute funds from other critical projects or initiatives. Having to openly
debate the merits and value of one department’s initiatives over another’s may lead
to tension between groups and ultimately delay or detract from optimal breach
response.
10.7 Incident Handling
The process of responding to a breach involves tasks that are not necessarily linear.
Companies facing a potential incident will deal with incident detection, ensure
stakeholders collaborate and know their roles, investigate, ask their legal teams to
conduct a legal analysis, address reporting obligations, and recover from the
situation. While these steps are all part of a well-run response, many of them must
happen in parallel. It can be helpful to think about breach response tasks in broad
categories: secure operations, notify appropriate parties, and fix vulnerabilities.
While these groupings help keep you organized, they are not necessarily meant
to be used as a checklist, as many steps will happen concurrently. For example, a
company’s CEO needs to know about a breach as soon as possible, even if the
breach has not yet been contained. In this case, notifying appropriate parties
would happen simultaneously with containment efforts.
10.7.1 Incident Detection
Unfortunately, there’s not one definitive way to detect a breach. Customer calls or
news reports may alert an organization to trouble before internal sources even
recognize a problem. Consider, for your organization, how you will determine
whether to classify an event as an incident or a breach. Remember also that
privacy is a business function—not a technical function—and relies on other
organizations and departments to execute breach detection and response.
Generally, a privacy incident may be described as any potential or actual
compromise of personal information in a form that facilitates intentional or
unintentional access by unauthorized third parties.
10.7.2 Employee Training
From their first day at an organization, new employees should be taught and
encouraged to assume a privacy-first mindset and reminded of this on a regular
basis. When they observe that leaders and fellow associates are genuinely
committed to data security and privacy protection, new hires are more likely to
respect and comply with established reporting and data-handling policies.
Initial security indoctrination and training should also teach employees to
recognize vulnerabilities and to capture and report basic information when
encountering a potential or actual breach. Employees must understand when and
how to report suspicious incidents to their supervisors, who, in turn, should know
how to properly escalate the incident to internal authorities, such as the privacy
office.
10.7.3 Reporting Worksheets
To emphasize employees’ personal responsibilities when encountering a breach,
policies and procedures should be a regular component of security training and
refreshers. The following worksheet provides a foundation for developing your
own incident-reporting or privacy-training worksheets. The items listed here are
some of the key issues you will need to think about during an incident. Not all the
issues and questions will be something that are asked or knowable at the outset.
Keep in mind, therefore, what makes sense to have as part of an initial intake and
what will be determined at a later date. Also keep in mind how these materials are
stored and distributed, as well as who can access them. Does the incident involve a
bad actor who has possibly accessed your email system? If so, then reporting
should not be occurring through that potentially compromised system!
Keep in mind issues of privilege, as well. All breach planning and preparedness
resources should be reviewed and approved by internal or external legal counsel,
who will think about how best to protect the potentially discoverable
documentation, including tracking and other workflow tools that are created
during an incident. Part of the analysis by the legal team will be to think about the
issues of privilege and determine what content should be documented and how it
should be documented. In many circumstances, materials will need to be prepared
at the direction of counsel.
Sample Worksheet—Prepared at the Direction of Counsel
Facts as they are known:
Name and contact information of person discovering the incident
Date and time the incident was discovered or brought to your attention
Incident date, time, and location
Type of data suspected to be involved
Internal organization or employee data
Client or customer data
Third-party partner or vendor data
Employee’s description of what occurred:
Brief description of how the incident or breach was discovered
Does the incident involve paper records, electronic information, or both?
What type of records or media do you believe were involved?
Paper: letter, office correspondence, corporate document, fax, or
copies thereof?
Electronic: data file or record; email; device, such as laptop,
desktop, or pad-style computer; hard drives in other electronic
equipment (e.g., copy machines)
Media: external hard drive, flash/thumb drive, USB key
Do you know if the device or information was password protected?
Do you know if the device or information was encrypted?
Do you believe information about individuals was exposed, and if so, what
type of information (financial account numbers, passwords, identification
numbers, etc.)?
Can you estimate how many records were involved?
To the best of your knowledge, has the incident been contained? (That is,
has the data leak or loss stopped, or is there still potential for additional
data to be lost?)
10.7.4 Collaboration Among Stakeholders
Within any organization, data is viewed and handled by any number of individuals
and groups and is often stored in several disparate locations, even across multiple
states or continents. The potential for compromising sensitive data exists
throughout every business of every size in every industry.
Regardless of organization size, however, all employees have a vested interest in
being vigilant about safeguarding data. The cost of recovering from a single breach
could potentially cripple an organization or business unit and render it unable to
operate or fully employ its workforce.
For example, whenever IT conducts security training, instructors may keep logs
to record who has attended and who has not. IT may then share this information
with HR to help ensure every employee receives the instruction required by
organization policies.
The potential for compromising sensitive data exists throughout every business of
every size in every industry.
Another example of cooperation between departments is how IT and HR might
work together following detection of a virus or other cybersecurity threat.
Typically, IT would detect the intrusion and prepare very specific containment
instructions for all employees. They could autonomously issue these instructions
or work with HR or communications to assure distribution to the complete
employee base via all available channels.
10.7.5 Physical Security
In many organizations, the level of technical integration between IT and facilities
is so deep and extensive that regular contact through established lines of
communication is essential to maintaining security.
As technology advances, the lines of responsibility can begin to blur. Computers
and systems managed by IT, for example, directly control doors,
electromechanical locks, remote cameras, and other access-limiting security
measures maintained by facilities staff. This close association demonstrates the
need for ongoing collaboration if the safety and integrity of physical and digital
assets are to be maintained.
10.7.6 Human Resources
Hiring, transfers, promotions, or other changes in employment status may require
revisions to an individual’s data access privileges. When such changes are needed,
HR, IT, and facilities should follow established policies for monitoring and
managing data access.
Other occasions requiring group coordination are employee layoffs or
terminations. These unfortunate but not uncommon events can affect thousands
of individuals or just a handful. In either case, the resulting threat to data security
can take many forms, for which HR and other teams must prepare.
Disgruntled or resentful employees, for example, may try to exact revenge for
the dismissal by stealing or destroying sensitive information. Others may attempt
to obtain organization secrets or intellectual property to sell to or gain favor with
key competitors. Before employees are terminated, HR must inform IT and
facilities so that physical and electronic access for those departing may be turned
off immediately after or, in some cases, even simultaneously with the
announcement. Phones, equipment, and other employee devices must also be
wiped of login and password credentials.
Every organization must ensure it has a procedure for retrieving portable storage
devices or media from departing employees.
10.7.7 Third Parties
Sensitive data is seldom handled or processed in a single location. In today’s global
economy, huge volumes of personal information for which companies are directly
responsible reside in systems and facilities managed by outside vendors, partners,
and contractors. These groups should always be accounted for in incident
detection and planning.
For example, companies should make standard a clause requiring third parties to
notify them within a certain time frame when servers, websites, or other business-
critical systems are taken offline. It goes without saying that companies should
always require third parties to promptly communicate any breach of data so that
contingencies can be made to mitigate resulting threats.
Conversely, it is vital for companies that work with third parties to remember
that such communication flows both ways. If the organization’s network is hit with
a virus or comes under a cyberattack, or if there are changes to call center
procedures or employee data-handling policies, the organization has an obligation
to notify its partners immediately.
10.7.8 Tools of Prevention
To those on the front lines, prevention and detection bear many similarities to
defending an occupied fortress. They must protect sensitive information against
treachery and attacks that could come at any time. Regardless of how they
originate, if the fortress is to remain secure, threats must be detected and
eliminated before it is too late.
Today, there are numerous weapons in a security team’s arsenal of prevention.
Some techniques are familiar but still quite effective, while others are emerging
and showing tremendous promise. The successful privacy professional will be
mindful of the need to understand various prevention techniques and their
intended applications and to be purposeful about keeping up with new ones as
security technology advances.
Once breach investigators conclude an actual compromise of sensitive
information has occurred, the prenotification process is triggered. Steps taken may
vary depending on several factors, but the purpose is to confirm the event does
indeed constitute a reportable breach.
10.8 Roles Different Individuals Play During an
Incident
Immediately following the decision to notify affected parties, tactical portions of
the incident response plan begin to unfold. Companies dealing with an incident
may find themselves balancing two possibly conflicting issues: containment and
legal exposures. Companies want to contain and remediate the problem. At the
same time, should the situation be viewed as a data breach, impacted individuals
and, potentially, government agencies must be notified. These notices often result
in lawsuits or regulatory scrutiny.
An incident response process will need to balance these objectives. A
successfully handled plan will be directed by legal to address the legal exposures
and privilege concerns, who will work hand in hand with an IT leader who is
focused on containment and remediation. Other key stakeholders will also need
direct involvement.
Depending on your organization’s structure, your incident program will need a
clearly delineated leader to rally the troops and keep the process on track. When
thinking about incident response leadership, it is important to keep legal risks in
mind. While an organization may have many lawyers on staff, not all are engaged
in the practice of law. The chief privacy officer (CPO) or chief compliance officer
(CCO), for example, may have legal degrees but not be functioning as lawyers
(i.e., providing legal advice to the organization). General counsels may, at times,
also be concerned about whether their roles will be questioned during litigation.
For this reason, many turn to outside counsel, who are often best positioned to
advise on breach-related matters. The distinction of individuals with law degrees
who are not serving in the role of lawyers is important, because to protect
attorney-client privilege, the investigation will need to be done at the direction of
counsel. Thus, in many organizations, the leader is, of necessity, legal counsel.
Organizations often have many individuals with extensive knowledge about data
breach matters. In addition to legal counsel, who are concerned with privilege, the
CPO or CCO wants to ensure a breach is handled correctly from a compliance
standpoint, and the chief information security officer (CISO) will be focused on
the nuts and bolts of investigation and containment. The CISO’s role may include
recommending outside forensic experts to help ascertain the incident’s cause, size,
and scope. The CISO, in connection with the rest of the IT department, may also
oversee evidence preservation, taking affected systems offline and correcting
vulnerabilities that facilitated the incident.
Team leadership, keeping containment and privilege in mind, will include
contacting and activating appropriate response team members or their alternates.
Meetings during the investigation may be necessary, and the team leadership
should think about how best to schedule these to gather and analyze status reports
and provide guidance as needed. Convening with individual stakeholders to
discuss lawsuits, media inquiries, regulatory concerns, and other pressing
developments is another of the team leader’s tasks.
During the breach, team leaders will also:
Keep individual response team members on track to meet their
performance objectives and timelines
Track budget adherence for all response activities
Contact outside incident response resources to confirm engagement
and monitor performance
Prepare a final analysis of the response effort and lead the post-event
evaluation process
The team leader may also choose to provide senior executives with an overview
of the event and the team’s expected course of action. The breach response team
leader must manage expectations around communications so executives know
they are always as informed as possible and do not need to continually check in
during the response process, which would hinder the team leader’s work.
Below is a list of tips to help manage expectations and communicate with
executives:
Manage executive leaders’ expectations by establishing the frequency of
updates/communications, particularly where you anticipate that
regulatory notifications might be required or press releases may occur
Determine what is appropriate for the situation, and communicate
when/if the frequency needs to change
Hold a kickoff meeting to present the team with the known facts and
circumstances
Provide senior executives with an overview of the event and the team’s
expected course of action
Engage remediation providers to reduce consumers’ risk of fraud or
identity theft
Convene with individual stakeholders to discuss lawsuits, media
inquiries, regulatory concerns, and other pressing developments
There is sometimes confusion about who—among legal, CPO/CCO, and CISO
—should be directing and leading an incident response. The best incident
response teams are those in which the three work together, ensuring maintenance
of privilege, containment, and swift investigations.
10.8.1 Legal
In addition to ensuring the protection of privilege during the investigation, legal
will be focused on determining whether there is a duty to notify under breach
notification laws and, if so, what form that notice should take. The entities to
notify vary by breach.
Legal stakeholders may also recommend forensically sound evidence collection
and preservation practices and engage or prepare statements for state attorneys
general, the U.S. Federal Trade Commission (FTC), and other regulators.
Stakeholders’ knowledge of laws and legal precedents helps teams more effectively
direct and manage the numerous related elements of incident investigation and
response.
Drafting and reviewing contracts is another vital area in which legal stakeholders
should be involved. If data belongs to a client, it can interpret contractual
notification requirements and reporting and remediation obligations. Should the
organization become the target of post-breach litigation, the legal stakeholder may
also guide or prepare the defense.
10.8.2 Information Security
While some data incident matters do involve paper records, given the cyber
nature of most incidents, it is almost certain that the information security group
will be engaged to address data compromises. The CISO, chief technology officer
(CTO), or their designated person on the incident team will focus the group’s
expertise on facilitating and supporting forensic investigations, including evidence
preservation. Information security will also likely be tasked with overseeing the
deletion of embedded malware and hacker tools and correcting vulnerabilities
that may have precipitated the breach. Larger companies may establish a
computer emergency response team (CERT) to promptly address security issues.
While internal IT resources may have the experience and equipment to
investigate incidents, it is often more advantageous to bring in outside experts to
identify the cause and scope of the breach and the type and location of
compromised data.
To support other groups with their breach response efforts, the technology team
may also:
Provide a secure transmission method for data files intended for the
print vendor (see section 10.8.3.6.1) or incident call center
Identify the location of potentially compromised data (e.g., test
development and production environments)
Determine the number of records potentially affected and the types of
personal information they contain
Clean up mailing lists to help facilitate the printing process
Sort through data to identify populations requiring special handling
(e.g., minors, expatriates, deceased)
Monitor systems for additional attacks
Fix the gaps in the IT systems, if applicable
10.8.3 Other Response Team Members
There are typically two levels to a response team. First are the leaders who will
make the key decisions about how an incident is handled. Second are the
individuals who will be providing input and support to the core team. Those in
the second group will vary depending on the type of incident. A balance should
be struck between ensuring that the appropriate stakeholders are included but
that communications are controlled to avoid legal exposure. Legal counsel can be
very helpful in this regard.
Core team members will be the legal lead, as well as the IT or security head
investigating and handling containment. Additional support may be needed from
other areas as described in table 10-2.
Table 10-2: Incident Response Support Roles by Function
Function Potential Role
Human resources Serves as information conduit to employees
Finance Secures resources to fund resolution
Marketing Establishes and maintains a positive and consistent message
Public relations Assumes positions on the front line
Customer care Handles breach-related call traffic
Business Notifies key accounts
development
Union leadership Communicates and coordinates with union
President/CEO Promptly allocates funds and personnel and publicly comments on
breach
The following describes the typical roles these functions fulfill in most
organizations. Every company is unique, however, and care should be taken to
ensure you are following the protocols of your own organization when thinking
about the roles of team members.
10.8.3.1 Human Resources
Whether breaches affect employees’ data or not, the chief human resources officer
(CHRO) or vice president of HR must guide the HR team’s response activities.
Concerns over the organization’s solvency or stock value can make it necessary to
inform employees of the incident or where details of the event are publicly known.
Moreover, employees might be contacted regarding the incident by affected
persons, the media, or other parties and need to be directed on how to respond.
If employee data is compromised, the CHRO’s role will become more
prominent, including directing the HR team in identifying the cause and scope
and overseeing communications between management and staff. If the breach is
attributed to an employee, the HR group will take one or more of the following
actions: provide training, make procedural changes, administer the appropriate
corrective action, or terminate the individual. If criminal behavior is discovered,
the legal department and/or law enforcement officials may become involved.
During and after a breach, the HR team may be called upon to perform a variety
of other corrective or educational tasks, such as:
Facilitating employee interviews with internal and external investigators
Identifying individuals who need training
Holding daily meetings to summarize breach updates and create
appropriate communications for employees
Escalating concerns to the appropriate department heads
In the aftermath of a breach, the HR stakeholder may serve as the organization’s
informational conduit, working closely with PR or corporate communications to
inform and update employees about the incident. During the breach, employees
may become concerned about the effects an event might have on their
employment, organization, stock, or strategic business relationships. Therefore,
HR might work with internal or external resources to address and allay such
concerns.
If an incident affects employee records, the HR team might also help
investigators determine the location, type, and amount of compromised data. If
the breach is traced to an organization employee, HR would be expected to
collaborate with the individual’s manager to document the individual’s actions
and determine the appropriate consequences.
10.8.3.2 Finance
The CFO or chief financial and operating officer (CFOO) will be responsible for
guiding the organization’s post-breach financial decisions. Since breaches tend to
be unplanned, unbudgeted events, the CFO should work closely with senior
management to allocate and acquire the funds necessary to fully recover from the
event.
The CFO may help negotiate with outside providers to obtain favorable pricing
and terms of service. The finance team may also collaborate with the legal group to
create cost/benefit models that identify the most practical or economical
approaches.
Tasks commonly undertaken by the finance team during a breach include:
Setting aside and managing appropriate reserves to pay for rapidly
mounting expenses
Working with vendors to extend payment terms and secure potential
discounts
Promptly paying invoices for breach-related activities
Meeting daily with the response team leader to track incident expenses
Requesting ongoing reports from breach providers to manage and track
call center, printing, and credit-monitoring costs
During a data breach, finance stakeholders apply their knowledge of the
organization’s financial commitments, obligations, and cash position to
recommend budget parameters for responding to the event.
In companies where incident response is an unbudgeted expense, the finance
team is often tasked with being both proactive and creative in securing the
resources necessary to fund resolution and notification. This sum can range from
several thousand to several million dollars.
Before or after a breach, finance executives may work with insurance carriers to
negotiate insurance policy updates, including improvements to the commercial
general liability (CGL) policy and the addition of cyber insurance coverage.
Cyber insurance is a relatively new form of protection that fills gaps typically not
covered by the CGL plan. Organizations seeking first-party cyber insurance
coverage have a surprisingly diverse range of choices, including protection against
losses stemming from data destruction and theft, extortion and hacking, and
revenue lost from network intrusion or interruption.
Notification expenses, such as printing, mailing, credit monitoring, and call
center support, may be included in a policy, along with third-party cyber liability
coverage for vendors and partners. The CFO or other finance stakeholder can
offer invaluable assistance in assessing the necessity and cost of updating
insurance coverage.
10.8.3.3 Marketing and Public Relations
The chief marketing officer (CMO) is the person best qualified to help mitigate
brand and reputational damage that can follow a data breach. By collaborating
with the PR team or crisis management firm, the CMO can oversee content
development for press releases, blog and website updates, and victim notification
letters. Monitoring and responding to media coverage and arranging spokesperson
interviews will also fall to members of the CMO’s team.
Since the marketing department may already have the expertise and
infrastructure in place to support large-scale mailings, the CMO could divert
resources necessary to facilitate the notification process. In support of the effort,
the team may also:
Suggest direct marketing best practices to maximize notification letter
open rates
Perform address/database hygiene to improve breach notification
delivery and response rates
Analyze media coverage and report relevant developments to the
response team
Draft scripts for the incident response call center
Develop customer retention and win-back campaigns to minimize
churn and encourage loyalty
Marketers are expert communicators, especially skilled at researching and
crafting highly targeted, consumer-driven messaging. Marketing can work with
management and PR teams to establish and maintain a positive, consistent
message during both the crisis and post-breach notification.
Direct mail expertise may also prove beneficial in supporting the data breach
response. Depending on organization size, marketing may control the physical
infrastructure to help launch and manage a high-volume email or letter
notification outreach. Gaining internal agreement on the post-breach allocation of
marketing resources is an essential element of breach response planning.
When a data breach occurs, and response teams are thrust into the fray
regardless of the severity, PR and communications stakeholders quickly assume
positions on the front lines, preparing for the response to potential media
inquiries and coordinating internal and external status updates.
Among their chief roles is to oversee the preparation and dissemination of
breach-related press releases, interviews, videos, and social media content. As the
crisis develops, they also work to ensure message accuracy and consistency and to
minimize leaks of false or inaccurate information.
During and after a breach, PR and communications teams closely monitor
online and offline coverage, analyzing what is being said and to what degree
negative publicity is shaping public opinion. Resulting analysis and
recommendations are shared among key stakeholders and used to adapt or refine
PR messaging.
10.8.3.4 Customer Care
In the aftermath of a breach, customer service can recommend ways of using
internal sources to serve the needs of breach victims and identify an appropriate
outsourced partner. This stakeholder is also likely to work with others to
coordinate the release of breach-related communications with call center
readiness and activities.
Given the customer service training and experience of most call center teams,
using existing staffing and assets to address breach-related inquiries may be a
viable time- and cost-saving option for some companies. Customer care can be
provided scripts to work from to respond to inquiries about the incident. If an
outsourced provider is retained to answer incoming calls, the customer service
executive can play a crucial role in determining acceptable service levels, reporting
duties, and necessary service representative training.
As part of their normal duties, customer care representatives are trained to
remain calm when confronted and defuse potentially volatile encounters before
they escalate. Such training, along with experience working and delivering
scripted messages in a pressure-filled environment, can enable deployment of
these team members to effectively handle breach-related call traffic.
Using internal resources in this manner, however, could potentially degrade
service quality for other incoming service calls, so the prospect of leveraging
existing resources to minimize breach response expenditures may only be
attractive for certain organizations.
In companies where using in-house employees to answer breach-related calls is
not an option, the executive of customer service should consider hiring
experienced outsourcers to handle call overflow or perhaps manage the entire
initiative.
10.8.3.5 Business Development
In the hands of a skilled sales or BD executive, high-value relationships can
flourish for many years. Because of their unique association with customers and
the bond of trust built carefully over time, BD decision makers are often asked to
notify key accounts when their data has been breached. Receiving unfavorable
news from a trusted friend and partner may lessen the impact and mitigate any
potential backlash, such as a loss of confidence or flight to a competitor.
After obtaining the facts from IT, legal, PR, or other internal teams, the BD
stakeholder should contact the account and carefully explain what happened.
Accuracy and transparency are essential. The stakeholder should stick to the
known facts and under no circumstances speculate about or downplay any aspect
of the breach.
Whenever possible, updates or special instructions regarding the breach should
be promptly delivered by the stakeholder in charge of the account. This will
provide reassurances that someone with executive authority is proactively
engaged in safeguarding the account’s interests and security.
10.8.3.6 Outside Resources
In addition to the support of internal functional leaders, a successful response may
depend heavily on the aid of outside specialists retained to manage notification,
call center, and breach remediation activities. It is a best practice to negotiate
agreements with experienced breach response providers prior to having to
respond to an incident.
Professional forensic firms prepare themselves to deploy at a moment’s notice.
Once on the scene, investigators work closely with the organization’s IT group to
isolate compromised systems, contain the damage, preserve electronic evidence,
establish a chain of custody, and document any actions taken.
Depending on the type of evidence uncovered, the affected organization may
need to confer with outside counsel regarding its legal obligations. Breach
definition and applicable reporting requirements usually depend on a variety of
state and federal laws and international regulations, as well as the compromised
organization’s industry. Health care, for example, is subject to a different set of
regulations than non-health care businesses. With so many variables influencing
the notify/don’t notify decision, advice from an experienced breach or privacy
attorney can prove invaluable in meeting legal obligations and mitigating
unnecessary costs.
As the forensic and legal analysis concludes, the decision whether to notify
affected parties must be made. If notification is indicated, the incident response
plan must be activated and “go-live” preparations quickly initiated. While the
organization’s focus shifts to executing the incident response plan, it is also
important to continue addressing the cause of the breach.
Whether through training employees, replacing equipment, installing new
software, adding staff, creating a new oversight position, or replacing the
responsible vendor, some action must be taken and quickly. The situation that led
to the breach should not be allowed to continue unchecked, or the entire costly
exercise may be repeated unnecessarily.
10.8.3.6.1 Print Vendors
A reputable print provider, for example, can be invaluable in leveraging its
equipment and assets to produce, stuff, mail, and track large volumes of letters.
The print vendor may also guide the breach response team leader and appropriate
support staff through the notification effort’s many technical details. Important
but less obvious support activities, such as gathering logos, sample signatures,
letter copy, and address files, must also be completed as production and delivery
deadlines approach.
10.8.3.6.2 Call Center
Once notification letters are delivered, recipients will begin calling and emailing
the organization to inquire about the event and its impact on their lives. In
situations when projected call volume is large enough for call center outsourcing,
it is crucial that the team leader fully understand the vendor’s overall capabilities.
As soon as possible, agreements should be reached and the timeline set for
training and assignment of agents, call-routing programming, message recording,
preparation of service level agreements (SLAs), and call center reporting.
10.8.3.6.3 Remediation Providers
Depending on the nature of the information compromised, breached
organizations may choose to engage remediation providers to reduce consumers’
risk of fraud or identity theft. This may include a third-party service to monitor
credit activity. The service should be offered free to the consumer and include, at
minimum, daily monitoring and alerts of activity from all three national credit
bureaus, identity theft insurance, and fraud resolution services. In some cases,
supplemental services, such as internet scanning for compromised information,
may also be deployed to help protect consumers.
10.8.3.7 Union Leadership Role During an Incident
In preparation for a breach, union stakeholders should identify appropriate
contacts within the organization and become familiar with its overall breach
response plan. Specifically, they should know the roles, responsibilities, and
sequence of events to be taken by other nonunion stakeholders and response
teams.
After a breach occurs, the primary roles for the union stakeholder are
communication and coordination. Working with IT, HR, or PR executives, the
union steward may oversee the use of electronic communication channels, such as
social media or union intranet or website, to provide members with timely
updates and instructions. If member directories and databases are supplied ahead
of time, marketing and call center teams can notify or update members directly
through mail, email, or phone calls.
10.8.3.8 President, Chief Executive Officer
One of the first and arguably most critical steps taken by the top executive is to
promptly allocate the funds and manpower needed to resolve the breach. Having
resources readily available helps teams quickly contain and manage the threat and
lessen its overall impact.
In the period immediately after a breach, PR or communications teams will
handle most of the media interaction. At some point, however, top executives
could be called upon to publicly comment on the breach’s cause or status. As with
any organization attempting to manage a crisis, accuracy, authenticity, and
transparency are essential. Regular status updates from IT and legal, as well as
coaching support from PR/communications, can prepare the president/CEO for
scrutiny from a potentially hostile media.
When addressing the public, executives would do well to follow messaging
recommendations set forth by the communications team. This helps ensure
message consistency and reduces the risks of speaking in error or going off topic.
The CEO, supported by the privacy team, might also be well advised to get in
contact with the responsible data protection authorities (DPAs) or regulators to
discuss the incident and assure them that the breach is being handled from the top
down. With personal information exposed, peoples’ lives and even livelihoods are
at risk. Therefore, language and tone used to address the public should always be
chosen with great care. The sensitivity with which an organization responds to a
breach and executives’ actions during the event will affect how quickly the
organization’s brand trust and customer relations are restored afterward.
10.8.3.9 Board of Directors
The board is responsible for ensuring a company is well run and focuses on
ensuring the company properly handles risk exposure. During an incident,
company personnel and management will find themselves in frequent
communication with the board. The board will ask questions about how an
incident is being handled and whether the company is properly mitigating its
risks.
10.9 Investigating an Incident
Breach investigation is a subset of breach response and occurs once breach
investigators have concluded that sensitive information has been compromised.
Professional forensic investigators can capture forensic images of affected systems,
collect and analyze evidence, and outline remediation steps. During an
investigation, on the containment side, the focus is on isolating compromised
systems, containing the damage, and documenting any actions taken. On the legal
side, the focus is on determining whether the event constitutes a “breach” as
defined by the relevant laws, preserving electronic evidence, and establishing a
chain of custody.
10.9.1 Containment
During the investigation phase of an incident, containment will be top of mind for
the IT/information security team. The need to prevent further loss by taking
appropriate steps is critical. These include securing physical areas and blocking
bad actors’ access to impacted data. The approach to these issues, however, needs
to be balanced with the legal steps discussed in the next section.
Another part of containment is fixing the vulnerabilities that allowed the bad
actor to access the systems in the first place. After ensuring any breach is
contained, begin analyzing vulnerabilities and addressing third parties that might
have been involved. Where appropriate, it may be necessary to share learnings, but
this should be done in conjunction with the legal steps discussed in the next
section. Factors to consider include:
Service providers—Were they involved? Is there a need to change
access privileges? What steps do they need to take to prevent future
breaches? How can you verify they have taken these steps?
Network segmentation—Ensure your segmentation plan was effective
in containing the breach.
10.9.2 The Importance of Legal Privilege
When investigating an incident, a company will want to make sure its
investigation and related communications and work product are protected by
attorney-client privilege. Attorney-client privilege protects any communication
between a lawyer and their client made for the purpose of giving or obtaining legal
advice. The attorney work product doctrine protects documents or analyses made
by a lawyer or at the direction of a lawyer in anticipation of litigation. A company
should involve its attorneys as soon as it discovers a breach has occurred and
ensure the attorneys are directing the investigation for the purpose of legal advice
or in anticipation of litigation. CC’ing an attorney on an email is not enough to
create privilege. It is better to have the process directed by outside counsel than by
inside counsel, because courts have in some instances ruled there was no privilege
where inside counsel appeared to be acting in a business rather than a legal
capacity. A proper investigation may generate communications and documents
containing facts and opinions that reflect badly on the company or sensitive
materials, such as trade secrets. An investigation directed by counsel will maintain
privilege so the company has the freedom to perform a thorough and effective
investigation into the incident without fearing the communications and
documents created during that process will be used against it in later litigation.
10.9.3 Notification and Cooperation with Insurer
After a cyber incident, a company should notify all its insurance providers because
there may be coverage under more than just a standalone cyber policy. After
notification, the company should receive a coverage letter from the insurer
outlining the scope of coverage. Depending upon the policy, the insurer may
require the company to use the insurer’s preferred service providers during the
investigation. The costs of breach response, including notification, credit
monitoring, PR, and data recovery, can add up quickly. Accordingly, it is
important for companies to seek ongoing updates on the status of their coverage
levels.
10.9.4 Credit Card Incidents and Card Schemes
Companies that have contracted with credit card companies to accept credit card
payments must notify those credit companies in case of a breach. The contract
should be consulted because it is likely to contain specific requirements not only
about notification, but also about post-breach procedures and cooperation with
the credit company.
10.9.5 Third-Party Forensics
It may be necessary to engage outside forensics vendors in a complex breach. To
ensure the investigation is privileged, those vendors should be engaged by the
attorneys, preferably by outside counsel, rather than the company. Furthermore,
to the extent the company is insured, it should check with the insurer in case the
insurer requires the company to use particular vendors. The engagement letter
with the vendors should specify their work is undertaken at the direction of
counsel as “work for hire” for the purpose of providing legal advice, and all
documents and communications with the vendor should be labeled confidential
and proprietary.
10.9.6 Involve Key Stakeholders During Investigations
Think carefully about how it is most appropriate to involve key stakeholders.
What is the culture of your company? What are the legal and practical risks in
your situation of involving large groups in potentially sensitive matters? In some
cases, daily meetings with your core response team will be needed. There may also
need to be some reporting out to other stakeholders, especially company
leadership.
10.10 Reporting Obligations and Execution
Timeline
Not all breaches require notification. There are various types of notification
requirements to regulators and affected individuals. If data was encrypted or an
unauthorized individual accidentally accessed but didn’t misuse the data,
potential harm and risk can be minimal and companies may not need to notify
based on applicable laws. Notification may be required even without harm to an
individual. Coordinating with legal counsel to understand notification obligations
is critical.
Breach-reporting obligations for legal compliance vary by jurisdiction but tend
to adhere to certain principles, including harm prevention, collection limitation,
accountability, and monitoring and enforcement. The legal team will determine,
based on the facts, whether the situation constitutes a breach as defined by
relevant laws such that notification is necessary.
If notification is needed, it must occur in a timely manner. The specific time
frame is often dictated under the relevant statutes. To best accomplish
notification, a timeline to guide the execution and administration of breach
resolution activities is critically helpful. Close coordination among internal and
external stakeholders will also help ensure all plan elements are executed in the
proper sequence.
No strategy is bulletproof, and no timeline perfect. But the crucial execution
phase of the incident response plan is particularly susceptible to setbacks if
realistic, properly sequenced timelines are not observed.
Because of organizations’ vastly differing cultural, political, and regulatory
considerations, it is usually not practical to prescribe a rigid, one-size-fits-all
breach event timeline. There is value, however, in including some or all the
following communication tactics when formulating a breach response.
10.10.1 Notification Requirements and Guidelines
Escalation refers to the internal process whereby employees alert supervisors
about a security-related incident, who, in turn, report the details to a predefined
list of experts—typically the privacy office—which will then engage IT,
information security, facilities, or HR. Notification is the process of informing
affected individuals that their personal data has been breached.
During the management of a privacy incident, it is imperative that all internal
communications are locked down so inaccurate or incomplete details regarding
the incident are not sent around the organization. The incident response team
should be responsible for all internal communications regarding the incident;
these communications should only be forwarded to staff on a need-to-know basis.
Many statutes prescribe specific time frames for providing notification, either to
impacted individuals and/or relevant regulators. The legal requirements change
regularly. For planning purposes, however, it is enough to know that when
investigating an incident, time is of the essence. Timing is even more critical once
the incident has been confirmed to be a breach. Organization privacy
professionals and those charged with incident response planning and notification
should be intimately familiar with the prevailing notification requirements and
guidelines and should work with qualified legal counsel to assist in making the
legal determination about the need to give notice.
Incident response teams should always confirm requirements with legal counsel
experienced in data privacy litigation prior to initiating or forgoing any
notification campaign.
Because of the potential consequences to the organization and those whose data
has been exposed, organizations must quickly initiate the notification process.
This includes verifying addresses; writing, printing, and mailing notification
letters; setting up a call center; and arranging support services, such as identity
theft protection for affected individuals.
In the United States, some states mandate that notification letters contain
specific verbiage or content, such as toll-free numbers and addresses for the three
major credit bureaus, the FTC, and a state’s attorney general. Multiple state laws
may apply to one breach, and notification may be delayed if law enforcement
believes it would interfere with an ongoing investigation.
The notification deadline weighs heavily, in addition to the public scrutiny and
already stressful ordeal of a data breach. Mishandling notifications can lead to
severe consequences, including fines and other unbudgeted expenses.
10.10.2 Internal Announcements
Attempting to keep employees from learning of a data loss is neither prudent nor
possible. On the contrary, transparency is typically paramount to maintaining
integrity and credibility. When a breach occurs, in most situations, all employees
should receive properly worded communications about the event, along with
specific guidelines and prohibitions about externally disseminating information.
Employees should be told who the designated press contact is. When it comes to
external breach inquiries, employees should always defer to those authorized to
speak about the incident and not provide information themselves.
Internal breach announcements should be timed to avoid conflict with other
organizational initiatives and avoid negative legal exposure. To minimize the
chance of leaks, align messaging, and demonstrate transparency, these
announcements should also be delivered at the same time as external statements.
A breach may affect an organization’s real or perceived financial viability, so the
HR team should prepare to address a range of employee concerns. How these
concerns should be addressed must be considered in light of a company’s legal
risks and exposures. If an event has occurred but does not affect employees
directly, the following activities may help supplement the internal announcement
process:
Creation, approval, and posting of employee-only FAQs
Response training for HR personnel and call center staff
Creation, approval, and distribution of explanatory letter, email, or
intranet communications
10.10.3 External Announcements
The creation and release of external communications should be closely
coordinated with the call center, in connection, of course, with legal counsel. In
addition to notification letters and press releases, other external strategies and
tactics may be deployed to announce and manage breach communications.
Among the most important of these is to engage a professional crisis management
or communications firm, if none are available internally, and designate a senior,
media-trained executive as the organization’s spokesperson.
Talking points should be developed as quickly as possible, so the spokesperson
may confidently address the media. For consistency, foundational message points
can be used to create content for press releases, intranets, the organization’s
website, and FAQ sheets for call centers. A dedicated toll-free number should be
secured and routed to the correct call center to properly handle incoming calls.
While there is no single correct way to communicate about a breach, messaging
should always be consistent. Potential consequences of inconsistent messaging
include public misunderstandings and assumptions, legal liability issues, loss of
trust and consumer confidence, and evidence of poor planning. Organizations
should also consider call center FAQ review and training, and staffing-level
assessment to ensure adequate coverage.
10.10.4 Regulator Notifications
Legal counsel should provide guidance on which state, federal, or international
regulatory agencies require notification in the event of a data breach. In many
instances in the United States, it is appropriate to contact the state attorney
general and, in some cases, the FTC. In the health care industry, the Department
of Health and Human Services (HHS) may need to be notified, as well.
Notification to these agencies would be determined on a case-by-case basis,
depending on the size and scope of the data breach; work with your legal counsel
with data breach experience to provide such notices.
10.10.5 Letter Drops
Letters and emails are the most common forms of breach notification. As
organizations decide to notify, the need to meet specific deadlines in accordance
with applicable laws while working within the constraints of complex production
and delivery processes can be unwieldy and difficult to reconcile.
Unlike outputting documents from a computer, industrial-level printing requires
a great deal of preparation and quality control. Verifying mailing file
completeness, format consistency, and age of mailing list data can add days to the
production timeline.
Moreover, changing content during production or delivering assets (e.g., logos,
signatures, copy) after specified deadlines can unnecessarily delay notification and
burn precious days in an already accelerated schedule.
Here are some time-proven methods for ensuring a more efficient process:
If appropriate, establish a secure data transfer channel
Create letter copy, and put it into Microsoft Word or another preferred
format
Obtain any necessary content approvals from the compliance and/or
legal team
Send usable data files to the print shop, including a properly formatted
logo and electronic signature
Supply a return address for undeliverable mail
Review final letter layout for a legible, aesthetically pleasing appearance
When planning letter drops, remember a data breach may also involve criminal
activity and, therefore, law enforcement personnel. If officials determine the
notification will impede their investigation or threaten national security, delays
can be expected.
10.10.6 Call Center Launches
Call centers normally in place have the infrastructure, policies, and procedures
needed to seamlessly switch from providing general customer service to
answering breach-related calls. For a switch to be successful, proper preparation
for every call center component is required. Adequately staffing the incident
response team is one particularly critical consideration.
To increase headcount, temp agencies or outsourcers may be retained. The next
steps are drafting phone scripts, sometimes in multiple languages, conducting call-
handling training, and recording a message for the call tree. A dedicated toll-free
number should be assigned and a call escalation process identified. Other
preparations may include:
Creating, approving, and uploading email templates
Training the quality assurance team on the details of the initiative
Pulling and analyzing reports
Monitoring call levels to determine staffing needs
10.10.7 Remediation Offers
Besides trying to protect incident victims’ identities, companies tend to offer
remediation services to soften the blow of a breach. If a remediation offer is made,
the organization should facilitate the dialog between the parties involved, which
typically include the credit-monitoring provider, letter-mailing house, and call
center. Some vendors are able to provide all three services but not in all cases.
As a best practice, the notification letter would feature a full description of the
remediation product, enrollment instructions, and a customer service phone
number or email address. An activation code, by which the recipient may redeem
the remediation product, should also be included. To assure close collaboration
between the three different entities—to the extent that the company is not
working with one provider who is able to deliver all three services—the following
are some suggested steps the organization can take and/or it should ensure that its
vendor is following:
Remediation Organization
Create one activation code per affected person for inclusion in
notification letters (in some cases, the credit-monitoring provider will
take a different approach to assist in streamlining the process)
Provide a full product description to the letter-mailing house and the
call center vendor, along with a toll-free number and an enrollment
website URL
Launch and test the website for enrollments
Ramp up and train internal call center staff to enable phone enrollments
and answer product questions
Approve the final letter copy as it pertains to the accuracy of the offer
details
Letter-Mailing House
Obtain product description and activation codes from the remediation
firm
Merge product copy and activation codes into notification letters
Print and mail letters according to agreed-upon standards and timelines
Call Center
Receive product description and, as appropriate, train staff on basic
product questions
Determine and institute call transfer procedures among the vendor call
center, remediation firm, and affected organization
10.10.8 Progress Reporting
There is some debate about the level and type of progress reporting that is needed
for an incident. Keep in mind every situation is different. That said, making sure
the incident team is well informed and moving toward a unified goal is critical.
For complex or large-scale data breaches in which notification is required as
determined by legal, there will be a significant number of letters mailed, calls
received, and credit-monitoring enrollments. Keeping track of this information
and being prepared to report up or down is important, and having a strong
reporting structure plays a pivotal role in distilling the chaotic flow of reports into
a clearer, more manageable stream.
You will need to give different types of reports to different stakeholders based on
their need to know. Regardless of audience, progress reporting during the breach
recovery period should focus on the question, “What data do they need, and
when do they need it?”
During the breach notification period, the incident team may be called upon to
provide metrics about how the event is being received by affected individuals, the
press, regulators, and the public generally. Requests may come from company
leadership, the board, impacted departments, or even regulators who are closely
watching the company. Typically, stakeholders will want weekly updates, although
in some circumstances, daily reports are requested.
The type of reporting and frequency should always be customized to each
individual event. When putting together a reporting plan, keep in mind who is
asking, what they need to know, and legal issues of privilege and risk. The answers
to these questions will inform the format of the reporting content and structure.
During the active notification period, mail drops should be reviewed at least
daily to ensure alignment with approved delivery deadlines. Additionally, mailing
and call center activities should be closely coordinated to ensure response staffing
levels are optimal. In situations in which victims receive credit activity monitoring
or other remediation, it may be beneficial to track enrollments and customer
escalations daily for at least the first few weeks.
In the first days or weeks, depending on the severity of the incident, senior
management may request briefings on the developments daily. Similarly, the PR
group will often track daily breach-related news coverage to confirm the
organization’s event narrative is being interpreted as intended. To mitigate public
backlash, clarifying responses to inaccuracies or negative press should be prepared
and practiced in advance.
Investors and other external stakeholders will naturally want to keep abreast of
all breach-related developments. If the organization is publicly traded, a good
practice is to update senior management at least weekly for the first few months
after breach notification.
Regular reviews should be scheduled to update functional leaders, senior
managers, and other key stakeholders about the status and impact of the incident
response effort. A breach’s effects on employee productivity and morale should
not be underestimated, so keeping workers informed about how the incident is
being handled is always a top priority.
10.11 Recovering from a Breach
10.11.1 Response Evaluation and Modifications
Incident response can be tested with a variety of scenarios. But even a well-written
plan can falter when the theory behind it collides with realities on the ground. As
teaching tools, real-life breaches are far superior to hypothetical scenarios, so
lessons learned from all incidents must afterward be captured, recorded, and
incorporated into the plan.
Once the initial chaos of a breach has subsided, the affected organization should
carefully evaluate its incident response plan. Even the most well-thought-out
responses can benefit from the lessons learned after a live event.
Among the most beneficial questions to answer about the response are:
Which parts of the process clearly worked as intended?
Which worked only after some modification?
Which did not work at all?
What did the team do exceptionally well? What didn’t go well?
Were any unforeseen complications encountered? How could they have
been avoided?
How well was the team prepared for the unexpected?
How realistic were the plan’s response timelines?
What was the difference between actual and budgeted costs?
Was the team sufficiently staffed?
Were all relevant parties part of the team?
What could be learned and what be improved upon for the next
potential breach?
10.11.2 Calculating and Quantifying the Costs
While many breach-related costs can be identified and tallied using actual
invoices, others are less apparent. Lost business opportunities and damage to
brand equity are examples of costs that may affect the bottom line for years
following a breach. Table 10-3 includes typical categories of breach-related
expenses in cases when costs can be traced to specific activities.
Table 10-3: Breach-Related Expenses
Expense Description
Legal Costs
Punitive costs Fines, lawsuits, and other penalties stemming from negligence in
preventing or improperly responding to the breach
Internal Costs
Outside counsel Legal review of the organization’s contractual and regulatory
obligations after a breach; may include defense costs if litigation
results
Crisis management/PR Experts to help the organization craft and deliver cohesive, properly
timed and customer-friendly communications about the incident
Forensic investigators Specialists to confirm, contain, and eliminate the cause of the breach
and determine the size, scale, and type of records affected
Call center support Staffing, training, and support of the customer care team responsible
for handling calls and emails related to the incident and its
aftermath
Equipment replacement Equipment changes, system upgrades, and physical security
and security improvements to mitigate the current breach and prevent future
enhancements incidents
Insurance Retention (deductible) payments and fee increases associated with
the breach
Card replacement The cost of issuing new cards (in incidents when credit card numbers
have been compromised)
Employee training Educational activities intended to improve upon previous programs
that facilitated the breach
Remediation Costs
Victim notification Creation and delivery of letters, emails, web pages, and other
methods/channels to notify affected individuals about the incident
Remediation offers Provision of services such as credit monitoring, fraud resolution, and
identity theft insurance to breach victims
Victim damages Costs related to correcting damages incurred by breach victims
Intangible Costs
Customer retention Marketing campaigns designed to prevent customer attrition and
win back lost business following an incident
Expense Description
Lost revenue and stock Reductions in stock price, lost customers, and other revenue
value decreases directly related to the loss
Opportunity costs Lost productivity and revenues, as employees suspend regularly
assigned tasks to assist with breach response
According to the Ponemon Institute’s 2019 study, the probability of a data
breach in a 24-month period is 29.6 percent.14 The numbers shown in table 10-4
can be helpful when a privacy manager is attempting to conduct a cost-benefit
analysis or get buy-in or budget from organizational leadership for breach
preparedness measures. Several factors can affect the per-capita cost of a data
breach—both positively and negatively. Knowing this can help organizations
prioritize their spending to mitigate potential costs of a breach.
Table 10-4: Average Cost Saved per Record in the Event of a Breach15
Incident response team +$13.66
Extensive use of encryption +$13.59
Employee training +$10.31
Business continuity management (BCM) involvement +$10.56
Participation in threat-sharing +$9.27
Artificial intelligence platform +$8.97
Use of security analytics +$7.68
Extensive use of data loss protection (DLP) +$6.91
Board-level involvement +$7.68
CISO appointed +$6.85
Data classification schema +$5.11
Insurance protection +$6.05
CPO appointed +$2.08
Provision of ID protection +S0.56
Consultants engaged -$4.17
Rush to notify -$5.61
Extensive use of Internet-of-Things (IoT) devices -$5.95
Lost or stolen devices -$6.75
Extensive use of mobile platforms -$9.33
Extensive cloud migration -$11.39
Compliance failures -$13.47
Third-party involvement -$14.04
Note: These figures indicate money saved; e.g., having an employee training program
saves, on average, $10.31 per record in the event of a data breach.16
10.12 Benefiting from a Breach
While no organization would choose to experience a data breach, failures breed
opportunity for organizational change and growth. How can you ensure you walk
away from a breach better prepared for the future? Be sure to conduct a breach or
incident response review or a post-incident assessment. At minimum, review
these items:
Staffing and resourcing
Containment, including timing and processes
The C-suite commitment, including signoff on new measures and
allocation of resources
Clarity of roles of the response team and others
The notification process for individuals, regulatory bodies, and others
Your organization’s objectives for breach management will likely change after an
incident. Take this time to renew your funding, focus and commitment, and
regularly review and audit your plan to make sure it is current and up-to-date.
10.13 Summary
A proper breach response plan provides guidance for meeting legal compliance,
planning for incident response, and handling privacy incidents. An organization
needs to be prepared to respond to its internal and external stakeholders,
including regulators. The privacy professional and related team members need to
be prepared to respond appropriately to each incoming request to reduce
organizational risk and bolster compliance with regulations.
Endnotes
1 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, p. 6-8, July 2020,
https://www.capita.com/sites/g/files/nginej291/files/2020-08/Ponemon-Global-Cost-of-Data-Breach-
Study-2020.pdf.
2 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, p. 9.
3 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, p. 9.
4 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, p. 9.
5 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2020, p. 40.
6 IAPP and FairWarning, Benefits, Attributes and Habits of Mature Privacy and Data Protection Programs, p. 4,
October 2020,
https://iapp.org/media/pdf/resource_center/benefits_attributes_habits_of_mature_privacy_data_prot
ection_programs.pdf
7 U.S. Department of Labor, Bureau of Labor Statistics, “Union Members—2020,” news release, January 22,
2021, https://www.bls.gov/news.release/pdf/union2.pdf.
8 AFL-CIO, accessed February 2021, https://aflcio.org/about-us.
9 Proofpoint, 2021 State of the Phish: An In-Depth Look at User Awareness, Vulnerability and Resilience, p. 4,
accessed February 2021, https://www.intelligentcio.com/wp-content/uploads/sites/20/2021/03/2021-
State-of-the-Phish-WP.pdf.
10 Proofpoint, 2021 State of the Phish, p. 4.
11 Proofpoint, 2021 State of the Phish, p. 29.
12 SAI Global, Addressing the COVID-19 Gap: How Business Continuity Professionals Can Propel Business
Forward, p. 5, March 2020, https://www.saiglobal.com/hub/whitepapers/addressing-the-covid-19-gap.
13 SAI Global, Addressing the COVID-19 Gap, p. 5.
14 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2019, p. 10, July 2019,
https://www.poweredbystl.com/wp-content/uploads/2020/04/2019_Cost_of_a_Data_Breach
_Report.pdf
15 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2019, p. 38.
16 IBM Security and the Ponemon Institute, Cost of a Data Breach Report 2019, p. 38.
About the Contributors
Executive Editor and Contributor
Russell Densmore, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Russell Densmore is the global data protection leader for Raytheon Technologies.
With more than 30 years of experience, he brings a multidisciplinary
understanding to data protection, data governance, data compliance, digital
forensics, and enterprise risk management. He has been recognized by the U.S.
attorney general and Federal Bureau of Investigation for support against
cybercriminals.
Densmore is renowned for information security, cyber forensic investigations,
privacy program management, and physical security. He is a proven cybersecurity
professional with a record of establishing and managing multiple cross-functional
data protection teams.
Densmore co-chairs the National Defense Industrial Association (NDIA)
cybersecurity, privacy subcommittee with longtime colleague and contributing
author Edward Yakabovicz. He is actively involved with the Privacy Engineering
Section of the IAPP and, as a privacy pioneer, often speaks at IAPP and other
privacy events to promote the profession. He chairs the OneTrust Privacy
Connect chapter for Los Angeles, as well as mentoring others on how to obtain
the most benefit from privacy program management platforms.
Densmore holds a master’s of engineering degree in cybersecurity policy and
compliance from The George Washington University and a bachelor’s of science
degree in computer information systems/networking from Regis University.
Contributors
Susan Bandi, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Susan Bandi currently serves as a compliance professional at Oracle. With more
than 25 years of information technology experience, she has served in multiple
leadership and executive roles responsible for application development,
infrastructure, and information security. For the past 17 years, her focus has been
on IT security, privacy, business continuity/disaster recovery, and data
governance. She has served as global chief privacy officer for Monsanto/Bayer and
was the assistant vice president and chief information security officer
(CISO)/chief privacy officer (CPO) for Enterprise Holdings, Inc.
She is experienced in providing thought leadership and implementing effective,
comprehensive global solutions in the areas of enterprise risk management, data
governance, data privacy, IT security, and business continuity. She also serves as
an adjunct professor in the Cybersecurity Master’s Program at Washington
University in St. Louis.
She is an active member of the IAPP, Executive Women in Privacy, Chief Privacy
Council Board, Future of Privacy Forum (FPF), ISACA, CISO Coalition, and FBI
Citizen Academy.
João Torres Barreiro, CIPP/E, CIPP/US
João Torres Barreiro is a privacy leader with a long experience on designing and
implementing privacy programs in multinationals operating in the
pharmaceutical, IT, and financial sectors.
He is currently the chief privacy officer (CPO) of BeiGene, a global commercial-
stage biopharmaceutical company, focused on developing and commercializing
innovative molecularly targeted and immuno-oncology drugs for the treatment of
cancer. He also serves as a member of the Research Advisory Board of the IAPP.
Before joining BeiGene, he was the CPO of Willis Towers Watson and
previously HCL Technologies. He also practiced as an attorney in law firms and as
a legal counsel at Celgene, IBM, the European Medicines Agency, and the
Portuguese Ministry of Health.
In 2020, he was listed as a “Global Top 100 Data Visionaries: Leaders who are
vividly innovating with analytics without compromising on trust and privacy,”
mostly because of his work as a consultative expert member on digital
ethics/artificial intelligence (AI) at the European Insurance and Occupational
Pensions Authority (EIOPA), where he helped to develop a framework for a
sustainable use of AI by the insurance industry in compliance with data ethics and
privacy principles.
John Brigagliano
John Brigagliano focuses his practice on data privacy and technology licensing
with a particular emphasis on guiding clients through California Consumer
Privacy Act (CCPA)/California Privacy Right Act (CPRA) and EU General Data
Protection Regulation (GDPR) compliance issues. With respect to California
privacy, for example, Brigagliano currently co-leads CCPA and CPRA compliance
for a marketing automation platform and regularly advises a cloud-based security
and interactive home services provider on CCPA compliance matters. He also
regularly advises U.S. retailers on CCPA-related digital advertising issues.
Prior to launching his legal career, Brigagliano was a special education teacher at
Seaford Senior High School in Seaford, Delaware, where he was placed as part of
Teach for America and, along with teaching students with disabilities, he coached
varsity golf. He earned an undergraduate degree from Wake Forest University and
graduated from Vanderbilt Law School.
Ron De Jesus, CIPP/A, CIPP/C, CIPP/E, CIPP/US, CIPM, CIPT, FIP
Ron De Jesus is the head of global privacy at Grindr, the world’s largest social
networking application for the LGBTQ+ community, and founder and CEO of
De Jesus Consulting, a boutique privacy consulting firm specializing in privacy
program and privacy strategy development, controls implementation, and privacy
assessments and reviews.
Previously, De Jesus led the privacy function at Tinder, where he was
responsible for developing and operationalizing the company’s EU General Data
Protection Regulation (GDPR) strategy. De Jesus later served as privacy program
manager for all North American brands owned and operated by Match Group,
Inc., including Tinder, PlentyOf Fish, OKCupid, Match.com, and Hinge.
Prior to Tinder, De Jesus served as the global privacy director for Tapestry, Inc.,
based in New York, where he developed its global privacy program and managed
privacy compliance efforts for all its brands, including Coach, Stuart Weitzman,
and Kate Spade.
In 2013, De Jesus helped establish PwC’s Data Protection & Privacy Practice in
New York, where he led privacy engagements globally. Prior to PwC, he consulted
with Deloitte, where he designed functional privacy controls and managed
company registrations with EU authorities. In his early career, De Jesus consulted
for Anzen, Inc., a boutique data privacy firm based in Toronto, Ontario, where he
led numerous privacy impact assessments (PIAs) for large health IT system
implementations across Canada.
De Jesus has also served as privacy director for American Express’s Global
Network Services (GNS), where he developed the business unit’s privacy policy,
developed its privacy-by-design (PbD) program, led its strategy to comply with
the EU ePrivacy Directive, and served on the Amex Privacy Board.
De Jesus sits on the IAPP Diversity in Privacy Advisory Board and was a former
member of the IAPP Publications Board and CIPT Exam Development Board.
He previously co-chaired the Los Angeles IAPP KnowledgeNet and New York
IAPP KnowledgeNet chapters and is a regular contributor to the Privacy Advisor.
De Jesus is also an IAPP Training Partner and Faculty Member and delivers both
IAPP-approved and IAPP-sponsored trainings.
Jonathan Fox, CIPP/US, CIPM
Jonathan Fox, director of privacy by design, is a member of Cisco’s chief privacy
office and coauthor of The Privacy Engineer’s Manifesto: Getting from Policy to Code
to QA to Value.
With more than 20 years of privacy experience, Fox’s principal areas of focus
have been product development, government relations, mergers and acquisitions
(M&A), and training. In addition to being a CIPP/US and CIPM, he was a
Certified Information Security Manager (CISM).
Prior to Cisco, Fox was senior privacy engineer at Intel. His previous roles
include director of data privacy at McAfee, director of privacy at eBay, deputy
chief privacy officer at Sun Microsystems, and editor-in-chief at Sun.com.
Fox frequently speaks at industry events and is a member of the IEEE P7002
Personal Data Privacy Working Group and chair of the U.S. Technical Advisory
Group for ISO/PC 317 Consumer protection: privacy by design for consumer
goods and services.
Jon Neiditz, CIPP/E, CIPP/US, CIPM
Jon Neiditz co-leads the Cybersecurity, Privacy and Data Governance Practice at
Kilpatrick Townsend. One of the first lawyers to focus broadly on data governance
and knowledge asset protection, he remains the only person recognized by Best
Lawyers in America both for Information Management Law and for Privacy and
Data Security Law. Most recently, he has been recognized for Technology Law, as
well.
For decades, Neiditz has helped clients anticipate, obviate, and manage
information privacy and security risks; appropriately monetize information;
comply with privacy, data protection, and cybersecurity laws around the world in
pragmatic ways; and contain and prevent harm from incidents while maximizing
resilience and minimizing regulatory issues.
Neiditz has always collaborated with clients and peers on pragmatic innovation;
for example, in the 1990s, he helped to define what accountable health
care and health care reform might look like; in the 2000s, he helped to invent
multidisciplinary incident response and the role of the “breach coach,” as well
as define proportionate search in e-discovery; and in the 2010s, he helped
pioneer governance of “big data” and protection of “crown jewels.”
Neiditz has been selected as a “Cybersecurity Trailblazer” by the National Law
Journal, Ponemon Fellow, and by Who’s Who Legal for Data Law. Neiditz’s JD is
from Yale Law School his bachelor’s of arts from Dartmouth College.
Chris Pahl, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPM, CIPT, FIP
Chris Pahl is the manager of cybersecurity governance for a West Coast utility
company, overseeing cyber standards, policies, and technical controls and
requirements, as well as supply chain risk management. He is responsible for
managing the strategic plan to ensure cybersecurity governance functions align
with different company stakeholders’ priorities.
During the prior 12 years at the same company, as a privacy professional, Pahl
helped develop overarching enterprise privacy programs while providing ongoing
advisory services to business units, including customer service, information
technology (IT), human resources (HR), sales, marketing, legal, and
procurement, determining compliance with ethical and regulatory requirements
pertaining to the collection, protection, use, and transfer of personally identifiable
information (PII). He was responsible for privacy-related activities on matters
such as privacy impact assessments (PIAs), regulatory audits, and company due
diligence encompassing 14 million customers and 50,000 employees and retirees.
Pahl chaired the multidisciplinary Privacy Incident Response Teams
investigating potential privacy incidents and managing remediation actions. He
has built and operationalized privacy compliance programs, completing multiple
privacy assessments in the areas of enterprise data transfers and customer and
employee support systems. Pahl worked on engagements supporting system
inventories and audits, data encryption, and implementation of data loss
prevention (DLP) applications in live operating environments and implemented
DLP solutions. He excels in developing ground-up privacy and programs for large
companies.
Pahl holds a doctorate degree in Strategic Leadership, certifications in privacy
and project management, and Six Sigma green and black belts. He actively writes
for industry publications.
Liisa Thomas
Liisa Thomas is a partner in Sheppard Mullin’s Chicago and London offices and
lead of its privacy and cybersecurity team, providing thoughtful legal analysis
combined with real-world practical advice. She also serves as an adjunct professor
at Northwestern Law School teaching privacy and data security courses, where
she is the recipient of the Edward Avery Harriman Law School Lectureship award.
Thomas is the author of the definitive treatise on data breach, Thomas on Data
Breach: A Practical Guide to Handling Worldwide Data Breach Notification,
described as “a no-nonsense roadmap for in-house and external practitioners
alike.” She is also the author of the new treatise on data privacy, Thomas on Big
Data: A Practical Guide to Global Privacy Laws, described as a “key text” and
“perfect for the busy practitioner.”
As an industry leader in the privacy and data security space, she has been
recognized by Leading Lawyers Network, Chambers, and the Legal 500 for her
depth of privacy knowledge. Thomas was named to Cybersecurity Docket’s
“Incident Response 30,” recognized as 2017 Data Protection Lawyer of the Year–
USA by Global 100, 2017 “U.S. Data Protection Lawyer of the Year” by Finance
Monthly, and a “Leading Woman Lawyer” by Crain’s in 2018.
Thomas received her JD from the University of Chicago and is admitted to the
bar in Illinois and the District of Columbia.
Amanda Witt, CIPP/E, CIPP/US
Amanda Witt is a partner at Kilpatrick Townsend & Stockton LLP and co-leader
of the firm’s Technology, Privacy & Cybersecurity team.
Witt advises clients on U.S., EU, and global privacy; cybersecurity; technology
transactions; e-commerce; outsourcing; licensing and procurement; intellectual
property protection; strategic alliances; software and mobile application
development, licensing and global manufacturing; and distribution agreements
relating to internet-connected devices.
She is a frequent presenter on topics related to U.S., EU, and global privacy, as
well as technology-related topics, such as artificial intelligence (AI), and has
published articles on cybersecurity, privacy, cloud computing, electronic
signatures, security laws, outsourcing, and media.
Witt earned her LLM in international intellectual property, magna cum laude,
from Catholic University at Leuven, Belgium, and her JD, cum laude, from Emory
University School of Law. She earned a bachelor’s of arts, magna cum laude, from
the University of Florida, where she was inducted into Phi Beta Kappa.
Edward Yakabovicz, CIPP/G, CIPM, CIPT
Edward Yakabovicz is a Northrop Grumman fellow with specialization in
cybersecurity, information security management, engineering, and privacy
management. With more than 32 years of experience, Yakabovicz is an
experienced speaker who is published by the SANS Institute, International
Council on Systems Engineering (INCOSE), National Defense Industrial
Association (NDIA), Information Systems Security Association (ISSA), and
International Association of Privacy Professionals (IAPP).
Yakabovicz currently chairs the NDIA Privacy Subcommittee and has held
board positions with several colleges and universities and with the Information
Systems Security Association and the IAPP.
He coauthored the first and second editions of Privacy Program Management:
Tools for Managing Privacy Within Your Organization textbook and contributed to
many cybersecurity and privacy publications, both in print and online. In addition
to his Certified Information Systems Security Professional (CISSP) accreditation,
Yakabovicz holds numerous certifications across security and privacy industries
and has received numerous awards for leadership, excellence, and innovation.
Index of Searchable Terms
A
AAPI (Agency of Access to Public Information, Argentina)
Acceptable use policies (AUP)
for cloud computing
for employee information protection
Access
acceptable use policies and
in data subject rights
to employee information
withdrawals of
Access, rectification, cancellation, and opposition rights (ARCO rights [Mexico])
Access control
Accountability
Acquisitions, divestitures, and mergers, data assessments in
Active scanning tools for monitoring
Activity monitoring
Act on the Protection of Personal Information (APPI, Japan)
AdChoices
Administrative controls
Advertising, unsolicited
AFL-CIO
Age Appropriate Design code (UK Information Commissioner’s Office)
Agency of Access to Public Information (AAPI, Argentina)
AICPA/CICA Privacy Maturity Model
AI (artificial intelligence) systems
American Institute of Certified Public Accountants (AICPA)
American National Standards Institute (ANSI)
ANPD (Autoridade Nacional de Proteção de Dados, Brazil)
APEC (Asia-Pacific Economic Cooperation) Privacy Framework
APPI (Act on the Protection of Personal Information, Japan)
Apple Corp.
ARCO rights (access, rectification, cancellation, and opposition [Mexico])
Argentina Agency of Access to Public Information (AAPI)
Article 29 Working Party (WP29)
Artificial intelligence (AI) systems
Asia-Pacific Economic Cooperation (APEC) Privacy Framework
Assess. See also Data assessments
Asset management
Attestation
Audience
Auditing. See also Sustain phase: monitoring and auditing performance
Audit log wiping
AUP (acceptable use policies). See Acceptable use policies (AUP)
Australia, data subject rights in
Australia Office of the Australian Information Commissioner (OAIC)
Automated decision-making
Autoridade Nacional de Proteção de Dados (ANPD, Brazil)
Awareness. See Sustain phase: training and awareness
B
Bandi, Susan
Barreiro, João Torres
BCRs (binding corporate rules)
Benchmarking. See also Sustain phase: monitoring and auditing performance
Benefiting from data breaches
Best practices
data breaches
in information security
for internal partnership development
Binding corporate rules (BCRs)
Biometric Information Privacy Act (BIPA, Illinois)
Board of directors role
in data breaches
in incident planning
Brazil Autoridade Nacional de Proteção de Dados (ANPD)
Brazil’s Lei Geral de Proteção de Dados (LGPD)
Breaches. See Data breaches
Brigagliano, John
Brown University’s Executive Master in Cybersecurity
B2B (business-to-business) organizations
B2C (business-to-consumer) organizations
Business continuity management
Business continuity plan, incident response in
Business development team
in data breaches
in incident planning
privacy procedures and
Business line privacy leaders
Business resiliency, metrics for
Business-to-business (B2B) organizations
Business-to-consumer (B2C) organizations
C
CAC (Cyberspace Administration of China)
California Consumer Privacy Act (CCPA)
approach of
awareness guide for
penalties for noncompliance with
privacy notices delivery requirements
of
privacy right extended by
vendor assessment under
California Online Eraser law
California Online Privacy Protection Act (CalOPPA)
California Privacy Rights Act (CPRA) of 2020 (Proposition 24)
California Shine the Light law
Call center launches to report data breaches
CAM4 website
Canada Office of the Privacy Commissioner of Canada (OPC)
Canadian anti-spam legislation (CASL)
Canadian Institute of Chartered Accountants (CICA)
Canadian Standards Association (CSA) Privacy Code
CAN-SPAM (Controlling the Assault of Non-Solicited Pornography and Marketing) Act of 2003
Carnegie Mellon’s Master of Science in Information Technology—Privacy Engineering (MSIT-PE)
Carnegie Mellon University
CARU (Children’s Advertising Review Unit) Advertising Guidelines
CASL (Canadian anti-spam legislation)
Cavoukian, Ann
CCPA (California Consumer Privacy Act). See California Consumer Privacy Act (CCPA)
CDPA (Consumer Data Protection Act, Virginia)
Centralized governance
CEO role
in data breaches
in incident planning
Certified Information Privacy Manager (CIPM) certification
Change management
Chief information security officers (CISO)
Chief privacy officer (CPO)
Children, consents of
Children’s Advertising Review Unit (CARU) Advertising Guidelines
Children’s Online Privacy Protection Act (COPPA)
China Cyberspace Administration of China (CAC)
China-People’s Republic of China Personal Information Protection Law (PIPL)
Chinese National Information Security Standardization Technical Committee (TC260)
Choice and consent, in data subject rights
CIA (confidentiality, integrity, and availability) of personal data
CICA (Canadian Institute of Chartered Accountants)
CIPM (Certified Information Privacy Manager) certification
Cisco Privacy Maturity Benchmarking Study (2021)
CISO (chief information security officers)
CJEU (Court of Justice of the European Union)
Class-action lawsuits
Cloud-based threats
Cloud computing
acceptable use of
assessing vendors of
breach activity increases
Cloud Industry Forum
CNIL (Commission Nationale de l’Informatique et des Libertés, France)
Colorado Privacy Act
Commission Nationale de l’Informatique et des Libertés (CNIL, France)
Communication
of information protection policies
in privacy notices and policies
of privacy procedures
for training and awareness
Communications security
Complaint handling
Complaint-monitoring processes
Compliance
audits for monitoring
demonstrating
governance, risk, and compliance (GRC) tools
in incident planning
in information security
measurement of
penalties for noncompliance
in privacy policies
Comprehensive approach
Conference of European Data Protection Authorities
Conferences and seminars
Confidential category, in data classification
Confidentiality, integrity, and availability (CIA) of personal data
Consent
of children
in data subject rights
management of
withdrawals of
Consultative Expert Group on Digital Ethics in insurance
Consumer Privacy Protection Act of 2021 (Canada)
Consumers, trust of
Containment, in data breaches
Contract language for privacy protection
Contracts for vendor engagement
Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act of 2003
Controls
in information security
monitoring
technical controls for privacy
Cookie compliance
“Cookie consents”
COPPA (Children’s Online Privacy Protection Act)
Co-regulatory model
Corrective controls
Cost considerations
calculating
in data breaches
in information protection
Cost of a Data Breach Report 2020 (Ponemon Institute)
Court of Justice of the European Union (CJEU)
COVID-19 pandemic
health records privacy in
personal data handling and
privacy policies changed by
working from home during
CPO (chief privacy officer)
CPRA (California Privacy Rights Act) of 2020 (Proposition 24)
Cranor, Lorrie Faith
Credential theft
Credit card incidents
Cronk, R. Jason
Cross-border data transfers
Cryptography
CSA (Canadian Standards Association) Privacy Code
Currency metrics
Customer care role
in data breaches
in incident planning
Cyber insurance coverage
Cyberspace Administration of China (CAC)
Cyclical component analysis
D
DAA (Digital Advertising Alliance)
Daily Dashboard (IAPP)
DAMA (Data Management Association) International
“Dark patterns,” prohibitions against
Data access
acceptable use policies and
in data subject rights
to employee information
withdrawals of
Data assessments
artificial intelligence system assessments
attestation in
compliance measurement
data governance and
data protection impact assessments
GDPR requirements for
inventories and records
in mergers, acquisitions, and divestitures
overview
physical and environmental assessments
privacy impact assessment: ISO
privacy impact assessment: overview
privacy impact assessment: U.S.
vendor assessments: overview
vendor assessments under CCPA
vendor assessments under GDPR
Data breaches
benefiting from
board of directors role in
business development role in
CEO role in
of company privacy notices
customer care role in
finance role in
functional roles in planning for
human resources role in
impact of
incident handling
incident planning
incident response in business continuity plan
individual roles: overview
information security role in
investigating
legal role in
marketing and public relations role in
monitoring
new tools, methods, and practices leading to
occurrences of
outside resources role in
overview
of personally identifiable information (PII)
Ponemon Institute study of
preparing for
recovering from
reporting obligations
terminology related to
union leadership role in
Data Breach Investigations Report 2020 (Verizon)
Data classification
Data destruction
Data discovery
Data governance
Data inventory
Data loss prevention (DLP) tools
Data management
Data Management Association (DAMA) International
Data mapping
Data minimization, technical controls for
Data portability, in data subject rights
Data privacy dashboards
Data protection authorities (DPAs)
Data Protection by Design and by Default
Data Protection Commission v. Facebook Ireland, Schrems
Data protection impact assessments (DPIA)
conditions requiring
contents of
methodology of
overview
privacy workshops on
supervisory authorities and
Data protection officers (DPOs)
accountability of
need for
overview
qualifications and responsibilities of
records of processing activities and
reporting structure of
at vendors
Data retention and destruction
Data subject access requests (DSARs)
Data subject rights
in Australia and New Zealand
children’s consents
choice and consent
complaint handling
East Asian
European and UK: automated decision-making
European and UK: data portability
European and UK: erasure
European and UK: overview
European and UK: personal data protection
European and UK: processing restrictions
European and UK: rectification
European and UK: restrictions of rights
European and UK: right to access
European and UK: right to information
European and UK: right to object
European and UK: transparency
Latin American
opt-in versus opt-out
overview
privacy notices and policies
U.S. federal laws on
U.S. state laws on
withdrawals of consent and data access
“Data transfer impact assessment” (DTIA or TIA)
Data transfers, cross-border
Decentralized governance
Deepfakes
Deidentification
De Jesus, Ron
Delaware Online Privacy Protection Act (DOPPA)
Densmore, Russell
Destruction of data
Detective controls
Digital Advertising Alliance (DAA)
Disposal Rule, in Fair and Accurate Credit Transactions Act (FACTA) of 2003
Divestitures, acquisitions, and mergers, data assessments in
DLP (data loss prevention) tools
DMA Guidelines for Ethical Business Practice
DNC (National Do Not Call) Registry
DOC (U.S. Department of Commerce)
DOPPA (Delaware Online Privacy Protection Act)
DPAs (data protection authorities)
DPIA (data protection impact assessments). See Data protection impact assessments (DPIA)
DPOs (data protection officers). See Data protection officers (DPOs)
Driver’s Privacy Protection Act (DPPA)of 1994
DSARs (data subject access requests)
DTIA (data transfer impact assessment)
Dublin City University’s Master of Arts in Data Protection and Privacy Law
Due diligence
Dutch Data Protection Authority
E
East Asia, data subject rights in
ECPA (Electronic Communications Privacy Act) of 1986
EDPB (European Data Protection Board)
Education data, privacy protections for
E-Government Act of 2002
EIOPA (European Insurance and Occupational Pensions Authority)
Electronic Communications Privacy Act (ECPA) of 1986
Email policies
Emerging laws and regulations
Employee information protection
acceptable use policies
access and data classification
components of
overview
Employment data, privacy protections for
Energy data, privacy protections for
Enterprise communications
Environment
assessments of
monitoring for vulnerabilities in
security of
Erasure, in data subject rights
Ethics
ETSI (European Telecommunications Standards Institute)
EU Code of Conduct
EU Data Protection Directive
EU General Data Protection Regulation (GDPR). See General Data Protection Regulation (GDPR)
European Data Protection Board (EDPB)
European Data Protection Law and Practice: Data Subjects’ Rights (Schultze-Melling)
Europe and UK data subject rights
automated decision-making
data portability
erasure
overview
personal data protection
processing restrictions
rectification
restrictions of rights
right to access
right to information
right to object
transparency
European Insurance and Occupational Pensions Authority (EIOPA)
European Telecommunications Standards Institute (ETSI)
European Union’s Article 29 Working Party (WP29)
EU-U.S. Privacy Shield
External announcements of data breaches
F
Facebook.com
Fair and Accurate Credit Transactions Act (FACTA) of 2003
Fair Credit Reporting Act (FCRA) of 1970
Fair information practices
Family Educational Rights and Privacy Act (FERPA) of 1974
FCRA (Fair Credit Reporting Act) of 1970
Federal Bureau of Investigation (FBI)
Federal Privacy Act of 1974f
Federal Trade Commission (FTC)
on advertising to children
data breaches and
Data Privacy Day resources from
Do Not Call Registry of
enforcement actions of
on privacy by design
privacy notice requirements of
Federal Trade Commission Act of 1914
Fileless attacks
Finance team
in data breaches
in incident planning
privacy procedures and
Financial data, privacy protections for
Firewall rules
First-party (internal) audits
First responders
FOIA (Freedom of Information Act)
Forensics, third-party
Fox, Jonathan
Frameworks
definition of
laws, regulations, and programs in
principles and standards in
questions answered by
rationalizing requirements by
France’s Commission Nationale de l’Informatique et des Libertés (CNIL)
Freedom of Information Act (FOIA)
FTC (Federal Trade Commission). See Federal Trade Commission (FTC)
G
Gap analysis
GAPP (Generally Accepted Privacy Principles)
General Data Protection Regulation (GDPR)
access rights under
Article 30 of
automated decision-making, right to not be subject to, in
Awareness Guide of
children, privacy notices to
in cross-border data transfer
data assessment requirements of
data portability rights under
data protection by design and by default
data subject rights under
DPIA under
DPO role established by
electronic consent as affirmative act under
erasure rights under
material scope
monitoring privacy performance under
OECD Guidelines as basis for
overview
penalties for noncompliance with
privacy by design in
Records of processing activities in Article 30 of
rectification rights under
reporting privacy performance under
restriction of processing rights under
restrictions of data subjects’ rights under
right to object under
subject matter and objectives
territorial scope
vendor assessments under
Generally Accepted Privacy Principles (GAPP)
General organization compliance
George Washington University’s Master of Engineering—Cybersecurity Policy and Compliance
GLBA (Gramm-Leach-Bliley Act)
Global privacy and data protection laws
Global Privacy Enforcement Network
Global privacy teams
Governance, data assessments and. See also Privacy governance
Governance, risk, and compliance (GRC) tools
Government data, privacy protections for
Gramm-Leach-Bliley Act (GLBA)
H
Health data, privacy protections for
Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009
Health Insurance Portability and Accountability Act (HIPAA) of 1996
Highly confidential category, in data classification
Hong Kong Office of the Privacy Commissioner for Personal Data (PCPD)
Human resources
in data breaches
data privacy protections for
in incident handling
in incident planning
monitoring processes of
privacy procedures and
vendors policies of
Hybrid governance
I
IaaS (infrastructure as a service)
IAPP-EY Privacy Governance Report 2018
IAPP-EY Privacy Governance Report 2019
IAPP-FTI Consulting Privacy Governance Report 2020
IAPP’s Westin Research Center
ICO (Information Commissioner’s Office, UK)
Implementing policies for information protection
Incident management
Incident response
budgeting for
plan for
teams for
tools for
Incidents of data breaches
business continuity plan integration
detection of
handling
planning for
See also Data breaches
Industry standards
Information and Privacy Commissioner of Ontario (Canada)
Information Commissioner’s Office (ICO, UK)
Information security
best practices in
confidentiality, integrity, and availability in
controls in
in data breaches
data privacy and
in incident planning
privacy procedures and
standards and guidelines in
See also Policies for information protection; Protecting personal information
Information Security Technology—Personal Information Security Specification (PI Security Specification,
Chinese National Information Security Standardization Technical Committee)
Information Systems Audit and Control Association (ISACA)
Information technology (IT) team, privacy procedures and
Infrastructure as a service (IaaS)
Insider threats
Insurance coverage for data breaches
Intangible costs of data breaches
Integrity of computer systems
Internal announcements of data breaches
Internal audit team, privacy procedures and
Internal costs of data breaches
Internal-error-related breaches
Internal partnerships in privacy strategy
International Assembly of Privacy Commissioners and Data Protection Authorities
International Conference of Data Protection and Privacy Commissioners
International Organization for Standardization (ISO)
Internet of Things (IoT)
Internet policies
Intrusion detection
Inventories and records, in data assessments
Investigating data breaches
IoT (Internet of Things)
Irish Data Protection Commission
Irregular component analysis
ISACA (Information Systems Audit and Control Association)
ISO (International Organization for Standardization)
Israel Privacy Protection Authority (PPA)
J
Japan Personal Information Protection Commission (PPC)
K
Kaseya hacks
Kişisel Verileri Koruma Kurumu (KVKK, Turkey)
L
Language of privacy notices
Latin America, data subject rights in
Laws and regulations
acceptable use policies and
Brazil’s Lei Geral de Proteção de Dados (LGPD)
California Consumer Privacy Act (CCPA)
cross-border data transfers
emerging
EU General Data Protection Regulation (GDPR)
monitoring
on new technologies
organizational balance and support
oversight agency authority
overview
penalties for noncompliance
People’s Republic of China Personal Information Protection Law
in privacy governance
privacy program management beyond
sectoral
self-regulation by industry standards
third-party external privacy resources
U.S. federal laws on data subject rights
U.S. state laws on data subject rights
See also Data subject rights
Learning and development team, privacy procedures and
Least privilege concept, for access control
Legal costs of data breaches
Legal privilege
Legal protections
Legal team
in data breach response
in incident planning
in litigation, liabilities, and regulatory scrutiny
privacy procedures and
Lessons learned, leveraging
Letter drops to report data breaches
LGPD (Brazil’s Lei Geral de Proteção de Dados)
Life cycle, privacy. See also Data assessments
Limited sectoral approach
Litigation exposure
Living off the land (LotL) attacks
( )
Local data protection authorities (DPAs)
Local governance
M
Machine learning (ML)
Malicious threats
Malvertising
Malware protection
Marketing and public relations team
in data breaches
in incident planning
privacy procedures and
Marketing data, privacy protections for
McAfee trust marks
McDonald, Aleecia
Mergers, acquisitions, and divestitures, data assessments in
Metrics
audience impact on
metric owner’s role
overview
for privacy measurement
reporting findings based on
for training and awareness
view of
Microsoft Corp.
Mission statement for privacy governance
ML (machine learning)
Models
organizational
privacy team
Monitoring
laws and regulations
performance
technology use
vendors
See also Sustain phase: monitoring and auditing performance
MSIR-PE (Carnegie Mellon’s Master of Science in Information Technology—Privacy Engineering)
N
NAI (Network Advertising Initiative)
National Do Not Call (DNC) Registry
National Institute of Standards and Technology (NIST)
NIST 800-60 classification system
NIST SP 800-88 Guidelines for Media Sanitization
Privacy Framework of
standards and guidelines of
third-party audits aligned with
training guidelines of
National People’s Congress Standing Committee Decision on Strengthening Network Information
( )
Protection (NPCSC Decision, China)
Nebrija University’s Master’s in Data Protection and Security
Need-to-know access, for access control
Neiditz, Jon
Netherlands Organisation for Applied Scientific Research
Network access
Network Advertising Initiative (NAI)
Network Advertising Initiative (NAI) Code of Conduct
New Zealand, data subject rights in
New Zealand Office of the Privacy Commissioner (OPC)
NIST (National Institute of Standards and Technology). See National Institute of Standards and Technology
(NIST)
Northam, Ralph
NPCSC Decision (National People’s Congress Standing Committee Decision on Strengthening Network
Information Protection, China)
O
Obfuscation, technical controls for
OECD (Organisation for Economic
Co-operation and Development) Guidelines on the Protection of Privacy and Transborder Flows of
Personal Data
Office of the Australian Information Commissioner (OAIC)
Office of the Privacy Commissioner (OPC,
New Zealand)
Office of the Privacy Commissioner for Personal Data (PCPD, Hong Kong)
Office of the Privacy Commissioner of Canada (OPC)
Online data, privacy protections for
Online Privacy Alliance (OPA)
Online tracking via “cookie consents”
OPC (Office of the Privacy Commissioner of Canada)
OPC (Office of the Privacy Commissioner,
New Zealand)
Open Knowledge Foundation
Operational actions for awareness
Operational security
Opt-in versus opt-out, in data subject rights
Organisation for Economic Co-operation and Development (OECD) Guidelines on the Protection of
Privacy and Transborder Flows of Personal Data
Organizational balance and support
Organization-wide privacy program management
Outside resources
in data breaches
in incident handling
monitoring
Oversight agencies
P
PaaS (Platform as a service)
Pahl, Chris
Password policies
Payment Card Industry Data Security Standard (PCI DSS)
PayPal trust marks
PbD (privacy by design). See Privacy by design (PbD)
PCPD (Office of the Privacy Commissioner for Personal Data, Hong Kong)
PDPA (Personal Data Protection Act, Malaysia) of 2010
PDPA (Personal Data Protection Act, Thailand) of 2021
PDPC (Personal Data Protection Commission, Singapore)
Penalties for noncompliance with laws and regulations
People’s Republic of China Personal Information Protection Law (PIPL)
Performance. See Sustain phase: monitoring and auditing performance
Personal data protection, in data subject rights
Personal Data Protection Act of 2010 (PDPA, Malaysia)
Personal Data Protection Act of 2021 (PDPA, Thailand)
Personal Data Protection Commission (PDPC, Singapore)
Personal Information Protection Act (South Korea)
Personal Information Protection and Electronic Documents Act (PIPEDA)
Personal Information Protection Commission (PIPC, South Korea)
Personal Information Protection Commission (PPC, Japan)
Personally identifiable information (PII)
breach losses of
collection and processing of
in privacy impact assessment
protection of
See also Protecting personal information
PETs (privacy-enhancing technologies)
PHI (protected health information)
Phishing attacks
Physical assessments
Physical controls
Physical security
PII (personally identifiable information). See Personally identifiable information (PII)
PIPC (Personal Information Protection Commission, South Korea)
PIPEDA (Personal Information Protection and Electronic Documents Act)
PIPL (People’s Republic of China Personal Information Protection Law)
Platform as a service (PaaS)
PMM (Privacy Maturity Model)
Policies for information protection
components of
cost considerations
data retention and destruction
of employees
implementing
interfacing and communicating with organization about
overview
vendor engagement
See also Information security; Protecting personal information
Policy controls
Polis, Jared
Ponemon Institute
PowerBI data privacy dashboards
PPA (Privacy Protection Authority, Israel)
PRC General Provisions of the Civil Law (China)
Preventative controls
Principles and standards, in privacy governance
Privacy Act of 1974
Privacy Act of 2020 (New Zealand)
Privacy analysts
Privacy by design (PbD)
diagramming
foundational principles of
overview
privacy impact assessment to facilitate
protecting personal information by
in research and development
Privacy dashboard
Privacy director/manager
Privacy engineering
Privacy-enhancing technologies (PETs)
Privacy governance
framework development
model, responsibilities, and reporting
overview
scope of
strategy development
team structure
technology and tools for
vision and mission statement
Privacy impact assessments (PIAs)
ISO on
overview
product research and development performance of
in United States
Privacy incidents, leveraging
Privacy leaders
conferences and seminars attended by
data protection officer (DPO) role
education and backgrounds of
professional certifications of
titles of
Privacy/legal counsels
Privacy Maturity Model (PMM)
Privacy measurement
Privacy notices
communication considerations
design challenges and solutions
elements of
overview
privacy policy versus
Privacy program management, introduction to
accountability in
championing
law and compliance versus
manager responsibilities
need for
organization-wide
overview
terminology
Privacy Protection Authority (PPA, Israel)
Privacy technologists
Privacy threshold analysis (PTA)
Privacy Tracker
Procurement team, privacy procedures and
Product research and development team, privacy procedures and
Professional certifications
Program management
Progress reporting, on data breaches
Proofpoint, Inc.
Proposition 24 (California Privacy Rights Act) of 2020
Proprietary information
Protected health information (PHI)
Protecting personal information
confidentiality, integrity, and availability in information security
controls in information security
data privacy and information security
by design and default
diagramming privacy by design
practices in information security
privacy by design
in privacy program life cycle
standards and guidelines in information security
technical controls for privacy
See also Information security; Policies for information protection
Pseudonymization
PTA (privacy threshold analysis)
Public category, in data classification
Q
QR codes
R
Ramirez, Edith
Ransomware attacks
Records of Processing Activities (GDPR)
Recovering from data breaches
Rectification, in data subject rights
Regulations. See Laws and regulations
Regulators, reporting data breaches to
Regulatory scrutiny
Remediation
costs of
offers of
Remote work
Remote worker endpoint security
Reporting data breaches
call center launches
external announcements
internal announcements
letter drops
overview
progress reporting
to regulators
remediation offers
requirements and guidelines
worksheets for
Reputational liability
Resources, third-party external
Respond, in privacy program life cycle
Respond phase: data subject rights. See Data subject rights
Restricted category, in data classification
Restrictions of data subject rights
Retention of data
Return on investment (ROI) analysis
“Right to be forgotten” (RTBF)
Right to information, in data subject rights
Right to object, in data subject rights
Risk assessments
Risk management team, privacy procedures and
ROI (return on investment) analysis
Routing patterns
RTBF (“right to be forgotten”)
Ryerson University’s Certificate in Privacy, Access and Information Management
S
SaaS (Software as a service)
Safeguards against security breaches
Safe Harbor Framework
SafetyDetectives
SCCs (standard contractual clauses)
Schultze-Melling, Jyn
Scope of privacy program
challenges in
laws and regulations
personal information collected and processed
Scoping audits
Second-party (supplier) audits
Sectoral regulations
Security, technical controls for
Security automation technologies
Security breaches. See also Data breaches
Security tools
Segregation of duties, for access control
Self-assessment, attestation as
Self-regulated model
Self-regulation by industry standards
Shapiro, Stuart
Singapore Personal Data Protection Commission (PDPC)
Snapchat.com
Social attacks
Social media-based attacks
Software as a service (SaaS)
Software-defined privacy settings
Software loading
SolarWinds breach
South Korea Personal Information Protection Commission (PIPC)
Stakeholders
collaboration among
in data breaches
in investigations
in privacy strategy
progress reporting to
Standard contractual clauses (SCCs)
Standards
industry
in information security
for vendor selection
Stanford University
Stay Safe Online
Strategy, privacy governance
Supplier monitoring. See also Vendors
Sustain, in privacy program life cycle
Sustain phase: monitoring and auditing performance
auditing
metrics for: audience and
metrics for: overview
metrics for: owner’s role
metrics for: privacy measurement
metrics for: reporting
monitoring forms
monitoring types
Sustain phase: training and awareness
audiences for
communication
creating awareness
difference between
leveraging privacy incidents
methods for
metrics for
operational actions
overview
strategies for
Systems acquisition, development, maintenance, and disposal
T
Tableau software, data privacy dashboards in
Tabletop exercises, in incident training
TC260 (Chinese National Information Security Standardization Technical Committee)
TCPA (Telephone Consumer Protection Act) of 1991
Team structure, privacy governance
TeamViewer software
Technical controls
Technology and tools
laws and regulations on new
for privacy governance
Telecom data, privacy protections for
Telephone Consumer Protection Act (TCPA) of 1991
Third parties
as external privacy resources
forensics by
in incident handling
in independent audits
Thomas, Liisa
Three Lines Model
TIA (data transfer impact assessment)
Training
budgeting for
for data breach preparedness
of employees on data breaches
monitoring
See also Sustain phase: training and awareness
Transparency, in data subject rights
Trend analysis
TrustArc trust marks
Turkey Kişisel Verileri Koruma Kurumu (KVKK)
U
UK data subject rights. See Europe and UK data subject rights
UK Information Commissioner’s Office (ICO)
UN Convention on the Rights of the Child in Child Friendly Language
Union leadership role
in data breaches
in incident planning
University of Auckland’s Postgraduate Diploma in Information Governance
U.S. Department of Commerce (DOC)
U.S. Department of Health and Human Services
U.S. Federal Trade Commission (FTC). See Federal Trade Commission (FTC)
V
Value metrics
Vendor management program (VMP)
Vendors
CCPA assessments of
for cloud computing
contract for engaging
contract language for privacy protection
as data breach incident sources
GDPR assessments of
human resources policies and
monitoring
policies for engaging
print
selection standards for
Verisign trust marks
Verizon, Inc.
Video data, privacy protections for
Video Privacy Protection Act (VPPA) of 1988
Virginia’s Consumer Data Protection Act (CDPA)
Virtual technology
Virus protection
Vision, privacy governance
VMP (vendor management program)
VPPA (Video Privacy Protection Act) of 1988
W
Website scanning
WebTrust
Wireless management
Withdrawals of consent and data access,
234–235
Witt, Amanda
Wombat Security
Workarounds
WP29 (Article 29 Working Party)
Y
Yakabovicz, Edward
YouTube.com