0% found this document useful (0 votes)
12 views19 pages

Dark Patterns

The paper introduces the concepts of privacy dark strategies and privacy dark patterns, which describe malicious techniques used by online service providers to deceive users into sharing personal information. It presents a framework for documenting and analyzing these patterns, emphasizing the importance of understanding how personal data can be abused. The authors aim to raise awareness and contribute to the development of countermeasures to protect user privacy.

Uploaded by

vivekanandkorati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views19 pages

Dark Patterns

The paper introduces the concepts of privacy dark strategies and privacy dark patterns, which describe malicious techniques used by online service providers to deceive users into sharing personal information. It presents a framework for documenting and analyzing these patterns, emphasizing the importance of understanding how personal data can be abused. The authors aim to raise awareness and contribute to the development of countermeasures to protect user privacy.

Uploaded by

vivekanandkorati
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

See discussions, stats, and author profiles for this publication at: [Link]

net/publication/303814886

Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns

Article in Proceedings on Privacy Enhancing Technologies · July 2016


DOI: 10.1515/popets-2016-0038

CITATIONS READS

270 5,806

5 authors, including:

Stefan Pfattheicher
Aarhus University
116 PUBLICATIONS 3,563 CITATIONS

SEE PROFILE

All content following this page was uploaded by Stefan Pfattheicher on 07 August 2023.

The user has requested enhancement of the downloaded file.


Proceedings on Privacy Enhancing Technologies ; 2016 (4):237–254

Christoph Bösch*, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher

Tales from the Dark Side: Privacy Dark


Strategies and Privacy Dark Patterns
Abstract: Privacy strategies and privacy patterns are However, online service providers have become more
fundamental concepts of the privacy-by-design engineer- and more sophisticated in deceiving users to hand over
ing approach. While they support a privacy-aware de- their personal information. Up until now, privacy re-
velopment process for IT systems, the concepts used by search has not studied this development.
malicious, privacy-threatening parties are generally less An example for this development is the Tripadvi-
understood and known. We argue that understanding sor mobile app (depicted in Figure 1), which is a review
the “dark side”, namely how personal data is abused, platform for travel-related content. At first glance, the
is of equal importance. In this paper, we introduce the starting page asks the user to log in with a personal
concept of privacy dark strategies and privacy dark pat- Google+, Facebook, or email account. Taking a closer
terns and present a framework that collects, documents, look, one notices that a third option is given that offers
and analyzes such malicious concepts. In addition, we the creation of a Tripadvisor account. Furthermore, a
investigate from a psychological perspective why privacy “Skip”-button is hidden in the upper right corner, which
dark strategies are effective. The resulting framework which skips the login process entirely. When signing in
allows for a better understanding of these dark con- with Facebook, Tripadvisor wants to gain access to the
cepts, fosters awareness, and supports the development friend list, photos, likes, and other information (cf. Fig-
of countermeasures. We aim to contribute to an eas- ure 1b). This is unnecessary for the main features of the
ier detection and successive removal of such approaches service.
from the Internet to the benefit of its users. Skipping the login process shows the user some fea-
tures which are available only after signing in (cf. Fig-
Keywords: Privacy, Patterns
ure 1c). In addition, the “Later”-button, which finally
DOI 10.1515/popets-2016-0038 leads to the app, is located on the left side. Placed on the
Received 2016-02-29; revised 2016-06-02; accepted 2016-06-02. right side is a “Sign in”-button which leads back to the
starting page. This influencing towards logging in via
Facebook/Google+ or creating a personal account gives
1 Motivation and Introduction Tripadvisor access to personal information. Figure 1 il-
lustrates general reusable strategies for deceiving users
Over the last years, privacy research has primarily fo- to share more of their personal information.
cused on (i) a better conceptual understanding of pri- In this paper, we deliberately change sides and ex-
vacy, (ii) approaches for improving and enhancing pri- plore the techniques used by the “bad guys” to col-
vacy protection, as well as (iii) technical mechanisms for lect privacy-sensitive data more efficiently. Similar to
implementing these approaches. the collection of well-established privacy solutions (so-
called privacy patterns [14, 24]) as part of the privacy-
by-design strategy, we identify and collect malicious pat-
terns that intentionally weaken or exploit the privacy of
users, often by making them disclose personal data or
*Corresponding Author: Christoph Bösch: Insti-
consent against their real interest.
tute of Distributed Systems, Ulm University, E-mail:
[Link]@[Link]
This approach may initially seem suspicious, as it
Benjamin Erb: Institute of Distributed Systems, Ulm Uni- could provide guidance for malign stakeholders such as
versity, E-mail: [Link]@[Link] data-driven companies or criminals. However, we believe
Frank Kargl: Institute of Distributed Systems, Ulm Univer- that this shift in perspective is helpful and necessary for
sity, E-mail: [Link]@[Link] privacy research, as it introduces several benefits: (i)
Henning Kopp: Institute of Distributed Systems, Ulm Uni-
A detailed analysis and documentation of privacy dark
versity, E-mail: [Link]@[Link]
Stefan Pfattheicher: Department of Social Psychology, Ulm patterns allows for a better understanding of the under-
University, E-mail: [Link]@[Link] lying concepts and mechanisms threatening the users’

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 238

(a) The starting page. (b) Log-in with Facebook. (c) Skipped sign-in process.

Fig. 1. Screenshots of the Tripadvisor mobile app. (a) shows the starting page. Note the small “Skip” button in the upper right corner.
(b) shows the requested personal information when logging in with Facebook. Some of the information is unnecessary for providing the
service. (c) shows what happens after skipping the sign-in process.

privacy. (ii) A collection of privacy dark patterns fos- of chambers in such a way that a further distance from
ters awareness and makes it easier to identify such ma- the building’s entrance allows for increased intimacy.
licious patterns in the wild. (iii) Furthermore, the doc- In 1987, the idea of patterns was readopted by Kent
umentation of a privacy dark pattern can be used as a and Cunningham [10] and introduced into the realm of
starting point for the development of countermeasures, computer science and software development. The Port-
which disarm the pattern and re-establish privacy. The land Pattern Repository of Kent and Cunningham col-
discussion is similar to IT security, where investigation lected patterns for programmers using object-oriented
and publication of vulnerabilities proved to be key for programming languages.1 The idea of using patterns in
actually enhancing the security level in real systems. software design gained wide acceptance in 1994, when
the so-called Gang of Four released their well-known
book on design patterns for reusable object-oriented
1.1 Introduction to Patterns software [19]. Since then, the usage of patterns has
spread to various different branches of computer science
In many disciplines recurring problems have been ad- and software engineering, including distributed architec-
dressed over and over again, yielding similar and recur- tures [18, 25], user interface design [46], IT security [41],
ring solutions. The idea of a pattern is to capture an and privacy [14, 22, 40, 42].
instance of a problem and a corresponding solution, ab- The success of patterns in software engineering has
stract it from a specific use case, and shape it in a more entailed novel classes of patterns with different seman-
generic way, so that it can be applied and reused in tics, namely anti patterns [14] and dark patterns [11].
various matching scenarios. Traditional design patterns capture a reasonable and
Patterns originate from the realm of architecture, established solution. In contrast, an anti pattern docu-
where Alexander et al. [5] released a seminal book on ments a solution approach that should be avoided, be-
architectural design patterns in 1977. In this book, the cause it has been proven to represent a bad practice.
authors compiled a list of archetypal designs for build-
ings and cities which were presented as reusable solu-
tions for other architects. Interestingly, Alexander et al.
already came up with patterns for privacy. For instance, 1 Historical side note: the online version of the pattern reposi-
their Intimacy Gradient pattern postulates a placement tory, WikiWikiWeb ([Link] became the first-
ever wiki on the World Wide Web

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 239

Hence, anti patterns raise awareness of sub-par solu- general terminology for privacy dark patterns and es-
tions and advocate against their usage. tablishes a template for documenting privacy dark pat-
Anti patterns often target solutions that may seem terns. Our framework suggests a list of malicious privacy
obvious to the system developer at a first glance, but in- strategies and psychological aspects for categorizing pri-
clude a number of less obvious negative implications and vacy dark patterns. Based on the pattern template of
consequences. Even established design patterns some- our framework, we discuss common privacy dark pat-
times become obsolete and are downgraded to anti pat- terns that we extracted from real-world occurrences.
terns due to new considerations. For instance, the Gang
of Four suggested a pattern for restricting the instan-
tiation of a class to a single instance, the so-called 1.3 Contribution
Singleton pattern. Only after concurrent programming
and multi-core architectures became more widespread, Our contribution can be summarized as follows:
shortcomings of this pattern eventually became appar- 1. We introduce the concept of privacy dark strategies
ent. Today, the Singleton pattern is widely considered and privacy dark patterns.
an anti pattern. 2. We present a framework for privacy dark patterns
The term dark pattern was first used by Brignull, that takes into account traditional privacy patterns,
who collected malicious user interface patterns [11] for empirical evidence of malign patterns, underlying
better awareness. A UI dark pattern tricks users into malicious strategies, and their psychological back-
performing unintended and unwanted actions, based on ground. The resulting framework provides a tem-
a misleading interface design. More generally speaking, plate for documenting and collecting arbitrary pri-
a dark pattern describes an established solution for ex- vacy dark patterns.
ploiting and deceiving users in a generic form. 3. We provide an initial set of exemplary dark patterns
In summary, anti patterns collect the Don’ts for that we encountered in the wild.
good intentions and dark patterns collect potential Dos 4. We launched the website [Link] as
for malicious intents. In this paper, we present a first an online collection for privacy dark patterns. Being
broad discussion on dark patterns in the field of privacy. a collaborative resource, we invite the community to
submit more patterns and help to raise awareness.

1.2 Methodology
2 On Privacy Strategies and
To suggest a framework for the collection of privacy dark Privacy Patterns
patterns and to compile a list of such patterns, we con-
sider the problem from three different angles as part of In this section, we introduce privacy patterns and cor-
a holistic approach. responding privacy strategies, based on their historical
First, we survey existing literature on privacy development.
strategies and privacy patterns. We then reverse pri- Until the mid-1990s, privacy was rarely considered
vacy strategies and adapt some of these ideas and extend a relevant feature of IT systems. Even if it was, the inte-
them, so that they become malicious patterns. Beyond gration of privacy-preserving mechanisms was often con-
this, we have identified new types of patterns. Second, ducted a posteriori, as an additional requirement later
we include a psychological point of view on malevolent added to the system. The notion of “privacy as an af-
privacy concepts. This perspective takes into account terthought” contradicted the cross-sectional property of
human information processing, social cognition and mo- privacy as part of an IT system and often yielded ex-
tivation, as well as exploitable basic human needs. On tensive or insufficient changes to the system.
this basis we are able to deduce additional approaches To overcome these deficits, a joint team of the Infor-
on how to reduce the power of privacy dark strategies. mation and Privacy Commissioner of Ontario, Canada;
Third, we identify and analyze real-world examples of the Dutch Data Protection Authority; and the Nether-
malicious privacy mechanisms as found on websites and lands Organisation for Applied Scientific Research ad-
in mobile applications. vocated a more integral approach that included privacy
Next, we integrate these findings on privacy dark considerations into the overall development cycle [27]. In
patterns into a unified framework, which introduces a 1995, they introduced the so-called Privacy by Design

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 240

approach,2 which is postulated by the following seven since metadata often includes personally identifiable in-
foundational principles: formation. Further, this solution is reusable, since it is
1. Proactive not reactive not bound to a specific instance of a problem. Thus,
2. Privacy as the default setting stripping of metadata constitutes a privacy pattern that
3. Privacy embedded into design can be applied, e.g., to a website managing digital pho-
4. Full functionality tographs.
5. End-to-end security A single privacy pattern addresses a problem with
6. Visibility and transparency a limited scope. Multiple related and complementing
7. Respect for user privacy patterns can then be compiled into a pattern catalog.
Similar to the well-known design pattern catalogs from
These principles have been a major milestone for the de- software engineering, a privacy pattern catalog collects
sign of privacy-preserving systems as they provide gen- a number of relevant problems and suitable solutions
eral guidance. For that reason, these concepts are part that can be applied during a privacy-aware development
of several privacy legislations today. phase.
One frequent criticism regarding Privacy by Design There are multiple collections of privacy patterns
and its seven principles is that they are too unspecific from academic research [14, 22, 40, 42] as well as online
to be directly applied to a development process. The repositories3 that are more accessible for practitioners.
principles neither provide concrete advice, nor do they In a typical system development process, privacy
address the varying needs of specific domains, such as patterns are applied during the stages of design and im-
the Internet of Things, User Interface Design, or Car- plementation. However, in many scenarios, privacy as-
2-Car communication. A system designer is thus still pects represent fundamental system requirements that
required to have a thorough understanding of privacy have to be considered from the very beginning. The
and the forces involved, to design a privacy friendly sys- question is, whether more general architectural build-
tem. Clearly, more guidance and a more methodological ing blocks exist that can be applied at an even earlier
approach is required to establish a privacy engineering stage, i.e., during requirement analysis and architectural
process as, for example, worked on by the PRIPARE design. Note that this notion is a natural continuation
project [38]. of the Privacy by Design philosophy—to include privacy
One element of this privacy engineering approach considerations into the entire development process.
are so-called Privacy Patterns. The main idea of pri- These general architectural building blocks are
vacy patterns is to improve the drawbacks of the Pri- known as Privacy Design Strategies. According to Hoep-
vacy by Design principles, i.e., that they are not action- man [24], a privacy design strategy is on a more general
able [14, 21, 48]. Privacy patterns are defined as reusable level than a privacy pattern and “describes a funda-
solutions for commonly occurring problems in the realm mental approach to achieve a certain design goal. It has
of privacy. Essentially, they are patterns for achieving certain properties that allow it to be distinguished from
or improving privacy. Privacy patterns provide guidance other (fundamental) approaches that achieve the same
for engineering and development, and target the needs goal.”
of specific domains, such as backend implementations or Later in the development process, a privacy design
user interface design. By providing a well-structured de- strategy can be refined with privacy patterns imple-
scription of a problem and its solution using a standard- menting one or more strategies. Thus, privacy design
ized template, patterns can easily be looked up and ap- strategies provide a classification of privacy patterns.
plied. Since these patterns include references to specific When system designers search for a privacy pattern in
use-cases and possibly implementations, engineers will a collection, they are only interested in the ones imple-
directly find the resources needed to implement them in menting their chosen privacy strategy.
their own context. Hoepman [24] defines the following eight privacy de-
One well-known example of a privacy pattern that sign strategies.
can be implemented in multiple domains is the strip- Minimize: Data minimization is a strategy which in-
ping of metadata that is not necessary for the function- sists that the amount of personal information that
ality of the service. This procedure increases privacy, is processed should be minimal. Data that is not

2 [Link] 3 [Link] [Link]

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 241

needed for the original purpose should not be col- for system implementations is commonly acknowledged
lected. when building privacy-friendly IT systems.
Hide: Hide takes place after data collection. Whereas However, there are other parties that have different
Minimize forbids the collection of needless informa- agendas when building IT systems. Instead of privacy-
tion, Hide suggests that any personal data that is friendly solutions, they aim for systems that purpose-
processed should be hidden from plain view. fully and intentionally exploit their users’ privacy—for
Separate: The approach of the privacy strategy Sep- instance motivated by criminal reasons or financially ex-
arate is to process any personal information in a ploitable business strategies.
distributed fashion if possible. Thus, interrelation- For the development of our framework, we re-
ships between personal data vanish in contrast to a verse the evolution of privacy strategies and patterns:
centralized processing. First, we define dark strategies as the high-level goals
Aggregate: When implementing Aggregate, per- that these parties follow in order to exploit privacy.
sonal information is processed at a high level of ag- Next, we derive suitable dark patterns that implement
gregation. This level should only be so high as to these strategies. We then complement our framework by
remain useful, however. Details that are not needed adding a psychological perspective on how the strate-
for the functionality of the service vanish. This pro- gies generally achieve their deceptive and manipulative
cess could include statistical aggregation such that goals. Note that we do not include a counterpart to
the details of identities are blurred. privacy-enhancing technologies as part of our frame-
Inform: The privacy strategy Inform states that data work.
subjects should be adequately informed whenever As already clarified in the introduction, the result-
personal information is processed. ing framework is neither intended nor structured as
Control: A common requirement of software systems a construction kit for malicious parties. Instead, the
is that data subjects should be in control of the pro- framework can be used by privacy researchers and prac-
cessing of their personal information. Whenever this titioners for detecting, recognizing, analyzing, and doc-
is ensured, we are dealing with the privacy strategy umenting malicious strategies and patterns.
Control. Hoepman states that he is not aware of When used top-down, the framework supports a pri-
any patterns implementing this strategy. vacy analysis of IT systems by raising awareness for
Enforce: Enforce states that a privacy policy that malicious strategies and by uncovering corresponding
is compatible with legal requirements should be in mechanisms. Bottom-up, the framework helps to iden-
place and should be enforced. tify malicious patterns, reveals underlying strategies,
Demonstrate: The privacy strategy Demonstrate and provides pointers for the development of concrete
demands that data controllers are able to demon- countermeasures.
strate compliance with their privacy policy and any
applicable legal requirements. A good example for
a pattern implementing this strategy is the use of 3.1 Privacy Dark Strategies
audits.
We now develop a categorization of privacy dark pat-
In the following sections, privacy design strategies serve terns, analogously to Hoepman’s privacy design strate-
as the starting point for our analysis of malicious dark gies [24]. As privacy design strategies can be used to cat-
strategies that harm privacy. For defining and docu- egorize privacy patterns by their fundamental approach,
menting malicious patterns, we adapt the idea of privacy the same holds for privacy dark strategies. Hoepman
patterns and transform it into privacy dark patterns. identified eight privacy strategies, namely Minimize,
Hide, Separate, Aggregate, Inform, Control, En-
force, and Demonstrate. Based on these strategies,
we identify the following privacy dark strategies: Max-
3 The Dark Side imize, Publish, Centralize, Preserve, Obscure,
Deny, Violate, and Fake as shown in Table 1. These
The triad of general privacy strategies for high-level pri-
are used for our categorization of privacy dark patterns
vacy requirements, privacy patterns for privacy-aware
in Section 5.
design processes, and privacy-enhancing technologies
The privacy design strategy Minimize, for exam-
ple, demands the amount of processed data to be re-

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 242

Table 1. Privacy Strategies vs. Dark Strategies. this dark strategy to encourage the sharing of personal
data and thus the use of their platform. This strategy
Strategies
satisfies a person’s need to belong as will be explained
Hoepman Dark Strategies
in section 4.
Minimize Maximize
Hide Publish
Separate Centralize Centralize. Centralize is the dark strategy associ-
Aggregate VS Preserve ated to the privacy strategy Separate, which mandates
Inform Obscure that personal data should be processed in a distributed
Control Deny way. Centralize, in contrast, enforces that. . .
Enforce Violate
Demonstrate Fake
Personal data is collected, stored, or processed at a
central entity.
stricted to the minimal amount possible. The corre-
This strategy preserves the links between the different
sponding dark strategy Maximize would collect, store,
users and thus allows for a more complete picture of
and process as much data as possible, leading to a loss
their habits and their usage of the service.
of privacy. The system designer does not act out of pure
Advertising networks employ this strategy heavily
maliciousness but to gain an advantage over a system
by sharing pseudonymous user IDs, a practice known as
with the same functionality but with stronger privacy
cookie syncing [1]. Another common occurrence of this
protection. Specifically, by receiving additional personal
privacy dark strategy is the practice of flash cookies,
data which can, e.g., be sold or used for personalized ad-
which are cookies that are stored centrally by the flash
vertisements.
plug-in on the file system and are thus not restricted to
In the following we detail the eight privacy dark
a specific web browser.
strategies we have developed.

Preserve. The dark strategy Preserve requires


Maximize. The goal of the dark strategy Maximize is
that. . .
to collect an inappropriate amount of data. More pre-
cisely Maximize means that. . . Interrelationships between different data items
should not be affected by processing.
The amount of personal data that is collected,
stored, or processed is significantly higher than what They should rather be preserved in their original state
is actually needed for the task. for analysis instead of storing them in a processed form,
e.g., aggregation. It is not necessary to know the type of
Examples would be extensive sign-up forms with fields
analysis in advance. A prominent example is telecom-
that are not needed for the functionality of the service.
munications data retention because traffic analysis can
Often those unneeded fields are mandatory, maximizing
recover the relationships between persons.
the collection of personal data. Another example of a
Maximize strategy are bad default privacy settings or
Obscure. In the dark strategy Obscure. . .
the necessity to set up an account for the usage of a
service, especially if the account is not needed for the It is hard or even impossible for data subjects to
functionality of the service. learn how their personal data is collected, stored,
and processed.
Publish. The dark strategy Publish can be character-
ized by the requirement that. . . Users should be unable to inform themselves about what
happens to their disclosed data. This can be achieved in
Personal data (not intended to be public) is not hid- the form of a privacy policy with many technical terms,
den from plain view. which are difficult to understand for the average user.
User Interfaces could be designed to mislead the user,
This means that often no mechanism is in place to hide
leading to decisions contradicting the user’s original in-
personal data from unauthorized access, such as en-
tent. The EFF called this particular mechanism “pri-
cryption or access control. The personal data lies in the
vacy zuckering” [28].
open for everyone to see. Social networks often employ

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 243

Deny. Patterns making use of the dark strategy Deny – Centralize: Personal data is collected, stored, or
make a data subject lose control of their personal data. processed at a central entity.
The term Deny is due to a denial of control. – Preserve: Interrelationships between different
data items should not be affected by processing.
Data subjects are denied control over their data. – Obscure: It is hard or even impossible for data sub-
jects to learn how their personal data is collected,
With this dark strategy, a service provider can pre- stored, and processed.
vent users from taking actions that oppose that service – Deny: Data subjects are denied control over their
provider’s interest. An example is to not provide the data.
functionality for deleting an account. Another example – Violate: A privacy policy presented to the user is
is the nonexistence of options to control sharing of infor- intentionally violated.
mation. Until recently this was the case in WhatsApp, – Fake: An entity collecting, storing, or processing
where the online status was automatically shared with personal data claims to implement strong privacy
everyone who subscribed to that phone number, which protection but in fact only pretends to.
has a big impact on the privacy of users [12].

Violate. The strategy Violate occurs if. . . 3.2 Privacy Dark Patterns
A privacy policy presented to the user is intention-
After our exploration of privacy dark strategies we will
ally violated.
now define the concept of a privacy dark pattern. As
mentioned in Section 2, a pattern describes a generic,
A privacy policy is in place, shown to the user but
reusable building block to solve a recurring problem and
intentionally not kept. The users are unaware of the
hence to document best practices. They can be collected
violation; thus, this does not impact the trust put into
in special catalogs and allow for easy replication. Pat-
that service if such violations are not revealed. It is hard
terns fulfill the role of a common language to allow sys-
to find concrete examples and patterns implementing
tem developers and privacy engineers to communicate
this strategy since using this strategy is against the law
more efficiently.
and not publicly admitted by companies.
We argue that common building blocks that are
used by service providers to deceive and mislead their
Fake. The privacy dark strategy Fake means that. . .
users exist. Some service providers use recurring pat-
An entity collecting, storing, or processing personal terns to increase the collection of personal data from
data claims to implement strong privacy protection their users. Sometimes these building blocks are used
but in fact only pretends to. unintentionally, simply constituting usage of privacy
anti patterns, but without any malicious intent. How-
An example of this strategy are self-designed padlock ever, we claim that there are building blocks which are
icons or privacy seals, which make the user feel secure used on purpose, thereby yielding an advantage to the
but do not have any meaning. Another example are service provider. We call these building blocks privacy
wrong and unsubstantial claims such as an unrealistic dark patterns.
claim on the key-size of ciphers or marketing terms like Analogously to privacy patterns, privacy dark pat-
“military grade encryption”. terns can be collected in special repositories to facilitate
easy access and retrievability for users and to develop
countermeasures. Patterns are usually documented in
Synthesis a formalized template to enable system developers to
Our eight Privacy Dark Strategies can be summarized easily reference and use them. Common fields in such a
as follows: template include the name of the pattern, the problem
– Maximize: The amount of personal data that is col- the pattern is solving and references to related patterns.
lected, stored, or processed is significantly higher However, current templates for design and privacy
than what is actually needed for the task. patterns are not suitable for documenting privacy dark
– Publish: Personal data is published. patterns due to the following reasons:
1. Privacy patterns and privacy dark patterns have
a different intent regarding their documentation.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 244

Each privacy pattern solves a specific problem, Examples/Known Uses: In this section, implemen-
which is often mentioned as a separate field in the tations using the dark pattern are described. Service
template. Privacy patterns are documented to be providers applying the privacy dark pattern belong
copied and used. The purpose of documenting pri- into this field. Screenshots of usage of the dark pat-
vacy dark patterns on the other hand is to create tern can be provided where appropriate.
and enhance awareness about common anti-privacy Related Patterns: If related privacy dark patterns
techniques, since they do not solve an engineering exist they are referenced here.
problem. Thus, a problem-centric description is out Psychological Aspects: This field describes the psy-
of place. chological mechanisms that make the pattern effec-
2. The target group of privacy patterns are system tively influence the users in their behavior.
designers whereas privacy dark patterns can target Strategies: In this part of the documentation of a pri-
non-technical end-users to educate them about the vacy dark pattern, the used dark strategy is pro-
strategies that are used to deceive them. vided. These are the dark strategies explained in
Section 3.1.
Thus, we need a different template to document privacy
dark patterns. This template can be used to systematically document
different privacy dark patterns in a repository. We make
use of this template later in Section 5.
Our Privacy Dark Pattern Template

We have developed a new template, specifically targeted


towards privacy dark patterns, which we explain in de-
4 Psychological Aspects
tail in the following.
In the following, we address the question why privacy
Name/Aliases: This field describes the name under
dark patterns do actually work. One can reasonably as-
which the privacy dark pattern is known. The name
sume that there is, at least to some degree, awareness
should be concise and capture the essence of the
among a majority of users that privacy dark strategies
pattern.
exist and some service providers have strong incentives
Summary: A short summary to describe the pattern
to violate the privacy of their users. It is similarly likely
is necessary to provide an overview of the pattern
that users notice, at least sometimes, when they are be-
and for quick reference.
ing targeted by privacy dark strategies. Nevertheless,
Context: Context describes the scenario in which the
privacy dark strategies still work, as indicated by their
pattern appears. e.g., online social networks or on-
frequent occurrence. This somewhat paradoxical situa-
line shops.
tion can be explained by adopting a psychological per-
Effect: This section explains the effects and conse-
spective on privacy dark strategies.
quences of the pattern. This should be described
Essentially, privacy dark strategies often work well
with sufficient granularity such that it is not too
because they take advantage of the psychological consti-
general.
tution of human beings. In this regard, we focus on the
Description: In this part of the template, the privacy
ways in which humans think and reason, i.e., humans’
dark pattern is described in detail. Technical lan-
cognitive information processing.
guage can be used if not avoidable, but it should
There is widespread agreement in the field of psy-
be remembered that the main target group of the
chological research that two different cognitive systems
pattern are the end-users of the system in which the
underlie thinking and reasoning processes [29, 43, 44].
privacy dark pattern is applied.
For instance, when creating a new account on a web-
Countermeasures: The countermeasures describe be-
site, a users are often asked to agree to a list of general
haviors and tools a user can implement to negate the
terms and conditions. Most likely, they will not read
effects of the privacy dark pattern. These are strate-
the page filled with these terms and conditions, but will
gies to help the “victims” of the pattern regain or
agree to them quickly, intuitively, and automatically.
maintain their privacy. This includes procedures to
This is an example of a System 1 thinking process; it
avoid the effects of the pattern, as well as add-ons to
takes place automatically, unconsciously, and with little
existing programs, e.g., web browsers, which prevent
effort [29, 43, 44].
the end-user from being deceived by the pattern.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 245

Instead of agreeing to general terms and conditions is too complicated and subjects are unable to interpret
quickly and automatically, one can take the time and this information [35].
make the effort to carefully read the information pro-
vided. Afterwards, one deliberatively weighs the pros
and cons and decides whether to agree to the condi- 4.1 Prompting System 1 Thinking
tions or not. This is an example of a System 2 thinking
process; it takes place in a controlled, conscious, and As argued above, privacy dark strategies are typically
effortful way. Behavior based on System 2 thinking is accompanied by System 1 thinking processes, while Sys-
driven by a deliberative, effortful decision-making pro- tem 2 thinking processes are often not possible, as shown
cess, resulting in the relatively slow execution of behav- in the following analysis. Regarding the dark strategy
ior [29, 43, 44]. Maximize, the amount of data that is processed is sig-
General terms and conditions are often not read, nificantly higher than the data that is really needed for
and agreement is typically made automatically and the task. Subjects need high motivation to resist exces-
quickly, i.e., System 1 operates. There is thus an op- sive data collection. Additionally, although some users
portunity to fill general terms and conditions with dark might have high motivation, they need specific knowl-
ingredients. These in turn are not consciously noticed edge and abilities to offer any resistance. However, some
when users are in System 1 mode, as illustrated in the service providers use mandatory form fields for user reg-
example. In general, we postulate that privacy dark istration, which renders the knowledge to circumvent
strategies work well when individuals process informa- the dark strategy useless if one wants to utilize the ser-
tion using System 1 thinking. When (dark) information vice. Thus, users often stay in System 1 mode and allow
is processed quickly, without much effort, and automat- Maximize to operate.
ically, it seems likely that privacy dark strategies can When personal data is not hidden from plain view
unleash their full impact. In other words, in System 1 (Publish), users need to be motivated and able to
mode, subjects are likely to be less conscious of privacy change settings. Users might lack the necessary moti-
dark patterns being at work and unable to delibera- vation and ability to do so; thus, remaining in System 1
tively act against them. On the other hand, recognizing processing when it comes to, for instance, privacy set-
privacy dark strategies and taking action against them tings.
requires System 2 processing. Working against the dark strategies of centralizing
Past research in fact shows the importance of cog- personal data (Centralize) and of providers that inter-
nitive information processing for privacy issues (e.g., relate data items (Preserve) requires particularly high
[8, 32, 34]). Knijnenburg and colleagues [31], for in- motivation as well as extensive knowledge of and the
stance, document that people automatically provide ability to understand these strategies. It is reasonable
personal information on website forms when an auto- to assume that the typical user often does not have the
completion feature fills out forms by default with pre- knowledge and ability to precisely understand the dark
viously stored values. Reducing the impact of this au- strategies of Centralize and Preserve and to work
tomatic (System 1 based) default completion by giving against them (e.g., taking action against data preserva-
users control over which forms are filled out reduces the tion). Thus, users often cannot engage in the delibera-
amount of personal information provided. tive processing that might lead to behavior that chal-
A number of conditions determine whether humans lenges these two dark strategies.
rely on System 1 thinking processes and System 2 think- The dark strategy Obscure reflects the idea that it
ing processes are inhibited. There are two central as- is difficult for users to learn about what happens to their
pects to consider [16, 39]. Humans engage in System 1 personal data. This strategy implies that users must be
processing whenever they (a) have little motivation to highly motivated and able to acquire information about
think and reason in an effortful way or (b) have no op- how their personal data is used and stored. Again, this
portunity to do so because they lack the required knowl- requirement inhibits users from engaging in deliberative
edge, ability, or time. Users, for instance, often have no processing.
motivation to read general terms and conditions. In in- Analogously, when users’ control of data is denied
stances where they are motivated, they often do not (Deny) they must be highly motivated and able to work
have the opportunity to use System 2 thinking because against this strategy. Deny makes it even more diffi-
the language used in general terms and conditions often cult for users to notice the violation of privacy policies
and legal requirements (Violate). Here, high motiva-

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 246

tion and ability is needed to enable users to notice and


to work against this dark strategy.
When certificates or other information is faked
(Fake), users need to be motivated to search for this
information. Additionally, they need the ability to judge
whether information has been faked or not. If motiva-
tion and ability is not present, subjects will process
(fake) information using System 1 thinking and will
likely not notice the privacy dark strategy.
To sum up, it is evident that privacy dark strate-
gies work, because users often do not have the mo-
tivation or opportunity to resist them. As such, Sys-
tem 2 thinking processes are often absent, while Sys-
tem 1 thinking does accompany the use of privacy dark
strategies. Building on these considerations, one can de-
duce suggestions on how to reduce the power of privacy
dark strategies. Specifically, we argue for attempts to
strengthen System 2 thinking processes by increasing
motivation (e.g., through emphasizing the negative im-
pact of privacy dark strategies) and opportunities for
resistance (e.g., by increasing knowledge about privacy
dark strategies as advocated by this paper, or by imple-
menting tools that reduce automatic provision of private
information [31]).

Fig. 2. Dialog of the Facebook mobile website when deactivating


the account. The page shows profile pictures of contacts the user
4.2 Humans’ Fundamental Need to has recently interacted with and states that they will miss the
Belong user when deactivating the account. Facebook targets the user’s
need to belong and provokes a reconsideration.
Beyond the idea that human information processing is
involved in the functioning of privacy dark strategies,
networks) that serve as personal resources for individu-
humans’ fundamental needs also contribute to the effec-
als’ well-being and functioning [7, 15, 26].
tiveness of some privacy dark strategies. Humans pos-
Although important for human beings [9], the need
sess basic needs, e.g., safety and security needs, concerns
to belong might counteract privacy concerns. For exam-
about physical well-being, the need for self-esteem, and
ple, when personal data is not hidden from plain view
the need to belong to significant others [23]. We iden-
(Publish), it can create a possibility of being liked and
tified the need to belong as particularly important for
admired by others, which can fulfill one’s need to belong
why some privacy dark strategies work well. The argu-
(cf. Nadkarni and Hofmann [37]). This may lead to a re-
ment that is put forward states that individuals’ need
duced level of privacy at the same time. Furthermore,
to belong forces people to disregard privacy issues.
it is hard for subjects to learn about what happens to
The need to belong reflects humans’ desire to be an
the personal data (Obscure) they share based on their
accepted member of a group. Psychological experiments
need to belong.
(e.g., Williams et al. [49]) show that social exclusion of
Service providers might further Maximize the
a subject, even by unknown other subjects in a sim-
amount of data based on subjects’ need to belong to
ple ball game played on the Internet, reduced subjects’
gain information about their users, specifically about
well-being, their belief in a meaningful existence, and
their social capital [15]. This information is then used
their self-esteem. People’s need to belong manifests as
to again target subjects’ need to belong, for instance
a concern for being liked and admired by others, as is
when a user wants to unsubscribe. Facebook, for exam-
evident in social networks [20]. The need to belong mo-
ple, writes “Your [number] friends will no longer be able
tivates people to accumulate social capital [9], i.e., to
to keep in touch with you.”, and “[Name] will miss you”
establish relationships with other people (e.g., in social

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 247

(status January 16, 2016). As shown in Figure 2, users’ and it provides a skip option. Still, the form design la-
motivation to unsubscribe is challenged by activating tently manipulates the user by encouraging the creation
their need to belong and the presentation of users’ social of a user account.
capital they would lose once they unsubscribe [7, 26]. A stronger form of manipulation is achieved by ap-
In sum, people provide and share private informa- plying traditional persuasion techniques [13]. For in-
tion based on their need to belong. Therefore, the need stance, the so-called “door in the face” technique takes
to belong may run counter to high privacy standards. advantage of the principle of reciprocity. In this tech-
nique, the refusal of a large initial request increases the
likelihood of agreement to a second, smaller request.
4.3 Specific Mechanisms This technique has already been studied in the context
of private information disclosure [4] and privacy user
In summary, privacy dark strategies often work well be- settings [30]. Applied to privacy dark strategies, a ser-
cause they take advantage of human beings’ psycholog- vice provider might intentionally ask users for dispro-
ical constitution. We argue that System 1 thinking and portionate amounts of personal data. By providing an
the need to belong are so fundamental for malicious pri- option to skip the first form and then only asking for
vacy mechanisms to work, that both aspects represent a very limited set of personal data in the second form
the basis of psychological considerations in our frame- (e.g., mail address only), users may be more willing to
work. Furthermore, we believe that both aspects are comply and to provide that information after all.
helpful for contributors when briefly assessing potential Closely related to the two cognitive systems, heuris-
privacy dark patterns and their psychological mechan- tics and biases provide decision guidance in case of un-
ics. certainty [47]. Although there are a lot of heuristics
The discussion whether a pattern is to be regarded and cognitive biases related to decision making [29] that
as a dark pattern can then easily integrate a perspec- could be exploited by dark privacy patterns, we will only
tive of the users. This psychological perspective com- introduce an exemplary bias that we later use in one of
plements the assessments of actual impacts of the pat- our example patterns: Hyperbolic discounting [33] is a
tern and suspected motives of the service providers. bias causing humans to inconsistently valuate rewards
This is important in order to differentiate actual privacy over time. Also known as present bias, this bias tricks
dark patterns with malicious intent from other forms of humans into favoring a present reward over a similar re-
poorly implemented or unintended features regarding ward at a later point in time. In terms of privacy, many
privacy. users tend to focus on the instant gratification of an
Apart from the thinking and reasoning processes immediate reward, when they are forced to provide per-
and the need to belong mentioned before, arbitrary pat- sonal data to use a service. At the same time, the users
terns may exploit more specific psychological mecha- discount the ramifications of privacy disclosures in the
nisms which build upon these fundamental aspects. In future [2].
the following, we introduce some of these mechanisms Cognitive dissonance [17] is a state of discomfort
and indicate their usage for privacy dark patterns. caused by contradictory beliefs and actions. According
First, we focus on nudging, a concept for influenc- to the theory of cognitive dissonance, the experience of
ing decision making based on positive reinforcement and inconsistency triggers a reduction of dissonance and a
non-forced compliance [45]. Nudging has already been potential modification of the conflicting cognition. In
applied to decision making in the domain of privacy pro- terms of privacy dark patterns, this process can be ex-
tection [3]. For instance, regular nudges that provide a ploited by inconspicuously providing justification argu-
user with information about data collection of smart- ments for sugarcoating user decisions that have nega-
phone applications have shown to increase awareness tively affected their privacy. For instance, after asking
and motivate users to reassess the applications’ permis- users for inappropriate amounts of personal data, a ser-
sions [6]. When the good intents of privacy nudging are vice provider would later remind the users of the high
replaced with malicious intents, the concept turns into data protection standards they comply with. When a
a latent manipulation technique for non-forced compli- user hesitantly provides personal data although they are
ance with weakened privacy. The dark counterpart pro- generally very cautious regarding personal information,
vides choice architectures facilitating decisions that are a state of discomfort may emerge soon after. Such hints
negative to the user’s privacy. For instance, the starting may then influence the dissonance resolution of the user.
screen in Figure 1 does not force an account creation

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 248

makes unintended changes to their privacy settings.


5 Dark Patterns in the Wild Effect: While the service provider will claim that users
have full control over their privacy settings, the pre-
This section introduces patterns of frequent malicious
sentation, terminology and user experience will highly
privacy behavior. For this purpose, we surveyed popular
discourage users from making changes. When combined
web sites and mobile applications and gathered reports
with the Bad Defaults pattern, these patterns facilitate
of recent privacy incidents. Next, we analyzed the un-
the enforcement of privacy settings suggested by the ser-
derlying concepts and mechanisms regarding malicious
vice provider. Privacy Zuckering could lead to uninten-
effects on privacy, assessed their impacts, and estimated
tional changes of privacy settings, when the complexity
the intentionality of the service providers. Based on our
of the settings does not align with the user’s percep-
framework, we then extracted a number of common dark
tion, and hence prevents originally intended preference
privacy patterns and described them using our pattern
adjustments.
template. The resulting list is not exhaustive, but illus-
Countermeasures: When service providers apply Pri-
trates the idea of privacy dark patterns based on exem-
vacy Zuckering, users require help of third parties that
plary sightings in the wild.
clarify the settings and guide them through the intended
Of course we cannot clearly determine whether the
preferences.
service providers mentioned as examples in the following
Examples/Known Uses: In the past, Facebook has
patterns actually had a malicious intent, and we are not
been accused of applying Privacy Zuckering to their
claiming they did. It is still reasonable to believe that
users’ privacy setting pages, which termed the mech-
many of the companies offering free services and apps
anism in the first place [11]. For instance, in August
have strong motivations to gather as much data from
2010, an updated privacy settings page of Facebook al-
their customers as possible and design their mobile web
lowed for highly customized settings, but required users
services and mobile applications on purpose following
to change dozens of settings on multiple pages to max-
such privacy dark patterns. In any case, the examples
imize personal privacy.
are helpful to understand the mechanics of the privacy
Related Patterns: When Bad Defaults are in place,
dark pattern in question.
Privacy Zuckering prevents changes and increases the
Please note that the following patterns are short-
number of retained default settings.
ened and use a condensed structure. The extended
Psychological Aspects: Overly complex settings and
versions of the patterns based on our full tem-
inappropriate terminology requires System 2 thinking.
plate structure are available at our online portal
When a user is motivated to change their settings, but
[Link].
is overwhelmed at the same time, and hence lacks the
opportunity to do so purposefully, the user may ei-
ther switch back to System 1 thinking and make vague
5.1 Privacy Zuckering
changes, or the user may refrain from doing so at all.
Strategies Obscure
The term Privacy Zuckering was first introduced by
Tim Jones in an EFF article [28] for “deliberately con-
fusing jargon and user-interfaces”, and was later used
5.2 Bad Defaults
on [Link] for a UI dark pattern. For our cat-
alog, we generalize the idea and present it as a universal
Name/Aliases: Bad Defaults
privacy dark pattern.
Context: This dark pattern is used mainly on websites,
by applications, or in social networks. For Bad Defaults
Name/Aliases: Privacy Zuckering
to have an effect it is often necessary that the system
Context: The access and usage of personal data is
has some form of user accounts.
often governed by user-specific, modifiable privacy set-
Description: When creating an account at a service
tings. By doing this, users can choose privacy settings
provider the default options are sometimes chosen badly
that reflect their own privacy requirements.
in the sense that they ease or encourage the sharing of
Description: A service provider allows users to change
personal information. Most users will be too busy to
their privacy settings. However, the settings are unnec-
look through all the options and configure their account
essary complex, overly fine-grained, or incomprehensi-
properly. Thus, they often unknowingly share more per-
ble to the user. As a result, the user either gives up, or
sonal information than they intend to.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 249

meaningful way or prevent misbehavior. But very often


this is unnecessary and serves the interest of the service
provider by giving him access to (unneeded) personal
data. The personal information collected regularly in-
cludes an e-mail address, since this is required for cre-
ating the account, but is often augmented by birthdates,
home addresses, etc.
Effect: The effect of this pattern is that the user is
forced to register an account at the service provider,
thereby allowing the service provider to track user be-
havior on his platform. Additionally the registration
process often requires an e-mail address and other per-
Fig. 3. Facebook default settings from 2010. The graph shows
sonal identifiable information. Since the user does not
which information can by default be accessed by You, your
Friends, Friends of Friends (FoF), all Facebook user (FBU), and want to have an account in the first place, the user is
the whole Internet (Inet). For the source and more details we unlikely to configure the settings properly, thereby pos-
refer to [Link] sibly revealing even more personal information not in-
tended for disclosure.
Countermeasures: One countermeasure is to create a
Effect: This pattern causes the user to share more in-
new account and fill it with random data. Often, one can
formation with the system or other users than the user
use an anonymous one-time e-mail address4 during reg-
intends to do. This includes but is not limited to which
istration to receive the activation link for the account.
sites the user visits, parts of his user profile, and his on-
Another countermeasure is provided by the service
line status.
BugMeNot5 . They enable users to bypass the forced
Countermeasures: Users need to be educated to de-
registration by allowing many users to share their ac-
velop more awareness of bad default settings so that
count details creating a large anonymity set. A user can
they become self-motivated to configure their accounts
try accounts published at BugMeNot for using the ser-
properly. However this is hard to achieve.
vice. BugMeNot allows users to create new accounts and
Examples/Known Uses: Facebook Default Privacy
share them with other users of BugMeNot. It can even
Settings (cf. Figure 3).
be used as a browser extension by some web browsers.
Related Patterns: Privacy Zuckering demotivates
Examples/Known Uses: As of Feb. 2016, the popu-
users from changing the defaults.
lar question-and-answer website [Link] requires ex-
Psychological Aspects: When users are not aware of
ternal visitors to sign up and log in when opening a
the defaults that are in effect, a deliberative processing
question page. While the page is rendered initially, it is
of this information is inhibited.
then blocked by pop-up modal dialog that forces visi-
Strategies: Obscure
tors to register, even for one-time, read-only access.
Related Patterns: When a user is required to register,
an Immortal Account will prevent the later cancellation
5.3 Forced Registration
of the account. Forced accounts can come with Bad De-
faults.
Name/Aliases: Forced Registration
Psychological Aspects: As the user’s original goal
is prevented by the necessary registration, account cre-
Context: This pattern can be applied in nearly ev-
ation often happens as part of an automatic behavior
ery service which provides some functionality to users.
for achieving that goal. This gives the user an instant
When the functionality technically requires an account,
gratification, and critical and deliberative thoughts are
e.g., in online social networks, this pattern degenerates.
inhibited.
In this case we are not speaking of a privacy dark pat-
Strategies: Maximize
tern anymore since without an account the service can-
not be provided in the intended way.
Description: A user wants to use some functional-
ity of a service which is only accessible after registra-
tion. Sometimes this is necessary to use the service in a 4 e.g., [Link]
5 [Link]

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 250

5.4 Hidden Legalese Stipulations Another approach is the one followed by the Terms
of Service; Didn’t Read (TOSDR8 ) webpage. This is
Name/Aliases: Hidden Legalese Stipulations a community-driven repository of ratings of privacy
Context: This pattern can be used by all systems policies. TOSDR is available as a browser add-on and
which incorporate a document describing the terms and shows the rating of the terms of service of the current
conditions of using the service. web page as a small icon. When clicking on the icon one
Description: Terms and conditions are mandatory by can see the positive and negative points of the terms of
law. Nevertheless, most users do not read them, since service in an easily understandable language.
they are often long and written in a complicated legal
jargon. This legal jargon is necessary to provide suc- Examples/Known Uses: In 2000, the then-popular
cinctness and clarity, but is not user-friendly. instant messenger service ICQ introduced a “Terms
The inability of the user to grasp the legal jargon Of Service — Acceptable Use Policy”9 which granted
puts him in a vulnerable state, since the policy is legally the service operators the copyright on all information
binding. If this vulnerability is exploited, the policy posted by their users. Hidden in this legalese, the oper-
turns into an instance of a privacy dark pattern. Ser- ators granted further rights of use “including, but not
vice providers can hide stipulations in the policies which limited to, publishing the material or distributing it”.
target the privacy of the user. Often the user will not The British firm GameStation owns the souls of
notice this, not reading the terms and conditions or be- 7,500 online shoppers, thanks to an “immortal soul
ing unable to understand their implications. Some ser- clause”10 in the terms and conditions. This April Fool’s
vice providers state that they will change their policies gag reveals the effectiveness of this pattern and shows
without further notice, preventing the user even further that companies can hide everything in their online terms
from learning what happens to his data. and conditions. Please note that McDonald et al. [36]
Effect: Usage of this pattern leads to the service calculated that reading the privacy policies you en-
provider being able to hide his malicious deeds from counter in a year would take 76 work days.
the user without necessarily violating legal regulations. Related Patterns: n/a
Countermeasures: There are various proposals for Psychological Aspects: Even if the user is motivated
easier communication of legal conditions. to read terms and conditions, missing opportunity to
One solution is to make the legal conditions fully comprehend all details makes a System 1-based pro-
machine-readable. This was the approach that P3P, the cessing more probable.
Platform for Privacy Preferences Project, followed. P3P Strategies: Obscure
is a standard by the W3C6 for a machine-readable ren-
dering of privacy policies. The basic idea is that an
XML-file specifying the privacy policy can be retrieved 5.5 Immortal Accounts
from any participating web pages. This policy can auto-
matically be checked against the preferences of the user Name/Aliases: Immortal Accounts
by the browser. Context: Many services require user accounts, either
The Privacy Bird7 , for example, was a tool which because they are necessary for service fulfilment, or be-
could show the P3P description as an icon, namely a cause user accounts represent a benefit for the service.
bird. The color or the bird, i.e., red or green, signified if Description: The service provider requires new users
the policy of the site matched the users’ preferences. to sign up for accounts to use the service. Once users de-
The drawback of this approach is, that the service cide to stop using the service, they might want to delete
provider needs to provide the machine-readable P3P de- their accounts and associated data. However, the service
scription. A malicious service provider who wants to provider prevents the user from doing so by either—
trick his users with hidden legal stipulations will of unnecessarily complicating the account deletion experi-
course not provide such a description. Since this coun- ence, or by not providing any account deletion option
termeasure depends on the collaboration with the ser-
vice provider it is not effective.
8 [Link]
9 [Link]
//[Link]/legal/[Link]
6 [Link] 10 [Link]
7 [Link] [Link]

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 251

at all. Additionally, the service provider might trick the Description: When the user imports the list, the ser-
user in the deletion process by pretending to delete the vice executes a lookup against its own database. It then
entire account, while still retaining (some) account data. provides suggestions for connections to the user. How-
Effect: When the user interface makes the account ever, the service provider stores the list of all contacts as
deletion options hard to access, the barrier to delete the internal data records for further processing—including
account is increased. If the users are required to call purposes that have not been initially declared.
the customer support, the process is even more cum- Effect: Using an import feature may lead to expos-
bersome. Both of these deliberately inconvenient user ing unwanted information, specifically the contents of
experiences may cause the user to reconsider the actual personal address books to third parties. A potential us-
deletion decision. A deletion process where the service age of such information is the dispatch of invitations or
provider claims to remove the account, but instead just other advertisements, at worst even in the name of the
flags the user records as deleted while still keeping the original uploader without consent. Service provider may
data gives the user a false feeling of deletion. misuse such data for profiling and tracking individuals
Countermeasures: Online resources such as just- that do not yet possess a user account.
delete.me11 or accountkiller.com12 curate a list of ser- Countermeasures: If it is unknown or unclear how
vice providers and their policies towards account re- a service provider is handling and processing imported
moval. They provide step-by-step tutorials for users how contact lists, such a feature should be avoided. Many
to delete an account at those providers. If the service to mobile and desktop operating systems allow users to
be used is known for a non-delete policy but requires deny applications access to address book data. Users
a user account, the usage of a throwaway account with should routinely click on deny unless it is definitely re-
incorrect data should be considered. quired or in their interest to share those data.
Examples/Known Uses: As of February 2016, the Examples/Known Uses: In 2008, the social book
community-curated data set of [Link] lists 474 cataloging website [Link] attracted negative at-
services. 75 services thereof do not provide the possibil- tention for unsolicited invite emails based on the address
ity to delete the account at all and 100 services require book import feature. The experiences of customers and
contacting the customer support. From the remaining reactions of the service providers are still available on
299 services listed, another 31 services have a non-trivial a customer support page13 . Based on a misleading up-
deletion process that requires additional steps. load form design, users thought they would only provide
Related Patterns: The creation of accounts can be contacts for matching against goodreads’ user base. In-
required due to Forced Registration. stead, goodreads sent invite emails to persons which had
Psychological Aspects: When the service provider mail addresses not yet registered at goodreads, thereby
renders the user experience for account deletion delib- referring to the user who provided the address.
erately painful, users might struggle in the process. If Related Patterns: This pattern is a potential source
the user wants to delete the account, but fails to do of information for Shadow User Profiles.
so, cognitive dissonance may emerge. As a result, the Psychological Aspects: Trading personal information
user could then reduce the inconsistent mental state by for instant connections to friends or known contacts is
reconsidering their original intent and deciding not to motivated by the need to belong.
delete the account. Strategies: Maximize, Preserve
Strategies: Deny, Obscure

5.7 Shadow User Profiles


5.6 Address Book Leeching
Name/Aliases: Shadow User Profiles
Name/Aliases: Address Book Leeching Context: A service provider tracks personal informa-
Context A service provider offers users to upload or tion about individuals.
import their address books to connect with known con- Description: While registered users have deliberately
tacts on that service.

13 [Link]
11 [Link] goodreads_trick_me_into_spamming_my_entire_address_
12 [Link] book

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 252

opted in for a user account and an associated profile, template contains countermeasures and a psychological
the service provider may collect information and keep viewpoint that explains why the pattern is effective.
records about individuals that do not use the service. We extensively discussed psychological aspects in
For instance, in a social network, the social graph can Section 4. Understanding those psychological mecha-
be supplemented with persons that are not members of nisms triggered by privacy dark patterns is of crucial
the network, but are known to the network based on importance as it will allow affected users to take appro-
data from members (e.g., imported address books, con- priate countermeasures.
tent metadata, or mentions). Such non-members enrich Based on our privacy dark pattern framework and
the graph and improve the quality of algorithms such the extensive discussion of the related concepts, we
as contact suggestions. briefly presented seven of such patterns including some
Effect: The service provider stores and processes infor- concrete examples. These patterns and more are avail-
mation on individuals without their knowledge or con- able in an extended form via an online privacy dark
sent. The affected individuals are not aware of personal pattern portal [Link]. We have set up
data records they have accidentally created or that have this portal for the community to study and discuss ex-
been provided by third parties. isting patterns and contribute new ones.
Countermeasures: While it is possible to minimize
the own data trail, the accidental release of personal
data through third parties cannot always be prevented.
Examples/Known Uses: The basic mechanism of
Acknowledgements
shadow user profiles fuels the entire online advertise-
The authors like to thank the anonymous reviewers for
ment industry. Although not verifiable, social networks
their valuable comments and suggestions to improve the
may store informations of non-users. This notion is
quality of the paper. They are also grateful to Yllka
based on the experiences of newly registered users of
Thaqi and Florian Oberlies for insightful remarks and
social networks who received accurate friendship sug-
fruitful discussions.
gestions without having ever interacted with these per-
sons on the social network before.
Related Patterns: Address Book Leeching is a poten-
tial source of information for this pattern. References
Psychological Aspects: Given the fact that this pat-
tern operates without any knowledge of the affected [1] G. Acar, C. Eubank, S. Englehardt, M. Juarez,
A. Narayanan, and C. Diaz, “The Web never forgets: Per-
users, it is not targeting any psychological aspects.
sistent tracking mechanisms in the wild,” in Proceedings of
Strategies: Maximize, Preserve, Centralize
the 2014 ACM SIGSAC Conference on Computer and Com-
munications Security. ACM, 2014, pp. 674–689.
[2] A. Acquisti, “Privacy in electronic commerce and the eco-
nomics of immediate gratification,” in Proceedings of the
6 Conclusions 5th ACM conference on Electronic commerce. ACM, 2004,
pp. 21–29.
In this paper, we introduce the concepts of privacy dark [3] ——, “Nudging privacy: The behavioral economics of per-
strategies and privacy dark patterns. Both are based on sonal information.” IEEE Security & Privacy, vol. 7, no. 6,
pp. 82–85, 2009.
the idea that actors intentionally manipulate people to
[4] A. Acquisti, L. K. John, and G. Loewenstein, “The impact
provide their personal data for collection, storage, and of relative standards on the propensity to disclose,” Journal
processing against their original intent and interest. of Marketing Research, vol. 49, no. 2, pp. 160–174, 2012.
Documenting such strategies and patterns is a vital [5] C. Alexander, S. Ishikawa, and M. Silverstein, A Pattern
first step towards a better recognition of such activities, Language: Towns, Buildings, Construction (Center for En-
e.g., in the Internet or in mobile apps. Our eight pri- vironmental Structure Series). Oxford University Press,
1977.
vacy dark strategies Maximize, Publish, Centralize,
[6] H. Almuhimedi, F. Schaub, N. Sadeh, I. Adjerid, A. Ac-
Preserve, Obscure, Deny, Violate, and Fake pro- quisti, J. Gluck, L. F. Cranor, and Y. Agarwal, “Your loca-
vide a coarse categorization for the subsequent patterns. tion has been shared 5,398 times!: A field study on mobile
Privacy dark patterns are documented using a uniform app privacy nudging,” in Proceedings of the 33rd Annual
template. Beyond a mere description of the pattern, the ACM Conference on Human Factors in Computing Systems,
ser. CHI ’15. New York, NY, USA: ACM, 2015, pp. 787–
796.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 253

[7] Y. Amichai-Hamburger and E. Ben-Artzi, “Loneliness and 1st ed. Boston: Addison-Wesley Professional, 2004.
internet use,” Computers in Human Behavior, vol. 19, no. 1, [26] D. J. Hughes, M. Rowe, M. Batey, and A. Lee, “A tale of
pp. 71–80, 2003. two sites: Twitter vs. Facebook and the personality predic-
[8] C. M. Angst and R. Agarwal, “Adoption of electronic health tors of social media usage,” Computers in Human Behavior,
records in the presence of privacy concerns: The elaboration vol. 28, no. 2, pp. 561–569, 2012.
likelihood model and individual persuasion,” MIS quarterly, [27] P. Hustinx, “Privacy by design: delivering the promises,”
vol. 33, no. 2, pp. 339–370, 2009. Identity in the Information Society, vol. 3, no. 2, pp. 253–
[9] R. F. Baumeister and M. R. Leary, “The need to belong: 255, 2010.
desire for interpersonal attachments as a fundamental hu- [28] T. Jones, “Facebook’s "evil interfaces",” [Link]
man motivation.” Psychological Bulletin, vol. 117, no. 3, pp. org/de/deeplinks/2010/04/facebooks-evil-interfaces, ac-
497–529, 1995. cessed: 2016-02-25.
[10] K. Beck and W. Cunningham, “Using pattern languages [29] D. Kahneman, Thinking, fast and slow. Macmillan, 2011.
for object oriented programs,” in Conference on Object- [30] B. P. Knijnenburg and A. Kobsa, “Increasing sharing ten-
Oriented Programming, Systems, Languages, and Applica- dency without reducing satisfaction: Finding the best
tions (OOPSLA), 1987. privacy-settings user interface for social networks,” in Pro-
[11] H. Brignull, “Dark Patterns: fighting user deception world- ceedings of the International Conference on Information
wide,” [Link] accessed: 2016-01-24. Systems - Building a Better World through Information Sys-
[12] A. Buchenscheit, B. Könings, A. Neubert, F. Schaub, tems, ICIS 2014, Auckland, New Zealand, December 14-17,
M. Schneider, and F. Kargl, “Privacy implications of pres- 2014, 2014.
ence sharing in mobile messaging applications,” in Proceed- [31] B. P. Knijnenburg, A. Kobsa, and H. Jin, “Counteracting
ings of the 13th International Conference on Mobile and the negative effect of form auto-completion on the privacy
Ubiquitous Multimedia. ACM, 2014, pp. 20–21. calculus,” in Thirty Fourth International Conference on In-
[13] R. Cialdini, Influence : the psychology of persuasion. New formation Systems, Milan, 2013.
York: Morrow, 1993. [32] A. Kobsa, H. Cho, and B. P. Knijnenburg, “The effect of
[14] N. Doty and M. Gupta, “Privacy Design Patterns and Anti- personalization provider characteristics on privacy attitudes
Patterns,” in Trustbusters Workshop at the Symposium on and behaviors: An elaboration likelihood model approach,”
Usable Privacy and Security, 2013. Journal of the Association for Information Science and Tech-
[15] N. B. Ellison, C. Steinfield, and C. Lampe, “The benefits nology, 2016, in press.
of facebook "friends:" social capital and college students’ [33] D. Laibson, “Golden eggs and hyperbolic discounting,” The
use of online social network sites,” Journal of Computer- Quarterly Journal of Economics, vol. 112, no. 2, pp. 443–
Mediated Communication, vol. 12, no. 4, pp. 1143–1168, 478, 1997.
2007. [34] P. B. Lowry, G. Moody, A. Vance, M. Jensen, J. Jenkins,
[16] R. H. Fazio, “Multiple processes by which attitudes guide and T. Wells, “Using an elaboration likelihood approach to
behavior: The MODE model as an integrative framework,” better understand the persuasiveness of website privacy as-
Advances in Experimental Social Psychology, vol. 23, pp. surance cues for online consumers,” Journal of the American
75–109, 1990. Society for Information Science and Technology, vol. 63,
[17] L. Festinger, A theory of cognitive dissonance. Stanford no. 4, pp. 755–776, 2012.
university press, 1962, vol. 2. [35] E. Luger, S. Moran, and T. Rodden, “Consent for all: re-
[18] M. Fowler, Patterns of Enterprise Application Architecture. vealing the hidden complexity of terms and conditions,” in
Boston: Addison-Wesley Professional, 2003. Proceedings of the SIGCHI conference on Human factors in
[19] E. Gamma, R. Helm, R. Johnson, and J. Vlissides, Design computing systems. ACM, 2013, pp. 2687–2696.
patterns: elements of reusable object-oriented software. [36] A. M. McDonald and L. F. Cranor, “Cost of reading privacy
Pearson Education, 1994. policies, the,” ISJLP, vol. 4, p. 543, 2008.
[20] H. Gangadharbatla, “Facebook me: Collective self-esteem, [37] A. Nadkarni and S. G. Hofmann, “Why do people use Face-
need to belong, and internet self-efficacy as predictors of book?” Personality and Individual Differences, vol. 52, no. 3,
the igeneration’s attitudes toward social networking sites,” pp. 243–249, 2012.
Journal of interactive advertising, vol. 8, no. 2, pp. 5–15, [38] N. Notario, A. Crespo, Y.-S. Martín, J. M. Del Alamo,
2008. D. Le Métayer, T. Antignac, A. Kung, I. Kroener, and
[21] S. Gürses, C. Troncoso, and C. Diaz, “Engineering privacy D. Wright, “PRIPARE: Integrating Privacy Best Practices
by design,” Computers, Privacy & Data Protection, vol. 14, into a Privacy Engineering Methodology,” in Security and
2011. Privacy Workshops (SPW), 2015 IEEE. IEEE, 2015, pp.
[22] M. Hafiz, “A collection of privacy design patterns,” in Pro- 151–158.
ceedings of the 2006 conference on Pattern languages of [39] R. E. Petty and J. T. Cacioppo, The elaboration likelihood
programs. ACM, 2006, p. 7. model of persuasion. Springer, 1986.
[23] E. T. Higgins, Beyond pleasure and pain: How motivation [40] S. Romanosky, A. Acquisti, J. Hong, L. F. Cranor, and
works. Oxford University Press, 2011. B. Friedman, “Privacy patterns for online interactions,” in
[24] J.-H. Hoepman, “Privacy Design Strategies,” CoRR, vol. Proceedings of the 2006 conference on Pattern languages of
abs/1210.6621, 2012. programs. ACM, 2006, p. 12.
[25] G. Hohpe and B. Woolf, Enterprise Integration Patterns - [41] M. Schumacher, “Security patterns and security standards.”
Designing, Building, and Deploying Messaging Solutions, in EuroPLoP, 2002, pp. 289–300.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns 254

[42] T. Schümmer, “The public privacy–patterns for filtering


personal information in collaborative systems,” in CHI2004:
Proceedings of the Conference on Human Factors in Com-
puting Systems, 2004.
[43] K. E. Stanovich and R. F. West, “Advancing the rationality
debate,” Behavioral and Brain Sciences, vol. 23, no. 05, pp.
701–717, 2000.
[44] F. Strack and R. Deutsch, “Reflective and impulsive deter-
minants of social behavior,” Personality and Social Psychol-
ogy Review, vol. 8, no. 3, pp. 220–247, 2004.
[45] R. Thaler, Nudge : improving decisions about health, wealth,
and happiness. New York: Penguin Books, 2009.
[46] J. Tidwell, Designing Interfaces. Sebastopol: "O’Reilly
Media, Inc.", 2010.
[47] A. Tversky and D. Kahneman, “Judgment under uncer-
tainty: Heuristics and biases,” science, vol. 185, no. 4157,
pp. 1124–1131, 1974.
[48] J. van Rest, D. Boonstra, M. Everts, M. van Rijn, and
R. van Paassen, Designing privacy-by-design. Springer,
2014, pp. 55–72.
[49] K. D. Williams, C. K. Cheung, and W. Choi, “Cyberos-
tracism: effects of being ignored over the internet.” Jour-
nal of Personality and Social Psychology, vol. 79, no. 5, pp.
748–762, 2000.

Bereitgestellt von | KIZ der Universitaet Ulm


Angemeldet | [Link]@[Link] Autorenexemplar
Heruntergeladen am | 19.07.16 11:02
View publication stats

You might also like