Dark Patterns
Dark Patterns
net/publication/303814886
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns
CITATIONS READS
270 5,806
5 authors, including:
Stefan Pfattheicher
Aarhus University
116 PUBLICATIONS 3,563 CITATIONS
SEE PROFILE
All content following this page was uploaded by Stefan Pfattheicher on 07 August 2023.
Christoph Bösch*, Benjamin Erb, Frank Kargl, Henning Kopp, and Stefan Pfattheicher
(a) The starting page. (b) Log-in with Facebook. (c) Skipped sign-in process.
Fig. 1. Screenshots of the Tripadvisor mobile app. (a) shows the starting page. Note the small “Skip” button in the upper right corner.
(b) shows the requested personal information when logging in with Facebook. Some of the information is unnecessary for providing the
service. (c) shows what happens after skipping the sign-in process.
privacy. (ii) A collection of privacy dark patterns fos- of chambers in such a way that a further distance from
ters awareness and makes it easier to identify such ma- the building’s entrance allows for increased intimacy.
licious patterns in the wild. (iii) Furthermore, the doc- In 1987, the idea of patterns was readopted by Kent
umentation of a privacy dark pattern can be used as a and Cunningham [10] and introduced into the realm of
starting point for the development of countermeasures, computer science and software development. The Port-
which disarm the pattern and re-establish privacy. The land Pattern Repository of Kent and Cunningham col-
discussion is similar to IT security, where investigation lected patterns for programmers using object-oriented
and publication of vulnerabilities proved to be key for programming languages.1 The idea of using patterns in
actually enhancing the security level in real systems. software design gained wide acceptance in 1994, when
the so-called Gang of Four released their well-known
book on design patterns for reusable object-oriented
1.1 Introduction to Patterns software [19]. Since then, the usage of patterns has
spread to various different branches of computer science
In many disciplines recurring problems have been ad- and software engineering, including distributed architec-
dressed over and over again, yielding similar and recur- tures [18, 25], user interface design [46], IT security [41],
ring solutions. The idea of a pattern is to capture an and privacy [14, 22, 40, 42].
instance of a problem and a corresponding solution, ab- The success of patterns in software engineering has
stract it from a specific use case, and shape it in a more entailed novel classes of patterns with different seman-
generic way, so that it can be applied and reused in tics, namely anti patterns [14] and dark patterns [11].
various matching scenarios. Traditional design patterns capture a reasonable and
Patterns originate from the realm of architecture, established solution. In contrast, an anti pattern docu-
where Alexander et al. [5] released a seminal book on ments a solution approach that should be avoided, be-
architectural design patterns in 1977. In this book, the cause it has been proven to represent a bad practice.
authors compiled a list of archetypal designs for build-
ings and cities which were presented as reusable solu-
tions for other architects. Interestingly, Alexander et al.
already came up with patterns for privacy. For instance, 1 Historical side note: the online version of the pattern reposi-
their Intimacy Gradient pattern postulates a placement tory, WikiWikiWeb ([Link] became the first-
ever wiki on the World Wide Web
Hence, anti patterns raise awareness of sub-par solu- general terminology for privacy dark patterns and es-
tions and advocate against their usage. tablishes a template for documenting privacy dark pat-
Anti patterns often target solutions that may seem terns. Our framework suggests a list of malicious privacy
obvious to the system developer at a first glance, but in- strategies and psychological aspects for categorizing pri-
clude a number of less obvious negative implications and vacy dark patterns. Based on the pattern template of
consequences. Even established design patterns some- our framework, we discuss common privacy dark pat-
times become obsolete and are downgraded to anti pat- terns that we extracted from real-world occurrences.
terns due to new considerations. For instance, the Gang
of Four suggested a pattern for restricting the instan-
tiation of a class to a single instance, the so-called 1.3 Contribution
Singleton pattern. Only after concurrent programming
and multi-core architectures became more widespread, Our contribution can be summarized as follows:
shortcomings of this pattern eventually became appar- 1. We introduce the concept of privacy dark strategies
ent. Today, the Singleton pattern is widely considered and privacy dark patterns.
an anti pattern. 2. We present a framework for privacy dark patterns
The term dark pattern was first used by Brignull, that takes into account traditional privacy patterns,
who collected malicious user interface patterns [11] for empirical evidence of malign patterns, underlying
better awareness. A UI dark pattern tricks users into malicious strategies, and their psychological back-
performing unintended and unwanted actions, based on ground. The resulting framework provides a tem-
a misleading interface design. More generally speaking, plate for documenting and collecting arbitrary pri-
a dark pattern describes an established solution for ex- vacy dark patterns.
ploiting and deceiving users in a generic form. 3. We provide an initial set of exemplary dark patterns
In summary, anti patterns collect the Don’ts for that we encountered in the wild.
good intentions and dark patterns collect potential Dos 4. We launched the website [Link] as
for malicious intents. In this paper, we present a first an online collection for privacy dark patterns. Being
broad discussion on dark patterns in the field of privacy. a collaborative resource, we invite the community to
submit more patterns and help to raise awareness.
1.2 Methodology
2 On Privacy Strategies and
To suggest a framework for the collection of privacy dark Privacy Patterns
patterns and to compile a list of such patterns, we con-
sider the problem from three different angles as part of In this section, we introduce privacy patterns and cor-
a holistic approach. responding privacy strategies, based on their historical
First, we survey existing literature on privacy development.
strategies and privacy patterns. We then reverse pri- Until the mid-1990s, privacy was rarely considered
vacy strategies and adapt some of these ideas and extend a relevant feature of IT systems. Even if it was, the inte-
them, so that they become malicious patterns. Beyond gration of privacy-preserving mechanisms was often con-
this, we have identified new types of patterns. Second, ducted a posteriori, as an additional requirement later
we include a psychological point of view on malevolent added to the system. The notion of “privacy as an af-
privacy concepts. This perspective takes into account terthought” contradicted the cross-sectional property of
human information processing, social cognition and mo- privacy as part of an IT system and often yielded ex-
tivation, as well as exploitable basic human needs. On tensive or insufficient changes to the system.
this basis we are able to deduce additional approaches To overcome these deficits, a joint team of the Infor-
on how to reduce the power of privacy dark strategies. mation and Privacy Commissioner of Ontario, Canada;
Third, we identify and analyze real-world examples of the Dutch Data Protection Authority; and the Nether-
malicious privacy mechanisms as found on websites and lands Organisation for Applied Scientific Research ad-
in mobile applications. vocated a more integral approach that included privacy
Next, we integrate these findings on privacy dark considerations into the overall development cycle [27]. In
patterns into a unified framework, which introduces a 1995, they introduced the so-called Privacy by Design
approach,2 which is postulated by the following seven since metadata often includes personally identifiable in-
foundational principles: formation. Further, this solution is reusable, since it is
1. Proactive not reactive not bound to a specific instance of a problem. Thus,
2. Privacy as the default setting stripping of metadata constitutes a privacy pattern that
3. Privacy embedded into design can be applied, e.g., to a website managing digital pho-
4. Full functionality tographs.
5. End-to-end security A single privacy pattern addresses a problem with
6. Visibility and transparency a limited scope. Multiple related and complementing
7. Respect for user privacy patterns can then be compiled into a pattern catalog.
Similar to the well-known design pattern catalogs from
These principles have been a major milestone for the de- software engineering, a privacy pattern catalog collects
sign of privacy-preserving systems as they provide gen- a number of relevant problems and suitable solutions
eral guidance. For that reason, these concepts are part that can be applied during a privacy-aware development
of several privacy legislations today. phase.
One frequent criticism regarding Privacy by Design There are multiple collections of privacy patterns
and its seven principles is that they are too unspecific from academic research [14, 22, 40, 42] as well as online
to be directly applied to a development process. The repositories3 that are more accessible for practitioners.
principles neither provide concrete advice, nor do they In a typical system development process, privacy
address the varying needs of specific domains, such as patterns are applied during the stages of design and im-
the Internet of Things, User Interface Design, or Car- plementation. However, in many scenarios, privacy as-
2-Car communication. A system designer is thus still pects represent fundamental system requirements that
required to have a thorough understanding of privacy have to be considered from the very beginning. The
and the forces involved, to design a privacy friendly sys- question is, whether more general architectural build-
tem. Clearly, more guidance and a more methodological ing blocks exist that can be applied at an even earlier
approach is required to establish a privacy engineering stage, i.e., during requirement analysis and architectural
process as, for example, worked on by the PRIPARE design. Note that this notion is a natural continuation
project [38]. of the Privacy by Design philosophy—to include privacy
One element of this privacy engineering approach considerations into the entire development process.
are so-called Privacy Patterns. The main idea of pri- These general architectural building blocks are
vacy patterns is to improve the drawbacks of the Pri- known as Privacy Design Strategies. According to Hoep-
vacy by Design principles, i.e., that they are not action- man [24], a privacy design strategy is on a more general
able [14, 21, 48]. Privacy patterns are defined as reusable level than a privacy pattern and “describes a funda-
solutions for commonly occurring problems in the realm mental approach to achieve a certain design goal. It has
of privacy. Essentially, they are patterns for achieving certain properties that allow it to be distinguished from
or improving privacy. Privacy patterns provide guidance other (fundamental) approaches that achieve the same
for engineering and development, and target the needs goal.”
of specific domains, such as backend implementations or Later in the development process, a privacy design
user interface design. By providing a well-structured de- strategy can be refined with privacy patterns imple-
scription of a problem and its solution using a standard- menting one or more strategies. Thus, privacy design
ized template, patterns can easily be looked up and ap- strategies provide a classification of privacy patterns.
plied. Since these patterns include references to specific When system designers search for a privacy pattern in
use-cases and possibly implementations, engineers will a collection, they are only interested in the ones imple-
directly find the resources needed to implement them in menting their chosen privacy strategy.
their own context. Hoepman [24] defines the following eight privacy de-
One well-known example of a privacy pattern that sign strategies.
can be implemented in multiple domains is the strip- Minimize: Data minimization is a strategy which in-
ping of metadata that is not necessary for the function- sists that the amount of personal information that
ality of the service. This procedure increases privacy, is processed should be minimal. Data that is not
needed for the original purpose should not be col- for system implementations is commonly acknowledged
lected. when building privacy-friendly IT systems.
Hide: Hide takes place after data collection. Whereas However, there are other parties that have different
Minimize forbids the collection of needless informa- agendas when building IT systems. Instead of privacy-
tion, Hide suggests that any personal data that is friendly solutions, they aim for systems that purpose-
processed should be hidden from plain view. fully and intentionally exploit their users’ privacy—for
Separate: The approach of the privacy strategy Sep- instance motivated by criminal reasons or financially ex-
arate is to process any personal information in a ploitable business strategies.
distributed fashion if possible. Thus, interrelation- For the development of our framework, we re-
ships between personal data vanish in contrast to a verse the evolution of privacy strategies and patterns:
centralized processing. First, we define dark strategies as the high-level goals
Aggregate: When implementing Aggregate, per- that these parties follow in order to exploit privacy.
sonal information is processed at a high level of ag- Next, we derive suitable dark patterns that implement
gregation. This level should only be so high as to these strategies. We then complement our framework by
remain useful, however. Details that are not needed adding a psychological perspective on how the strate-
for the functionality of the service vanish. This pro- gies generally achieve their deceptive and manipulative
cess could include statistical aggregation such that goals. Note that we do not include a counterpart to
the details of identities are blurred. privacy-enhancing technologies as part of our frame-
Inform: The privacy strategy Inform states that data work.
subjects should be adequately informed whenever As already clarified in the introduction, the result-
personal information is processed. ing framework is neither intended nor structured as
Control: A common requirement of software systems a construction kit for malicious parties. Instead, the
is that data subjects should be in control of the pro- framework can be used by privacy researchers and prac-
cessing of their personal information. Whenever this titioners for detecting, recognizing, analyzing, and doc-
is ensured, we are dealing with the privacy strategy umenting malicious strategies and patterns.
Control. Hoepman states that he is not aware of When used top-down, the framework supports a pri-
any patterns implementing this strategy. vacy analysis of IT systems by raising awareness for
Enforce: Enforce states that a privacy policy that malicious strategies and by uncovering corresponding
is compatible with legal requirements should be in mechanisms. Bottom-up, the framework helps to iden-
place and should be enforced. tify malicious patterns, reveals underlying strategies,
Demonstrate: The privacy strategy Demonstrate and provides pointers for the development of concrete
demands that data controllers are able to demon- countermeasures.
strate compliance with their privacy policy and any
applicable legal requirements. A good example for
a pattern implementing this strategy is the use of 3.1 Privacy Dark Strategies
audits.
We now develop a categorization of privacy dark pat-
In the following sections, privacy design strategies serve terns, analogously to Hoepman’s privacy design strate-
as the starting point for our analysis of malicious dark gies [24]. As privacy design strategies can be used to cat-
strategies that harm privacy. For defining and docu- egorize privacy patterns by their fundamental approach,
menting malicious patterns, we adapt the idea of privacy the same holds for privacy dark strategies. Hoepman
patterns and transform it into privacy dark patterns. identified eight privacy strategies, namely Minimize,
Hide, Separate, Aggregate, Inform, Control, En-
force, and Demonstrate. Based on these strategies,
we identify the following privacy dark strategies: Max-
3 The Dark Side imize, Publish, Centralize, Preserve, Obscure,
Deny, Violate, and Fake as shown in Table 1. These
The triad of general privacy strategies for high-level pri-
are used for our categorization of privacy dark patterns
vacy requirements, privacy patterns for privacy-aware
in Section 5.
design processes, and privacy-enhancing technologies
The privacy design strategy Minimize, for exam-
ple, demands the amount of processed data to be re-
Table 1. Privacy Strategies vs. Dark Strategies. this dark strategy to encourage the sharing of personal
data and thus the use of their platform. This strategy
Strategies
satisfies a person’s need to belong as will be explained
Hoepman Dark Strategies
in section 4.
Minimize Maximize
Hide Publish
Separate Centralize Centralize. Centralize is the dark strategy associ-
Aggregate VS Preserve ated to the privacy strategy Separate, which mandates
Inform Obscure that personal data should be processed in a distributed
Control Deny way. Centralize, in contrast, enforces that. . .
Enforce Violate
Demonstrate Fake
Personal data is collected, stored, or processed at a
central entity.
stricted to the minimal amount possible. The corre-
This strategy preserves the links between the different
sponding dark strategy Maximize would collect, store,
users and thus allows for a more complete picture of
and process as much data as possible, leading to a loss
their habits and their usage of the service.
of privacy. The system designer does not act out of pure
Advertising networks employ this strategy heavily
maliciousness but to gain an advantage over a system
by sharing pseudonymous user IDs, a practice known as
with the same functionality but with stronger privacy
cookie syncing [1]. Another common occurrence of this
protection. Specifically, by receiving additional personal
privacy dark strategy is the practice of flash cookies,
data which can, e.g., be sold or used for personalized ad-
which are cookies that are stored centrally by the flash
vertisements.
plug-in on the file system and are thus not restricted to
In the following we detail the eight privacy dark
a specific web browser.
strategies we have developed.
Deny. Patterns making use of the dark strategy Deny – Centralize: Personal data is collected, stored, or
make a data subject lose control of their personal data. processed at a central entity.
The term Deny is due to a denial of control. – Preserve: Interrelationships between different
data items should not be affected by processing.
Data subjects are denied control over their data. – Obscure: It is hard or even impossible for data sub-
jects to learn how their personal data is collected,
With this dark strategy, a service provider can pre- stored, and processed.
vent users from taking actions that oppose that service – Deny: Data subjects are denied control over their
provider’s interest. An example is to not provide the data.
functionality for deleting an account. Another example – Violate: A privacy policy presented to the user is
is the nonexistence of options to control sharing of infor- intentionally violated.
mation. Until recently this was the case in WhatsApp, – Fake: An entity collecting, storing, or processing
where the online status was automatically shared with personal data claims to implement strong privacy
everyone who subscribed to that phone number, which protection but in fact only pretends to.
has a big impact on the privacy of users [12].
Violate. The strategy Violate occurs if. . . 3.2 Privacy Dark Patterns
A privacy policy presented to the user is intention-
After our exploration of privacy dark strategies we will
ally violated.
now define the concept of a privacy dark pattern. As
mentioned in Section 2, a pattern describes a generic,
A privacy policy is in place, shown to the user but
reusable building block to solve a recurring problem and
intentionally not kept. The users are unaware of the
hence to document best practices. They can be collected
violation; thus, this does not impact the trust put into
in special catalogs and allow for easy replication. Pat-
that service if such violations are not revealed. It is hard
terns fulfill the role of a common language to allow sys-
to find concrete examples and patterns implementing
tem developers and privacy engineers to communicate
this strategy since using this strategy is against the law
more efficiently.
and not publicly admitted by companies.
We argue that common building blocks that are
used by service providers to deceive and mislead their
Fake. The privacy dark strategy Fake means that. . .
users exist. Some service providers use recurring pat-
An entity collecting, storing, or processing personal terns to increase the collection of personal data from
data claims to implement strong privacy protection their users. Sometimes these building blocks are used
but in fact only pretends to. unintentionally, simply constituting usage of privacy
anti patterns, but without any malicious intent. How-
An example of this strategy are self-designed padlock ever, we claim that there are building blocks which are
icons or privacy seals, which make the user feel secure used on purpose, thereby yielding an advantage to the
but do not have any meaning. Another example are service provider. We call these building blocks privacy
wrong and unsubstantial claims such as an unrealistic dark patterns.
claim on the key-size of ciphers or marketing terms like Analogously to privacy patterns, privacy dark pat-
“military grade encryption”. terns can be collected in special repositories to facilitate
easy access and retrievability for users and to develop
countermeasures. Patterns are usually documented in
Synthesis a formalized template to enable system developers to
Our eight Privacy Dark Strategies can be summarized easily reference and use them. Common fields in such a
as follows: template include the name of the pattern, the problem
– Maximize: The amount of personal data that is col- the pattern is solving and references to related patterns.
lected, stored, or processed is significantly higher However, current templates for design and privacy
than what is actually needed for the task. patterns are not suitable for documenting privacy dark
– Publish: Personal data is published. patterns due to the following reasons:
1. Privacy patterns and privacy dark patterns have
a different intent regarding their documentation.
Each privacy pattern solves a specific problem, Examples/Known Uses: In this section, implemen-
which is often mentioned as a separate field in the tations using the dark pattern are described. Service
template. Privacy patterns are documented to be providers applying the privacy dark pattern belong
copied and used. The purpose of documenting pri- into this field. Screenshots of usage of the dark pat-
vacy dark patterns on the other hand is to create tern can be provided where appropriate.
and enhance awareness about common anti-privacy Related Patterns: If related privacy dark patterns
techniques, since they do not solve an engineering exist they are referenced here.
problem. Thus, a problem-centric description is out Psychological Aspects: This field describes the psy-
of place. chological mechanisms that make the pattern effec-
2. The target group of privacy patterns are system tively influence the users in their behavior.
designers whereas privacy dark patterns can target Strategies: In this part of the documentation of a pri-
non-technical end-users to educate them about the vacy dark pattern, the used dark strategy is pro-
strategies that are used to deceive them. vided. These are the dark strategies explained in
Section 3.1.
Thus, we need a different template to document privacy
dark patterns. This template can be used to systematically document
different privacy dark patterns in a repository. We make
use of this template later in Section 5.
Our Privacy Dark Pattern Template
Instead of agreeing to general terms and conditions is too complicated and subjects are unable to interpret
quickly and automatically, one can take the time and this information [35].
make the effort to carefully read the information pro-
vided. Afterwards, one deliberatively weighs the pros
and cons and decides whether to agree to the condi- 4.1 Prompting System 1 Thinking
tions or not. This is an example of a System 2 thinking
process; it takes place in a controlled, conscious, and As argued above, privacy dark strategies are typically
effortful way. Behavior based on System 2 thinking is accompanied by System 1 thinking processes, while Sys-
driven by a deliberative, effortful decision-making pro- tem 2 thinking processes are often not possible, as shown
cess, resulting in the relatively slow execution of behav- in the following analysis. Regarding the dark strategy
ior [29, 43, 44]. Maximize, the amount of data that is processed is sig-
General terms and conditions are often not read, nificantly higher than the data that is really needed for
and agreement is typically made automatically and the task. Subjects need high motivation to resist exces-
quickly, i.e., System 1 operates. There is thus an op- sive data collection. Additionally, although some users
portunity to fill general terms and conditions with dark might have high motivation, they need specific knowl-
ingredients. These in turn are not consciously noticed edge and abilities to offer any resistance. However, some
when users are in System 1 mode, as illustrated in the service providers use mandatory form fields for user reg-
example. In general, we postulate that privacy dark istration, which renders the knowledge to circumvent
strategies work well when individuals process informa- the dark strategy useless if one wants to utilize the ser-
tion using System 1 thinking. When (dark) information vice. Thus, users often stay in System 1 mode and allow
is processed quickly, without much effort, and automat- Maximize to operate.
ically, it seems likely that privacy dark strategies can When personal data is not hidden from plain view
unleash their full impact. In other words, in System 1 (Publish), users need to be motivated and able to
mode, subjects are likely to be less conscious of privacy change settings. Users might lack the necessary moti-
dark patterns being at work and unable to delibera- vation and ability to do so; thus, remaining in System 1
tively act against them. On the other hand, recognizing processing when it comes to, for instance, privacy set-
privacy dark strategies and taking action against them tings.
requires System 2 processing. Working against the dark strategies of centralizing
Past research in fact shows the importance of cog- personal data (Centralize) and of providers that inter-
nitive information processing for privacy issues (e.g., relate data items (Preserve) requires particularly high
[8, 32, 34]). Knijnenburg and colleagues [31], for in- motivation as well as extensive knowledge of and the
stance, document that people automatically provide ability to understand these strategies. It is reasonable
personal information on website forms when an auto- to assume that the typical user often does not have the
completion feature fills out forms by default with pre- knowledge and ability to precisely understand the dark
viously stored values. Reducing the impact of this au- strategies of Centralize and Preserve and to work
tomatic (System 1 based) default completion by giving against them (e.g., taking action against data preserva-
users control over which forms are filled out reduces the tion). Thus, users often cannot engage in the delibera-
amount of personal information provided. tive processing that might lead to behavior that chal-
A number of conditions determine whether humans lenges these two dark strategies.
rely on System 1 thinking processes and System 2 think- The dark strategy Obscure reflects the idea that it
ing processes are inhibited. There are two central as- is difficult for users to learn about what happens to their
pects to consider [16, 39]. Humans engage in System 1 personal data. This strategy implies that users must be
processing whenever they (a) have little motivation to highly motivated and able to acquire information about
think and reason in an effortful way or (b) have no op- how their personal data is used and stored. Again, this
portunity to do so because they lack the required knowl- requirement inhibits users from engaging in deliberative
edge, ability, or time. Users, for instance, often have no processing.
motivation to read general terms and conditions. In in- Analogously, when users’ control of data is denied
stances where they are motivated, they often do not (Deny) they must be highly motivated and able to work
have the opportunity to use System 2 thinking because against this strategy. Deny makes it even more diffi-
the language used in general terms and conditions often cult for users to notice the violation of privacy policies
and legal requirements (Violate). Here, high motiva-
(status January 16, 2016). As shown in Figure 2, users’ and it provides a skip option. Still, the form design la-
motivation to unsubscribe is challenged by activating tently manipulates the user by encouraging the creation
their need to belong and the presentation of users’ social of a user account.
capital they would lose once they unsubscribe [7, 26]. A stronger form of manipulation is achieved by ap-
In sum, people provide and share private informa- plying traditional persuasion techniques [13]. For in-
tion based on their need to belong. Therefore, the need stance, the so-called “door in the face” technique takes
to belong may run counter to high privacy standards. advantage of the principle of reciprocity. In this tech-
nique, the refusal of a large initial request increases the
likelihood of agreement to a second, smaller request.
4.3 Specific Mechanisms This technique has already been studied in the context
of private information disclosure [4] and privacy user
In summary, privacy dark strategies often work well be- settings [30]. Applied to privacy dark strategies, a ser-
cause they take advantage of human beings’ psycholog- vice provider might intentionally ask users for dispro-
ical constitution. We argue that System 1 thinking and portionate amounts of personal data. By providing an
the need to belong are so fundamental for malicious pri- option to skip the first form and then only asking for
vacy mechanisms to work, that both aspects represent a very limited set of personal data in the second form
the basis of psychological considerations in our frame- (e.g., mail address only), users may be more willing to
work. Furthermore, we believe that both aspects are comply and to provide that information after all.
helpful for contributors when briefly assessing potential Closely related to the two cognitive systems, heuris-
privacy dark patterns and their psychological mechan- tics and biases provide decision guidance in case of un-
ics. certainty [47]. Although there are a lot of heuristics
The discussion whether a pattern is to be regarded and cognitive biases related to decision making [29] that
as a dark pattern can then easily integrate a perspec- could be exploited by dark privacy patterns, we will only
tive of the users. This psychological perspective com- introduce an exemplary bias that we later use in one of
plements the assessments of actual impacts of the pat- our example patterns: Hyperbolic discounting [33] is a
tern and suspected motives of the service providers. bias causing humans to inconsistently valuate rewards
This is important in order to differentiate actual privacy over time. Also known as present bias, this bias tricks
dark patterns with malicious intent from other forms of humans into favoring a present reward over a similar re-
poorly implemented or unintended features regarding ward at a later point in time. In terms of privacy, many
privacy. users tend to focus on the instant gratification of an
Apart from the thinking and reasoning processes immediate reward, when they are forced to provide per-
and the need to belong mentioned before, arbitrary pat- sonal data to use a service. At the same time, the users
terns may exploit more specific psychological mecha- discount the ramifications of privacy disclosures in the
nisms which build upon these fundamental aspects. In future [2].
the following, we introduce some of these mechanisms Cognitive dissonance [17] is a state of discomfort
and indicate their usage for privacy dark patterns. caused by contradictory beliefs and actions. According
First, we focus on nudging, a concept for influenc- to the theory of cognitive dissonance, the experience of
ing decision making based on positive reinforcement and inconsistency triggers a reduction of dissonance and a
non-forced compliance [45]. Nudging has already been potential modification of the conflicting cognition. In
applied to decision making in the domain of privacy pro- terms of privacy dark patterns, this process can be ex-
tection [3]. For instance, regular nudges that provide a ploited by inconspicuously providing justification argu-
user with information about data collection of smart- ments for sugarcoating user decisions that have nega-
phone applications have shown to increase awareness tively affected their privacy. For instance, after asking
and motivate users to reassess the applications’ permis- users for inappropriate amounts of personal data, a ser-
sions [6]. When the good intents of privacy nudging are vice provider would later remind the users of the high
replaced with malicious intents, the concept turns into data protection standards they comply with. When a
a latent manipulation technique for non-forced compli- user hesitantly provides personal data although they are
ance with weakened privacy. The dark counterpart pro- generally very cautious regarding personal information,
vides choice architectures facilitating decisions that are a state of discomfort may emerge soon after. Such hints
negative to the user’s privacy. For instance, the starting may then influence the dissonance resolution of the user.
screen in Figure 1 does not force an account creation
5.4 Hidden Legalese Stipulations Another approach is the one followed by the Terms
of Service; Didn’t Read (TOSDR8 ) webpage. This is
Name/Aliases: Hidden Legalese Stipulations a community-driven repository of ratings of privacy
Context: This pattern can be used by all systems policies. TOSDR is available as a browser add-on and
which incorporate a document describing the terms and shows the rating of the terms of service of the current
conditions of using the service. web page as a small icon. When clicking on the icon one
Description: Terms and conditions are mandatory by can see the positive and negative points of the terms of
law. Nevertheless, most users do not read them, since service in an easily understandable language.
they are often long and written in a complicated legal
jargon. This legal jargon is necessary to provide suc- Examples/Known Uses: In 2000, the then-popular
cinctness and clarity, but is not user-friendly. instant messenger service ICQ introduced a “Terms
The inability of the user to grasp the legal jargon Of Service — Acceptable Use Policy”9 which granted
puts him in a vulnerable state, since the policy is legally the service operators the copyright on all information
binding. If this vulnerability is exploited, the policy posted by their users. Hidden in this legalese, the oper-
turns into an instance of a privacy dark pattern. Ser- ators granted further rights of use “including, but not
vice providers can hide stipulations in the policies which limited to, publishing the material or distributing it”.
target the privacy of the user. Often the user will not The British firm GameStation owns the souls of
notice this, not reading the terms and conditions or be- 7,500 online shoppers, thanks to an “immortal soul
ing unable to understand their implications. Some ser- clause”10 in the terms and conditions. This April Fool’s
vice providers state that they will change their policies gag reveals the effectiveness of this pattern and shows
without further notice, preventing the user even further that companies can hide everything in their online terms
from learning what happens to his data. and conditions. Please note that McDonald et al. [36]
Effect: Usage of this pattern leads to the service calculated that reading the privacy policies you en-
provider being able to hide his malicious deeds from counter in a year would take 76 work days.
the user without necessarily violating legal regulations. Related Patterns: n/a
Countermeasures: There are various proposals for Psychological Aspects: Even if the user is motivated
easier communication of legal conditions. to read terms and conditions, missing opportunity to
One solution is to make the legal conditions fully comprehend all details makes a System 1-based pro-
machine-readable. This was the approach that P3P, the cessing more probable.
Platform for Privacy Preferences Project, followed. P3P Strategies: Obscure
is a standard by the W3C6 for a machine-readable ren-
dering of privacy policies. The basic idea is that an
XML-file specifying the privacy policy can be retrieved 5.5 Immortal Accounts
from any participating web pages. This policy can auto-
matically be checked against the preferences of the user Name/Aliases: Immortal Accounts
by the browser. Context: Many services require user accounts, either
The Privacy Bird7 , for example, was a tool which because they are necessary for service fulfilment, or be-
could show the P3P description as an icon, namely a cause user accounts represent a benefit for the service.
bird. The color or the bird, i.e., red or green, signified if Description: The service provider requires new users
the policy of the site matched the users’ preferences. to sign up for accounts to use the service. Once users de-
The drawback of this approach is, that the service cide to stop using the service, they might want to delete
provider needs to provide the machine-readable P3P de- their accounts and associated data. However, the service
scription. A malicious service provider who wants to provider prevents the user from doing so by either—
trick his users with hidden legal stipulations will of unnecessarily complicating the account deletion experi-
course not provide such a description. Since this coun- ence, or by not providing any account deletion option
termeasure depends on the collaboration with the ser-
vice provider it is not effective.
8 [Link]
9 [Link]
//[Link]/legal/[Link]
6 [Link] 10 [Link]
7 [Link] [Link]
at all. Additionally, the service provider might trick the Description: When the user imports the list, the ser-
user in the deletion process by pretending to delete the vice executes a lookup against its own database. It then
entire account, while still retaining (some) account data. provides suggestions for connections to the user. How-
Effect: When the user interface makes the account ever, the service provider stores the list of all contacts as
deletion options hard to access, the barrier to delete the internal data records for further processing—including
account is increased. If the users are required to call purposes that have not been initially declared.
the customer support, the process is even more cum- Effect: Using an import feature may lead to expos-
bersome. Both of these deliberately inconvenient user ing unwanted information, specifically the contents of
experiences may cause the user to reconsider the actual personal address books to third parties. A potential us-
deletion decision. A deletion process where the service age of such information is the dispatch of invitations or
provider claims to remove the account, but instead just other advertisements, at worst even in the name of the
flags the user records as deleted while still keeping the original uploader without consent. Service provider may
data gives the user a false feeling of deletion. misuse such data for profiling and tracking individuals
Countermeasures: Online resources such as just- that do not yet possess a user account.
delete.me11 or accountkiller.com12 curate a list of ser- Countermeasures: If it is unknown or unclear how
vice providers and their policies towards account re- a service provider is handling and processing imported
moval. They provide step-by-step tutorials for users how contact lists, such a feature should be avoided. Many
to delete an account at those providers. If the service to mobile and desktop operating systems allow users to
be used is known for a non-delete policy but requires deny applications access to address book data. Users
a user account, the usage of a throwaway account with should routinely click on deny unless it is definitely re-
incorrect data should be considered. quired or in their interest to share those data.
Examples/Known Uses: As of February 2016, the Examples/Known Uses: In 2008, the social book
community-curated data set of [Link] lists 474 cataloging website [Link] attracted negative at-
services. 75 services thereof do not provide the possibil- tention for unsolicited invite emails based on the address
ity to delete the account at all and 100 services require book import feature. The experiences of customers and
contacting the customer support. From the remaining reactions of the service providers are still available on
299 services listed, another 31 services have a non-trivial a customer support page13 . Based on a misleading up-
deletion process that requires additional steps. load form design, users thought they would only provide
Related Patterns: The creation of accounts can be contacts for matching against goodreads’ user base. In-
required due to Forced Registration. stead, goodreads sent invite emails to persons which had
Psychological Aspects: When the service provider mail addresses not yet registered at goodreads, thereby
renders the user experience for account deletion delib- referring to the user who provided the address.
erately painful, users might struggle in the process. If Related Patterns: This pattern is a potential source
the user wants to delete the account, but fails to do of information for Shadow User Profiles.
so, cognitive dissonance may emerge. As a result, the Psychological Aspects: Trading personal information
user could then reduce the inconsistent mental state by for instant connections to friends or known contacts is
reconsidering their original intent and deciding not to motivated by the need to belong.
delete the account. Strategies: Maximize, Preserve
Strategies: Deny, Obscure
13 [Link]
11 [Link] goodreads_trick_me_into_spamming_my_entire_address_
12 [Link] book
opted in for a user account and an associated profile, template contains countermeasures and a psychological
the service provider may collect information and keep viewpoint that explains why the pattern is effective.
records about individuals that do not use the service. We extensively discussed psychological aspects in
For instance, in a social network, the social graph can Section 4. Understanding those psychological mecha-
be supplemented with persons that are not members of nisms triggered by privacy dark patterns is of crucial
the network, but are known to the network based on importance as it will allow affected users to take appro-
data from members (e.g., imported address books, con- priate countermeasures.
tent metadata, or mentions). Such non-members enrich Based on our privacy dark pattern framework and
the graph and improve the quality of algorithms such the extensive discussion of the related concepts, we
as contact suggestions. briefly presented seven of such patterns including some
Effect: The service provider stores and processes infor- concrete examples. These patterns and more are avail-
mation on individuals without their knowledge or con- able in an extended form via an online privacy dark
sent. The affected individuals are not aware of personal pattern portal [Link]. We have set up
data records they have accidentally created or that have this portal for the community to study and discuss ex-
been provided by third parties. isting patterns and contribute new ones.
Countermeasures: While it is possible to minimize
the own data trail, the accidental release of personal
data through third parties cannot always be prevented.
Examples/Known Uses: The basic mechanism of
Acknowledgements
shadow user profiles fuels the entire online advertise-
The authors like to thank the anonymous reviewers for
ment industry. Although not verifiable, social networks
their valuable comments and suggestions to improve the
may store informations of non-users. This notion is
quality of the paper. They are also grateful to Yllka
based on the experiences of newly registered users of
Thaqi and Florian Oberlies for insightful remarks and
social networks who received accurate friendship sug-
fruitful discussions.
gestions without having ever interacted with these per-
sons on the social network before.
Related Patterns: Address Book Leeching is a poten-
tial source of information for this pattern. References
Psychological Aspects: Given the fact that this pat-
tern operates without any knowledge of the affected [1] G. Acar, C. Eubank, S. Englehardt, M. Juarez,
A. Narayanan, and C. Diaz, “The Web never forgets: Per-
users, it is not targeting any psychological aspects.
sistent tracking mechanisms in the wild,” in Proceedings of
Strategies: Maximize, Preserve, Centralize
the 2014 ACM SIGSAC Conference on Computer and Com-
munications Security. ACM, 2014, pp. 674–689.
[2] A. Acquisti, “Privacy in electronic commerce and the eco-
nomics of immediate gratification,” in Proceedings of the
6 Conclusions 5th ACM conference on Electronic commerce. ACM, 2004,
pp. 21–29.
In this paper, we introduce the concepts of privacy dark [3] ——, “Nudging privacy: The behavioral economics of per-
strategies and privacy dark patterns. Both are based on sonal information.” IEEE Security & Privacy, vol. 7, no. 6,
pp. 82–85, 2009.
the idea that actors intentionally manipulate people to
[4] A. Acquisti, L. K. John, and G. Loewenstein, “The impact
provide their personal data for collection, storage, and of relative standards on the propensity to disclose,” Journal
processing against their original intent and interest. of Marketing Research, vol. 49, no. 2, pp. 160–174, 2012.
Documenting such strategies and patterns is a vital [5] C. Alexander, S. Ishikawa, and M. Silverstein, A Pattern
first step towards a better recognition of such activities, Language: Towns, Buildings, Construction (Center for En-
e.g., in the Internet or in mobile apps. Our eight pri- vironmental Structure Series). Oxford University Press,
1977.
vacy dark strategies Maximize, Publish, Centralize,
[6] H. Almuhimedi, F. Schaub, N. Sadeh, I. Adjerid, A. Ac-
Preserve, Obscure, Deny, Violate, and Fake pro- quisti, J. Gluck, L. F. Cranor, and Y. Agarwal, “Your loca-
vide a coarse categorization for the subsequent patterns. tion has been shared 5,398 times!: A field study on mobile
Privacy dark patterns are documented using a uniform app privacy nudging,” in Proceedings of the 33rd Annual
template. Beyond a mere description of the pattern, the ACM Conference on Human Factors in Computing Systems,
ser. CHI ’15. New York, NY, USA: ACM, 2015, pp. 787–
796.
[7] Y. Amichai-Hamburger and E. Ben-Artzi, “Loneliness and 1st ed. Boston: Addison-Wesley Professional, 2004.
internet use,” Computers in Human Behavior, vol. 19, no. 1, [26] D. J. Hughes, M. Rowe, M. Batey, and A. Lee, “A tale of
pp. 71–80, 2003. two sites: Twitter vs. Facebook and the personality predic-
[8] C. M. Angst and R. Agarwal, “Adoption of electronic health tors of social media usage,” Computers in Human Behavior,
records in the presence of privacy concerns: The elaboration vol. 28, no. 2, pp. 561–569, 2012.
likelihood model and individual persuasion,” MIS quarterly, [27] P. Hustinx, “Privacy by design: delivering the promises,”
vol. 33, no. 2, pp. 339–370, 2009. Identity in the Information Society, vol. 3, no. 2, pp. 253–
[9] R. F. Baumeister and M. R. Leary, “The need to belong: 255, 2010.
desire for interpersonal attachments as a fundamental hu- [28] T. Jones, “Facebook’s "evil interfaces",” [Link]
man motivation.” Psychological Bulletin, vol. 117, no. 3, pp. org/de/deeplinks/2010/04/facebooks-evil-interfaces, ac-
497–529, 1995. cessed: 2016-02-25.
[10] K. Beck and W. Cunningham, “Using pattern languages [29] D. Kahneman, Thinking, fast and slow. Macmillan, 2011.
for object oriented programs,” in Conference on Object- [30] B. P. Knijnenburg and A. Kobsa, “Increasing sharing ten-
Oriented Programming, Systems, Languages, and Applica- dency without reducing satisfaction: Finding the best
tions (OOPSLA), 1987. privacy-settings user interface for social networks,” in Pro-
[11] H. Brignull, “Dark Patterns: fighting user deception world- ceedings of the International Conference on Information
wide,” [Link] accessed: 2016-01-24. Systems - Building a Better World through Information Sys-
[12] A. Buchenscheit, B. Könings, A. Neubert, F. Schaub, tems, ICIS 2014, Auckland, New Zealand, December 14-17,
M. Schneider, and F. Kargl, “Privacy implications of pres- 2014, 2014.
ence sharing in mobile messaging applications,” in Proceed- [31] B. P. Knijnenburg, A. Kobsa, and H. Jin, “Counteracting
ings of the 13th International Conference on Mobile and the negative effect of form auto-completion on the privacy
Ubiquitous Multimedia. ACM, 2014, pp. 20–21. calculus,” in Thirty Fourth International Conference on In-
[13] R. Cialdini, Influence : the psychology of persuasion. New formation Systems, Milan, 2013.
York: Morrow, 1993. [32] A. Kobsa, H. Cho, and B. P. Knijnenburg, “The effect of
[14] N. Doty and M. Gupta, “Privacy Design Patterns and Anti- personalization provider characteristics on privacy attitudes
Patterns,” in Trustbusters Workshop at the Symposium on and behaviors: An elaboration likelihood model approach,”
Usable Privacy and Security, 2013. Journal of the Association for Information Science and Tech-
[15] N. B. Ellison, C. Steinfield, and C. Lampe, “The benefits nology, 2016, in press.
of facebook "friends:" social capital and college students’ [33] D. Laibson, “Golden eggs and hyperbolic discounting,” The
use of online social network sites,” Journal of Computer- Quarterly Journal of Economics, vol. 112, no. 2, pp. 443–
Mediated Communication, vol. 12, no. 4, pp. 1143–1168, 478, 1997.
2007. [34] P. B. Lowry, G. Moody, A. Vance, M. Jensen, J. Jenkins,
[16] R. H. Fazio, “Multiple processes by which attitudes guide and T. Wells, “Using an elaboration likelihood approach to
behavior: The MODE model as an integrative framework,” better understand the persuasiveness of website privacy as-
Advances in Experimental Social Psychology, vol. 23, pp. surance cues for online consumers,” Journal of the American
75–109, 1990. Society for Information Science and Technology, vol. 63,
[17] L. Festinger, A theory of cognitive dissonance. Stanford no. 4, pp. 755–776, 2012.
university press, 1962, vol. 2. [35] E. Luger, S. Moran, and T. Rodden, “Consent for all: re-
[18] M. Fowler, Patterns of Enterprise Application Architecture. vealing the hidden complexity of terms and conditions,” in
Boston: Addison-Wesley Professional, 2003. Proceedings of the SIGCHI conference on Human factors in
[19] E. Gamma, R. Helm, R. Johnson, and J. Vlissides, Design computing systems. ACM, 2013, pp. 2687–2696.
patterns: elements of reusable object-oriented software. [36] A. M. McDonald and L. F. Cranor, “Cost of reading privacy
Pearson Education, 1994. policies, the,” ISJLP, vol. 4, p. 543, 2008.
[20] H. Gangadharbatla, “Facebook me: Collective self-esteem, [37] A. Nadkarni and S. G. Hofmann, “Why do people use Face-
need to belong, and internet self-efficacy as predictors of book?” Personality and Individual Differences, vol. 52, no. 3,
the igeneration’s attitudes toward social networking sites,” pp. 243–249, 2012.
Journal of interactive advertising, vol. 8, no. 2, pp. 5–15, [38] N. Notario, A. Crespo, Y.-S. Martín, J. M. Del Alamo,
2008. D. Le Métayer, T. Antignac, A. Kung, I. Kroener, and
[21] S. Gürses, C. Troncoso, and C. Diaz, “Engineering privacy D. Wright, “PRIPARE: Integrating Privacy Best Practices
by design,” Computers, Privacy & Data Protection, vol. 14, into a Privacy Engineering Methodology,” in Security and
2011. Privacy Workshops (SPW), 2015 IEEE. IEEE, 2015, pp.
[22] M. Hafiz, “A collection of privacy design patterns,” in Pro- 151–158.
ceedings of the 2006 conference on Pattern languages of [39] R. E. Petty and J. T. Cacioppo, The elaboration likelihood
programs. ACM, 2006, p. 7. model of persuasion. Springer, 1986.
[23] E. T. Higgins, Beyond pleasure and pain: How motivation [40] S. Romanosky, A. Acquisti, J. Hong, L. F. Cranor, and
works. Oxford University Press, 2011. B. Friedman, “Privacy patterns for online interactions,” in
[24] J.-H. Hoepman, “Privacy Design Strategies,” CoRR, vol. Proceedings of the 2006 conference on Pattern languages of
abs/1210.6621, 2012. programs. ACM, 2006, p. 12.
[25] G. Hohpe and B. Woolf, Enterprise Integration Patterns - [41] M. Schumacher, “Security patterns and security standards.”
Designing, Building, and Deploying Messaging Solutions, in EuroPLoP, 2002, pp. 289–300.