Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2013, 2013 Third Workshop on Socio-Technical Aspects in Security and Trust
As the amount of users' information collected and exchanged on the Internet is growing, so are, consequently, the users' concerns that their privacy might be violated. Some studies have shown that a large number of users avoid engaging in online services due to privacy concerns. It has been suggested that increased transparency of privacy related mechanisms may promote users' trust. This paper reviews the relationship between users' privacy concerns, transparency enhancing and privacy enhancing mechanisms on the one hand, and users' trust on the other, based on the existing literature. Our literature review demonstrates that previous studies have produced inconsistent results, implying this relationship should be reexamined in future work. Impact of higher transparency on users' trust has been insufficiently studied. Current research seems to suggest that the increase of the understanding of privacy issues increases importance of privacy for trust. Use of privacy enhancing mechanisms by service provider also seems to promote the trust, but this may only hold when these mechanisms are understood by the user. A need for tools that would provide users with this kind of knowledge has also been repeatedly recognized. Additionally, this paper provides an overview and description of the currently available transparency enhancing tools. To the best of our knowledge, no such overview has been available to this end. We demonstrate that the majority of tools promote awareness. Most of them attempt to provide a better understanding of privacy policies, or provide insight in the third party tracking behavior. Two tools have been identified that provide some insight in the collected user's data. No tool providing specific information on, or access to, processing logic has been identified.
Abstract The use of technology helps organizationsto get closer to customers and to build an ongoing relationship with them.As the usage of Internet grows rapidly in recent years, especially in online shopping and payment via Internet, we can observe that there is a rising number of people concerning about online privacy. Users are increasingly concerned about their own privacy when they go online, this has been repeatedly shown in studies developed by researchers. Thus, the way of compiling consumer data profiles may play an important role in assuring consumer satisfaction and trust have been fully taken into account.Customers’ perceived risk will be lower in electronic marketplace if they obtained a sufficient level of satisfaction as well as trust. Also, in a society having a high level of privacy awareness, the accessibility of privacy policieswhile navigating a website’s page and the understanding of relevant contents in the policy have been focused and considered by organizations to improve their service quality in terms of online privacy system. The purpose of this paper is presented is to allowus a better understanding of the online privacy measurements and to evaluate the usability of online privacy policies. We also examine the four aspects of policies and how they can be improved in order to meet more user satisfaction and trust. Keywords: Internet, electronic commerce, online privacy system, online trust, customer satisfaction.
International Journal of Information Security, 2007
This study applies Mayer et al. [24] trust model to an Internet context. A model is presented linking privacy policy, through trustworthiness, to online trust, and then to customers' loyalty and their willingness to provide truthful information. The model is tested using a sample of 269 responses. The findings suggest that consumers' trust in a company is closely linked with the perception of the company's respect for customer privacy. Trust in turn is linked to increased customer loyalty that can be manifested through increased purchases, openness to trying new products, and willingness to participate in programs that use additional personal information.
Improving associations between users and web-based systems can make computing tasks easier. User-to-system associations include the amount of time spent using the system, quality and quantity of data involved, and the user's trust in the system. The strength of these associations is an integral part of a user's online experience and can be improved with better understanding of the interactions involved. One class of user-to-system interaction, namely understanding and controlling the release of personal information, can heavily influence a user's trust associations with a system. However, most users do not read the statements and others do not understand them completely. These statements tend to be lengthy, non-intuitive, and not written in layman's terms. System owners can modify the way privacy policy statements are developed, presented, and how users have control over the use of their personal information to improve user-to-system trust and improve their experience and effectiveness using the systems. We analyze several factors that affect user-to-system trust with respect to privacy management and provide an overview of techniques used in building privacy statements and privacy controls. We review existing design models and factors in the context of privacy management and propose methods to improve transparency and control for users. We argue that implementing these models can strengthen user-to-system trust associations and can result in more effective system use.
2018
Today's environment of data-driven business models relies heavily on collecting as much personal data as possible. This is one of the main causes for the importance of privacy-enhancing technologies (PETs) to protect internet users' privacy. Still, PETs are rather a niche product used by relatively few users on the internet. We undertake a first step towards understanding the use behavior of such technologies. For that purpose, we conducted an online survey with 141 users of the anonymity service "JonDonym". We use the technology acceptance model as a theoretical starting point and extend it with the constructs perceived anonymity and trust in the service. Our model explains almost half of the variance of the behavioral intention to use JonDonym and the actual use behavior. In addition, the results indicate that both added variables are highly relevant factors in the path model.
People on the Web are generating and disclosing an ever-increasing amounts of data, often without full awareness of who is recording what about them, and who is aggregating and linking pieces of data with context information, for a variety of purposes. Awareness can help users to be informed about what silently happens during their navigation while learning from disclosure of personal information may help to discriminate potential harmful activities from daily and regular activities that can be performed online. Our main objective is to study whether a highly customized tool can help users to learn the value of privacy from their behaviors and make informed decisions to reduce their degree of exposure. To this aim, we present an evaluation study to analyze general perceptions, attitudes, and beliefs about privacy online, and to explore the resultant behaviors for two different groups of participants from an academic environment.
Computers in Human Behavior, 2016
The majority of Internet users do not read privacy policies because of their lengthy verbose format, although they are still the main source of information for users about how their data are collected and used. Consequently, as shown by many studies, users do not trust online services with respect to the use of their private data. Furthermore, they find it unfair that their data are used to generate revenue by online services without their knowledge or without their benefit from this. In this paper, we take as main assumption that the control of their private data and also caring about their interests would restore the trust of users. Based on an empirical model, we conducted an experimental comparative study of user trust by offering to two groups of participants the possibility to adhere to a service with a privacy policy presented in one of two different formats: the first, a conventional privacy policy and the second, designed according to the privacy policy model studied in this paper. We collected, through a survey, 717 responses from participants. The results show that allowing personalization and management in privacy policies affects user trust and makes online services appear more trustworthy to their users.
Electronic Markets, 2007
International Journal of E-Business Research, 2005
Increasingly, the Internet is used as a common tool for communication, information gathering, and online transactions. Information privacy is threatened as users are expected to reveal personal information without knowing the consequences of sharing their information. To that end, research groups, both from academia and industry, have embarked on the development of privacy enhancement technologies. One such technology is Platform for Privacy Preferences (P3P). Developed by the World Wide Web Consortium (W3C), P3P has a number of prominent stakeholders such as IBM, Microsoft, and AT&T. Yet, there is little published information on what P3P is and to what extent it is being adopted by e-business organizations. This study is exploratory in nature and aims at addressing these questions; in particular, we look at P3P both as a new technology and as a standard. We use our empirical data on top 500 interactive companies to assess its adoption.
2021
Users reveal privacy sensitive information when they post on social media, which can have negative consequences. To help these users make informed decisions, different tools have been developed that detect privacy sensitive information (PSI) and provide alerts. However, how users would perceive this type of tool has not yet been evaluated. In this position paper, we take the first steps to address this gap, by exploring user intention, perceived usefulness and attitude towards a PSI detection tool. We designed an experiment and showed participants examples of the PSI detection tool alerts, and quantitatively and qualitatively evaluated their response. The results showed that participants perceived the PSI detection tool as useful, had positive interest, and a low level of concern about it, although they had a neutral level of intention of using the tool. The participants’ open-ended responses revealed that they considered the PSI detection tool useful, but mostly for other people an...
2014 47th Hawaii International Conference on System Sciences, 2014
Trust in data privacy and security of internet service providers is getting increasing attention, especially in the context of cloud service. Insufficient communication of implemented data privacy and data security measures may lead to a lack of trust. Visualizations are known to have various beneficial effects when used to communicate information. This paper investigates whether providing visualizations as means for communicating data privacy and security measures has a positive effect on trust. A laboratory experiment was conducted to measure the effect of visualizations on trust in the provider, trust in the measures of the provider, and on information security and privacy concerns. The results indicate that there is a small positive effect regarding trust in the provider and in his measures and no significant effect regarding information security and privacy concerns. The findings provide first insights into the effect of visualizations on trust and lead to several further research questions.
Journal of Retailing, 2006
We explore the impact of privacy disclosures on online shoppers' trust in an e-tailer through a two-phase study. In the first study, we use a between-subjects factorial design to test whether the presence of an online privacy policy influences consumer trust and find that consumers are likely to respond more favorably to a shopping site with a clearly stated privacy message than to one without it, especially when privacy risk is high. In our second experiment, we examine the effects of different forms of privacy disclosures. The results suggest that online shoppers find a short, straightforward privacy statement more comprehensible than a lengthy, legalistic one. However, how a privacy policy is presented (in terms of wording) does not affect a shopper's trust in the store to any significant degree.
Abstract—With the world now becoming a global village, most business have over the years migrated their businesses online commonly known as e-commerce. Though this new technology brings about a wider market reach and faster marketing for most companies, it has also raised the issue of trust between business owners and their customers. The customers want to be sure that their information will be kept private and also be conducted in a timely fashion accurately. The intention of this paper is to discuss the main issues concerned with consumer online privacy, how best to tackle these issues, possible technologies to address these issues and also focus on an aspect that has not been researched as much i.e. The user’s online privacy perception. Index Terms —E-Commerce, consumer online privacy, trust, privacy perception.
Government Information Quarterly, 2019
The increasing use of the Internet for service delivery has paralleled an increase of e-service users' privacy concerns as technology offers ample opportunities for organizations to store, process, and exploit personal data. This may reduce individuals' perceived ability to control their personal information and increase their perceived privacy risk. A systematic understanding of individuals' privacy concerns is important as negative user perceptions are a challenge to service providers' reputation and may hamper service delivery processes as they influence users' trust and willingness to disclose personal information. This study develops and validates a model that examines the effect of organizational privacy assurances on individual privacy concerns, privacy control and risk perceptions, trust beliefs and non-self-disclosure behavior. Drawing on a survey to 547 users of different types of e-services-e-government, e-commerce and social networking-in Rwanda, and working within the framework of exploratory analysis, this study uses partial least square-structural equation modeling to validate the overall model and the proposed hypotheses. The findings show that perceptions of privacy risks and privacy control are antecedents of e-service users' privacy concerns, trust and non-selfdisclosure behavior. They further show that the perceived effectiveness of privacy policy and perceived effectiveness of self-regulations influence both perceptions of privacy risks and control and their consequences; users' privacy concerns, trust and non-self-disclosure behavior. The hypotheses are supported differently across the three types of e-services, which means that privacy is specific to context and situation. The study shows that the effect of privacy assurances on trust is different in egovernment services than in other services which suggest that trust in e-government may be more complex and different in nature than in other contexts. The findings serve to enhance a theoretical understanding of organizational privacy assurances and individual privacy concerns, trust and self-disclosure behavior. They also have implications for e-service providers and users as well as for regulatory bodies and e-services designers.
IFIP – The International Federation for Information Processing
Every time a user uses the Internet, a wealth of personal information is revealed, either voluntarily or involuntarily. This often causes privacy breaches, specially if the information is misused. Ideally, a user would like to make a reasoned decision about who to release her information to and what to release. For this purpose, we propose using the level of trust that a user has on the recipient regarding not to misuse her private data. To measure this trust level, we adapt the vector model of trust proposed earlier. We formalize a notion of privacy context and show how a privacy context ontology can be used to determine trust values for previously unencountered situations.
2017
Trends show that privacy concerns are rising, but end users are not armed with enoughmechanisms to protect themselves. Privacy enhancing technologies (PETs) or morespecifically, tools (PET-tools) a ...
Journal of Consumer Affairs, 2005
Online privacy is an issue of increasing national importance, and voluntary privacy seals provided by third-party organizations such as TRUSTe and BBBOnline have been proposed as a means of assuring consumer privacy. Few studies have examined privacy seal effects. This study presents results of an online experiment that evaluated consumer response to privacy seals in a naturalistic exposure setting. Findings suggest that privacy seals enhance trust in the Web site and expectations that the site would inform the user of its information practices. While concern for privacy-threatening information practices had no influence, privacy self-efficacy, confidence in ability to protect one's privacy, moderated seal effects. Implications for the continued role of privacy seals are discussed.
Proceedings of the 1st International Workshop on Data Economy
Why is it hard for online users to trust service providers when it comes to their personal data? While users might give away their data when using their services, this does not mean that they necessarily trust these companies. Building trust in online services is particularly relevant as digital economy policy strategies, such as the EU Data Strategy, deposit a considerable amount of faith in the benefits of a data-driven society. To achieve this goal, transparency should be considered a necessary feature, on which trust can be built. According to scholarly literature, the more information provided to data subjects, the less power asymmetry, caused by a lack of knowledge, between them and data controllers will exist. In this respect, transparency around data processing has been, and still is, conveyed through privacy notices. But these are far from being used as helpful tools to navigate complex data-intensive environments. Technical developments, such as Solid personal datastores, provide a fertile ground for the negotiation of privacy terms between the involved parties. But to do so, it is necessary to have clear and transparent processing conditions. However, while certain specifications have been developed to accommodate for the representation of privacy terms, there is still a lack of developed solutions to address this problem. With this in mind, we propose the usage of the Privacy Paradigm ODRL Profile (PPOP), which extends ODRL and DPV to specify data processing requirements for personal datastores envisaged as key core elements of the data economy. To demonstrate the usage of PPOP, a set of policy examples will be provided, as well as a prototype implementation of a generator of machine and human-readable PPOP policies. CCS CONCEPTS • Information systems → World Wide Web; Ontologies; • Security and privacy → Human and societal aspects of security and privacy; Access control; Social aspects of security and privacy;
Workshop on Privacy, …, 2006
Decision Support Systems, 2012
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/copyright
Despite increased concern about the privacy threat posed by new technology and the Internet, there is relatively little evidence that people's privacy concerns translate to privacy-enhancing behaviors while online. In Study 1, measures of privacy concern are collected, followed 6 weeks later by a request for intrusive personal information alongside measures of trust in the requestor and perceived privacy related to the specific request (n = 759). Participants' dispositional privacy concerns, as well as their level of trust in the requestor and perceived privacy during the interaction, predicted whether they acceded to the request for personal information, although the impact of perceived privacy was mediated by trust. In Study 2, privacy and trust were experimentally manipulated and disclosure measured (n = 180). The results indicated that privacy and trust at a situational level interact such that high trust compensates for low privacy, and vice versa. Implications for un...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.