Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2015, Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia
…
12 pages
1 file
We explore mobile privacy through a survey and through usability evaluation of three privacy-preserving mobile applications. Our survey explores users' knowledge of privacy risks, as well as their attitudes and motivations to protect their privacy on mobile devices. We found that users have incomplete mental models of privacy risks associated with such devices. And, although participants believe they are primarily responsible for protecting their own privacy, there is a clear gap between their perceived privacy risks and the defenses they employ. For example, only 6% of participants use privacy-preserving applications on their mobile devices, but 83% are concerned about privacy. Our usability studies show that mobile privacy-preserving tools fail to fulfill fundamental usability goals such as learnability and intuitivenesspotential reasons for their low adoption rates. Through a better understanding of users' perception and attitude towards privacy risks, we aim to inform the design of privacy-preserving mobile applications. We look at these tools through users' eyes, and provide recommendations to improve their usability and increase user-acceptance. CCS Concepts •Security and privacy → Usability in security and privacy; •Human-centered computing → HCI design and evaluation methods;
People increasingly use mobile applications (apps) for multiple online activities. Users often do not have access to mobile apps' privacy information when deciding whether to download an app. We designed and tested an interface that provides users with information on apps' privacy practices prior to download. 94 participants were asked to rate 4 app prototypes with varying privacy information. The experiment was a 2 (privacy display: text or visual) x 2 (privacy practices: enhancing or invasive) x 2 (app type: social media or health) factorial design. We found that users rate apps' trustworthiness and their own comfort entering information differently based on apps' privacy practices, and that app type influences these perceptions. Privacy information did not affect whether users liked an app or would use or recommend it. 72% participants found the privacy tab helpful, and we discuss design implications for incorporating privacy information into mobile interfaces prior to download.
International Journal of Advanced Computer Science and Applications, 2021
With the increased use of social networking platforms, especially with the inclusion of sensitive personal information, it has become important for those platforms to have adequate levels of security and privacy. This research aimed to evaluate the usability of privacy in the WhatsApp, Twitter, and Snapchat applications. The evaluation was conducted based on the structured analysis of privacy (STRAP) framework. Seven expert evaluators performed heuristic evaluations and applied the 11 STRAP heuristics to the privacy policy statements and settings provided in the WhatsApp, Twitter, and Snapchat applications. This study provides useful information for designers and developers of social media applications as well as users of the apps. In particular, the results indicate that Snapchat had the highest severity rating, followed by Twitter and WhatsApp. Moreover, the most notable severity rating for all the apps was regarding the ability to revoke consent, where all of the apps had a very high number of usability problems.
2009
Abstract: Privacy is a thorny issue that affects all information systems designed for interpersonal interaction and awareness. Theoretical insights regarding privacy and user experience in a variety of systems have produced numerous design principles and guidelines for building systems sensitive to privacy issues. In order to truly improve support for privacy management, the usability of systems that implement these principles is critical. Yet, usability evaluation of privacy designs is a relatively unexplored area.
2014 47th Hawaii International Conference on System Sciences, 2014
Evidences collected from smartphones users show a growing desire of personalization offered by services for mobile devices. However, the need to accurately identify users' contexts has important implications for user's privacy and it increases the amount of trust, which users are requested to have in the service providers. In this paper, we introduce a model that describes the role of personalization and control in users' assessment of cost and benefits associated to the disclosure of private information. We present an instantiation of such model, a context-aware application for smartphones based on the Android operating system, in which users' private information are protected. Focus group interviews were conducted to examine users' privacy concerns before and after having used our application. Obtained results confirm the utility of our artifact and provide support to our theoretical model, which extends previous literature on privacy calculus and user's acceptance of contextaware technology.
Personal and Ubiquitous …, 2004
To participate in meaningful privacy practice in the context of technical systems, people require opportunities to understand the extent of the systems' alignment with relevant practice and to conduct discernible social action through intuitive or sensible engagement with the system. It is a significant challenge to design for such understanding and action through the feedback and control mechanisms of today's devices. To help designers meet this challenge, we describe five pitfalls to beware when designing interactive systems-on or off the desktop-with personal privacy implications. These pitfalls are: obscuring potential information flow, obscuring actual information flow, emphasizing configuration over action, lacking coarse-grained control, and inhibiting existing practice. They are based on a review of the literature, on analyses of existing privacy-affecting systems, and on our own experiences designing a prototypical user interface for managing privacy in ubiquitous computing. We illustrate how some existing research and commercial systems-our prototype included-fall into these pitfalls and how some avoid 2 them. We suggest that privacy-affecting systems that heed these pitfalls can help users appropriate and engage them in alignment with relevant privacy practice.
Proceedings on Privacy Enhancing Technologies
Smartphone manufacturer provided default features (e.g., default location services, iCloud, Google Assistant, ad tracking) enhance the usability and extend the functionality of these devices. Prior studies have highlighted smartphone vulnerabilities and how users’ data can be harvested without their knowledge. However, little is known about manufacturer provided default features in this regard—their usability concerning configuring them during usage, and how users perceive them with regards to privacy. To bridge this gap, we conducted a task-based study with 27 Android and iOS smart-phone users in order to learn about their perceptions, concerns and practices, and to understand the usability of these features with regards to privacy. We explored the following: users’ awareness of these features, why and when do they change the settings of these features, the challenges they face while configuring these features, and finally the mitigation strategies they adopt. Our findings reveal t...
International Journal of Human-Computer Studies
The use of mobile applications continues to experience exponential growth. Using mobile apps typically requires the disclosure of location data, which often accompanies requests for various other forms of private information. Existing research on information privacy has implied that consumers are willing to accept privacy risks for relatively negligible benefits, and the offerings of mobile apps based on location-based services (LBS) appear to be no different. However, until now, researchers have struggled to replicate realistic privacy risks within experimental methodologies designed to manipulate independent variables. Moreover, minimal research has successfully captured actual information disclosure over mobile devices based on realistic risk perceptions. The purpose of this study is to propose and test a more realistic experimental methodology designed to replicate real perceptions of privacy risk and capture the effects of actual information disclosure decisions. As with prior research, this study employs a theoretical lens based on privacy calculus. However, we draw more detailed and valid conclusions due to our use of improved methodological rigor. We report the results of a controlled experiment involving consumers (n=1025) in a range of ages, levels of education, and employment experience. Based on our methodology, we find that only a weak, albeit significant, relationship exists between information disclosure intentions and actual disclosure. In addition, this relationship is heavily moderated by the consumer practice of disclosing false data. We conclude by discussing the contributions of our methodology and the possibilities for extending it for additional mobile privacy research.
Proceedings of the conference on Human factors in computing systems - CHI '03, 2003
P2P file sharing systems such as Gnutella, Freenet, and KaZaA, while primarily intended for sharing multimedia files, frequently allow other types of information to be shared. This raises serious concerns about the extent to which users may unknowingly be sharing private or personal information.
Mobile Networks and Applications
Our smartphone is full of applications and data that analytically organize, facilitate and describe our lives. We install applications for the most varied reasons, to inform us, to have fun and for work, but, unfortunately, we often install them without reading the terms and conditions of use. The result is that our privacy is increasingly at risk. Considering this scenario, in this paper, we analyze the user's perception towards privacy while using smartphone applications. In particular, we formulate two different hypotheses: 1) the perception of privacy is influenced by the knowledge of the data used by the installed applications; 2) applications access to much more data than they need. The study is based on two questionnaires (within-subject experiments with 200 volunteers) and on the lists of installed apps (30 volunteers). Results show a widespread abuse of data related to location, personal contacts, camera, Wi-Fi network list, running apps list, and vibration. An in-depth analysis shows that some features are more relevant to certain groups of users (e.g., adults are mainly worried about contacts and Wi-Fi connection lists; iOS users are sensitive to smartphone vibration; female participants are worried about possible misuse of the smartphone camera).
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Proceedings 2019 Workshop on Usable Security
… of the Seventh Symposium on Usable …, 2011
International Journal of Information Management, 2020
2014 47th Hawaii International Conference on System Sciences, 2014
CHI 2011 workshop article, 2011
Design, User Experience, and Usability. Practice and Case Studies, 2019
Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2014
CHI'06 extended abstracts on Human …, 2006
CHI Conference on Human Factors in Computing Systems, 2022
IFIP Advances in Information and Communication Technology, 2014