Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Proceedings on Privacy Enhancing Technologies
…
26 pages
1 file
Standardized privacy labels that succinctly summarize those data practices that people are most commonly concerned about offer the promise of providing users with more effective privacy notices than full-length privacy policies. With their introduction by Apple in iOS 14 and Google’s recent adoption in its Play Store, mobile app privacy labels are for the first time available at scale to users. We report the first indepth interview study with 24 lay iPhone users to investigate their experiences, understanding, and perceptions of Apple’s privacy labels. We uncovered misunderstandings of and dissatisfaction with the iOS privacy labels that hinder their effectiveness, including confusing structure, unfamiliar terms, and disconnection from permission settings and controls. We identify areas where app privacy labels might be improved and propose suggestions to address shortcomings to make them more understandable, usable, and useful.
Proceedings of the 9th IEEE/ACM International Conference on Mobile Software Engineering and Systems
Privacy labels provide an easy and recognizable overview of data collection practices adopted by mobile apps developers. Specifically, on the Apple App Store, privacy labels are displayed on each mobile app's page and summarize what data is collected by the app, how it is used, and for what purposes it is needed. Starting from the release of iOS version 14.3 developers are required to provide privacy labels for their applications. We conducted a large-scale empirical study, collecting and analyzing the privacy labels of 17, 312 apps published on the App Store, to understand and characterize how sensitive data is collected and shared. The results of our analysis highlight important criticalities about the collection and sharing of personal data for tracking purposes. In particular, on average free applications collect more sensitive data, the majority of data is collected in an unanonimyzed form, and a wide range of sensitive information are collected for tracking purposes. The analysis provides also evidence to support the decision-making of users, platform maintainers, and regulators. Furthermore, we repeated the data collection and analysis after seven months, following the introduction of additional run-time tracking controls by Apple. Comparing the two datasets, we observed that the newly introduced measures resulted in a statistically significant decrease in the number of apps that collect data for tracking purposes. At the same time, we observed a growth in overall data collection.
People increasingly use mobile applications (apps) for multiple online activities. Users often do not have access to mobile apps' privacy information when deciding whether to download an app. We designed and tested an interface that provides users with information on apps' privacy practices prior to download. 94 participants were asked to rate 4 app prototypes with varying privacy information. The experiment was a 2 (privacy display: text or visual) x 2 (privacy practices: enhancing or invasive) x 2 (app type: social media or health) factorial design. We found that users rate apps' trustworthiness and their own comfort entering information differently based on apps' privacy practices, and that app type influences these perceptions. Privacy information did not affect whether users liked an app or would use or recommend it. 72% participants found the privacy tab helpful, and we discuss design implications for incorporating privacy information into mobile interfaces prior to download.
Proceedings 2023 Symposium on Usable Security
Proceedings 2021 Network and Distributed System Security Symposium
Various privacy laws require mobile apps to have privacy policies. Questionnaire-based policy generators are intended to help developers with the task of policy creation. However, generated policies depend on the generators' designs as well as developers' abilities to correctly answer privacy questions on their apps. In this study we show that policies generated with popular policy generators are often not reflective of apps' privacy practices. We believe that policy generation can be improved by supplementing the questionnaire-based approach with code analysis. We design and implement PrivacyFlash Pro, a privacy policy generator for iOS apps that leverages static analysis. PrivacyFlash Pro identifies code signatures-composed of Plist permission strings, framework imports, class instantiations, authorization methods, and other evidence-that are mapped to privacy practices expressed in privacy policies. Resources from package managers are used to identify libraries. We tested PrivacyFlash Pro in a usability study with 40 iOS app developers and received promising results both in terms of reliably identifying apps' privacy practices as well as on its usability. We measured an F-1 score of 0.95 for identifying permission uses. 24 of 40 developers rated PrivacyFlash Pro with at least 9 points on a scale of 0 to 10 for a Net Promoter Score of 42.5. The mean System Usability Score of 83.4 is close to excellent. We provide PrivacyFlash Pro as an open source project to the iOS developer community. In principle, our approach is platformagnostic and adaptable to the Android and web platforms as well. To increase privacy transparency and reduce compliance issues we make the case for privacy policies as software development artifacts. Privacy policy creation should become a native extension of the software development process and adhere to the mental model of software developers.
arXiv (Cornell University), 2023
Google has mandated developers to use Data Safety Sections (DSS) to increase transparency in data collection and sharing practices. In this paper, we present a comprehensive analysis of Google's Data Safety Section (DSS) using both quantitative and qualitative methods. We conduct the first large-scale measurement study of DSS using apps from Android Play store (n=1.1M). We find that there are internal inconsistencies within the reported practices. We also find trends of both over and under-reporting practices in the DSSs. Next, we conduct a longitudinal study of DSS to explore how the reported practices evolve over time, and find that the developers are still adjusting their practices. To contextualize these findings, we conduct a developer study, uncovering the process that app developers undergo when working with DSS. We highlight the challenges faced and strategies employed by developers for DSS submission, and the factors contributing to changes in the DSS. Our research contributes valuable insights into the complexities of implementing and maintaining privacy labels, underlining the need for better resources, tools, and guidelines to aid developers. This understanding is crucial as the accuracy and reliability of privacy labels directly impact their effectiveness.
2013
Users are increasingly expected to manage a wide range of security and privacy settings. An important example of this trend is the variety of users might be called upon to review permissions when they download mobile apps. Experiments have shown that most users struggle with reviewing these permissions. Earlier research efforts in this area have primarily focused on protecting users' privacy and security through the development of analysis tools and extensions intended to further increase the level of control provided to users with little regard for human factor considerations.
2018
Privacy policies emerge as the main mechanism to inform users on the way their information is managed by online service providers, and still remain the dominant approach for this purpose. Literature notes that users find difficulties in understanding privacy policies because they are usually written in technical or legal language even, although most users are unfamiliar with them. These difficulties have led most users to skip reading privacy policies and blindly accept them. In an effort to address this challenge this paper presents AppWare, a multiplatform tool that intends to improve the visualization of privacy policies for mobile applications. AppWare formulates a visualized report with the permission set of an application, which is easily understandable by a common user. AppWare aims to bridge the difficulty to read privacy policies and android's obscure permission set with a new privacy policy visualization model. To validate AppAware we conducted a survey through questionnaire aiming to evaluate AppAware in terms of installability, usability, and viability-purpose. The results demonstrate that AppAware is assessed above average by the users in all categories.
Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia, 2015
We explore mobile privacy through a survey and through usability evaluation of three privacy-preserving mobile applications. Our survey explores users' knowledge of privacy risks, as well as their attitudes and motivations to protect their privacy on mobile devices. We found that users have incomplete mental models of privacy risks associated with such devices. And, although participants believe they are primarily responsible for protecting their own privacy, there is a clear gap between their perceived privacy risks and the defenses they employ. For example, only 6% of participants use privacy-preserving applications on their mobile devices, but 83% are concerned about privacy. Our usability studies show that mobile privacy-preserving tools fail to fulfill fundamental usability goals such as learnability and intuitivenesspotential reasons for their low adoption rates. Through a better understanding of users' perception and attitude towards privacy risks, we aim to inform the design of privacy-preserving mobile applications. We look at these tools through users' eyes, and provide recommendations to improve their usability and increase user-acceptance. CCS Concepts •Security and privacy → Usability in security and privacy; •Human-centered computing → HCI design and evaluation methods;
Our personal information, habits, likes and dislikes can be all deduced from our mobile devices. Safeguarding mobile privacy is therefore of great concern. Transparency and individual control are bedrock principles of privacy but making informed choices about which mobile apps to use has been shown to be difficult. In order to understand the dynamics of information collection in mobile apps and to demonstrate the value of transparent access to the details of mobile applications information access permissions, we have gathered information about 528,433 apps on Google Play, and analyzed the permissions requested by each app. We develop a quantitative measure of the risk posed by apps by devising a ‘sensitivity score’ to represent the number of occurrences of permissions that read personal information about users where network communication is possible. We found that 54% of apps do not access any personal data. The remaining 46% collect between 1 to 20 sensitive permissions and have th...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of the American Medical Informatics Association : JAMIA, 2015
Proceedings on Privacy Enhancing Technologies
Proceedings of the Pre-ECIS 2014 eHealth Workshop, 2014
2016
Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2014
Social Science Computer Review, 2018
Proceedings 2017 Network and Distributed System Security Symposium, 2017
… of the Seventh Symposium on Usable …, 2011
Mobile Networks and Applications
Proceedings on Privacy Enhancing Technologies, 2019
CHI Conference on Human Factors in Computing Systems, 2022
Personal and Ubiquitous Computing, 2009
Is Downloading this App Consistent with my Values? Conceptualizing a Value-Centered Privacy Assistant, 2021
Symposium On Usable Privacy and Security, 2016