Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2020
…
4 pages
1 file
Consumers are largely unaware regarding the use being made to the data that they generate through smart devices, or their GDPR-compliance, since such information is typically hidden behind vague privacy policy documents, which are often lengthy, difficult to read (containing legal terms and definitions) and frequently changing. This paper describes the activities of the CAP-A project, whose aim is to apply crowdsourcing techniques to evaluate the privacy friendliness of apps, and to allow users to better understand the content of Privacy Policy documents and, consequently, the privacy implications of using any given mobile app. To achieve this, we developed a set of tools that aim at assisting users to express their own privacy concerns and expectations and assess the mobile apps’ privacy properties through collective intelligence.
2020
Consumers are largely unaware regarding the use being made to the data that they generate through smart devices, or their GDPR-compliance, since such information is typically hidden behind vague privacy policy documents, which are often lengthy, difficult to read (containing legal terms and definitions) and frequently changing. This paper describes the activities of the CAP-A project, whose aim is to apply crowdsourcing techniques to evaluate the privacy friendliness of apps, and to allow users to better understand the content of Privacy Policy documents and, consequently, the privacy implications of using any given mobile app. To achieve this, we developed a set of tools that aim at assisting users to express their own privacy concerns and expectations and assess the mobile apps’ privacy properties through collective intelligence.
2020
The utilisation of personal data by mobile apps is often hidden behind vague Privacy Policy documents, which are typically lengthy, difficult to read (containing legal terms and definitions) and frequently changing. This paper discusses a suite of tools developed in the context of the CAP-A project, aiming to harness the collective power of users to improve their privacy awareness and to promote privacy-friendly behaviour by mobile apps. Through crowdsourcing techniques, users can evaluate the privacy friendliness of apps, annotate and understand Privacy Policy documents, and help other users become aware of privacy-related aspects of mobile apps and their implications, whereas developers and policy makers can identify trends and the general stance of the public in privacy-related matters. The tools are available for public use in: https://cap-a.eu/tools/.
2021
The CAP-A project is offering socio-technical tools to promote collective awareness and informed consent, whereby data collection and use by digital products are driven by the expectations and needs of the consumers. Theme-driven events aimed at rating the privacy friendliness of apps of specific categories and at annotating their Privacy Policy documents have helped us generate informative statistics about the behaviour and mindset of citizens and the privacy-consciousness of mobile apps.
2012
ABSTRACT Smartphone security research has produced many useful tools to analyze the privacy-related behaviors of mobile apps. However, these automated tools cannot assess people's perceptions of whether a given action is legitimate, or how that action makes them feel with respect to privacy. For example, automated tools might detect that a blackjack game and a map app both use one's location information, but people would likely view the map's use of that data as more legitimate than the game.
Proceedings on Privacy Enhancing Technologies, 2019
The app economy is largely reliant on data collection as its primary revenue model. To comply with legal requirements, app developers are often obligated to notify users of their privacy practices in privacy policies. However, prior research has suggested that many developers are not accurately disclosing their apps’ privacy practices. Evaluating discrepancies between apps’ code and privacy policies enables the identification of potential compliance issues. In this study, we introduce the Mobile App Privacy System (MAPS) for conducting an extensive privacy census of Android apps. We designed a pipeline for retrieving and analyzing large app populations based on code analysis and machine learning techniques. In its first application, we conduct a privacy evaluation for a set of 1,035,853 Android apps from the Google Play Store. We find broad evidence of potential non-compliance. Many apps do not have a privacy policy to begin with. Policies that do exist are often silent on the pract...
Proceedings 2017 Network and Distributed System Security Symposium, 2017
Mobile apps have to satisfy various privacy requirements. Notably, app publishers are often obligated to provide a privacy policy and notify users of their apps' privacy practices. But how can a user tell whether an app behaves as its policy promises? In this study we introduce a scalable system to help analyze and predict Android apps' compliance with privacy requirements. We discuss how we customized our system in a collaboration with the California Office of the Attorney General. Beyond its use by regulators and activists our system is also meant to assist app publishers and app store owners in their internal assessments of privacy requirement compliance. Our analysis of 17,991 free Android apps shows the viability of combining machine learning-based privacy policy analysis with static code analysis of apps. Results suggest that 71% of apps that lack a privacy policy should have one. Also, for 9,050 apps that have a policy, we find many instances of potential inconsistencies between what the app policy seems to state and what the code of the app appears to do. In particular, as many as 41% of these apps could be collecting location information and 17% could be sharing such with third parties without disclosing so in their policies. Overall, each app exhibits a mean of 1.83 potential privacy requirement inconsistencies.
Proceedings of the 9th IEEE/ACM International Conference on Mobile Software Engineering and Systems
Privacy labels provide an easy and recognizable overview of data collection practices adopted by mobile apps developers. Specifically, on the Apple App Store, privacy labels are displayed on each mobile app's page and summarize what data is collected by the app, how it is used, and for what purposes it is needed. Starting from the release of iOS version 14.3 developers are required to provide privacy labels for their applications. We conducted a large-scale empirical study, collecting and analyzing the privacy labels of 17, 312 apps published on the App Store, to understand and characterize how sensitive data is collected and shared. The results of our analysis highlight important criticalities about the collection and sharing of personal data for tracking purposes. In particular, on average free applications collect more sensitive data, the majority of data is collected in an unanonimyzed form, and a wide range of sensitive information are collected for tracking purposes. The analysis provides also evidence to support the decision-making of users, platform maintainers, and regulators. Furthermore, we repeated the data collection and analysis after seven months, following the introduction of additional run-time tracking controls by Apple. Comparing the two datasets, we observed that the newly introduced measures resulted in a statistically significant decrease in the number of apps that collect data for tracking purposes. At the same time, we observed a growth in overall data collection.
Proceedings 2021 Network and Distributed System Security Symposium
Various privacy laws require mobile apps to have privacy policies. Questionnaire-based policy generators are intended to help developers with the task of policy creation. However, generated policies depend on the generators' designs as well as developers' abilities to correctly answer privacy questions on their apps. In this study we show that policies generated with popular policy generators are often not reflective of apps' privacy practices. We believe that policy generation can be improved by supplementing the questionnaire-based approach with code analysis. We design and implement PrivacyFlash Pro, a privacy policy generator for iOS apps that leverages static analysis. PrivacyFlash Pro identifies code signatures-composed of Plist permission strings, framework imports, class instantiations, authorization methods, and other evidence-that are mapped to privacy practices expressed in privacy policies. Resources from package managers are used to identify libraries. We tested PrivacyFlash Pro in a usability study with 40 iOS app developers and received promising results both in terms of reliably identifying apps' privacy practices as well as on its usability. We measured an F-1 score of 0.95 for identifying permission uses. 24 of 40 developers rated PrivacyFlash Pro with at least 9 points on a scale of 0 to 10 for a Net Promoter Score of 42.5. The mean System Usability Score of 83.4 is close to excellent. We provide PrivacyFlash Pro as an open source project to the iOS developer community. In principle, our approach is platformagnostic and adaptable to the Android and web platforms as well. To increase privacy transparency and reduce compliance issues we make the case for privacy policies as software development artifacts. Privacy policy creation should become a native extension of the software development process and adhere to the mental model of software developers.
2013
Users are increasingly expected to manage a wide range of security and privacy settings. An important example of this trend is the variety of users might be called upon to review permissions when they download mobile apps. Experiments have shown that most users struggle with reviewing these permissions. Earlier research efforts in this area have primarily focused on protecting users' privacy and security through the development of analysis tools and extensions intended to further increase the level of control provided to users with little regard for human factor considerations.
Personal and Ubiquitous Computing, 2009
A number of mobile applications have emerged that allow users to locate one another. However, people have expressed concerns about the privacy implications associated with this class of software, suggesting that broad adoption may only happen to the extent that these concerns are adequately addressed. In this article, we report on our work on PEOPLEFINDER, an application that enables cell phone and laptop users to selectively share their locations with others (e.g. friends, family, and colleagues). The objective of our work has been to better understand people's attitudes and behaviors towards privacy as they interact with such an application, and to explore technologies that empower users to more effectively and efficiently specify their privacy preferences (or "policies"). These technologies include user interfaces for specifying rules and auditing disclosures, as well as machine learning techniques to refine user policies based on their feedback. We present evaluations of these technologies in the context of one laboratory study and three field studies.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Proceedings 2019 Workshop on Usable Security
Personal and Ubiquitous Computing, 2017
Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia, 2015
Lecture Notes in Computer Science, 2018
2016
arXiv (Cornell University), 2023
Proceedings of the Pre-ECIS 2014 eHealth Workshop, 2014
Proceedings on Privacy Enhancing Technologies
Proceedings on Privacy Enhancing Technologies
Proceedings on Privacy Enhancing Technologies
Social Science Computer Review