Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2011, 2011 International Conference on Collaboration Technologies and Systems (CTS)
As sensors become ever more prevalent, more and more information will be collected about each of us. A longterm research question is how best to support beneficial uses while preserving individual privacy. Presence systems are an emerging class of applications that support collaboration. These systems leverage pervasive sensors to estimate end-user location, activities, and available communication channels. Because such presence data are sensitive, to achieve wide-spread adoption, sharing models must reflect the privacy and sharing preferences of the users. To reflect users' collaborative relationships and sharing desires, we introduce CollaPSE security, in which an individual has full access to her own data, a third party processes the data without learning anything about the data values, and users higher up in the hierarchy learn only statistical information about the employees under them. We describe simple schemes that efficiently realize CollaPSE security for time series data. We implemented these protocols using readily available cryptographic functions, and integrated the protocols with FXPAL's myUnity presence system.
As sensors become ever more prevalent, more and more information will be collected about each of us. A long-term research question is how best to support beneficial analysis of such data while preserving individual privacy. Awareness systems represent an emerging class of applications supporting both business and social functions that leverage pervasive sensors to detect and report end-user physical state, activities, and available communication channels. To buy into the system, however, users must be able to control how information about them is shared. We introduce "need to know" security in which an individual has full access to her own data, a third party processes the data without learning anything about the data values, and other users, such as analysts, learn only the desired statistics. Our novel privacy mechanism for time series data gives users a high level of control over their individual data while allowing storage of data and computation of summary statistics to take place on untrusted machines. The mechanism supports computation of simple statistics across multiple users whose data have been encrypted under distinct keys. We designed key structures and extensions to provide a family of efficient noninteractive "need to know" protocols for time series data. We implemented the mechanism and integrated it with MyUnity, a prototype awareness system.
2000
Abstract Providing information about other users and their activites is a central function of many collaborative applications. The data that provide this" presence awareness" are usually automatically generated and highly dynamic. For example, services such as AOL Instant Messenger allow users to observe the status of one another and to initiate and participate in chat sessions. As such services become more powerful, privacy and security issues regarding access to sensitive user data become critical.
Lecture Notes in Computer Science, 2006
One distinctive feature of pervasive computing environments is the common need to gather and process context information about real persons. Unfortunately, this unavoidably affects persons' privacy. Each time someone uses a cellular phone, a credit card, or surfs the web, he leaves a trace that is stored and processed. In a pervasive sensing environment, however, the amount of information collected is much larger than today and also might be used to reconstruct personal information with great accuracy. The question we address in this paper is how to control dissemination and flow of personal data across organizational, and personal boundaries, i.e., to potential addressees of privacy relevant information. This paper presents the User-Centric Privacy Framework (UCPF). It aims at protecting a user's privacy based on the enforcement of privacy preferences. They are expressed as a set of constraints over some set of context information. To achieve the goal of cross-boundary control, we introduce two novel abstractions, namely Transformations and Foreign Constraints, in order to extend the possibilities of a user to describe privacy protection criteria beyond the expressiveness usually found today. Transformations are understood as any process that the user may define over a specific piece of context. This is a main building block for obfuscating -or even plainly lying about -the context in question. Foreign Constraints are an important complementing extension because they allow for modeling conditions defined on external users that are not the tracked individual, but may influence disclosure of personal data to third parties. We are confident that these two easy-to-use abstractions together with the general privacy framework presented in this paper constitute a strong contribution to the protection of the personal privacy in pervasive computing environments.
… and Communications Security, 2010
Location Based Services (LBSs) introduce several privacy issues, the most relevant ones being: (i) how to anonymize a user; (ii) how to specify the level of anonymity; and, (iii) how to guarantee to a given user the same level of desired anonymity for all of his requests. Anonymizing the user within k potential users is a common solution to (i). A recent work [28] highlighted how specifying a practical value of k could be a difficult choice for the user, hence introducing a feeling based model: a user defines the desired level of anonymity specifying a given area (e.g. a shopping mall). The proposal sets the level of anonymity (ii) as the popularity of the area-popularity being measured via the entropy of the footprints of the visitors in that area. To keep the privacy level constant (iii), the proposal conceals the user requests always within an area of the same popularity-independently from the current user's position.
Journal of Universal Computer Science, 2010
Protection of personal data in the Internet is already a challenge today. Users have to actively look up privacy policies of websites and decide whether they can live with the terms of use. Once discovered, they are forced to make a "'take or leave"' decision. In future living and working environments, where sensors and context-aware services are pervasive, this becomes an even greater challenge and annoyance. The environment is much more personalized and users cannot just "'leave"'. They require measures to prevent, avoid and detect misuse of sensitive data, as well as to be able to negotiate the purpose of use of data. We present a novel model of privacy protection, complementing the notion of enterprise privacy with the incorporation of personal privacy towards a holistic privacy management system. Our approach allows non-expert users not only to negotiate the desired level of privacy in a rather automated and simple way, but also to track and monitor the whole life-cycle of data.
2010
The exchange of user-related sensitive data within a Pervasive Computing Environment (PCE) raises security and privacy concerns. On one hand, service providers require user authentication and authorization prior to the provision of a service, while at the same time users require anonymity, i.e., untraceability and unlinkability for their transactions. In this paper we discuss privacy and security requirements for access control in PCEs and show why a recently proposed efficient scheme [1] fails to satisfy these requirements. Furthermore, we discuss a generic approach for achieving a desired level of privacy against malicious insiders, while balancing with competing demands for access control and accountability.
2015
Abstract. In the context of ambient networks, this article describes a cryptographic protocol called Common History Extraction (CHE) pro-tocol implementing a trust management framework. All the nodes are supposed to share the same cryptographic algorithms and protocols. An entity called imprinting station provides them with two pairs of pub-lic/private keys derived from their identities. Also, two strange nodes wanting to initiate an interaction have to build a seed of trust. The trust between two nodes is based on a mutual proof of previous common met nodes.
2000
This paper discusses the main issues regarding the reliability and the efficacy of standard cryptographic techniques applied to pervasive computing. First of all, we describe a set of scenarios, where social interactions are supported by small- sized pervasive devices. These are expected to manage and exchange personal and sensible information with context-specific requirements. We present a general methodology that helps
2011
In pervasive environments, presence-based application development via Presence Management Systems (PMSs) is a key factor to optimise the management of communication channels, driving productivity increase. Solutions for presence management should satisfy the interoperability requirements, in turn providing context-centric presence analysis and privacy management. In order to push PMSs towards flexible, open and contextaware presence management, we propose some adaptation of two extensions to standard XML-based XMPP for message exchange in online communication systems. The contribution allows for more complex specification and management of nested group and privacy lists, where semantic technologies are used to map all messages into RDF vocabularies and pave the way for a broader semantic integration of heterogeneous and distributed presence information sources in the standard PMSs framework.
With the growth of presence based services, it is important to securely manage and distribute sensitive presence information such as user location. We survey techniques that are used for security and privacy of presence information. In particular, we describe the SIMPLE based presence specific authentication, integrity and confidentiality. We also discuss the IETF's common policy for geo-privacy, presence authorization for presence information privacy and distribution of different levels of presence information to different watchers. Additionally, we describe an open problem of getting the aggregated presence from the trusted server without the server knowing the presence information, and propose a solution. Finally, we discuss denial of service attacks on the presence system and ways to mitigate them.
2011 IEEE Fifth International Conference on Semantic Computing, 2011
Recent years have seen a confluence of two major trends-the increase of mobile devices such as smart phones as the primary access point to networked information and the rise of social media platforms that connect people. Their convergence supports the emergence of a new class of context-aware geosocial networking applications. While existing systems focus mostly on location, our work centers on models for representing and reasoning about a more inclusive and higher-level notion of context, including the user's location and surroundings, the presence of other people and devices, and the inferred activities in which they are engaged. A key element of our work is the use of collaborative information sharing where devices share and integrate knowledge about their context. This introduces the need for privacy and security mechanisms. We present a framework to provide users with appropriate levels of privacy to protect the personal information their mobile devices are collecting, including the inferences that can be drawn from the information. We use Semantic Web technologies to specify high-level, declarative policies that describe user information sharing preferences. We have built a prototype system that aggregates information from a variety of sensors on the phone, online sources, and sources internal to the campus intranet, and infers the dynamic user context. We show how our policy framework can be effectively used to devise better privacy control mechanisms to control information flow between users in such dynamic mobile systems.
Business applications often use such data structures as Presence Patterns for presentation of numbers of customers in service-oriented businesses including education, retailing and media. Presence Patterns contain open data derived from internal data of organizations. In this paper, we investigate different ways of defining Presence Patterns and possible privacy consequences dependent on the chosen definition. The first contribution of the paper is a definition of a family of Presence Patterns. The second contribution is a method for privacy analysis of Presence Patterns. Method for privacy analysis of presence patterns
Lecture Notes in Computer Science, 2010
This paper describes the design and implementation of a dynamic privacy management system aimed at enabling tangible privacy control and feedback in a pervasive sensor network. Our work began with the development of a potentially invasive sensor network (with high resolution video, audio, and motion tracking capabilities) featuring different interactive applications that created incentive for accepting this network as an extension of people's daily social space. A user study was then conducted to evaluate several privacy management approaches-an active badge system for both online and on-site control, on/off power switches for physically disabling the hardware, and touch screen input control. Results from a user study indicated that an active badge for on-site privacy control is the most preferable method among all provided options. We present a set of results that yield insight into the privacy/benefit tradeoff from various sensing capabilities in pervasive sensor networks and how privacy settings and user behavior relate in these environments.
2007
The main message of this position paper is that ubiquitous computing technology need not necessarily be an obstacle to privacy protection: if legal and social issues are given proper consideration, technology can also be used to allow individuals to exercise their rights. We believe that the key issue is to devise techniques able to support ambitious privacy protection policies while allowing for the flexibility required in the ambient intelligence context. We illustrate our position using three technical requirements: (1) formal specification of privacy policies, (2) trust management and (3) auditability, which show both the challenges posed by ubiquitous computing and the opportunities to strengthen privacy. For each of these requirements, we present the legal and social motivations, suggest technical challenges and provide hints on possible solutions based on our on-going work.
Companion Publication of the 2019 Conference on Computer Supported Cooperative Work and Social Computing, 2019
This one-day workshop aims to explore ubiquitous privacy research and design in the context of mobile and IoT by facilitating discourse among scholars from the networked privacy and design communities. The complexity in modern socio-technical systems points to the potential of utilizing various design techniques (e.g., speculative design, design fiction, and research through design practices) in surfacing the potential consequences of novel technologies, particularly those that traditional user studies may not reveal. The results will shed light on future privacy designs for mobile and IoT technologies from both empirical and design perspectives.
2016
A system in ubiquitous computing consists of a large amount of heterogeneous users and devices that communicate with each other. Users in this dynamic field communicate with lightweight and autonomous devices, which accentuate security problems and make them more complex. The existing mechanisms and solutions are inadequate to address new challenges mainly for problems of authentication and protection of privacy. In this paper, a new security architecture called Tree Based distributed Privacy Protection System is proposed. It supports protection of users private data and addresses the shortcomings of systems like GAIA, OpenID and User-directed Privacy Protection (UPP). Furthermore, it takes into account the domain dissociation property, in order to achieve decentralized data protection. Keywords–Ubiquitous Computing; Security; Private Data Protection; Privacy; Integrity.
2007 Fourth Annual International Conference on Mobile and Ubiquitous Systems: Networking & Services (MobiQuitous), 2007
In recent years, ubiquitous computing applications span such areas as telemedicine, banking, and transportation that require user privacy protection. The realization of contextaware ubiquitous computing exasperates existing privacy concerns. Ubiquitous computing applications demand new privacy enhancing technologies for the information and communication environments where the users are equipped with flexible and portable applications to support the capture, communication, recall, organization and reuse of diverse information. In this paper, we develop a novel scheme for the infusion of privacy into context-aware ubiquitous computing. We present Precision, a system for privacy enhanced context-aware information fusion in ubiquitous computing environments. In our scheme, privacy is defined as a set of parameters encapsulated in composite data entities called privons, through which, we aim at infusing privacy into Precision. We evaluate our proposed scheme through real interactions in implementation of privons.
Computer Science and Information Systems, 2014
In this paper, we propose a user-centric software architecture for managing Ubiquitous Health Monitoring Data (UHMD) generated from wearable sensors in a Ubiquitous Health Monitoring System (UHMS), and examine how these data can be used within privacy-preserving distributed statistical analysis. Two are the main goals of our approach. First, to enhance the privacy of patients. Second, to decongest the Health Monitoring Center (HMC) from the enormous amount of biomedical data generated by the users' wearable sensors. In our solution personal software agents are used to receive and manage the personal medical data of their owners. Moreover, the personal agents can support privacy-preserving distributed statistical analysis of the health data. To this end, we present a cryptographic protocol based on secure multi-party computations that accept as input current or archived values of users' wearable sensors. We describe a prototype implementation that performs a statistical analysis on a community of independent personal agents. Finally, experiments with up to several hundred agents confirm the viability and the effectiveness of our approach.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.