Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2017, Journal of Network and Computer Applications
Reputation systems are useful to assess the trustworthiness of potential transaction partners, but also a potential threat to privacy since rating profiles reveal users' preferences. Anonymous reputation systems resolve this issue, but make it difficult to assess the trustworthiness of a rating. We introduce a privacypreserving reputation system that enables anonymous ratings while making sure that only authorized users can issue ratings. In addition, ratings can be endorsed by other users. A user who has received a pre-defined number of endorsements can prove this fact, and be rewarded e.g. by receiving a "Premium member" status. The system is based on advanced cryptographic primitives such as Chaum-Pedersen blind signatures, verifiable secret sharing and oblivious transfer.
Reputation systems are emerging as a promising method for enabling electronic transactions with unknown entities to be conducted with some level of confidence. Consequently the topic of reputation systems attracts increasing attention from players in the e-business industry. However, it seems that up until now privacy issues have not been given a fair treatment in the literature on reputation systems. We identify two different concerns for privacy in reputation systems: one is for the feedback provider and the other is for the feedback target. Possible solutions for each of these concerns are proposed based on electronic cash technology and designated verifier proofs. These are described within a new architecture for managing reputation certificates.
Lecture Notes in Computer Science, 2015
We consider reputation systems where users are allowed to rate products that they purchased previously. To obtain trustworthy reputations, they are allowed to rate these products only once. As long as they do so, the users stay anonymous. Everybody is able to detect users deviating from the rate-products-only-once policy and the anonymity of such dishonest users can be revoked by a system manager. In this paper we present formal models for such reputation systems and their security. Based on group signatures we design an efficient reputation system that meets all our requirements.
Lecture Notes in Computer Science, 2018
We consider reputation systems in the Universal Composability Framework where users can anonymously rate each others products that they purchased previously. To obtain trustworthy, reliable, and honest ratings, users are allowed to rate products only once. Everybody is able to detect users that rate products multiple times. In this paper we present an ideal functionality for such reputation systems and give an efficient realization that is usable in practical applications.
2018
We present CLARC (Cryptographic Library for Anonymous Reputation and Credentials), an anonymous credentials system (ACS) combined with an anonymous reputation system. Using CLARC, users can receive attribute-based credentials from issuers. They can efficiently prove that their credentials satisfy complex (access) policies in a privacy-preserving way. This implements anonymous access control with complex policies. Furthermore, CLARC is the first ACS that is combined with an anonymous reputation system where users can anonymously rate services. A user who gets access to a service via a credential, also anonymously receives a review token to rate the service. If a user creates more than a single rating, this can be detected by anyone, preventing users from spamming ratings to sway public opinion. To evaluate feasibility of our construction, we present an open-source prototype implementation.
IFIP Advances in Information and Communication Technology, 2010
Trust and Reputation systems in distributed environments attain widespread interest as online communities are becoming an inherent part of the daily routine of Internet users. Several models for Trust and Reputation have been suggested recently, among them the Knots model . The Knots model provides a member of a community with a method to compute the reputation of other community members. Reputation in this model is subjective and tailored to the taste and choices of the computing member and those members that have similar views, i.e. the computing member's Trust-Set. A discussion on privately computing trust in the Knots model appears in . The present paper extends and improves [16] by presenting three efficient and private protocols to compute trust in trust based reputation systems that use any trust-sets based model. The protocols in the paper are rigorously proved to be private against a semi-honest adversary given standard assumptions on the existence of an homomorphic, semantically secure, public key encryption system. The protocols are analyzed and compared in terms of their privacy characteristics and communication complexity.
2006
Abstract In ubiquitous networks, the multiple devices carried by an user may unintentionally expose information about her habits or preferences. This information leakage can compromise the users' right to privacy. A common approach to increase privacy is to hide the user real identity under a pseudonym. Unfortunately, pseudonyms may interfere with the reputation systems that are often used to assert the reliability of the information provided by the participants in the network.
ABSTRACT We present three different privacy preserving protocols for computing reputation. They vary in strength in terms of preserving privacy, however, a common thread in all three protocols is that they are fully decentralized and efficient. Our protocols that are resilient against semi-honest adversaries and non-disruptive malicious adversaries have linear and loglinear communication complexity respectively. We evaluate our proposed protocols on data from the real web of trust of Advogato. org.
2010
Abstract���The secure sum protocol is a well-known protocol for computing the sum of private inputs from distributed entities such that the inputs remain private. In this paper we present protocols for computing reputation in a privacy preserving manner that are inspired by the secure sum protocol. We provide a protocol that is secure under the semi-honest adversarial model as well as one that is secure under the stronger non-disruptive malicious model.
2011
Reputation systems offer web service consumers an effective solution for selecting web service providers who meet their Quality of Service (QoS) expectations. A reputation system computes the reputation of a provider as an aggregate of the feedback submitted by consumers. Truthful feedback is clearly a prerequisite for accurate reputation scores. However, it has been observed that users of a reputation system often hesitate in providing truthful feedback, mainly due to the fear of reprisal from target entities. We present a privacy preserving reputation protocol that enables web service consumers to provide feedback about web service providers in a private and thus uninhibited manner.
Workshop of the 1st International Conference on Security and Privacy for Emerging Areas in Communication Networks, 2005., 2005
In online communities, the users typically do not meet personally, and, thus, they have to estimate the trustworthiness of the other parties using other means. To assist these estimations, various reputation systems have been developed. But collecting the required reputation information, which, essentially, is information about the user's past, also creates privacy concerns. In this paper, we examine how the distribution of reputation management using P2P networks deals with the privacy concerns of processing reputation information. We analyze the distributed reputation management from three angles: how the requirements of fair use practices should be reflected on the system design, what classes of information is leaked and, finally, how to manage the risks related to the social and technical issues.
2013 IEEE International Conference on Communications (ICC), 2013
Reputation systems allow to estimate the trustworthiness of entities based on their past behavior. Electronic commerce, peer-to-peer routing and collaborative environments, just to cite a few, highly benefit from using reputation systems. To guarantee an accurate estimation, reputation systems typically rely on a central authority, on the identification and authentication of all the participants, or both. In this paper, we go a step further by presenting a distributed reputation mechanism which is robust against malicious behaviors and that preserves the privacy of its clients. Guaranteed error bounds on the estimation are provided.
Lecture Notes in Computer Science, 2004
Previous studies have been suggestive of the fact that reputation ratings may be provided in a strategic manner for reasons of reciprocation and retaliation, and therefore may not properly reflect the trustworthiness of rated parties. It thus appears that supporting privacy of feedback providers could improve the quality of their ratings. We argue that supporting perfect privacy in decentralized reputation systems is impossible, but as an alternative present three probabilistic schemes that support partial privacy. On the basis of these schemes, we offer three protocols that allow ratings to be privately provided with high probability in decentralized additive reputation systems. Ø is too limited or outdated, so that it cannot derive a meaningful reputation measure regarding the trustworthiness of the target agent.
Journal of Internet Services and Applications, 2011
Trust plays a key-role in enhancing user experience at service providers. Reputation management systems are used to quantify trust, based on some reputation metrics. Anonymity is an important requirement in these systems, since most individuals expect that they will not be profiled by participating in the feedback process. Anonymous Reputation management (ARM) systems allow individuals to submit their feedback anonymously. However, this solves part of the problem. Anonymous ratings by one individual can be linked to each other. This enables the system to easily build a profile of that individual. Data mining techniques can use the profile to re-identify that individual. We call this the linkability problem. This paper presents an anonymous reputation management system that avoids the linkability problem. This is achieved by constructing a system that empowers individuals to interact and rate service providers, securely and anonymously.
2010
A reputation protocol computes the reputation of an entity by aggregating the feedback provided by other entities in the system. Reputation makes entities accountable for their behavior. Honest feedback is clearly a pre-requisite for accurate reputation scores. However, it has been observed that entities often hesitate in providing honest feedback, mainly due to the fear of retaliation. We present a privacy preserving reputation protocol which enables entities to provide feedback in a private and thus uninhibited manner.
Computers & Security, 2012
Reputation systems make the users of a distributed application accountable for their behavior. The reputation of a user is computed as an aggregate of the feedback provided by other users in the system. Truthful feedback is clearly a prerequisite for computing a reputation score that accurately represents the behavior of a user. However, it has been observed that users often hesitate in providing truthful feedback, mainly due to the fear of retaliation. We present a decentralized privacy preserving reputation protocol that enables users to provide feedback in a private and thus uninhibited manner. The protocol has linear message complexity, which is an improvement over comparable decentralized reputation protocols. Moreover, the protocol allows users to quantify and maximize the probability that their privacy will be preserved.
Lecture Notes in Computer Science, 2009
Trust and Reputation systems in distributed environments attain widespread interest as online communities are becoming an inherent part of the daily routine of Internet users. Trust-based models enable safer operation within communities to which information exchange and peer to peer interaction are centric. Several models for trust based reputation have been suggested recently, among them the Knots model . In these models, the subjective reputation of a member is computed using information provided by a set of members trusted by the latter. The present paper discusses the computation of reputation in such models, while preserving members' private information. Three different schemes for the private computation of reputation are presented, and the advantages and disadvantages in terms of privacy and communication overhead are analyzed.
2019
We introduce updatable anonymous credential systems UACS and use them to construct a new privacy-preserving incentive system. In a UACS, a user holding a credential certifying some attributes can interact with the corresponding issuer to update his attributes. During this, the issuer knows which update function is run, but does not learn the user's previous attributes. Hence the update process preserves anonymity of the user. One example for a class of update functions are additive updates of integer attributes, where the issuer increments an unknown integer attribute value v by some known value k. This kind of update is motivated by an application of UACS to incentive systems. Users in an incentive system can anonymously accumulate points, e.g. in a shop at checkout, and spend them later, e.g. for a discount. In this paper, we (1) formally define UACS and their security, (2) give a generic construction for UACS supporting arbitrary update functions, and (3) construct a new incentive system using UACS that is efficient while offering offline double-spending protection and partial spending.
Natural Language Processing, 2021
This paper investigates the situation in which exists the unshared Internet in specific areas while users in there need instant advice from others nearby. Hence, a peer-to-peer network is necessary and established by connecting all neighbouring mobile devices so that they can exchange questions and recommendations. However, not all received recommendations are reliable as users may be unknown to each other. Therefore, the trustworthiness of advice is evaluated based on the advisor's reputation score. The reputation score is locally stored in the user’s mobile device. It is not completely guaranteed that the reputation score is trustful if its owner uses it for a wrong intention. In addition, another privacy problem is about honestly auditing the reputation score on the advising user by the questioning user. Therefore, this work proposes a security model, namely Crystal, for securely managing distributed reputation scores and for preserving user privacy. Crystal ensures that the ...
2004
Reputation systems have the potential of improving the quality of on-line markets by identifying fraudulent users and subsequently dealing with these users can be prevented. The behaviour of participants involved in e-commerce can be recorded and then this information made available to potential transaction partners to make decisions to choose a suitable counterpart. Unfortunately current reputation systems suffer from various vulnerabilities. Solutions for many of these problems will be discussed. One of the major threats is that of unfair feedback. A large number of negative or positive feedbacks could be submitted to a particular user with the aim to either downgrade or upgrade the user's reputation. As a result the produced reputation does not reflect the user's true trustworthiness. To overcome this threat a variation of Bayesian Reputation system is proposed. The proposed scheme is based on the subjective logic framework proposed Josang et al. [65]. The impact of unfair feedback is countered through some systematic approaches proposed in the scheme. Lack of anonymity for participants leads to reluctance to provide negative feedback. A novel solution for anonymity of feedback providers is proposed to allow participants to provide negative feedback when appropriate without fear of retaliation. The solution is based on several primitive cryptographic mechanisms; e-cash, designated verifier proof and knowledge proof. In some settings it is desirable for the reputation owner to control the distribution of its own reputation and to disclose this at its discretion to the intended parties. To realize this, a solution based on a certificate mechanism is proposed. This solution allows the reputation owner to keep the certificate and to distribute its reputation while not being able to alter that information without detection. The proposed solutions cater for two modes of reputation systems: centralised and decentralised. The provision of an off-line reputation system is discussed by proposing a new solution using certificates. This is achieved through the delegation concept and a variant of digital signature schemes known as proxy signatures. v The thesis presents a security architecture of reputation systems which consists of different elements to safeguard reputation systems from malicious activities. Elements incorporated into this architecture include privacy, verifiability and availability. The architecture also introduces Bayesian approach to counter security threat posed by reputation systems. This means the proposed security architecture in the thesis is a combination of two prominent approaches, namely, Bayesian and cryptographic, to provide security for reputation systems. The proposed security architecture can be used as a basic framework for further development in identifying and incorporating required elements so that a total security solution for reputation systems can be achieved. vi Declaration The work contained in this thesis has not been previously submitted for a degree or diploma at any higher education institution. To the best of my knowledge and belief, the thesis contains no material previously published or written by another person except where due reference is made.
Proceedings of the 2005 ACM symposium on Applied computing, 2005
In this paper we present a novel approach to enable untraceable communication between pseudonyms. Our work provides strong sender and recipient anonymity by eliminating the need to know of each other's address. We use a variation of Chaum mixes to achieve unlinkability between sender and recipient and introduce a concept called extended destination routing (EDR) which relies on routing headers constructed in multiple layers of encryption and published in a distributed hash table (DHT). In order to communicate, a sender requests from the DHT the recipient's routing header, which is extended and used for routing the message via a mix cascade to this recipient. This work was performed in the context of the UniTEC reputation system and describes the functionality of its anonymous communication layer, which is completely independent of the other UniTEC layers. Although trust and reputation systems in general are typical application areas for our contribution, the presented concepts are suitable for various other application areas as well. We have implemented a prototype of UniTEC and present the first results from an ongoing evaluation in our network emulation testbed.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.