Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2018
We present CLARC (Cryptographic Library for Anonymous Reputation and Credentials), an anonymous credentials system (ACS) combined with an anonymous reputation system. Using CLARC, users can receive attribute-based credentials from issuers. They can efficiently prove that their credentials satisfy complex (access) policies in a privacy-preserving way. This implements anonymous access control with complex policies. Furthermore, CLARC is the first ACS that is combined with an anonymous reputation system where users can anonymously rate services. A user who gets access to a service via a credential, also anonymously receives a review token to rate the service. If a user creates more than a single rating, this can be detected by anyone, preventing users from spamming ratings to sway public opinion. To evaluate feasibility of our construction, we present an open-source prototype implementation.
Journal of Network and Computer Applications, 2017
Reputation systems are useful to assess the trustworthiness of potential transaction partners, but also a potential threat to privacy since rating profiles reveal users' preferences. Anonymous reputation systems resolve this issue, but make it difficult to assess the trustworthiness of a rating. We introduce a privacypreserving reputation system that enables anonymous ratings while making sure that only authorized users can issue ratings. In addition, ratings can be endorsed by other users. A user who has received a pre-defined number of endorsements can prove this fact, and be rewarded e.g. by receiving a "Premium member" status. The system is based on advanced cryptographic primitives such as Chaum-Pedersen blind signatures, verifiable secret sharing and oblivious transfer.
Lecture Notes in Computer Science, 2015
We consider reputation systems where users are allowed to rate products that they purchased previously. To obtain trustworthy reputations, they are allowed to rate these products only once. As long as they do so, the users stay anonymous. Everybody is able to detect users deviating from the rate-products-only-once policy and the anonymity of such dishonest users can be revoked by a system manager. In this paper we present formal models for such reputation systems and their security. Based on group signatures we design an efficient reputation system that meets all our requirements.
Journal of Internet Services and Applications, 2011
Trust plays a key-role in enhancing user experience at service providers. Reputation management systems are used to quantify trust, based on some reputation metrics. Anonymity is an important requirement in these systems, since most individuals expect that they will not be profiled by participating in the feedback process. Anonymous Reputation management (ARM) systems allow individuals to submit their feedback anonymously. However, this solves part of the problem. Anonymous ratings by one individual can be linked to each other. This enables the system to easily build a profile of that individual. Data mining techniques can use the profile to re-identify that individual. We call this the linkability problem. This paper presents an anonymous reputation management system that avoids the linkability problem. This is achieved by constructing a system that empowers individuals to interact and rate service providers, securely and anonymously.
Lecture Notes in Computer Science, 2018
We consider reputation systems in the Universal Composability Framework where users can anonymously rate each others products that they purchased previously. To obtain trustworthy, reliable, and honest ratings, users are allowed to rate products only once. Everybody is able to detect users that rate products multiple times. In this paper we present an ideal functionality for such reputation systems and give an efficient realization that is usable in practical applications.
Reputation systems are emerging as a promising method for enabling electronic transactions with unknown entities to be conducted with some level of confidence. Consequently the topic of reputation systems attracts increasing attention from players in the e-business industry. However, it seems that up until now privacy issues have not been given a fair treatment in the literature on reputation systems. We identify two different concerns for privacy in reputation systems: one is for the feedback provider and the other is for the feedback target. Possible solutions for each of these concerns are proposed based on electronic cash technology and designated verifier proofs. These are described within a new architecture for managing reputation certificates.
2011
Reputation systems offer web service consumers an effective solution for selecting web service providers who meet their Quality of Service (QoS) expectations. A reputation system computes the reputation of a provider as an aggregate of the feedback submitted by consumers. Truthful feedback is clearly a prerequisite for accurate reputation scores. However, it has been observed that users of a reputation system often hesitate in providing truthful feedback, mainly due to the fear of reprisal from target entities. We present a privacy preserving reputation protocol that enables web service consumers to provide feedback about web service providers in a private and thus uninhibited manner.
2002
The expression of one's opinion through endorsement is one of the simplest methods of democratic participation. The result of an endorsement can be used to evaluate whether a certain subject should deserve a higher attention. In some cases, the endorsers desire privacy protection. However, conventional paper-based endorsement systems provide neither convenience nor well privacy protection for the endorsers. In addition, current electronic anonymous voting schemes are unsuitable for anonymous endorsement. This motivates us to develop an anonymous endorsement system that can be realized on computer networks. The proposed system satisfies completeness, soundness, privacy, unreusability, eligibility, and verifiability. In practice, the proposed system can be integrated with the conventional paper-based endorsement system.
2019
We introduce updatable anonymous credential systems UACS and use them to construct a new privacy-preserving incentive system. In a UACS, a user holding a credential certifying some attributes can interact with the corresponding issuer to update his attributes. During this, the issuer knows which update function is run, but does not learn the user's previous attributes. Hence the update process preserves anonymity of the user. One example for a class of update functions are additive updates of integer attributes, where the issuer increments an unknown integer attribute value v by some known value k. This kind of update is motivated by an application of UACS to incentive systems. Users in an incentive system can anonymously accumulate points, e.g. in a shop at checkout, and spend them later, e.g. for a discount. In this paper, we (1) formally define UACS and their security, (2) give a generic construction for UACS supporting arbitrary update functions, and (3) construct a new incentive system using UACS that is efficient while offering offline double-spending protection and partial spending.
IFIP Advances in Information and Communication Technology, 2010
Trust and Reputation systems in distributed environments attain widespread interest as online communities are becoming an inherent part of the daily routine of Internet users. Several models for Trust and Reputation have been suggested recently, among them the Knots model . The Knots model provides a member of a community with a method to compute the reputation of other community members. Reputation in this model is subjective and tailored to the taste and choices of the computing member and those members that have similar views, i.e. the computing member's Trust-Set. A discussion on privately computing trust in the Knots model appears in . The present paper extends and improves [16] by presenting three efficient and private protocols to compute trust in trust based reputation systems that use any trust-sets based model. The protocols in the paper are rigorously proved to be private against a semi-honest adversary given standard assumptions on the existence of an homomorphic, semantically secure, public key encryption system. The protocols are analyzed and compared in terms of their privacy characteristics and communication complexity.
2004
Reputation systems have the potential of improving the quality of on-line markets by identifying fraudulent users and subsequently dealing with these users can be prevented. The behaviour of participants involved in e-commerce can be recorded and then this information made available to potential transaction partners to make decisions to choose a suitable counterpart. Unfortunately current reputation systems suffer from various vulnerabilities. Solutions for many of these problems will be discussed. One of the major threats is that of unfair feedback. A large number of negative or positive feedbacks could be submitted to a particular user with the aim to either downgrade or upgrade the user's reputation. As a result the produced reputation does not reflect the user's true trustworthiness. To overcome this threat a variation of Bayesian Reputation system is proposed. The proposed scheme is based on the subjective logic framework proposed Josang et al. [65]. The impact of unfair feedback is countered through some systematic approaches proposed in the scheme. Lack of anonymity for participants leads to reluctance to provide negative feedback. A novel solution for anonymity of feedback providers is proposed to allow participants to provide negative feedback when appropriate without fear of retaliation. The solution is based on several primitive cryptographic mechanisms; e-cash, designated verifier proof and knowledge proof. In some settings it is desirable for the reputation owner to control the distribution of its own reputation and to disclose this at its discretion to the intended parties. To realize this, a solution based on a certificate mechanism is proposed. This solution allows the reputation owner to keep the certificate and to distribute its reputation while not being able to alter that information without detection. The proposed solutions cater for two modes of reputation systems: centralised and decentralised. The provision of an off-line reputation system is discussed by proposing a new solution using certificates. This is achieved through the delegation concept and a variant of digital signature schemes known as proxy signatures. v The thesis presents a security architecture of reputation systems which consists of different elements to safeguard reputation systems from malicious activities. Elements incorporated into this architecture include privacy, verifiability and availability. The architecture also introduces Bayesian approach to counter security threat posed by reputation systems. This means the proposed security architecture in the thesis is a combination of two prominent approaches, namely, Bayesian and cryptographic, to provide security for reputation systems. The proposed security architecture can be used as a basic framework for further development in identifying and incorporating required elements so that a total security solution for reputation systems can be achieved. vi Declaration The work contained in this thesis has not been previously submitted for a degree or diploma at any higher education institution. To the best of my knowledge and belief, the thesis contains no material previously published or written by another person except where due reference is made.
2011
Abstract Anonymous authentication can give users the license to misbehave since there is no fear of retribution. As a deterrent, or means to revocation, various schemes for accountable anonymity feature some kind of (possibly distributed) trusted third party (TTP) with the power to identify or link misbehaving users.
Computer Security – ESORICS 2016, 2016
This paper presents an anonymous certification (AC) scheme, built over an attribute based signature (ABS). After identifying properties and core building blocks of anonymous certification schemes, we identify ABS limitations to fulfill AC properties, and we propose a new system model along with a concrete mathematical construction based on standard assumptions and the random oracle model. Our solution has several advantages. First, it provides a data minimization cryptographic scheme, permitting the user to reveal only required information to any service provider. Second, it ensures unlinkability between the different authentication sessions, while preserving the anonymity of the user. Third, the derivation of certified attributes by the issuing authority relies on a non interactive protocol which provides an interesting communication overhead.
Electronics
Course evaluations have become a common practice in most academic environments. To enhance participation, evaluations should be private and ensure a fair result. Related privacy-preserving method and technologies (e.g., anonymous credentials, Privacy Attribute-Based Credentials, and domain signatures) fail to address, at least in an obvious way, the minimal security and practicality requirements. In this paper, we propose, evaluate, and implement an efficient, anonymous evaluation protocol for academic environments. The protocol borrows ideas from well-known and efficient cryptographic approaches for anonymously submitting ballots in Internet elections for issuing one-time credentials and for anonymously broadcasting information. The proposed protocol extends the above approaches in order to provably satisfy properties such as the eligibility, privacy, fairness and verifiability of the evaluation system. Compared to the state of the art, our approach is less complex and more effecti...
2008
Abstract. This paper presents an efficient anonymous credential system that in-cludes two variants. One is a system that lacks a credential revoking protocol, but provides perfect anonymity-unlinkability and computational unforgeability under the strong Diffie-Hellman assumption. It is more efficient than existing creden-tial systems with no revocation. The other is a system that provides revocation as well as computational anonymity-unlinkability and unforgeability under the strong Diffie-Hellman and decision linear Diffie-Hellman assumptions. This sys-tem provides two types of revocation simultaneously: one is to blacklist a user who acted wrong so that he can no longer use his credential, and the other is identifying a user who acted wrong from his usage of credential. Both systems are provably secure under the above-mentioned assumptions in the standard model.
2006
Abstract In ubiquitous networks, the multiple devices carried by an user may unintentionally expose information about her habits or preferences. This information leakage can compromise the users' right to privacy. A common approach to increase privacy is to hide the user real identity under a pseudonym. Unfortunately, pseudonyms may interfere with the reputation systems that are often used to assert the reliability of the information provided by the participants in the network.
Proceedings 2014 Network and Distributed System Security Symposium, 2014
Anonymous credentials provide a powerful tool for making assertions about identity while maintaining privacy. However, a limitation of today's anonymous credential systems is the need for a trusted credential issuer -which is both a single point of failure and a target for compromise. Furthermore, the need for such a trusted issuer can make it challenging to deploy credential systems in practice, particularly in the ad hoc network setting (e.g., anonymous peer-to-peer networks) where no single party can be trusted with this responsibility.
2010
A reputation protocol computes the reputation of an entity by aggregating the feedback provided by other entities in the system. Reputation makes entities accountable for their behavior. Honest feedback is clearly a pre-requisite for accurate reputation scores. However, it has been observed that entities often hesitate in providing honest feedback, mainly due to the fear of retaliation. We present a privacy preserving reputation protocol which enables entities to provide feedback in a private and thus uninhibited manner.
Natural Language Processing, 2021
This paper investigates the situation in which exists the unshared Internet in specific areas while users in there need instant advice from others nearby. Hence, a peer-to-peer network is necessary and established by connecting all neighbouring mobile devices so that they can exchange questions and recommendations. However, not all received recommendations are reliable as users may be unknown to each other. Therefore, the trustworthiness of advice is evaluated based on the advisor's reputation score. The reputation score is locally stored in the user’s mobile device. It is not completely guaranteed that the reputation score is trustful if its owner uses it for a wrong intention. In addition, another privacy problem is about honestly auditing the reputation score on the advising user by the questioning user. Therefore, this work proposes a security model, namely Crystal, for securely managing distributed reputation scores and for preserving user privacy. Crystal ensures that the ...
2013 IEEE International Conference on Communications (ICC), 2013
Reputation systems allow to estimate the trustworthiness of entities based on their past behavior. Electronic commerce, peer-to-peer routing and collaborative environments, just to cite a few, highly benefit from using reputation systems. To guarantee an accurate estimation, reputation systems typically rely on a central authority, on the identification and authentication of all the participants, or both. In this paper, we go a step further by presenting a distributed reputation mechanism which is robust against malicious behaviors and that preserves the privacy of its clients. Guaranteed error bounds on the estimation are provided.
A credential system is a system in which users can obtain credentials from organizations and demonstrate possession of these credentials. Such a system is anonymous when transactions carried out by the same user cannot be linked. An anonymous credential system is of significant practical relevance because it is the best means of providing privacy for users. In this paper we propose a practical anonymous credential system that is based on the strong RSA assumption and the decisional Diffie-Hellman assumption modulo a safe prime product and is considerably superior to existing ones: (1) We give the first practical solution that allows a user to unlinkably demonstrate possession of a credential as many times as necessary without involving the issuing organization. (2) To prevent misuse of anonymity, our scheme is the first to offer optional anonymity revocation for particular transactions. (3) Our scheme offers separability: all organizations can choose their cryptographic keys independently of each other. Moreover, we suggest more effective means of preventing users from sharing their credentials, by introducing all-or-nothing sharing: a user who allows a friend to use one of her credentials once, gives him the ability to use all of her credentials, i.e., taking over her identity. This is implemented by a new primitive, called circular encryption, which is of independent interest, and can be realized from any semantically secure cryptosystem in the random oracle model.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.