Kalpana Kalpana (Editor)

Reputation system

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit
Reputation system jassssocsurreyacuk914img1png

Building web reputation systems


Reputation systems are programs that allow users to rate each other in online communities in order to build trust through reputation. Some common uses of these systems can be found on E-commerce websites such as eBay, Amazon.com, and Etsy as well as online advice communities such as Stack Exchange. These reputation systems represent a significant trend in "decision support for Internet mediated service provisions." With the popularity of online communities for shopping, advice, and exchange of other important information, reputation systems are becoming vitally important to the online experience. The idea of reputations systems is that even if the consumer can't physically try a product or service, or see the person providing information, that they can be confident in the outcome of the exchange through trust built by recommender systems.

Contents

Collaborative filtering, used most commonly in recommender systems, are related to reputation systems in that they both collect ratings from members of a community. The core difference between reputation systems and collaborative filtering is the ways in which they use user feedback. In collaborative filtering, the goal is to find similarities between users in order to recommend products to customers. The role of reputation systems, in contrast, is to gather a collective opinion in order to build trust between users of an online community.

Online

Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that a function of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.

Reputation banks

The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services. User can build up reputation and trust in individual systems but don’t have the ability to use them in other systems. Rachel Botsman and Roo Rogers argue in their book What’s Mine is Yours (2010), that 'it is only a matter of time before there is some form of network that aggregates reputation capital across multiple form of Collaborative Consumption'. These systems, often referred to as reputation banks, try to give users a platform to manage their Reputation capital across multiple systems.

Maintaining effective reputation systems

The main function of reputation systems is to build a sense of trust within users of online communities. Much like the brick and mortar stores, the idea of trust and reputation can be built through customer feedback. Paul Resnick from the Association of Computing Machinery describes three properties that are necessary for reputations systems to operate effectively.

  1. Entities that have a long lifetime and create accurate expectations of future interactions
  2. Capture and distribute feedback about prior interactions
  3. Use feedback to guide trust

These three entities are critically important in building and all revolve around one important element: user feedback. User feedback in reputation systems, whether it be in the form of comments, ratings, or recommendations, is a valuable piece of information. Without the presence of user feedback reputation systems are not able to sustain the environment of trust needed. Eliciting user feedback can manifest three related problems.[1] The first of these problems is the willingness of users to provide feedback when the option to do so is not required. If an online community has a large stream of interactions happening, but no feedback is gathered the environment of trust and reputation cannot be formed. The second of these problems is gaining negative feedback from users. Many factors contribute to users not wanting to give negative feedback, the most prominent being a fear of retaliation. When feedback is not anonymous, many users fear retaliation if negative feedback is given. The final problem related to user feedback is eliciting honest feedback from users. Although there is no concrete method for ensuring the truthfulness of if a community of honest feedback is established, new users will be more likely to give honest feedback as well.

Other pitfalls to effective reputation systems described by A. Josang et al. include change of identities and discrimination. Again these ideas tie back to the idea of regulating user actions in order to gain accurate and consistent user feedback. When analyzing different types of reputation systems it is important to look at these specific features in order to determine the effectiveness of each system.

Notable examples of practical applications

  • Search: web (see PageRank)
  • eCommerce: eBay, Epinions, Bizrate, Trustpilot
  • Social news: Reddit, Digg, Imgur
  • Programming communities: Advogato, freelance marketplaces, Stack Overflow
  • Wikis: Increase contribution quantity and quality (Dencheva, Prause & Prinz 2011)
  • Internet Security: TrustedSource
  • Question-and-Answer sites: Quora, Yahoo! Answers, Gutefrage.net, Stack Exchange
  • Email: anti-spam techniques, reputation lookup (RapLeaf)
  • Personal Reputation: CouchSurfing (for travelers),
  • Non Governmental organizations (NGOs): GreatNonProfits.org, GlobalGiving
  • Professional reputation of translators and translation outsourcers: BlueBoard at ProZ.com
  • All purpose reputation system: Yelp, Inc.
  • Academia: general bibliometic measures, e.g. the h-index of a researcher.
  • Reputation as a resource

    High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and asking price on eBay, indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.

    Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation. Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question.

    Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a driver with a high ride acceptance score (a metric often used for driver reputation) on a ride-sharing service may opt to be more selective about his or her clientele, decreasing their acceptance score but improving their driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.

    Attacks and defense

    Reputation systems are in general vulnerable to attacks, and many types of attacks are possible. As the reputation system tries to generate an accurate assessment based on various factors including but not limited to unpredictable user size and potential adversarial environments, the attacks and defense mechanisms play an important role in the reputation systems.

    Attack classification of reputation system is based on identifying which system components and design choices are the targets of attacks. While the defense mechanisms are concluded based on existing reputation systems.

    Attacker model

    The capability of the attacker is determined by several characteristics, e.g., the location of the attacker related to the system (insider attacker vs. outsider attacker). The insiders are those entities who have legitimate access to the system and can participate according to the system specifications, while the outsider is any unauthorized entity in the system who may or may not be identifiable.

    As the outsider attack is much more similar to other attacks in computer system environment, the insider attack gets more focus in the reputation system. Usually, here are some common assumptions: the attackers are motivated either by selfish or malicious intent and the attackers can either work alone or in coalitions.

    Attack classification

    The attacks against reputation systems are classified based on the goals of the reputation systems targeted by the attacks.

  • Self-promoting Attack. The attacker falsely increases their own reputation by falsely increase it. A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence. A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically.
  • Whitewashing Attack. The attacker uses some system vulnerability to update their reputation. This attack usually targets the reputation system’s formulation that is used to calculate the reputation result. Whitewashing attack can be combined with other types of attacks to make each one more effective.
  • Slandering Attack. The attacker reports false data to lower the reputation of the victim nodes. It can be achieved both by a single attacker or a coalition of attackers.
  • Orchestrated Attack. The attacker orchestrates their efforts and employs several of the above strategies. One famous example of an orchestrated attack is known as an oscillation attack.
  • Denial of Service Attack. The attacker prevents the calculation and dissemination of reputation values in reputation systems by using Denial of Service method.
  • Defense strategies

    Here are some strategies to prevent the above attacks.

  • Preventing Multiple Identities
  • Mitigating Generation of False Rumors
  • Mitigating Spreading of False Rumors
  • Preventing Short-Term Abuse of the System
  • Mitigating Denial of Service Attacks
  • References

    Reputation system Wikipedia