sellers incorporate a forum where users can freely rate and review the products.
As with judgments of capability in Sect. 3.2 , more information adds more value.
In fact, researchers have investigated whether it is beneficial to plant fake product
reviews and remove bad user reviews to inflate the site's reputation artificially.
The theoretical results are interesting: sellers will optimize their benefits if all the
reputation systems contain only true feedback [ 18 ]. However, since some sites
will inevitably manipulate their online forums, a site that does not follow suit will
suffer. However, signaling can also help in judging trustworthiness of e-commerce
sites: sites that have online forums are perceived to have more transparency and are
considered more trustworthy compared to sites that do not support such forums.
A trustworthy reputation system should be robust which means that it should
be hard to manipulate a person's reputation via an attack from a small number
of individuals. Josang and Golbeck discuss types of possible attacks on reputation
systems [ 39 ]. Among these are attacks in which a person achieves good reputation
and uses it to impact someone else's reputation (playbook), a group of individuals
give bad marks to a single seller to impact her reputation (collusion), or a bad seller
leaves the system and enters under a new identity (re-entry). A good reputation
system must reduce the impact of these type of attacks as much as possible.
In addition to economic transactions between peers mediated by a third-party
system, reputation methods have been used in purely peer to peer systems [ 1 , 76 , 78 ]
that facilitate interaction between peers distributed over a network (e.g., BitTorrent).
In such systems, there is no central authority that contains all the relevant trust
information about the participants. The computing platform provides the capability
to communicate with peers, perform tasks (e.g., file sharing), and transmit infor-
mation about the success of operations and possible trust evaluations. The users
compute trust for their peers based on the information available to them. As a result,
the computed trust may differ from peer to peer. This subjective value provides a
different measurement than reputation, which is a global value. In both cases, trust
values are computed to protect the peers from harm by making sure that attacks
cannot result in unwarranted trust or reputation values.
Overall, reputation is an institutional mechanism. A trustor visiting a site will
rely on its reputation mechanism to decide to which degree a trustee is trustworthy
(unless she has previous experience with the trustee). Many sites have become
successful due to the reliability of their specific reputation mechanisms, such as
Amazon and Ebay. Reputation mechanisms have also been employed in sites for
sharing information, e.g., Slashdot, in which a user with good reputation is expected
to post high quality and correct information.
Ranking Web Pages
One of the earliest uses of trust, even though it was not explicitly called such,
was in the ranking of web pages in response to a query. Before the now famous-
link analysis algorithms like PageRank [ 9 ] became common-place, the information