services should have an opportunity to observe their function and to develop trust
beliefs with respect to such systems [ 48 ]. As a system becomes trusted, people
may monitor it less frequently [ 56 ], hence reducing the cognitive load necessary
to continuously evaluate its functionality. As a result, similar to the evaluation of
capability of other individuals, positive evidence may start to carry larger weight.
Trust for such systems may continue to grow even if occasional faults occur.
However, if the overall reliability of the system declines, the trust will also decline
over time. As a result, if sporadic faults can have a major impacts on the system's
functionality, the design must take into consideration that these faults may not be
correctly perceived by its user.
There is ongoing discussion regarding whether machines will ever be perceived
as having positive intentions towards their users [ 34 ]. People often attribute social
features to the systems they use, by considering them to be friendly and competent
[ 58 ]. However, person-machine relationships are perceived differently than person-
person relationships according to brain activation patterns. Hence, it is not clear to
which degree an analogy with person to person trust relationships can be drawn
when designing trustable systems.
Furthermore, systems incorporate the intentions of those who built them. The
trust for different entities plays a role in judging the benevolence of their products.
For example, Alice learns that her phone is recording all her activities and sending
them to the device manufacturer. The system still has integrity, because it turns out
that this was part of its design. However, Alice now knows that the system may
expose very sensitive information about her to third parties and doubts about how
benevolent the design is. Overall, there is a component of benevolence in system
design that cannot be disregarded.
Reputation is one of the most widely studied trust concepts. Reputation in social
networks corresponds to what is generally said or believed about a person's
character or standing. A person's reputation is public, determined by the common
opinion of others in her social group. It indicates an expectation of adherence to
the common norms of behavior in the given social group. One's actions, opinions
of others, and recommendations in the group all contribute to one's reputation. In
many open systems like e-commerce sites, people interact with others that they do
not know beforehand. When one does not have firsthand knowledge about a person,
his reputation can serve as a basis for trust. This is only possible if one's reputation
will be damaged if he misbehaves. Reputation is relatively straightforward in face
to face networks, but becomes more challenging in online systems where people
can create new identities easily. If a person cheats in one transaction and ruins the
reputation of an identity, he can easily create another one. Reputation management
systems aim to improve this situation by providing useful information about trustees
that can be used to form beliefs about their trustworthiness.