craigconnects: Connecting the World for the Common Good
contact

Trust and reputation systems: redistributing power and influence

People use social networking tools to figure out who they can trust and rely on for decision making. By the end of this decade, power and influence will shift largely to those people with the best reputations and trust networks, from people with money and nominal power. That is, peer networks will confer legitimacy on people emerging from the grassroots.


This shift is already happening
, gradually creating a new power and influence equilibrium with new checks and balances. It will seem dramatic when its tipping point occurs, even though we're living through it now.

Everyone gets a chance to participate in large or small ways, giving a voice to what we once called "the silent majority."

(Okay, I started with the bottom line. The following is a relatively brief summary of how I got there, deserving much longer treatment from really smart people.)

When we need help with decision making, we get recommendations from people we trust, that trust built on some combination of personal experience and reputation. That's the way humans work, nothing new about that. We talk about reputation being one's greatest asset.

Reputation is contextual, that is, you might trust someone when it comes to dry cleaners but not politic. However, I'm going to simplify this thing by avoiding the issue, right now. (Yes, might be short-sighted on my part.) I'll also defer a prerequisite, the need for persistent and verifiable identity, the need to prove you are who you say you are.

In real life, personal networks are pretty small, maybe in the hundreds. Mass media plays a role in shaping reputation for a small number of people, including celebrities and politicians. A very small number of people have influence in this environment.

Internet culture and technology changes this dramatically:

  • people tend to work with each other
  • people are normally trustworthy
  • despite their large media footprint, there aren't many bad guys out there
  • message spreads quickly
  • message is persistent, it's there forever
  • connectivity is increasingly pervasive
  • people are finding that reputation and recommendation systems can be used to drive a lot of profit.

Okay, so we want to be able to see who we might be able to trust, maybe by seeing some explicit measurement, or maybe something implicit, like seeing their history, and who trusts them.

We already see various forms of reputation and recommendation systems evolve, often with mixtures of pre-selected experts or professionals. Amazon and Consumer Reports Online do a good job of this. (Disclaimer: I'm on the board of Consumers, since their record for integrity is close to perfect.)

Wikipedia does a very good job of this, mostly by having lots of people keep an eye on articles, particularly the more controversial ones. There are ongoing issues, being addressed in good conscience as people develop
new methods to address information quality and reliability.

We also see reputation and influence created by persistent works, reflected in social networking sites including Facebook, LinkedIn, and Google Social. Such systems show history and context, which play into trust, and display connections to other people. Those are not trust relationships normally, they're "weak ties" which also play into trustworthiness.


Cory Doctorow
 (or here) postulates kind of a trustworthiness currency called "whuffie". You trust someone, maybe want to reward them for something, you give them points. Turns out that there's an experimental repository of whuffie, thewhuffiebank.org. While this sounds facetious, it's a very simple solution to the complex problem of tracking trust.

The most prominent experiment in directly measuring trust is Unvarnished, very recently launched in beta form. You rate what trust you have in specific individuals, and they might rate you. Unvarnished is pretty controversial, and is already attracting a lot of legal speculation. They're trying to address all the problems related to the trustworthiness of the information they receive, and if so, might become very successful.

The last raises an issue all such systems have; they might be very easy to game. Any such system is vulnerable to disinformation attacks, wherein smart enough people can figure out how to fake good or bad ratings. There are a number of very successful groups who are really good at such disinformation in conventional media. Often they're called "front groups", "influence peddlers" or "astroturfers." One good watchdog over such groups is the Center for Media and Democracy.

One metric of trust is transitive, that is, the trustworthiness of the people who trust someone. If person A trusts you, and person B trusts A, then that might affect how one measures your trustworthiness. However, that gets real complicated when that web of trust involves seven billion people, or even a few thousand. It's a research problem.

How do we trust the custodians of trustworthiness? We need to have some confidence that they're not fiddling the ratings, that they're reasonably secure. After all, trust and reputation are really valuable assets.

I think the solution lies in a network of trust and reputation systems. We're seeing the evolution of a number of different ways of measuring trust, which reflects a human reality; different people think of trust in different ways.

Also, the repositories of trust information are the banks in which we store this big asset. Like any banks, having a lot of this kind of currency confers a lot of power in them. Having some competition provides some checks and balances.

We need to be able to move around the currency of trust, whatever that turns out to be, like we move money from one bank to another. That suggests the need for interchange standards, and ethical standards that require the release of that information when requested. Perhaps there's a need for new law in this area.

Restating the bottom line: we are already seeing a shift in power and influence, a big wave whose significance we'll see by the end of this decade. Right now, it's like the moment before a tsunami, where the water is drawn away from the shore, when it's time to get ahead of that curve.

Tags: | 40 Responses »

40 Responses to Trust and reputation systems: redistributing power and influence

Charles H. Green says:

Craig,
Great post, thanks. I think you're doing as good a job as anyone out here thinking through the networking of trust systems.
But I am skeptical of the network-scalable solutions to three issues–all of which you note–and another issue you don't mention.
1. The inherently personal nature of trust: outside of a trust characteristic like dependability, trust relies on things like understanding of motives–and motives don't reveal themselves well.
2. The game-ability of trust ratings systems. Rapleaf started out that way and was immediately flooded by the self-serving "I'll link to you if you link to me" patterns that one can also see in twitter; it's a hard problem to crack without resorting to Wikipedia-like overseers.
3. The transitivity of trust. Trust, like love and good red wine, doesn't travel (transit) well. If A trusts B and B trusts C it very much does NOT follow that A trusts C.
4. The issue you don't mention is the requirement for trust to have an object to be meaningful. It isn't enough to say you trust someone, you must answer "trust them to do what?"
It is one thing to trust Amazon to pick my books for me; quite another to have Amazon suggest a date, or a financial advisor.
Look at the silliness that is TweetLevel, which purports to quantify and rate trustworthiness of various people. You'll find that (on a given day), the New York Times outranks CNN–but that Perez Hilton tops them both! What possible sense does that make!
The issues you're raising are important–verification of identity, security. But that's a subset of trust issues.
When you move to these other aspects of trust, the analogy of a trust bank no longer works. We don't transfer intimacy or a sense of security that someone cares about us in the same way. It isn't transitive, and we don't store it up. Instead, it gets established in the interaction itself by things like listening and empathizing.
Similar to what George Burns once said: "The most important thing in life is sincerity; if you can fake that, you've got it made." Ditto with much of trust; the systems you're talking about don't come anywhere near faking sincerity, and without that, a big chunk of trust is not at play.

Scott says:

Excellent analogy of the tsunami, in part because it's right when things seem tranquil that you are in big trouble.
And I love that quote from your first commenter from George Burns.

Craig Newmark says:

Charles, thanks! and I appreciate all this, all deserve much longer treatment. Some deliberately neglected, since this piece is longer than I want already. I will be listening to people, figuring out what makes sense, then will do some combination of writing a little, mostly deferring to others smarter than me.
Craig

Daveangulo says:

You hit the nail on the head with this statement: "That is, peer networks will confer legitimacy on people emerging from the grassroots."
The solutions you propose are easily gamed because they are so divorced from reality. IRL you don't get together with your friends and vote on who is the expert on televisions, cars, or computers.
These experts, trust agents, influencers emerge because they contribute useful information to conversations around those topics and over time people have had a good experience following their advice.
So, if you want to find those people in the online world. You can look at an individuals contributions, the amount of buzz they've generated, and their reach in the context of a specific topic. By restricting the context you can find who the peer group has conferred legitimacy upon.
That's the theory behind http://spotinfluence.com, or we could be wrong. We're just excited that these conversations are occurring.

Ric says:

This is a very astute observation Craig and I think a solid foundation to build from. Accountability is so key that it is the sole reason why successful networks like linkedin and FB work. You have to put out there the real you to recieve any sort of benefit from the system. And as such, it makes you somewhat accountable. This underscores the need for authentication. If you use a system to verify that you are you in the community space and make yourself known by your actual ID, then your clout as a trust member is higest of all. Only when the anonymous or unverified are relegated to the lowest ranks of trust standing will a 2.0 system like this have a chance to succeed.
As with trust in the real world, I better know you and you know me before that interaction can develop.

Mark Essel says:

Trust is built up slowly over time. After many months of following, reading, and conversing with people I meet online I can't help but build up faith in an individual. There are a dozen or so people I have interacted with online that I trust as much as close friends simply by reading and sharing messages. It makes perfect sense that there should be some way for me to exhibit that trust (I do so with shared links from my blog now).

JONES28Kathleen says:

One knows that today's life is not cheap, however we require money for various issues and not every one gets enough cash. So to receive fast personal loans or just small business loan will be good solution.

Guy Martin says:

Craig,
How does 'work product' fit into this thinking? For example, in software dev communities (let's pick the Linux kernel as a fine example), your reputation and trust is built mainly on your contributions, and how they fit into the overall direction of the project.
While there are 'social' aspects to the trust meter, the majority of the trust factor is in how good your contributions are.
I understand that might not always fit in the models you are describing, but it would seem that in cases where something tangible is delivered, the 'squishy' trust factor can be augmented by the (slightly) more objective 'deliverable'.
Regardless, this whole problem space is an interesting intersection of technology and psychology. :)

Gaurav Bhalla says:

Thanks Craig, for a provocative piece. No doubt about the value of trust and authenticity. Still plenty to be worked out on how this currency will evolve, disseminate, and be adopted. And whether it will be more just and equitable than the one it intends to replace.
Cheers!

Ross Dawson says:

Hi Craig, great piece and I very much look forward to hearing how your thinking develops on the mechanics of distributed trust systems.
While I would love distributed systems to develop rapidly, I think realistically it's going to a long time, after the rise of individual reputation systems. I've written some of my thoughts on this here: http://bit.ly/cadNou

R_macdonald says:

Loved your thoughts on the import of elaborating reliable and hugely scalable systems for deducing trust and reliability in this pre-tsunami inundation phase of web-wide integration of real-time communities.
You, and others with parallel interestes, might appreciate the work of one of the more sophisticated & experienced strategists in the web-reputation space, F. Randall Farmer.
Tim O'Reily's group just published Randy's "Building Web Reputation Systems."
http://oreilly.com/catalog/9780596159801/
I found the book most insightful. Randy's background includes 30+ years of managing innovative online communities and games.
You may find his bolg of interest as well:
http://buildingreputation.com/
At the risk of sounding like his publicist, I'm just a fan of his work, you might also be interested that Randy and his co-auther are presenting "Designing Reputation Systems" at this year's Web2.0 Expo on May 4.

Lee Semel says:

Trust can't be purely mathematical and algorithmic — trust is both personal and specific. You don't simply trust a person, you trust them FOR something. You trust person A is good with money, person B delivers on time, and person C is fun to be with. You can't boil these down into a universal trust or "reputation score" because they are all really different things. Any trust network is going to have to take this into account, and there will probably end up being separate ones for different domains of life. They will work differently — right now, I could build up trust with a programmer by looking at their work on Github, with an author by reading the books and articles, and with a company by reports from other customers or by doing business with them over time. Successful trust systems could evolve in each of these domains by making this process of building up trust less time consuming.

Pownum says:

Craig,
Excellent piece. Unvarnished looks interesting. We're launching something called pownum on April 20th that will do the same thing for organisations.
Until now it was the companies with the deepest pockets that could shout loudest.
pownum (short for 'power in numbers' will enable people (i.e. the 'silent majority) to come together to rate organisations and encourage them to change. This will all be in the public eye and will bring about much needed transparency.
Thanks again and there's more info here if you're interested: http://pownum-blog.blogspot.com/
Marty.

Mike Ricard says:

Reputation and trust building may be hard to qualify in an open system like the web, but it is possible within a business network like an Enterprise 2.0 community. I have seen some people rapidly gain reps from the quality of the contributions they have made using social tools.
Building trust can work well within a limited network – the real trick will be to do it out on the web.

Jacques Werth says:

It is entirely possible for two people to meet and develop a deep emotional linkage characterized by mutual trust and respect. It can happen within twenty minutes.
To learn how it is done read "High Probability Selling." To learn why it works read "Power vs. Force" by Dr. David Hawkins.

John Hamer says:

Great post, Craig. Here's another idea that's gaining support: "The TAO of Journalism — Transparent, Accountable, Open." See http://www.taoofjournalism.org, our beta site. We propose that anyone practicing journalism of any kind — mainstream, independent, blog, hyperlocal, etc. — voluntarily take the "TAO Pledge" to be Transparent, Accountable and Open. Those three principles are crucial to earning Trust. Pledgers will post the "TAO Seal" on their sites as a signal to readers, viewers and listeners that they'll be TAO. It's no panacea, but it could help. We already have several signed up, including The Banyan Project and Spot.Us. Many people online are already totally TAO, so taking the pledge and displaying the seal is a no-brainer. Want to sign up? Let us know and we'll put you on the list. Just TAO it!

CoCreatr says:

Trust and reputation -YES. … Systems – MAYBE. These are important, as in a visual by Gistics, "Fifth Era of Trust Networks" http://bit.ly/9imB5C
As collaborators said above, we build trust, it takes time to see consistency, it is personal, object-oriented and context sensitive.
What all the social media tools basically do is reduce time and space needed for the building of trust. Which helps us recognize patterns faster. Does this mean we the people are the basic trust engine?
Considering systems, we start over: we try to build trust, find ways to justify the trust – now in the system. While we may feel personally ready to place a certain amount of trust in others, we now face the challenge to teach this to a machine.
Great visionaries and experimenters lead us, as usual. Thanks, Craig. Thanks Venessa for tweeting http://twitter.com/VenessaMiemis/status/11794552311

michael webster says:

1. The thesis appears false on its face: people with money and nominal power never had significant influence as compared to those who we trusted.
2. Many people may have significant influence without being trusted at all, call this the Nixon phenomena.
3. We need a single example of the thesis instead of a general discussion of reputation.

Masanori Fujimura says:

I impressed very much with your insights and critical eyes to Trust and reputation systems in peer networks.
I translated the blog into Japanese, today. I would like to share this japanese edition with my folks on Twitter and other japanese who are interested in social media.
Please let me allow to up the japanese version of your blog on my site,
http://bugsworks.blogspot.com/2010/04/craig-newmark.html
Please also let me know it, if you don't want it. I will remove the page at once.

Craig Newmark says:

Masanori, thanks! and Ive check the Google translation of your paper, and it looks good to me, very appreciated! (I only know a few words of Japanese, most of which are names of fish.)
However, looks like Teppingupointo is tipping point.
Thanks very much, appreciated!
Craig

Joan Boyd says:

Craig,
Here’s my belated response. You’ve raised challenging concepts about trust, power and influence. Let me play devil’s advocate.
PEOPLE TEND TO WORK WITH EACH OTHER
True, but the contingencies are very numerous: the stakes or self-interest, balance of power, personalities, gender . . .
PEOPLE ARE NORMALLY TRUSTWORTHY
True, but the exceptions are endless.
For example, Roosevelt was revered by my family and the majority of people; he did great things for the country. But now we understand that he was preparing our country for WWII long before the Pearl Harbor attack which wasn’t such a surprise as claimed. FDR ran for a fourth term and won even though he was too ill to hold office. And all those around him knew that. No one challenged him.
Robert McNamara was known as a whiz kid, and we all thought the DoD was in good hands. But he made bad decisions. He was the wrong man at the wrong time for the wrong job, i.e., wrong context.
Political correctness and group think interfere with judgment and trustworthiness.
What about the Polish airline crash in Russia.? One would presume that the pilot of the plane was trustworthy given his extraordinary responsibility. Did some one else command him to land? Was this an issue of balance of power? Personality? Time frame? Did others decline to speak up?
DESPITE THEIR LARGE MEDIA FOOTPRINT, THERE AREN'T MANY BAD PEOPLE.
In the enormous military-corporate-congressional-academic complex, are there any bad guys? In the Wall Street scandal? How did the rich/poor divide become so large? Are politicians who make decisions/vote for the purpose of campaign contributions and votes bad? How about people who cheat on taxes? Students who cheat? Bullies in school? Employees who steal from the office?
Re the military-industrial complex, there are two changes being discussed with some seriousness:
1) There is intent to decrease the DoD’s annual budget and increase the State Department’s budget which will expand the Agency for International Development. This will help reduce the DoD’s influence, hopefully.
2) Also, there is intent to increase the number and percentage of women of all ranks in the military. Women are less likely to go to war.
CONNECTIVITY IS INCREASINGLY PERVASIVE
This has got to be good, but I am waiting to see more solid evidence of benefits.
PEOPLE ARE FINDING THAT REPUTATION AND RECOMMENDATION SYSTEMS CAN BE USED TO DRIVE A LOT OF PROFIT
Does Goldman Sachs stand as an example? Just kidding!

AlexPitson says:

Trust is a funny thing.Trust is a funny thing.One of the most powerful ways to breach trust is to behave inconsistently with promises or verbalisations. There are thousands of websites in the chain, all doing the same thing website links chain.People trust people who do what they say they will do or who sell products which do what they claim to do. http://www.safeshops.org

discount coach says:

Everybody’s got something they had to leave behind One regret from yesterday that just seems to grow with time. There’s no use looking back or wondering How it could be now or might have been.

buy viagra says:

There have been studies and some attempt to create tsunami waves as a weapon. In World War II, the army in New Zealand trialled explosives in the area of today's Shakespear Regional Park to create small tsunamis, an attempt which failed.

Natalie Portman says:

What an excellent blog! Are politicians who make decisions/vote for the purpose of campaign contributions and votes bad? How about people who cheat on taxes? Students who cheat? Bullies in school? Employees who steal from the office?