On Trust (part I)
Trust is a big topic. There’s a number of definitions out there related to different domains; everything from the famous Ronald Reagan “trust, but verify” dictum applied to international arms treaties, to “trusted computing” which is not about whether something is trustworthy, but whether it behaves predictably from a software/content provider’s point of view (and not necessarily the user’s!).
In general, most people tend, of course, to assume that “trust” means what the dictionary says: someone or something on which you can depend.
But let’s take a look at what the first six definitions in the dictionary actually say:
- Firm reliance on the integrity, ability, or character of a person or thing.
- Custody; care.
- Something committed into the care of another; charge.
- The condition and resulting obligation of having confidence placed in one: violated a public trust.
- One in which confidence is placed.
- Reliance on something in the future; hope.
- Reliance on the intention and ability of a purchaser to pay in the future; credit.
Common threads emerge in these definitions: a looking-forward into the future, a sense that something of value is being held by someone else, and that somehow integrity and character are related to these two factors. In short, that integrity can be measured as whether future interactions with someone will damage something you value.
This cuts to the heart of what was being discussed in the post and resulant thread on reputation systems, which sometimes go by another name in social software circles: “trust metrics.” Perhaps the best-known advocate of truly robust trust metrics is another Raph, whose work I first became aware of years ago when researching reputation systems.
Using Neil Gaiman’s metric of “who is ranked higher on Google,” I’m Raph #2, and he’s Raph #3. There’s a Raph #1 as well, of course. This may seem like a total tangent, but it isn’t, because Google’s PageRank system is (roughly) based on reputation votes in the form of links to a site. The more inbound links, the higher-ranked you will be. Effectively, Neil Gaiman is Neil #1 and is considered to be the prime authority on Neilness, beating out Neil Young. Here Google is leveraging the social reputation of a vast network in order to build a trust metric. The thing of value is your time, and the expectation of future interaction is whether or not the page you get when you hit “I’m Feeling Lucky” is actually what you want.
A while back I attended a conference where the theme was centered on trust. The proceedings of that conference are sealed, but an official report was produced. I ended up going off on a massive mental tangent during that conference, exploring my own mental constructs regarding the words “trust” and “authority,” and thinking about it in terms of what it might mean for virtual spaces. Bear with me, because this gets a little bit abstract, and even a bit idealistic, as we go.

It’s quite straightforward to connect this to discussions of Dunbar’s Number, aka “the monkeysphere.” So let’s take that as another truism: most people only have repeated significant interactions with a limited number of other people.
We can then turn around and think about the notion of “community standards.” These days this tends to mean a sort of terms of service document imposed from on high, but that’s actually exactly the opposite to what I mean: instead, think of it as “the standards the community imposes on itself.” These can be summed up by another word: culture.
The words culture, cultivate, and of course cult all come from the same Latin root, and this is no accident. We think of cults as brainwashing, but that is also what cultures do. They cultivate the group (get it to grow and grow strong) by inculcating (from the Latin root meaning “to force” and “to trample”) certain practices and worldviews. Deviation from these practices is punished, often by removing the support of the group from the deviant individual.
What we have here is the core of the term “peer pressure,” and also the core of the notion of “community policing.” In a group of appropriate size, as long as there is a culture established, you will see fellow members of the society enforcing the implicit rules of that society. People who deviate will be maligned, ostracized, and eventually (should there be a form of authority with this power) ejected.
This is only possible because of the promise of future interaction. This is perhaps best illustrated by recent events here on this blog. A culture has been established (and in fact, I’ve worked over years to establish it, via the manner of my participation on many different forums). Certain types of attacks and commentary isn’t welcome within that culture for a variety of reasons. When people showed up contravening that unspoken rule, other blog regulars started openly putting down the offenders. Notably, the authority (meaning me) didn’t. Some offenders promised to reform purely so they could maintain their access to the group resource: namely, the blog. Others persisted in their behavior. Eventually, the outcry was enough that the authority (again me) stepped in and enforced the unspoken rules and made them explicit. In effect, communitarian policing led to the creation of a “law.” The folks who joined but were disruptive were either normed into the group or were ejected via ostracization.
The “law” is interesting, because at that point, we move out of the realm of communitarian enforcement and into the realm of “authority-based” trust. At that point, there need not be any expectation of repeated interaction. You have a level of trust that the guy in front of you in the checkout line is not going to turn around and stab you in the eye with his Snickers bar because of mediated trust — you are relying on the authority above the both of you to keep that from happening.
Obligatory game-related point to make: This is in fact why in my community relations policies manual at SOE I recommend not having a “general game discussion” forum — by creating a community whose natural shared interest is so large, you rapidly grow beyond Dunbar’s Number, and thus rapidly create an environment where you must appeal to authority in order to get a civil environment. In other words, an overly large forum or game will have greater policing problems, because smaller groups are more effective at policing themselves via peer pressure. This is why the SWG forums were partitioned into so many smaller groups.
I also believe this is why the level-based segmenting of player populations into cohorts and “cozy worlds” is such a powerful dynamic, and why levelless systems have to find a compensating mechanic. Dunbar’s number is about our support system, not just about our knowledge sphere. Being outnumbered by the big bad world is uncomfortable. A big anonymous world is not as emotionally satisfying as a tight-knit community of reasonable size.
And all of that just covers the very first sentence in the notes I set out to transcribe. We have seven pages of notes to go! At this rate, I’ll finish in 2008.
Still to come in part 2 (and beyond, if need be):
- “The relay problem” with authority appeals
- “Trust is not transitive, but mistrust always is.”
- Can we architect communitarian models that resist homogeny?
- “The more distant the authority, the higher the burden of proof is.”
- Is there such a thing as bottom-up authority?
- What’s the difference between trust and faith?
- Yes, of course we will talk about LambdaMOO!

You sir would make a good Brit, where a natural appreciation for irony is a pre-requisite!
Your first paragraph can be summarized as “Familiarity Breeds Trust”. That doesn’t include authority issues, and probably a host of other things, but those three words put together a lot of it. =)
Your first paragraph (where it begins “Let’s take the following as a truism”) can be summarized as “Familiarity Breeds Trust”. That doesn’t include authority issues, and probably a host of other things, but those three words put together a lot of it. =)
Interesting, to say the least.
As my son pointed out while I was reading this and discussing it with him, the online limits to the size of a community may be considerably lower than Dunbar’s number due to each individual’s need to maintain relationships outside the online medium. Since the online community only represents a fraction of the individual’s “monkeysphere,” and it’s relatively easy (relative to communities based on physical proximity, anyway) to change online communities, I wonder if online communities have more churn? How does that affect trust?
I am not sure that familiarity breeds trust. Familiarity is perhaps an indicator of repeated past interactions, which then results in a decent predictor of future interactions. But we also know the stories of “burning bridges” — people who are familiar and yet act in ways that break our trust in them because they have chosen to sever past ties.
Aufero, good questions. I am pretty sure online communities do have more churn, but it would be very interesting to see real data on that.
Ok I think you stopped a little short however as to why large anonymous worlds aren’t as well recieved as tight knit communities. It gets into the realm of expectations of reciprical support.
I help you, you help me. That I can rely on to a point. If you fail to hold up your end the trust is broken and I never help you again, its easy to cut you off because there is only us.
However as I help more people (X number) it gets much harder for me to keep track of transactions. You also end up in conflicts, I helped Y and Z but they have conflicting goals so now I have to choose. The interconnectedness gets harder and harder to maintain as the numbers grow larger, hense why folks maintain smaller communities. In the end its all about the ‘Me’ however and what the ‘Me’ expects from those around that owe ‘Me’ (owed as in actions done or actions that may be done in the future).
Thats a very nutshell view of the various social theories on the matter just from what I remember from my college days. A full discourse on it really precludes a blog honestly.
Churn
What would contribute to higher churn in an online community is ease of entrance and exit. It’s harder to leave a community when you can’t go more than ten steps without bumping into one of them. But if all you have to do is log off…
Familiarity
And I was a bit off-base… maybe it’s more accurate to say that familiarity breeds expectations, and some expectations are trustworthiness. Behavior patterns?
“Familiarity breeds trust” doesn’t sum it up at all. For instance, if you move from Los Angeles to a town of 40 people in Montana, you *will* treat your neighbors nicely on your very first interaction with them, though you’ve never seen them before. Because you know you’ll see them again. It has nothing to do with familiarity.
Hm… yes, you’re right. I think I made a subconscious connection and expressed it completely wrong. Expectation of familiarity, perhaps? The term is important, I’m sure of it; it’s just HOW that’s befuddling me now.
You may have noticed that even when forum managers (in games or otherwise) try to keep a “general forum” from forming that it tends to spontaneously erupt in some other space. You may have also noticed that attempts to engineer a fit between a designed space and a tight community also tends to be frustrated over time by the creation of information flows and communication channels that interlink much larger and less trust-centric social networks on a global scale.
You’re missing something here, perhaps because you’re focusing too exclusively on trust. Let’s put it this way: there is something that pushes against tight-knit communities of reasonable size and the kinds of futureward forms of trust they engender.
The old gemeinschaft-geschelleschaft distinction may hold promise in understanding what pushes against attempts to restrict communities to some allegedly natural or intrinsic “appropriate size”. Small face-to-face communities all around the world have given way to large anonymous ones in the last two centuries. There are a lot of reasons for that, many of them having nothing to do with the will or volition of individual human beings or communities of human beings. But some of the reasons do lie in what people want, what they desire.
In part, they lie with a certain form or conception of freedom; that the very things which produce “trust” as you define it also produce intimate tyrannies, restrictions, forms of domination. Many people seek global anonymity in order to be free of constraint–and to choose their communities rather than have those communities chosen for them.
And there is a “culture” that is vital and powerful at that larger anonymous level of sociality, even a form of “trust”. But the trust which can exist at that scale is a completely different thing. For one, it depends vitally on the free flow of information throughout the entirety of the social space, not on fear or anticipation of what particular individuals might think of you as an individual tomorrow. So every time a community manager tries to confine people who know they’re in a larger whole to what they judge to be some appropriate sub-community of interest, people seek to tear down the barriers to the flow of information across the global scale of community.
From a practical point of view, though, Tim, it doesn’t matter if the larger forum erupts elsewhere. In fact, more power to it if it does. As long as you as a service operator have smaller communities that you can still effectively communicate with, and that you can police at a reasonable cost, you’re still gaining the benefits of running forums.
I agree that there are many social forces that pushes against staying at the smaller size. That’s a separate issue from the issue of whether a given particular service provides forums of a given size.
This is exactly why RP-MUSHes work.
Though, unless someone is exploiting code, cheating, or harassing other players, ejection is largely voluntary. When the very fabric of your existence in the game is pegged, to some degree, on your participation in the culture, the game will cease to be of interest rather quickly if you can’t adhere to its norms. After all, you can’t roleplay solo.
As you might imagine, the Dunbar limit is extremely evident in RP MUSHes. Once the player base exceeds a certain size, it will by necessity fracture into various social strata and cliques. I’d say that the stable state for an RP-MUSH is well below Dunbar’s 150 — or even below the UO 60.
However, RP MUSHes also need a mixing-bowl type area (or “main stage,” as I call it), to foster interplay (and create tension) between the factions, and provide opportunities for dramatic confrontations. The Crossroads tavern kidnapping in A Game of Thrones is precisely the sort of thing a main stage is for.
I think this sort of interplay happens naturally in most MMOs, due to trade, resource competition, and other factors. During these interactions at the interfaces between factions, we are standard-bearers. Our outside interactions affect the reputation of our entire faction. Thus, it’s possible to not just ostracize someone from your group, but to pressure other groups to ostracize someone, as well. (You will sometimes see on the WoW forms, for example, people pressuring guild-leaders to boot known ninja-looters.)
So, guilds/factions/cliques can act as atomic players in terms of future-interaction, on a whole different scale from individual players. And this could expand to higher tiers, as well. (We are at war with Nation X, which holds Guild Y, which holds Player Z. It doesn’t matter if Z is the nicest guy on earth. We’ve never met Z, and he’s just an X to us.) When there are too many people to know, we organize our trust through abstraction.
What I’m suggesting is that the larger forum will erupt within the space of the smaller communities that a manager is trying to gerrymander into existence–that if you fail to provide one on the grounds that you don’t want that larger forum, you’re going to get it anyway, and then have to invest serious effort constantly slapping it down. You cannot manage the desire for a global information channel out of any community; if you don’t want that channel to exist, you might as well have no forums at all.
It’s been my experience that it really isn’t that hard to create and manage this situation, Tim… Remember the goals from the service provider’s point of view:
Providing lots of forums labelled with smaller group identities (say, by class and by server) gives you all of the above. Adding larger forums that are not intended to meet those above and serve merely as places to blow off steam is a minor addition; they can even be largely unpoliced.
In addition, there are surprising labels you can come up with; a forum specific to game development can, if the culture is established early of it being very thoughtful and wordy, keep its size small and limited only to thoughtful participants.
Again, it’s not that you don’t want that channel to exist — it’s that it is of limited utility for those trying to read it for governance purposes. (This leaves aside the issue of its utility in terms of datamining — a service like Intelliseek can glean much from such a channel).
I guess I disagree with the latter claim you make–that the larger global channel is of limited utility for governance purposes. I think that’s about poor heuristics on the part of the readers plus a desire to make governance about the details and the mechanics and not the general will or public sphere of a virtual world. I know that developers or community managers feel like they have limited time, but as someone who watches those kinds of forums on the side, I don’t feel I have much trouble finding “high signal” messages.
It’s too easy to just complain about the noise in a general forum, or dismiss it as filled with negativity, or some such. I’d tie this a bit to my running critique of the avoidance of sovereignity as a concept by developers, a desire to make governance a question of managing specific mechanics, playing factions off against one another while veiling interior processes of decision-making, and so on.
If there’s a trust issue between player communities and developers, in fact, I’d say in part it’s because developers tend to want to decompose general, sweeping issues that involve the geist of their communities into game-mechanical questions that treat each class or race or faction as a community of transfer-seekers who can be met as an isolated constituency. This gives the general citizenry of a virtual world a sense of being evaded, of shouting into the void, of knowing about overall problems that developers refuse to speak to.
It’s not unlike the problem technocratic bureaucrats have in responding to general social crises, or moments of public malaise. They want to meet those moments in manageable chunks, in terms of particular constituencies seeking service. But sometimes geist is geist. You won’t find and meet the tipping point moment in class or server forums; you’ll find it only in a general, noisy channel. When it comes time to speak to the citizens as a whole, that function gets ceded to public relations-speech, or there is an engagement that seems not to meet the general will head-on or in a way that corresponds to the social reality of the virtual world.
I hear what you’re saying, Tim. And I think many aspects of your critique are dead on. But the fact remains that high signal messages are often not impactful messages (usually the opposite, actually — they tend to get lost). That smaller constituencies easily get lost in large forums. That changes often ARE made for individual constituencies, or to manage specific mechanics.
I’ve been an advocate of direct interaction with “the citizens as a whole” for a long time; luncheons, the UO essays, House of Commons chats, and my methods of interaction on forums all point towards that. I will say, however, that the geist actually does manifest in the smaller forums — if it’s truly the geist, it will be apparent in the sum of the parts as well as in a general forum.