If you’ve read Living in Information, you’ll know I think highly of Wikipedia. It’s a very valuable artifact; a convenient agglomeration of the world’s key knowledge. But it’s more than that: Wikipedia is also where that artifact is created — a place, a culture, a system that produces ongoing value for everyone, for free.

Wikipedia co-founder Jimmy Wales is one of the people most responsible for establishing and stewarding that place. In a lengthy interview (audio, transcript) with Tim Ferriss, Wales discusses Wikipedia’s history, the values that make it work, and how it’s different from other social networks.

The entire conversation is worth your attention, but I was particularly taken by the discussion of how to regulate bad behavior. What should users be allowed/forbidden to do? What guardrails should the system have to prevent bad things from happening?

Wales used a striking analogy:

I have this analogy, I call it the Steak Knives Analogy, which is, imagine you’re designing a restaurant and you think, “Okay, in my restaurant, I’m going to serve steak because I like steak and I’m going to give everybody steak knives. And one thing we know about people with knives is they might stab each other. So therefore, I’m going to build a cage around every table so that no one can stab each other. And yeah, when you hear this, you laugh because you’re like, “Oh, that’s hilarious.”

Yeah, but then what do we do about it? Actually that is true, people could stab each other. And what we do is we recognize that it’s really rare, for one thing, that we actually don’t want to live in a society where we live in cages because some people might be crazy. We have various institutions of society to deal with the problem. So for example, if somebody does start stabbing people in a restaurant, usually some brave soul, hopefully, young and strong, will tackle the person and stop the violence immediately. And we would call that a hero. You think that’s a great thing to do in society is to do what’s necessary to prevent damage and a tragedy. And then we’ve got the ambulance that comes to, hopefully, fix things. And we’ve got the police to take the bad person to jail.

And none of those things are perfect. People sometimes will get stabbed and it is a tragedy and there is no recovering from it. And that’s terrible. But we still say, “You know what? We don’t want to live in cages. We wouldn’t think, ‘Let’s redesign everything so that we’re in a cage.’” And so I kind of feel this way about, when I think about things like the way social networks are designed. I often think, “Why can’t I go and edit somebody else’s comment?” Well, the reason is because they’re thinking, “Well, bad people will do it, and so now no one can do it.” But actually, maybe, if you have the right kind of systems and processes and transparency in place, it could be a great thing. Maybe not. But I just think, too often, if you design for the worst people, then you’re failing design for good people.

I like this principle: if you design for the worst people, then you’re failing to design for good people. It strikes the balance on the side of liberty, trusting that most people will do the right thing. Wikipedia has a good track record of self-correcting bad behavior, so it’s a compelling example of this approach in practice.

That said, online interactions are different from those in the “real world.” For one thing, online identities are relatively fluid — especially in systems that allow anonymous and pseudonymous accounts. As a result, there are fewer downsides to bad behavior. (Imprisonment is a more serious punishment than having a pseudonymous account suspended.)

For another, bad behavior online can have an oversized impact. The laws of physics constrain the effect of bad actions in the real world. (E.g., there are only so many people a crazed patron can stab during a hypothetical attack.) Online, words and actions have much greater leverage. Nefarious actors pumping misinformation into the semantic environment can affect many hundreds or thousands of lives, as evidenced by the pandemic.

As Wales’s analogy suggests, there are several approaches to dealing with bad behavior. One is to install guardrails to keep it from happening. (E.g., putting cages in the restaurant.) This can prevent bad behavior, but it can also constrain good actors unnecessarily.

A second approach is to disincentivize bad behavior through threat of punishment. (E.g., the possibility of going to prison might hold back people with destructive impulses.) Again, I suspect the effectiveness of this approach depends on the severity of the punishment.

A third approach is through social norms and rules that empower other actors to keep the environment safe. (E.g., make it possible for “some brave soul” to “tackle the person and stop the violence.”) A possible downside here might be the emergence of vigilante cultures looking to enforce their particular norms.

We’re seeing a mix of the three approaches in how information environments such as Twitter and Facebook deal with bad behavior online. As with so many other things, all three have tradeoffs.

As we move more interactions online, the effect of regulatory mechanisms on individual freedom is becoming more apparent. We must have these conversations. This interview gave me lots to think about on the subject — it’s worth your attention.

The Tim Ferriss Show Transcripts: Jimmy Wales, Founder of Wikipedia, on Wikipedia’s Real Genesis Story, Best Business Decisions, Understanding Financial Markets, Developing a Questioning Mind, and the Value of Optimism (#528) – The Blog of Author Tim Ferriss