Archives for category: Collaboration

About a decade ago I migrated into community work [1] from a non-community background. This is the guide I wish I had read back then.

[1] When I say community work, I am talking about stuff like Wikipedia: large distributed groups of people doing something together, usually online, often unpaid. Usually international, often nerdy, often (but not always) FLOSS or FLOSS-adjacent.

You are going to be doing a lot of writing. Do it well. Phone calls and Hangouts don’t scale. Face-to-face is expensive for distributed groups and therefore rare. Real-time tools like IRC and Slack disadvantage people in minority timezones. And so, in an online community, your main way to communicate is likely going to be email. Which means you need to be good at it. Take the time to write carefully, fully and precisely. And since text is going to do so much of your communicative heavy lifting, consider being a little more explicit about emotional signal-sending than you might be otherwise. (“I am happy…”, “it is sad…”, “I am grateful.”)

In all your communications, be conscious of your group’s diversity. The group you’re speaking with is likely more diverse than what you’re used to. There may be children and teenagers as well as adults. For many or most, English won’t be their first language. They probably live in multiple countries, have had a broad diversity of experiences, and and hold a wide array of beliefs. This creates a bunch of potential pitfalls. Your jokes may fall flat or offend people. Cultural references (sports, movies, history) may be meaningless. Even for those of us who aren’t American, it’s easy to come across as U.S.-centric. Metaphors, allusions and convoluted sentence structures may not be worth the time they’d take readers to untangle, and make translations much more difficult. High diversity argues for a style that’s literal, straightforward, and well-structured.

Be cautious about creating an insider culture. This is a tough one, because inside jokes and shared history and assumptions foster a sense of belonging. But every in-group requires an out-group, and having a lot of shared lore is unavoidably exclusionary: it makes it harder for new people to join. A strong culture will also inevitably move you towards demographic/attitudinal narrowing, rather than the reverse.

Publish early, publish often. If you are building a plan or a proposal, don’t wait until it’s flawless and polished to publish: release it while it’s still raw and half-baked. For folks from a non-community background this will feel dangerous, like you’re leaving yourself vulnerable to criticism. But in a community context it builds trust and empathy, and will be understood as an invitation to collaborate. Do tons of signposting. Explain what you’re trying to do, and why. Sketch out how you imagine it may work.

Be aware that volunteer time is different from paid time. Staff need to begin their public work as soon as they possibly can (sooner!), and to build in lots of elapsed time for community discussion. Community members have other priorities: school, jobs, families. You can’t expect them to make your work their top priority, so you need to give them the biggest-possible window in which to contribute.

Write (and publish) a greater volume of stuff than you think you should. This feels counter-intuitive for people who’ve been execs in traditional contexts, because in an ordinary executive context the scarcest resource is time, and so we get used to providing summaries, bullet points, upshots, and key takeaways. Succinct=good. In a community context though, comprehensive beats succinct. This is only logical: if you’re writing for a wide diversity of stakeholders, they’re going to want to know a wide variety of stuff. Manually asking and answering questions is slow and laborious and splinters the information so you can’t get it all in one place: it’s faster and better, as much as possible, to anticipate questions and answer them in your original communication.

Assume good faith. This is so easier said than done ;/ But for real, assume good faith. When someone asks a question and you think they are trolling, it’s entirely possible they are not. (Maybe they are 15 years old, or their English is imperfect, or they have an impairment of some kind.) Even if they are trolling: there will always be onlookers who don’t know it, and who, whatever the provocation, will recoil if you are curt or unkind. Trolling also gives you an opportunity to equip onlookers with reasonable arguments that they can go on to use themselves.

Bias towards transparency. Way, way, way more than you think you should. I remember being taught change management back in the early 2000s. Our instructor beat into us that wherever there is a communications vacuum, it will be filled by gossip and fear. That is a million percent true, and even more so in online communities. Gossip and fear grow out of power imbalances and information asymmetry which are to some degree unavoidable in distributed and voluntary groups, and you need to compensate for that. Publishing everything also scales well, because it equips everybody, not just you and your inner circle, to help explain what’s going on.

Note that if you’re the boss it’s insufficient to ask your staff to be transparent, because as long as there is any risk of penalty for over-publishing, they will do the opposite. You need to make it clear that nobody will ever be punished or shamed for being transparent, then you need somebody to publish something they shouldn’t have, and then you need to not punish them. Only then will people begin to take you seriously about transparency.

When you change your mind, say it publicly, and explain why. This is another one that’s tough for execs from a non-community context, where we got trained to express more confidence than we felt. But for real: in a community context, changing your mind and explaining why will not erode your credibility; it will earn you more.

Pay attention to people you disagree with. In an ordinary work environment executives get insulated and protected from honest disagreement. This is bad for them and for their company. Community work is different: there is no shortage of people who will disagree with you, loudly and repeatedly, in public. It’s natural to avoid or ignore those people, but if you do, you’re wasting an opportunity. Consider instead that they may, occasionally, be right.

CC-BY-SA

Wikipedia editors in Washington DC at our annual conference (July 2012)

(This post is a very lightly-modified version of a piece that appeared in the L.A. Times this past weekend. I wrote it because at Newfoo I was describing Wikipedians to the Times op ed editor — she found it interesting, and asked me to write it up for her. It’s also in honour of Wikipedia’s 12th anniversary, which is tomorrow.)

Wikipedia is the encyclopedia anyone can edit (yes, even you!), but most people don’t think much about who does the work. With half a billion people around the world relying on Wikipedia for information, we should.

More than 1.5 million people in practically every country have contributed to Wikipedia’s 23 million articles. More than 12,000 new entries are created every day — eight in the last minute. The authors are poets and professors, baristas and busboys, young and old, rich and poor.

It’s crazy. An encyclopedia is one of humankind’s grandest displays of collaborative effort, with contributors from pretty much every ethnicity, nationality, socioeconomic background, political ideology, religion, sexual orientation and gender. The youngest Wikipedian I’ve met was seven, a boy in Tel Aviv who makes small edits to articles about animals and children’s books. The oldest I’ve met was 73, a retired engineer who writes about the history of Philadelphia, where he’s lived for half a century. My most recent cab driver in San Francisco, a middle-aged guy who I think was Eastern European, told me he edits, although I don’t know on what topics. I don’t know of a comparable effort, a more diverse collection of people coming together, in peace, for a single goal.

But beneath that surface diversity is a community built on shared values. The core Wikipedia editing community — those who are very, very active — is about 12,000 people. I’ve met thousands of them personally, and they do share common characteristics.

The first and most defining is that Wikipedians, almost without exception, are ridiculously smart, as you might expect of people who, for fun, write an encyclopedia in their spare time. I have a theory that back in school, Wikipedians were the smartest kids in the class, kids who didn’t care what was trendy or cool but spent their time reading, or with the debate team, or chess club, or in the computer lab. There’s a recurring motif inside Wikipedia of preteen editors who’ve spent their lives so far having their opinions and ideas discounted because of their age, but who have nonetheless worked their way into positions of real authority on Wikipedia. They love Wikipedia fiercely because it’s a meritocracy: the only place in their lives where their age doesn’t matter.

Wikipedians are geeky. They have to be to want to learn the wiki syntax required to edit, and that means most editors are the type of people who find learning technology fun. (It’s also because Wikipedia has its roots in the free software movement, which is a very geeky subculture.) The rise of the dot-com millionaire and the importance of services such as Google, Facebook and Wikipedia have made geekiness more socially acceptable. But geeks are still fundamentally outsiders, tending to be socially awkward, deeply interested in obscure topics, introverted and yet sometimes verbose, blunt, not graceful and less sensorily oriented than other people.

Nine of 10 Wikipedians are male. We don’t know exactly why. My theory is that Wikipedia editing is a minority taste, and some of the constellation of characteristics that combine to create a Wikipedian — geeky, tech-centric, intellectually confident, thick-skinned and argumentative, with the willingness and ability to indulge in a solitary hobby — tend to skew male.

Although individual Wikipedians come from a broad range of socioeconomic backgrounds, we tend to live in affluent parts of the world and to be relatively privileged. Most of us have reliable Internet connectivity and access to decent libraries and bookstores; we own laptops and desktops; we’re the product of decent educational systems, and we’ve got the luxury of free time.

Wikipedians skew young and are often students, concentrated at the postsecondary level. That makes sense too: Students spend their reading, thinking, sourcing, evaluating and summarizing what they know, essentially the same skills it takes to write an encyclopedia.

Like librarians and probably all reference professionals, Wikipedians are detail-obsessed pedants. We argue endlessly about stuff like whether Japan’s Tsushima Island is a single island or a trio of islands. Whether the main character in “Grand Theft Auto IV” is Serbian, Slovak, Bosnian, Croatian or Russian. Whether Baltimore has “a couple of” snowstorms a year or “several,” whether the bacon in an Irish breakfast is fried or boiled, whether the shrapnel wound John Kerry suffered in 1968 is better described as minor or left unmodified. None of this makes us fun at parties, but it does make us good at encyclopedia writing.

As befits an encyclopedia that anyone can edit, Wikipedians tend to be iconoclastic, questioning and curious. Wikipedia is a place where debate is a form of play and people are searching in good faith for the most correct answer. We’re credentials-agnostic: We want you to prove what you’re asserting; we take nothing on faith (and the article on “Faith” has ample footnotes). We’re products of the Enlightenment and the children of Spinoza, Locke and Voltaire. We oppose superstition, irrationalism and intolerance; we believe in science and reason and progress.

The most contentious topics on Wikipedia are the same as those in the rest of the world, like the Israeli-Palestinian conflict, global warming, intelligent design, the war on terrorism and people such as Adolf Hitler, Ayn Rand and Dick Cheney. We believe it’s not our job to edit Wikipedia so that it reflects our personal opinions; instead, we aim to be fair to all sides. Entries need to be neutrally stated, well-documented and verifiable. Editors are asked to avoid stating opinions, or even seriously contested assertions, as facts; instead, we attribute them to their source. We aim for non-judgmental language: We avoid value-laden words like “legendary” and “racist” and “terrorist.” If we don’t know for sure what’s true, we say so, and we describe what various sides are claiming.

Does this mean Wikipedia’s perfect? Of course not. Our weakest articles are those on obscure topics, where subtle bias and small mistakes can sometimes persist for months or even years. But Wikipedians are fierce guardians of quality, and they tend to challenge and remove bias and inaccuracy as soon as they see it. The article on Barack Obama is a great example of this. Because it’s widely read and frequently edited, over the years it’s become comprehensive, objective and beautifully well sourced. The more eyes on an article, the better it is. That’s the fundamental premise of Wikipedia, and it explains why Wikipedia works.

And it does work. On Dec. 17, 2001, an editor named Ed Poor started an article called “Arab-Israeli conflict” with this single sentence: “The Arab-Israeli conflict is a long-running, seemingly intractable dispute in the Middle East mostly hinging on the status of Israel and its relations with Arab peoples and nations.” Today that article is 10,000 words long, with two maps and six other images and 138 footnotes. It’s been edited more than 5,000 times by 1,800 people in dozens of countries, including Israel, Lebanon, Egypt, Denmark, Germany, Australia, Canada, Britain, the United States and Russia.

Since it was founded 12 years ago this week, Wikipedia has become an indispensable part of the world’s information infrastructure. It’s a kind of public utility: You turn on the faucet and water comes out; you do an Internet search and Wikipedia answers your question. People don’t think much about who creates it, but you should. We do it for you, with love.

For the past sixteen months, the Wikimedia Foundation has been having uncomfortable conversations about how we handle controversial imagery in our projects — including, a few weeks ago, the staging of a referendum on an image hiding feature requested by our Board. The purpose of this post is not to talk specifically about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues. The purpose of this post is to step back and assess where we’re at, and to call for a change in tone and emphasis in our discussions.

Please note also that due to the nature of the topic, you may find yourself offended by this post, and/or the materials linked from it.

In March 2010, editors on the German Wikipedia ran a poll asking their colleagues whether they would support a rule restricting the types of material that could appear on the German home page. Thirteen voted in favour of restrictions, and 233 voted against. A few weeks later, the German Wikipedia featured the article about the vulva on its home page, which included a close-up photograph of an open vagina. Twenty-three minutes after the article went up, a reader in Berlin wrote “you can’t be serious?!,” and called for the image to be taken down. This initiated an on-wiki discussion that eventually reached 73,000 words – the length of a shortish novel. It included a straw poll in which 29 people voted to remove the image and 30 voted to keep it. The image was kept, and the article remained on the front page for its full 24 hours.

A few months later, in June, the Wikimedia Foundation Board of Trustees began to discuss how the Wikimedia community was handling controversial imagery. Why? Because some people seemed to be using Commons to stockpile commercial porn; because the German community had put a close-up photo of a vagina on its homepage; and because upskirt photos and controversial editorial cartoons seemed to be being categorized in ways that seemed designed to be provocative, and the people who complained about them were being shot down.

The Wikimedia Foundation was concerned that a kind of market failure might be happening — that the Wikimedia community, which is generally so successful at achieving good decision quality through a consensus process, was for some reason failing to handle the issue of controversial material well. It set out to explore what was going on, and whether we needed to handle controversial imagery differently.

That triggered community members’ fears of censorship and editorial interference. And so we find ourselves today, sixteen months later, locked in angry debate. At a meeting in Nuremberg a few weeks ago, German Wikipedian User:Carbidfischer furiously denounced our Board Chair Ting Chen. The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists. User:Niabot created this parody image. It’s unpleasant and unconstructive, and if you’re familiar with transactional analysis, or with the work done by the Arbinger Institute, you’ll recognize the bad patterns here.

The purpose of this post is to figure out why we aren’t handling this problem well, and how we can get back on track.

So: backing up.

Is there a problem with how the Wikimedia projects handle potentially-objectionable material? I say yes. The problems that led the Board to want to address this issue still exist: they have not been solved.

So what’s the solution? I have read pages upon pages of community discussion about the issue, and I sympathize and agree with much of what’s been said. Wikipedia is not, and should never be, censored. It should not be editorially interfered with.

But refusing censorship doesn’t mean we have no standards. Editors make editorial judgments every day, when we assess notability of topics, reliability of sources, and so forth. The German Wikipedia particularly is known to have extremely rigorous standards.

So why do we refrain from the expression of editorial judgment on this one issue?

I think there are two major reasons.

First, we have a fairly narrow range of views represented in our discussions.

We know that our core community represents just a sliver of society: mainly well-educated young men in wealthy countries, clustered in Europe and North America. It shouldn’t surprise us, therefore, when we skew liberal/libertarian/permissive, especially on issues related to sexuality and religion. Our demographic and attitudinal narrowness is a shame because at the heart of the projects is the belief that many eyes make all bugs shallow and yet, we’re not practicing what we preach. Instead, we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with. People who believe other things fall silent or abandon the conversation or are reduced to impotent rage. Or, and even likelier, they never made it to the table in the first place.

Second, we are confusing editorial judgment with censorship.

Censorship is imposed from outside. Editorial judgment is something we do every day in the projects. Applying editorial judgment to potentially-objectionable material is something that honourable journalists and educators do every day: it is not the same as censorship, nor does it constitute self-censorship.

In newsrooms, editors don’t vote on whether they personally are offended by material they know their readers will find objectionable, and they don’t make decisions based on whether the angry letters outnumber the supportive ones. They exercise empathy, and at their best they are taking a kind of ‘balance of harm’ approach — aiming to maximize benefit and minimize cost. The job is to provide useful information to as many people as possible, and they know that if people flee in disgust, they won’t benefit from anything the newsroom is offering. That doesn’t mean newsrooms publish only material that’s comfortable for their readers: it means they aim to exercise good judgment, and discomfit readers only when –on balance– discomfort is warranted.

How does that apply to us? It’s true that when people go to the article about the penis, they probably expect to see an image of a penis, just like they do when they look it up in a reference book in their public library. It’s also true that they probably wouldn’t benefit much from a gallery of cellphone camera shots of penises, and that’s why we don’t have those galleries on our articles. In lots of areas, we are currently doing a good job.

But not always.

When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic. When an editor writes “you can’t be serious?!” after vagina,anus,perineum_(detail).jpg is posted on the front page, the response shouldn’t be WP:NOTCENSORED: we should have a discussion about who visits the homepage, and we should try to understand, and be sensitive to, their expectations and circumstances and needs. When we get thousands of angry e-mails about our decision to republish the Jyllands-Posten Muhammad cartoons, we should acknowledge the offence the cartoons cause, and explain why, on balance, we think they warrant publication anyway. None of that is censorship. It’s just good judgment. It demonstrates transparency, a willingness to be accountable, and a desire to help and serve our readers — and it would earn us trust.

I believe that in our discussions to date, we’ve gotten ourselves derailed by the censorship issue. I know that some people believe that the Wikimedia Foundation is intending to coercively intervene into the projects, in effect overruling the judgment of the editorial community. I don’t see it that way, I regret that others do, and I dislike the ‘authoritarian parent / rebellious adolescent’ dynamic we seem to be having trouble resisting.

Wikipedia is not censored. It should never be censored. That doesn’t relieve us of the obligation to be thoughtful and responsible.

So: what needs to happen?

We need to have a discussion about how to responsibly handle objectionable imagery. That discussion doesn’t need to happen with the Wikimedia Foundation (or at least, not solely with the Wikimedia Foundation). The projects should be talking internally about how to avoid unnecessarily surprising and offending readers, without compromising any of our core values.

Those community members who are acting like provocateurs and agitators need to stop. Demonizing and stereotyping people we disagree with pushes everyone into extremist positions and makes a good outcome much less likely. We need to look for common ground and talk calmly and thoughtfully with each other, staying rooted in our shared purpose. Some editors have been doing that throughout our discussions: I am seriously grateful to those people, and I wish others would follow their example.

“Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.

We need to set aside anxieties about who’s in charge, and quit fighting with each other. We need to be aware of who’s not at the table. We need to bring in new voices and new perspectives that are currently lacking, and really listen to them. Those community members who’ve been afraid to talk need to speak up, and those who’ve been driven away need to come back.

The purpose of this post is to call for that responsible engagement.

Like I said at the top of this post, my purpose in writing this is not to talk about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues.

In my downtime while travelling, I read about two years worth of Less Wrong, a rationalist community blog that Kat Walsh introduced me to. It’s a great read, especially for people who fall into what Less Wrong co-founder Eliezer Yudkowsky hilariously and aptly labels “the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd” – and there are a couple of posts I think are particularly worth calling to the attention of experienced, committed Wikimedia community members.

Here are four posts I think every Wikimedian should read.

1. How to Save the World lays out a rationalist approach to making the world a better place. My favourite –and the most applicable to us– “identify a cause with lots of leverage.” In the words of the author:

It’s noble to try and save the world, but it’s ineffective and unrealistic to try and do it all on your own. So let’s start out by joining forces with an established organization who’s already working on what you care about. Seriously, unless you’re already ridiculously rich + brilliant or ludicrously influential, going solo or further fragmenting the philanthropic world by creating US-Charity#1,238,202 is almost certainly a mistake. Now that we’re all working together here, let’s keep in mind that only a few charitable organizations are truly great investments — and the vast majority just aren’t. So maximize your leverage by investing your time and money into supporting the best non-profits with the largest expected pay-offs.

2. Defecting By Accident: A Flaw Common to Analytical People lays out the author’s view that highly analytical people tend to frequently “defect by accident” – basically, they hurt their ability to advance their own agenda by alienating others with unnecessary pedantry, sarcasm, and disagreeableness. The author offers eight tips for behavioural changes to make accidental defectors more effective, and recommends three books to increase influence persuasive ability — including Robert Cialdini’s excellent Influence: The Psychology of Persuasion [1].

3. Why Our Kind Can’t Cooperate. A post that argues that yes, a group which can’t tolerate disagreement isn’t rational. But also that a group that tolerates only disagreement is equally irrational.

Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others’ arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we’re not losing because we’re so superior, we’re losing because our exclusively individualist traditions sabotage our ability to cooperate.

4. Your Price For Joining. This picks up where Poul-Henning Kamp’s Why Should I Care What Color the Bikeshed Is? leaves off, arguing that “people in the atheist/libertarian/technophile/sf-fan/etcetera cluster often set their joining prices way way way too high.” In the words of the author:

I observe that people underestimate the costs of what they ask for, or perhaps just act on instinct, and set their prices way way way too high. If the nonconformist crowd ever wants to get anything done together, we need to move in the direction of joining groups and staying there at least a little more easily. Even in the face of annoyances and imperfections! Even in the face of unresponsiveness to our own better ideas!

These are themes I think about / write about, a lot: collaboration, dissent, how groups can work together productively. I worry sometimes that Wikimedians think I’m hyper-critical and don’t see the strengths of our (argumentative, lively, sometimes ungenerous) culture. So to be super-clear: no! I very much value our culture, scrappiness and all. That doesn’t mean I don’t see its limitations though, and I do think we should always be aiming to improve and make ourselves more effective. That’s what these essays are about, and that’s why I’m recommending them.

[1] I e-mailed Robert Cialdini once looking for advice about a particular problem I was having working well with some Wikimedia community members. Surprisingly to me, he called me within just a few minutes, and we talked for more than an hour while I walked through an airport. I wouldn’t say he was able to fully solve my problem, but it was a helpful conversation and I was amazed by his generosity.