Archives for posts with tag: collaboration

A while back I was startled while researching someone in a work context, to come across a bunch of NSFW self-portraits she’d posted online under her real name. She was mid-career in compliance-related roles at big, traditional companies, and the photos raised questions for me about her judgement and honestly her competency. Didn’t she realise the images were public? Hadn’t she ever thought about what could happen when somebody –a colleague, a boss– randomly googled her? Was she making a considered decision, or just being clueless?

I was surprised because nowadays, that lack of caution is so rare. That’s partly because people have gotten a little more sophisticated about privacy controls, but mostly I think we’ve just given up. We can’t be confident our stuff is private today or will stay private tomorrow — if we didn’t know that already, we know it now from The Fappening and the Guardian’s uncovering that Whisper tracks its users.

And so I think that most people, most of the time, have decided to just assume everything we do online is public, and to conduct ourselves accordingly. It’s a rational decision that’s resulted in a tone and style we all recognize: we’re cheerful about work, supportive of friends, proud of family; we’ve got unobjectionable hobbies and we like stuff like vacations and pie. Promotions and babies and parties yes, layoffs and illnesses and setbacks not so much.

Secret, the app that was super-hot last winter, was briefly an exception. People talked on Secret about bad sex, imposter syndrome, depression and ADD, their ageing parents, embarrassments at work. You may remember the engineer who posted that he felt like a loser because he, seemingly alone in Silicon Valley, was barely scraping by financially. It was vulnerable and raw and awesome.

But I ended up uninstalling it pretty fast, after one too many humble-brags showed up in my feed. (The final straw was a guy boasting about how he’d bought a new iPad for a kid at the airport, after watching her mom get mad at her for dropping and breaking theirs. Blah.) I couldn’t bear seeing people diligently polishing up their self-presentation as confident and fun and generous and successful, on a service whose whole point was to enable risk-free vulnerability.

Reverse-engineering user behaviour on Secret, it read to me like people were hedging their bets. Secret users seemed to be operating (maybe without even thinking much about it) on the assumption that one day, due to a data breach or change in privacy policy or sale of the company, their activity on Secret might be available, linked to them, to their friends or insurance provider or boss or mom or bank. They didn’t trust their activity was permanently private, and so they acted as though it wasn’t.

That feeling of always being potentially in a spotlight leads us to relentlessly curate how we self-present online. And that is bad for us.

It’s bad for individuals because we run the risk of comparing our own insides to other people’s outsides, which makes us feel crappy and sets us up to make decisions based on flawed assumptions. Brene Brown: “If you trade your authenticity for safety, you may experience the following: anxiety, depression, eating disorders, addiction, rage, blame, resentment, and inexplicable grief.” Erving Goffman: “To the degree that the individual maintains a show before others that he himself does not believe, he can come to experience a special kind of alienation from self and a special kind of wariness of others.”

It’s bad for society because it makes people feel alienated and disconnected from each other, and also because it has the effect of encouraging normativity. If we all self-monitor to hide our rough edges, our unpopular opinions, our anxieties and ugly truths, we’re participating in the narrowing of what’s socially acceptable. We make it less okay to be weird, flawed, different, wrong. Which sucks for young people, who deserve to get to freely make the stupid mistakes of youth. It sucks for people who’ve been abused or poor or sick, and who shouldn’t have to hide or minimize those experiences. And it sucks for anybody with an opinion or characteristic or interest that is in any way unconventional. (Yes that is all of us.)

Anonymity was one of the great things about the early internet, and although we benefit enormously from the ability today to quickly find and research and understand each other, as individuals we also need private spaces. We need, when we want to, for our own reasons, to get to be predictably, safely, unbreakably anonymous/pseudonymous, online. That’s why I use Tor and other FLOSS services that support anonymity, and it’s why I avoid the closed-source, commercially-motivated ones. I trust Tor, like a lot of people do, because it has a track record of successful privacy protection, and because it’s radically transparent in the same way, and presumably for the same reasons, that Wikipedia is.

I’ve got nothing to hide (and oh how I hate that I feel like I need to type out that sentence), but I value my privacy, and I want to support anonymity being understood as normal rather than perverse or suspect. So I’m increasingly using tools like Tor, ChatSecure, TextSecure, RiseUp, and DuckDuckGo. I’ve been talking about this with friends for a while and some have been asking me how to get started with Tor, and especially how to use it to access the deep web. I’m working on a post about that — with luck I’ll get it done & published within the next few weeks.

CC-BY-SA

Wikipedia editors in Washington DC at our annual conference (July 2012)

(This post is a very lightly-modified version of a piece that appeared in the L.A. Times this past weekend. I wrote it because at Newfoo I was describing Wikipedians to the Times op ed editor — she found it interesting, and asked me to write it up for her. It’s also in honour of Wikipedia’s 12th anniversary, which is tomorrow.)

Wikipedia is the encyclopedia anyone can edit (yes, even you!), but most people don’t think much about who does the work. With half a billion people around the world relying on Wikipedia for information, we should.

More than 1.5 million people in practically every country have contributed to Wikipedia’s 23 million articles. More than 12,000 new entries are created every day — eight in the last minute. The authors are poets and professors, baristas and busboys, young and old, rich and poor.

It’s crazy. An encyclopedia is one of humankind’s grandest displays of collaborative effort, with contributors from pretty much every ethnicity, nationality, socioeconomic background, political ideology, religion, sexual orientation and gender. The youngest Wikipedian I’ve met was seven, a boy in Tel Aviv who makes small edits to articles about animals and children’s books. The oldest I’ve met was 73, a retired engineer who writes about the history of Philadelphia, where he’s lived for half a century. My most recent cab driver in San Francisco, a middle-aged guy who I think was Eastern European, told me he edits, although I don’t know on what topics. I don’t know of a comparable effort, a more diverse collection of people coming together, in peace, for a single goal.

But beneath that surface diversity is a community built on shared values. The core Wikipedia editing community — those who are very, very active — is about 12,000 people. I’ve met thousands of them personally, and they do share common characteristics.

The first and most defining is that Wikipedians, almost without exception, are ridiculously smart, as you might expect of people who, for fun, write an encyclopedia in their spare time. I have a theory that back in school, Wikipedians were the smartest kids in the class, kids who didn’t care what was trendy or cool but spent their time reading, or with the debate team, or chess club, or in the computer lab. There’s a recurring motif inside Wikipedia of preteen editors who’ve spent their lives so far having their opinions and ideas discounted because of their age, but who have nonetheless worked their way into positions of real authority on Wikipedia. They love Wikipedia fiercely because it’s a meritocracy: the only place in their lives where their age doesn’t matter.

Wikipedians are geeky. They have to be to want to learn the wiki syntax required to edit, and that means most editors are the type of people who find learning technology fun. (It’s also because Wikipedia has its roots in the free software movement, which is a very geeky subculture.) The rise of the dot-com millionaire and the importance of services such as Google, Facebook and Wikipedia have made geekiness more socially acceptable. But geeks are still fundamentally outsiders, tending to be socially awkward, deeply interested in obscure topics, introverted and yet sometimes verbose, blunt, not graceful and less sensorily oriented than other people.

Nine of 10 Wikipedians are male. We don’t know exactly why. My theory is that Wikipedia editing is a minority taste, and some of the constellation of characteristics that combine to create a Wikipedian — geeky, tech-centric, intellectually confident, thick-skinned and argumentative, with the willingness and ability to indulge in a solitary hobby — tend to skew male.

Although individual Wikipedians come from a broad range of socioeconomic backgrounds, we tend to live in affluent parts of the world and to be relatively privileged. Most of us have reliable Internet connectivity and access to decent libraries and bookstores; we own laptops and desktops; we’re the product of decent educational systems, and we’ve got the luxury of free time.

Wikipedians skew young and are often students, concentrated at the postsecondary level. That makes sense too: Students spend their reading, thinking, sourcing, evaluating and summarizing what they know, essentially the same skills it takes to write an encyclopedia.

Like librarians and probably all reference professionals, Wikipedians are detail-obsessed pedants. We argue endlessly about stuff like whether Japan’s Tsushima Island is a single island or a trio of islands. Whether the main character in “Grand Theft Auto IV” is Serbian, Slovak, Bosnian, Croatian or Russian. Whether Baltimore has “a couple of” snowstorms a year or “several,” whether the bacon in an Irish breakfast is fried or boiled, whether the shrapnel wound John Kerry suffered in 1968 is better described as minor or left unmodified. None of this makes us fun at parties, but it does make us good at encyclopedia writing.

As befits an encyclopedia that anyone can edit, Wikipedians tend to be iconoclastic, questioning and curious. Wikipedia is a place where debate is a form of play and people are searching in good faith for the most correct answer. We’re credentials-agnostic: We want you to prove what you’re asserting; we take nothing on faith (and the article on “Faith” has ample footnotes). We’re products of the Enlightenment and the children of Spinoza, Locke and Voltaire. We oppose superstition, irrationalism and intolerance; we believe in science and reason and progress.

The most contentious topics on Wikipedia are the same as those in the rest of the world, like the Israeli-Palestinian conflict, global warming, intelligent design, the war on terrorism and people such as Adolf Hitler, Ayn Rand and Dick Cheney. We believe it’s not our job to edit Wikipedia so that it reflects our personal opinions; instead, we aim to be fair to all sides. Entries need to be neutrally stated, well-documented and verifiable. Editors are asked to avoid stating opinions, or even seriously contested assertions, as facts; instead, we attribute them to their source. We aim for non-judgmental language: We avoid value-laden words like “legendary” and “racist” and “terrorist.” If we don’t know for sure what’s true, we say so, and we describe what various sides are claiming.

Does this mean Wikipedia’s perfect? Of course not. Our weakest articles are those on obscure topics, where subtle bias and small mistakes can sometimes persist for months or even years. But Wikipedians are fierce guardians of quality, and they tend to challenge and remove bias and inaccuracy as soon as they see it. The article on Barack Obama is a great example of this. Because it’s widely read and frequently edited, over the years it’s become comprehensive, objective and beautifully well sourced. The more eyes on an article, the better it is. That’s the fundamental premise of Wikipedia, and it explains why Wikipedia works.

And it does work. On Dec. 17, 2001, an editor named Ed Poor started an article called “Arab-Israeli conflict” with this single sentence: “The Arab-Israeli conflict is a long-running, seemingly intractable dispute in the Middle East mostly hinging on the status of Israel and its relations with Arab peoples and nations.” Today that article is 10,000 words long, with two maps and six other images and 138 footnotes. It’s been edited more than 5,000 times by 1,800 people in dozens of countries, including Israel, Lebanon, Egypt, Denmark, Germany, Australia, Canada, Britain, the United States and Russia.

Since it was founded 12 years ago this week, Wikipedia has become an indispensable part of the world’s information infrastructure. It’s a kind of public utility: You turn on the faucet and water comes out; you do an Internet search and Wikipedia answers your question. People don’t think much about who creates it, but you should. We do it for you, with love.

From the collections of the Musée de la chasse et de la nature. Wikimedia Commons, CC-BY-SALots of Wikipedians are savants, geniuses, boffins. I am not, and I’m a pretty good Wikipedia contributor anyway — and you could be too. The purpose of this post is to show you how.

I usually start writing an article because I stumble across something interesting somewhere and want to find out more about it. If Wikipedia doesn’t already have an article, I’ll start one. That’s how I started the Wikipedia articles on the emo killings in Iraq, American chicklit novelist Laura Zigman, the type of prostitution known as survival sex, the Palestinian journalist Asma al-Ghul, and the healthcare industry practice of balance billing.

Here’s how to do it.

1.  Find a topic that interests you and which has either a bad Wikipedia article, or none at all. This is not hard, particularly if you fall outside the typical Wikipedian demographic (male, youngish, well-educated, and living in North America or Europe). There are lots of weak or missing articles on Wikipedia — here are a few: Handbag. The 17th century English Shoplifting Act. French curator Claude d’Anthenaise. American sociologist Rose Weitz. The hair treatment called marcelling. Sonic.net CEO Dane Jasper. The Marathi “bangle protection” ceremony Doha Jeevan. Mourning jewellery. The article on the Musée de la Chasse et de la Nature used to be pretty weak, until I fell in love with the museum on a trip to Paris, and then fixed it up.

2.  Google it. Wikipedia doesn’t care how smart you are, or how knowledgeable — it wants you to provide a reputable source for every statement you make. So if you say The Musée de la Chasse et de la Nature is housed in the Hôtel de Guénégaud, Wikipedia wants to know how you know that. I found that fact in Let’s Go Paris, the student-traveller guidebook published by Harvard, which I found by searching for the museum’s name in Google Books. In this case, I already knew where the museum was located, but I still needed to support it with a published reference.

Normally, when I’m researching a Wikipedia article, I get my best results from Google Books (preview results not snippet results) or Google Scholar. There are guidelines on Wikipedia about what sources are okay and what aren’t, but you don’t need to obsess over this: mostly, if you let common sense be your guide you’ll do fine. And if you mess up, a Wikipedian will likely fix your mistake.

3. Assemble your facts into a decent article. Most people do this in a text editor, and then dump it into the Wikipedia edit window once they’re nearly done. You get an edit window by typing this into the addressbar of your browser: http://en.wikipedia.org/w/index.php?title=*******&action=edit. Replace the asterisks with your title, in mixed case.

As you’re writing, you can look at other articles on Wikipedia to see how they’re structured (like this or this or this), but you’re free to do it however you like — there are no strict rules, and if you do it badly somebody will usually help make it better. Normally articles will contain some or all of the following sections: Overview, Background or History, the meat of the article which will have a section heading(s) appropriate to the subject-matter, References, Further Reading, and External Links. But an article can be considered complete even if all it contains is a paragraph or two of text, supported by a References section.

When you’re ready, paste your text into the edit window.

4. Add citations. This used to be really fiddly and irritating (and yes, I know, wiki syntax is not at all user-friendly, and yes we are working on it), but recently some lovely person made it easier.

Put your cursor right after the sentence you want to cite, then click cite. That’ll bring up a new set of options. Click templates then select which one you want –- if you’re unsure, choosing “web” is always safe. Fill out the little form that pops up and click insert. That’ll paste the appropriate wiki syntax into your article text. (Here is something I just figured out a few months ago: If you are adding a citation to a book, copy-paste the ISBN into that field first, then click the magnifying glass to its right. The rest of the form will auto-populate, yay!)

5. Make some final tweaks. Bold the first instance of your article title, like this: The Musée de la Chasse et de la Nature is a private museum of hunting and nature located in the IIIe arrondissement at 62, rue des Archives, Paris, France. Add double-square brackets around words you want to link to other pre-existing articles on Wikipedia – usually proper nouns are good candidates for this. Like this: In the Salon of the Dogs, a collection of gold dog collars throughout the ages is displayed alongside 17th-century portraits of [[Louis XIV]]’s pets and a small white version of the Scottie dog sculpture [[Puppy]] by contemporary American ceramic artist [[Jeff Koons]].

Once you’re happy, preview your article by clicking Show Preview at the bottom of the edit window, then fix anything that looks broken.

6. Then hit Save Page. And you’re done!

Here’s some further reading……

For the past sixteen months, the Wikimedia Foundation has been having uncomfortable conversations about how we handle controversial imagery in our projects — including, a few weeks ago, the staging of a referendum on an image hiding feature requested by our Board. The purpose of this post is not to talk specifically about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues. The purpose of this post is to step back and assess where we’re at, and to call for a change in tone and emphasis in our discussions.

Please note also that due to the nature of the topic, you may find yourself offended by this post, and/or the materials linked from it.

In March 2010, editors on the German Wikipedia ran a poll asking their colleagues whether they would support a rule restricting the types of material that could appear on the German home page. Thirteen voted in favour of restrictions, and 233 voted against. A few weeks later, the German Wikipedia featured the article about the vulva on its home page, which included a close-up photograph of an open vagina. Twenty-three minutes after the article went up, a reader in Berlin wrote “you can’t be serious?!,” and called for the image to be taken down. This initiated an on-wiki discussion that eventually reached 73,000 words – the length of a shortish novel. It included a straw poll in which 29 people voted to remove the image and 30 voted to keep it. The image was kept, and the article remained on the front page for its full 24 hours.

A few months later, in June, the Wikimedia Foundation Board of Trustees began to discuss how the Wikimedia community was handling controversial imagery. Why? Because some people seemed to be using Commons to stockpile commercial porn; because the German community had put a close-up photo of a vagina on its homepage; and because upskirt photos and controversial editorial cartoons seemed to be being categorized in ways that seemed designed to be provocative, and the people who complained about them were being shot down.

The Wikimedia Foundation was concerned that a kind of market failure might be happening — that the Wikimedia community, which is generally so successful at achieving good decision quality through a consensus process, was for some reason failing to handle the issue of controversial material well. It set out to explore what was going on, and whether we needed to handle controversial imagery differently.

That triggered community members’ fears of censorship and editorial interference. And so we find ourselves today, sixteen months later, locked in angry debate. At a meeting in Nuremberg a few weeks ago, German Wikipedian User:Carbidfischer furiously denounced our Board Chair Ting Chen. The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists. User:Niabot created this parody image. It’s unpleasant and unconstructive, and if you’re familiar with transactional analysis, or with the work done by the Arbinger Institute, you’ll recognize the bad patterns here.

The purpose of this post is to figure out why we aren’t handling this problem well, and how we can get back on track.

So: backing up.

Is there a problem with how the Wikimedia projects handle potentially-objectionable material? I say yes. The problems that led the Board to want to address this issue still exist: they have not been solved.

So what’s the solution? I have read pages upon pages of community discussion about the issue, and I sympathize and agree with much of what’s been said. Wikipedia is not, and should never be, censored. It should not be editorially interfered with.

But refusing censorship doesn’t mean we have no standards. Editors make editorial judgments every day, when we assess notability of topics, reliability of sources, and so forth. The German Wikipedia particularly is known to have extremely rigorous standards.

So why do we refrain from the expression of editorial judgment on this one issue?

I think there are two major reasons.

First, we have a fairly narrow range of views represented in our discussions.

We know that our core community represents just a sliver of society: mainly well-educated young men in wealthy countries, clustered in Europe and North America. It shouldn’t surprise us, therefore, when we skew liberal/libertarian/permissive, especially on issues related to sexuality and religion. Our demographic and attitudinal narrowness is a shame because at the heart of the projects is the belief that many eyes make all bugs shallow and yet, we’re not practicing what we preach. Instead, we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with. People who believe other things fall silent or abandon the conversation or are reduced to impotent rage. Or, and even likelier, they never made it to the table in the first place.

Second, we are confusing editorial judgment with censorship.

Censorship is imposed from outside. Editorial judgment is something we do every day in the projects. Applying editorial judgment to potentially-objectionable material is something that honourable journalists and educators do every day: it is not the same as censorship, nor does it constitute self-censorship.

In newsrooms, editors don’t vote on whether they personally are offended by material they know their readers will find objectionable, and they don’t make decisions based on whether the angry letters outnumber the supportive ones. They exercise empathy, and at their best they are taking a kind of ‘balance of harm’ approach — aiming to maximize benefit and minimize cost. The job is to provide useful information to as many people as possible, and they know that if people flee in disgust, they won’t benefit from anything the newsroom is offering. That doesn’t mean newsrooms publish only material that’s comfortable for their readers: it means they aim to exercise good judgment, and discomfit readers only when –on balance– discomfort is warranted.

How does that apply to us? It’s true that when people go to the article about the penis, they probably expect to see an image of a penis, just like they do when they look it up in a reference book in their public library. It’s also true that they probably wouldn’t benefit much from a gallery of cellphone camera shots of penises, and that’s why we don’t have those galleries on our articles. In lots of areas, we are currently doing a good job.

But not always.

When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic. When an editor writes “you can’t be serious?!” after vagina,anus,perineum_(detail).jpg is posted on the front page, the response shouldn’t be WP:NOTCENSORED: we should have a discussion about who visits the homepage, and we should try to understand, and be sensitive to, their expectations and circumstances and needs. When we get thousands of angry e-mails about our decision to republish the Jyllands-Posten Muhammad cartoons, we should acknowledge the offence the cartoons cause, and explain why, on balance, we think they warrant publication anyway. None of that is censorship. It’s just good judgment. It demonstrates transparency, a willingness to be accountable, and a desire to help and serve our readers — and it would earn us trust.

I believe that in our discussions to date, we’ve gotten ourselves derailed by the censorship issue. I know that some people believe that the Wikimedia Foundation is intending to coercively intervene into the projects, in effect overruling the judgment of the editorial community. I don’t see it that way, I regret that others do, and I dislike the ‘authoritarian parent / rebellious adolescent’ dynamic we seem to be having trouble resisting.

Wikipedia is not censored. It should never be censored. That doesn’t relieve us of the obligation to be thoughtful and responsible.

So: what needs to happen?

We need to have a discussion about how to responsibly handle objectionable imagery. That discussion doesn’t need to happen with the Wikimedia Foundation (or at least, not solely with the Wikimedia Foundation). The projects should be talking internally about how to avoid unnecessarily surprising and offending readers, without compromising any of our core values.

Those community members who are acting like provocateurs and agitators need to stop. Demonizing and stereotyping people we disagree with pushes everyone into extremist positions and makes a good outcome much less likely. We need to look for common ground and talk calmly and thoughtfully with each other, staying rooted in our shared purpose. Some editors have been doing that throughout our discussions: I am seriously grateful to those people, and I wish others would follow their example.

“Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.

We need to set aside anxieties about who’s in charge, and quit fighting with each other. We need to be aware of who’s not at the table. We need to bring in new voices and new perspectives that are currently lacking, and really listen to them. Those community members who’ve been afraid to talk need to speak up, and those who’ve been driven away need to come back.

The purpose of this post is to call for that responsible engagement.

Like I said at the top of this post, my purpose in writing this is not to talk about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues.

In my downtime while travelling, I read about two years worth of Less Wrong, a rationalist community blog that Kat Walsh introduced me to. It’s a great read, especially for people who fall into what Less Wrong co-founder Eliezer Yudkowsky hilariously and aptly labels “the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd” – and there are a couple of posts I think are particularly worth calling to the attention of experienced, committed Wikimedia community members.

Here are four posts I think every Wikimedian should read.

1. How to Save the World lays out a rationalist approach to making the world a better place. My favourite –and the most applicable to us– “identify a cause with lots of leverage.” In the words of the author:

It’s noble to try and save the world, but it’s ineffective and unrealistic to try and do it all on your own. So let’s start out by joining forces with an established organization who’s already working on what you care about. Seriously, unless you’re already ridiculously rich + brilliant or ludicrously influential, going solo or further fragmenting the philanthropic world by creating US-Charity#1,238,202 is almost certainly a mistake. Now that we’re all working together here, let’s keep in mind that only a few charitable organizations are truly great investments — and the vast majority just aren’t. So maximize your leverage by investing your time and money into supporting the best non-profits with the largest expected pay-offs.

2. Defecting By Accident: A Flaw Common to Analytical People lays out the author’s view that highly analytical people tend to frequently “defect by accident” – basically, they hurt their ability to advance their own agenda by alienating others with unnecessary pedantry, sarcasm, and disagreeableness. The author offers eight tips for behavioural changes to make accidental defectors more effective, and recommends three books to increase influence persuasive ability — including Robert Cialdini’s excellent Influence: The Psychology of Persuasion [1].

3. Why Our Kind Can’t Cooperate. A post that argues that yes, a group which can’t tolerate disagreement isn’t rational. But also that a group that tolerates only disagreement is equally irrational.

Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others’ arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we’re not losing because we’re so superior, we’re losing because our exclusively individualist traditions sabotage our ability to cooperate.

4. Your Price For Joining. This picks up where Poul-Henning Kamp’s Why Should I Care What Color the Bikeshed Is? leaves off, arguing that “people in the atheist/libertarian/technophile/sf-fan/etcetera cluster often set their joining prices way way way too high.” In the words of the author:

I observe that people underestimate the costs of what they ask for, or perhaps just act on instinct, and set their prices way way way too high. If the nonconformist crowd ever wants to get anything done together, we need to move in the direction of joining groups and staying there at least a little more easily. Even in the face of annoyances and imperfections! Even in the face of unresponsiveness to our own better ideas!

These are themes I think about / write about, a lot: collaboration, dissent, how groups can work together productively. I worry sometimes that Wikimedians think I’m hyper-critical and don’t see the strengths of our (argumentative, lively, sometimes ungenerous) culture. So to be super-clear: no! I very much value our culture, scrappiness and all. That doesn’t mean I don’t see its limitations though, and I do think we should always be aiming to improve and make ourselves more effective. That’s what these essays are about, and that’s why I’m recommending them.

[1] I e-mailed Robert Cialdini once looking for advice about a particular problem I was having working well with some Wikimedia community members. Surprisingly to me, he called me within just a few minutes, and we talked for more than an hour while I walked through an airport. I wouldn’t say he was able to fully solve my problem, but it was a helpful conversation and I was amazed by his generosity.

The Wikimedia Foundation Board of Trustees met in San Francisco a few weeks ago, and had a long and serious discussion about controversial content in the Wikimedia projects. (Why? Because we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission.)

Out of that agenda item, we found ourselves talking about what it looks like when change is handled well at Wikimedia, what good leadership looks like in our context, and what patterns we can see in work that’s been done to date.

I found that fascinating, so I’ve done some further thinking since the meeting. The purpose of this post is to document some good patterns of leadership and change-making that I’ve observed at Wikimedia.

Couple of quick caveats: For this post, I’ve picked three little case studies of successful change at Wikimedia. I’m defining successful change here as ‘change that stuck’ – not as ‘change that led to a desirable outcome.’ (I think all these three outcomes were good, but that’s moot for the purposes of this. What I’m aiming to do here is extract patterns of effective process.) Please note also that I picked these examples quickly without a criteria set – my goal was just to pick a few examples I’m familiar with, and could therefore easily analyze. It’s the patterns that matter, not so much the examples.

That said: here are three case studies of successful change at Wikimedia.

  • The Board’s statement on biographies of living people. Policies regarding biographies had been a topic of concern among experienced Wikipedians for years, mainly because there is real potential for people to be damaged when the Wikipedia article about them is biased, vandalized or inaccurate, and because our experience shows us that articles about non-famous people are particularly vulnerable to skew or error, because they aren’t read and edited by enough people. And, that potential for damage –particularly to the non-famous– grows along with Wikipedia’s popularity. In April 2009, the Board of Trustees held a discussion about BLPs, and then issued a statement which essentially reflected best practices that had been developed by the Wikipedia community, and recommended their consistent adoption.  The Board statement was taken seriously: it’s been translated into 18 languages, discussed internally throughout the editing community, and has been cited and used as policies and practices evolve.

  • The strategy project of 2009-10. Almost 10 years after Wikipedia was founded, the Board and I felt like it was time to stop down and assess: what are we doing well, and where do we want to focus our efforts going forward. So in spring 2009, the Wikimedia Board of Trustees asked me to launch a collaborative, transparent, participatory strategy development project, designed to create a five-year plan for the Wikimedia movement. Over the next year, more than 1,000 people participated in the project, in more than 50 languages. The resultant plan is housed on the strategy wiki here, and a summary version will be published this winter. You can never really tell the quality of strategy until it’s implemented (and sometimes not even then), but the project itself has accomplished what it set out to do.

  • The license migration of May 2009. When I joined Wikimedia this process was already underway, so I only observed first-hand the last half of it. But it was lovely to watch. Essentially: some very smart and experienced people in leadership positions at Wikimedia decided it made sense to switch from the GFDL to CC-BY-SA. But, they didn’t themselves have the moral or legal right to make the switch – it needed to be made by the writers of the Wikimedia projects, who had originally released their work under the GFDL. So, the people who wanted the switch launched a long campaign to 1) negotiate a license migration process that Richard Stallman (creator of the GFDL and a hero of the free software movement) would be able to support, and 2) explain to the Wikimedia community why they thought the license migration made sense. Then, the Wikimedia board endorsed the migration, and held a referendum. It passed with very little opposition, and the switch was made.

Here are nine patterns I think we can extract from those examples:

  1. The person/people leading the change didn’t wait for it to happen naturally – they stepped up and took responsibility for making it happen. The strategy project grew out of a conversation between then-board Chair Michael Snow and me, because we felt that Wikimedia needed a coherent plan. The BLP statement was started by me and the Board, because we were worried that as Wikipedia grew more popular, consistent policy in this area was essential. The license migration was started by Jimmy Wales, Erik Moeller and others because they wanted it to be much easier for people to reuse Wikimedia content. In all these instances, someone identified a change they thought should be made, and designed and executed a process aimed at creating that change.
  2. A single person didn’t make the change themselves. A group of people worked together to make it happen. More than a thousand people worked on the strategy project. Probably hundreds have contributed (over several years) to tightening up BLP policies and practices. I’m guessing dozens of people contributed to the license migration. The lesson here is that in our context, lasting change can’t be produced by a single person.
  3. Early in the process, somebody put serious energy towards achieving a global/meta understanding of the issue, from many different perspectives. It might be worth pointing out that this is not something we normally do: in order to do amazing work, Random Editor X doesn’t have any need to understand the global whole; he or she can work quietly, excellently, pretty much alone. But in order to make change that involves multiple constituencies, the person doing it needs to understand the perspectives of everyone implicated by that change.
  4. The process was carefully designed to ask the right people the right questions at the right time. The license migration was an exemplar here: The people designing the process quite rightly understood that there was no point in asking editors’ opinions about something many of them probably didn’t understand. On the other hand, the change couldn’t be made without the approval of editors. So, an education campaign was designed that gave editors access to information about the proposed migration from multiple sources and perspectives, prior to the vote.
  5. A person or a group of people dedicated lots of hours towards figuring out what should happen, and making it happen. In each case here, lots of people did lots of real work: researching, synthesizing, analyzing, facilitating, imagining, anticipating, planning, communicating.
  6. The work was done mostly in public and was made as visible as possible, in an attempt to bolster trust and understanding among non-participants. This is fundamental. We knew for example that the strategy project couldn’t succeed if it happened behind closed doors. Again and again throughout the process, Eugene Eric Kim resisted people’s attempts to move the work to private spaces, because he knew it was critical for acceptance that the work be observable.
  7. Some discussion happened in private, inside a small group of people who trust each other and can work easily together. That’s uncomfortable to say, because transparency and openness are core values for us and anything that contradicts them feels wrong. But it’s true: people need safe spaces to kick around notions and test their own assumptions. I know for example that at the beginning of the Board’s BLP conversations, I had all kinds of ideas about ‘the problem of BLPs’ that turned out to be flat-out wrong. I needed to feel free to air my bad ideas, and get them poked at and refuted by people I could trust, before I could start to make any progress thinking about the issue. Similarly, the Board exchanged more than 300 e-mails about controversial content inside its private mailing list, before it felt comfortable enough to frame the issue up in a resolution that would be published. That private kicking around needs to happen so that people can test and accelerate and evolve their own thinking.
  8. People put their own credibility on the line, endorsing the change and trying to persuade others to believe in it. In a decentralized movement, there’s a strong gravitational pull towards the status quo, and whenever anyone tries to make change, they’re in effect saying to hundreds or thousands of people “Hey! Look over here! Something needs to happen, and I know what it is.” That’s a risky thing to do, because they might be perceived in a bunch of negative ways – as naiive or overreacting, as wrong or stupid or presumptuous, or even as insincere – pretending to want to help, but really motivated by inappropriate personal self-interest. Putting yourself on the line for something you believe in, in the face of suspicion or apathy, is brave. And it’s critical.
  9. Most people involved –either as participants or observers– wanted more than anything else to advance the Wikimedia mission, and they trusted that the others involved wanted the same thing. This is critical too. I have sometimes despaired at the strength of our default to the status quo: it is very, very hard to get things done in our context. But I am always reassured by the intelligence of Wikimedia community members, and by their dedication to our shared mission. I believe that if everyone’s aligned in wanting to achieve the mission, that’s our essential foundation for making good decisions.

Like I said earlier — these are just examples I’ve seen or been involved in personally. I’d be very interested to hear other examples of successful change at Wikimedia, plus observations & thinking about patterns we can extract from them.

Below are some raw notes I took this weekend at the Quaker workshop I posted about yesterday. This is super-rough; if anything doesn’t make sense just say so in the comments, and I will try to reconstruct what it meant :-)

How meeting attendees are expected to behave:

  • The tone of the meeting is supposed to be unhurried, calm and thoughtful;
  • People are expected to come with an open mind;
  • People are expected to pay attention and listen carefully;
  • People are expected to try to avoid clever debate or heated argument: to try to speak with love rather than judgment;
  • Quakers wait to be called upon by the meeting clerk;
  • Expressing agreement with other people is fine. Quakers will nod and say “this Friend speaks my mind.”
  • People are expected to be open to learning and changing their minds;
  • They’re expected to be honest, and call out other people who are behaving badly;
  • Quakers used to vote but don’t any more: a decision isn’t taken until the weight of the meeting is behind it;
  • People are expected to be open to exploring disagreement. Avoiding conflict means avoiding the opportunity to learn;
  • People should speak appropriately – if they are talking too much, they should restrain themselves. They shouldn’t interrupt other people. Once they express a view, they should refrain from repeating themselves, or bringing up the issue again once the meeting has moved on;
  • If Quakers are feeling shy or reticent or silenced, they should say that in the meeting, in order to get the impediment –whatever it is– resolved;
  • In Quaker meetings, some work is done offline and presented to the meeting by committees. In the meeting, Quakers are expected to trust the diligence and the care of the committees, rather than aiming to second-guess or redo the committee’s work;
  • When people at meetings behave badly (compulsively wordsmithing other people’s work, compulsively standing in the way of consensus, and so forth), the others at the meeting help everyone by encouraging that person to let go;
  • Quakers are expected to not gossip about each other. Useful conversation is accurate and caring, promotes greater understanding, and does not break confidences. It has the result of increasing trust rather than diminishing it.
  • Quakers are expected to express appreciation to the presiding clerk for their work. They’re also expected to try to help the presiding clerk do better.

How meeting clerks [1] are expected to behave:

  • Clerks are responsible for establishing the appropriate tone for the meeting – setting the stage at the beginning, and controlling the tone throughout;
  • The clerk’s expected to know more about every topic than practically everyone else. Most of that is research and thinking and other prep that happens outside of the meeting;
  • The clerk’s expected to help the group trust the work of the committees. This may involved working with committee chairs offline prior to the meeting, to ensure the material’s in good shape and ready to present. The clerk is also expected to help the committees draft appropriate rough minutes in advance of the meeting, so the meeting has a starting point for its deliberations. The clerk should aim to establish a tone in which the committee’s work can be appropriately received;
  • Remind people how to behave in the meeting. Remind people of the higher purpose;
  • Pace the meeting – providing for silence where necessary;
  • Be conscious of their own effect on other attendees (e.g., hugging one person and not others can foster suspicion of cabals);
  • Park their own strong opinions;
  • Be humble and patient and loving, but not wimpy;
  • Park their own desire to be popular;
  • Have a sense of humour, and leaven the seriousness sometimes;
  • Surface and aim to resolve conflict, rather than letting it fester under the meeting’s surface;
  • Build bridges among different constituencies in the meeting;
  • Pay attention to the tone of the meeting, especially to people’s complaints or problems that are keeping them from letting the meeting move forward. A lot of this is emotional work – understanding when people are feeling unheard, and helping them fix that.
  • Right after the meeting, check in with the recording clerk to see if they accurately captured the sense of the meeting;
  • Right after the meeting, connect with any participants who found the meeting particularly difficult, to help them resolve whatever the conflict is;
  • Look around the group and see which other people may be able to lead [2], and encourage and help develop them.

[1] Quakers call the people who facilitate their meetings ‘clerks.’ The job of the clerk is to listen, understand and document meeting decisions. When clerks think they’re understanding the ‘sense of the meeting,’ they will draft a minute and read it back to the meeting for acceptance or refinement. At the Wikimedia Foundation, all our leadership roles contain elements of clerking.

[2] Quakers say “we don’t find Quaker leaders; we grow them.”

How recording clerks [3] are expected to behave:

  • Does not facilitate;
  • Listens really carefully during the meeting and creates the minutes that discern the truth of the meeting (“what does the meeting think”). The minutes are intended to capture all decisions, including who will do things, and by when;
  • The recording clerk will draft the minutes during the meeting itself. Often the meeting pauses to have a draft minute read back to the group – this will surface dissent and misunderstandings, and allow for them to be resolved and reflected in a revised minute;
  • Good minutes are thought to be brief but complete. They typically aim to show how a decision was arrived at, but try not to revive dissent. Points raised in discussion typically aren’t attributed to individuals, because ultimately consensus is achieved and disagreement resolved, so there’s no benefit to retaining a record of who said what. Good minutes aim to reflect the fact that everyone in the meeting is seeking unity;
  • Minutes need to include sufficient rationale so that people don’t need to have the discussion all over again;
  • Responsibility for the accuracy of the meeting resides with the clerk – both the recording clerk and the clerk signs the minutes
  • Minutes should be published as soon as possible after the meeting, so that people’s memories are fresh, and they are reminded of what they’ve committed to do.

[3] Recording clerks are the people who actually capture the minutes of the meeting for the clerk. Once the minutes are captured, both the recording clerk and the clerk sign them. Essentially: the recording clerk keeps the records, which frees the clerk up to do active facilitation.

How committees are expected to behave:

  • The job of the committee is to actually do the work well, offline, out of the meeting. The meeting doesn’t have sufficient time to go through issues with an appropriate level of detail and rigour: that is what the committees are for;
  • Committees should bring to the meeting clear draft minutes (resolutions). Clerks should look at the minutes in advance and tweak them if he or she thinks it will help;
  • Committee membership should not be determined by who is free, or most interested – people should not ‘volunteer’ to be on committees. Instead, committee membership should be determined carefully, by weighing what skills and abilities are needed, and who has them;
  • Committee meetings should be open: unless confidentiality is required, anyone should be able to attend them;
  • Committee work has the incidental benefit of creating time and space for committee members to develop personal relationships with each other, that strengthen the entire community;
  • Committees are expected to do the hard work and resolve difficult issues. They are not expected to throw up their hands and bring back unresolved issues to the meeting;
  • The meeting is asked to trust the committee, and the clerk is asked to help the meeting trust the committee. The committee needs to live up to its part of that bargain, by meeting its deadlines and doing the work it’s committed that it will do.

Here are some examples of Quaker meeting minutes. I have no reason to think these minutes are particularly exemplary – they’re just examples I could find quickly, on the internet. Here are minutes from a 2005 Davis (California) meeting, minutes from a 2010 Eastbourne (UK) meeting, and minutes from a 2010 national UK meeting. (All PDF.)

About a week ago, I started running a little survey asking Wikimedians how we should approach target-setting for the next five years.

I did it because next month Wikimedia will finalize the targets that’ll guide our work for the next five years, and I wanted to gather some quick feedback on the thinking that’s been done on that, to date.  The survey’s close to wrapping up now, and the results thus far are terrific: there appears to be good consensus on what we want to measure, as well as on our general approach.

More detail below!  But first, some general background.

In July 2009, the Wikimedia Foundation kicked off a massive strategy development project, which is starting to wrap up now. [1] The one major set of decisions that remains to be finalized is how we will measure progress towards our goals.

The draft goals, measures of success and targets that have been developed via the strategy project are here. They were created over the past several months by Wikimedia community members, Bridgespan staff, and Wikimedia Foundation staff (thank you all) – and in my opinion, they’re pretty good.  They focus on what’s important, and they do a reasonably good job of figuring out how to measure things that don’t always lend themselves to easy measurement.

Before finalizing the targets and taking them to the Wikimedia Board of Trustees for approval, I wanted to gather some additional input, so I hacked together a quick, imperfect little survey.   (You can read it –and fill it out if you want– here.) The purpose of this post is just to share the results — I will probably write more about the targets themselves later.

First some methodology: I made the survey in Google Docs, and sent identical versions to i) the Wikimedia Board, ii) the Wikimedia staff, and iii) the “foundation-l” mailing list (a public list on which anyone can talk about the Wikimedia Foundation and Wikimedia projects), the Wikimedia Foundation Advisory Board list, and the “internal-l” mailing list (a private list intended for Wikimedia chapters representatives and Wikimedia Foundation board and staff).  Then –for the purposes of this post– I aggregated together all three sets of results, which total about 120 individual responses thus far.

If I’d been more serious I’d have used LimeSurvey, which is a better survey tool than Google Docs — but this is really just meant to be a structured solicitation of input, rather than a proper quantitative study.  For one thing, the “community” results reflect only a tiny fraction of active editors — those who read English, who are on Wikimedia’s mailing lists or are connected with people who are, and who self-selected to answer the survey.  So, please resist the temptation to over-interpret whatever numbers I’ve given here.

In general, I was happy to find that the survey surfaced lots of consensus.  A comfortable majority agrees with all of the following:

  • Wikimedia’s goals should be “ambitious but possible.” (Other less-popular options were: “definitely attainable, but not necessarily easily,” “audacious and probably not attainable, but inspiring,” and “fairly easily attainable.”)
  • We agree that the purpose of setting goals is “to create a shared understanding and alignment about what we’re trying to do, publicly and with everyone.” (Other options: “to create an audacious target that everyone can get excited about and rally behind,” and “to create accountability.”)
  • In setting goals, we believe “perfection is the enemy of the good: I would rather see us using imperfect measures than no measures at all.” (About 15% of respondents felt otherwise, believing that “imperfect measures are a waste of time and energy.”)
  • The Wikimedia Foundation’s goals should be dependent on efforts by both the Wikimedia Foundation and the Wikimedia community, not by the Foundation alone. (18% of respondents felt otherwise, that the targets should be “entirely within the control of the Wikimedia Foundation to influence.”)
  • If we exceed our goals, practically everyone will be “thrilled.” (About five percent of respondents felt otherwise, saying that they would be “disappointed: that would tell me our goals weren’t sufficiently challenging.”
  • If we fail to meet our goals, about three quarters of respondents will feel “fine, because goals are meant to aspire/align: if we do good work but don’t meet them, that’s okay.” Interestingly, this is one of the few areas of the survey where there was a real division between the staff of the Wikimedia Foundation and other respondents. Only 17% of staff agreed they’d be okay with missing our targets. I think this is probably good, because it suggests that the staff feel a high sense of personal responsibility for their work.
  • Almost everyone agrees that “goal-setting for the Wikimedia Foundation is difficult. We should set goals now, but many measures and targets will be provisional, and we’ll definitely need to REFINE them over the next five years, possibly radically.” (Runner-up response: “we can set good goals, measures and targets now, and we should NOT need to change them much during the next five years.” And a very small number felt that we should refrain from setting targets for “things we’re still uncertain about,” and instead restrict ourselves to areas that are “straightforward.”)
  • The global unique visitors target is felt by most to be “attainable if the staff and community work together to achieve it.” (About 20% of respondents felt the target might be “even happen without any particular intervention.”)

I wanted to get a sense of what measures people felt were most important. They’re below, in descending order of importance. (The number is the percentage of total respondents who characterized the measure as either “critical” or “important.” Other options were “somewhat important,” “not important,” and “don’t know/not sure.”)

It’s probably worth noting that consensus among community members, the board and the staff was very high.  For more than half the measures, the percentage of respondents rating the measure as “important” or “critical” varied by less than 10% among the different groups, and for the remainder, it varied by less than 20%.

Measure Avg
Retention of active editors 84
Number of active editors 83
Site performance in different geographies 80
Demographics of active editors 80
Uptime of all key services 78
Financial stability 74
Global unique visitors 66
Secure off-site copies 65
Number of articles/objects/resources 65
Regular snapshots/archives 60
Thriving research community 54
Offline reach 53
Reader-submitted quality assessments 41
Expert article assessments 40
Community-originated gadgets/tools/extensions 22

The survey’s still accepting input — if you’re interested you’ve got until roughly 7PM UTC, Wednesday August 18, to fill it out.

————————————————————————————–
[1]

I launched the Wikimedia strategy project at the request of the Wikimedia Foundation Board of Trustees, and it was led by Eugene Eric Kim of Blue Oxen Associates, a consulting firm with a special focus on enabling collaborative process. Eugene worked with Philippe Beaudette, a longtime Wikipedian and online facilitator for the project, and The Bridgespan Group, a non-profit strategy consulting firm that provided data and analysis for us. The premise of the project was that the Wikimedia movement had achieved amazing things (the number five most-used site in the world! 375 million visitors monthly!), and it was now time to reflect on where we were making good progress towards fulfilling the mission, and where we weren’t. With the goal of course-correcting where we weren’t doing well.

To come up with a good plan, we wanted to stay true to our core and central premise: that open, mass collaboration is the most effective method for achieving high-quality decisionmaking. So, we designed the process to be transparent, participatory and collaborative. So, during the course of the project, more than a thousand volunteers worked together in 50+ languages — in teams and as individuals, mostly in public on the strategy wiki, but supplemented by IRC meetings, Skype calls, e-mail exchanges, and face-to-face conversations (e.g., meetings were held in Berlin, Paris, Buenos Aires, San Francisco, Boston and Gdansk).

The project’s now entering its final phase, and you can see the near-final results here on the strategy wiki.  What remains to be done is the finalization of the measures of success, which will happen over the next six or so weeks. At that point, there will be some final wordsmithing, and the result will be brought to the Wikimedia Board of Trustees for approval.

I will probably write about the strategy project at a later date, because it is super-interesting. (Meanwhile, if you’re interested, you can read a little about it here in a story that Noam Cohen wrote from Wikimania 2010 in Gdansk.)

I never thought much about the Quakers [1] until I read Joseph Reagle‘s excellent new book Good Faith Collaboration: The Culture of Wikipedia (forthcoming from MIT Press in September), in which Joseph references the Quaker consensus decisionmaking processes – and specifically, how Quakers resolve dissent.

Joseph cites the sociological study Beyond Majority Rule: Voteless Decisions in the Society of Friends – an exploration of Quaker decisionmaking by Jesuit priest Michael J. Sheeran, who had spent two years observing and interviewing Quakers for his Princeton PhD thesis, which afterwards was published by the Quakers and is now considered a definitive guide on the subject.

Consensus decisionmaking (CDM) is a really interesting topic for Wikimedians because we make most of our decisions by consensus, and we struggle every day with CDM’s inherent limitations. It’s slow and sometimes tedious, it’s messy and vulnerable to disruption, and –most problematically– it’s got a strong built-in bias towards the status quo. CDM creates weird perverse incentives – for example, it gives a lot of power to people who say no, which can make saying no attractive for people who want to be powerful. And it can act to empower people with strong views, regardless of their legitimacy or correctness.

Beyond Majority Rule was so fascinating that it’s sent me on a bit of a Quaker reading binge, and in the past month or so I’ve read about a dozen books and pamphlets on Quaker practices.  I’ve been interested to see what values and practices the Quakers and Wikimedians share, and whether there are things the Quakers do, that we might usefully adopt.

For the most part, Quaker practices likely aren’t particularly adaptable for mass collaboration, because they don’t scale easily.  They seem best-suited to smallish groups that are able to meet frequently, face-to-face.

But some Quaker practices, I think, are relevant to Wikimedia, and we are either already using versions of them, or should consider it. The Quaker “clerk” role, I think, is very similar to our leadership roles such as board or committee chair. The Quaker decisionmaking process has strong similarities to how our board of trustees makes its decisions, and I think Quaker methods of reconciling dissent might be particularly useful for us.  (Quakers have better-codified levels of dissent and paths to resolution than we do — I think we could adopt some of this.) And the Quaker schools’ delineation of roles-and-responsibilities among board, staff and community members, could I think also be a good model for us.

I plan to write more about the Quakers in coming weeks. For now though, here’s a list of what I’ve been reading:

[1] Quakers have their roots in 17th century England. There are about 360,000 Quakers today, mainly in Africa, the Asia Pacific Region, the UK and North America. Most consider themselves Christians, although a few identify as agnostic, atheistic, or as members of non-Christian faith traditions such as Judaism or Islam. Quakers are probably best known for their belief that the word of God is still emergent rather than fully known, their silent and “unprogrammed” religious services which have no leaders, hymns or incantations, their centuries-old tradition of pacifism and social activism, and their consensus decision-making process.

Read more about the Quakers at Wikipedia.