For the past sixteen months, the Wikimedia Foundation has been having uncomfortable conversations about how we handle controversial imagery in our projects — including, a few weeks ago, the staging of a referendum on an image hiding feature requested by our Board. The purpose of this post is not to talk specifically about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues. The purpose of this post is to step back and assess where we’re at, and to call for a change in tone and emphasis in our discussions.
Please note also that due to the nature of the topic, you may find yourself offended by this post, and/or the materials linked from it.
In March 2010, editors on the German Wikipedia ran a poll asking their colleagues whether they would support a rule restricting the types of material that could appear on the German home page. Thirteen voted in favour of restrictions, and 233 voted against. A few weeks later, the German Wikipedia featured the article about the vulva on its home page, which included a close-up photograph of an open vagina. Twenty-three minutes after the article went up, a reader in Berlin wrote “you can’t be serious?!,” and called for the image to be taken down. This initiated an on-wiki discussion that eventually reached 73,000 words – the length of a shortish novel. It included a straw poll in which 29 people voted to remove the image and 30 voted to keep it. The image was kept, and the article remained on the front page for its full 24 hours.
A few months later, in June, the Wikimedia Foundation Board of Trustees began to discuss how the Wikimedia community was handling controversial imagery. Why? Because some people seemed to be using Commons to stockpile commercial porn; because the German community had put a close-up photo of a vagina on its homepage; and because upskirt photos and controversial editorial cartoons seemed to be being categorized in ways that seemed designed to be provocative, and the people who complained about them were being shot down.
The Wikimedia Foundation was concerned that a kind of market failure might be happening — that the Wikimedia community, which is generally so successful at achieving good decision quality through a consensus process, was for some reason failing to handle the issue of controversial material well. It set out to explore what was going on, and whether we needed to handle controversial imagery differently.
That triggered community members’ fears of censorship and editorial interference. And so we find ourselves today, sixteen months later, locked in angry debate. At a meeting in Nuremberg a few weeks ago, German Wikipedian User:Carbidfischer furiously denounced our Board Chair Ting Chen. The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists. User:Niabot created this parody image. It’s unpleasant and unconstructive, and if you’re familiar with transactional analysis, or with the work done by the Arbinger Institute, you’ll recognize the bad patterns here.
The purpose of this post is to figure out why we aren’t handling this problem well, and how we can get back on track.
So: backing up.
Is there a problem with how the Wikimedia projects handle potentially-objectionable material? I say yes. The problems that led the Board to want to address this issue still exist: they have not been solved.
So what’s the solution? I have read pages upon pages of community discussion about the issue, and I sympathize and agree with much of what’s been said. Wikipedia is not, and should never be, censored. It should not be editorially interfered with.
But refusing censorship doesn’t mean we have no standards. Editors make editorial judgments every day, when we assess notability of topics, reliability of sources, and so forth. The German Wikipedia particularly is known to have extremely rigorous standards.
So why do we refrain from the expression of editorial judgment on this one issue?
I think there are two major reasons.
First, we have a fairly narrow range of views represented in our discussions.
We know that our core community represents just a sliver of society: mainly well-educated young men in wealthy countries, clustered in Europe and North America. It shouldn’t surprise us, therefore, when we skew liberal/libertarian/permissive, especially on issues related to sexuality and religion. Our demographic and attitudinal narrowness is a shame because at the heart of the projects is the belief that many eyes make all bugs shallow and yet, we’re not practicing what we preach. Instead, we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with. People who believe other things fall silent or abandon the conversation or are reduced to impotent rage. Or, and even likelier, they never made it to the table in the first place.
Second, we are confusing editorial judgment with censorship.
Censorship is imposed from outside. Editorial judgment is something we do every day in the projects. Applying editorial judgment to potentially-objectionable material is something that honourable journalists and educators do every day: it is not the same as censorship, nor does it constitute self-censorship.
In newsrooms, editors don’t vote on whether they personally are offended by material they know their readers will find objectionable, and they don’t make decisions based on whether the angry letters outnumber the supportive ones. They exercise empathy, and at their best they are taking a kind of ‘balance of harm’ approach — aiming to maximize benefit and minimize cost. The job is to provide useful information to as many people as possible, and they know that if people flee in disgust, they won’t benefit from anything the newsroom is offering. That doesn’t mean newsrooms publish only material that’s comfortable for their readers: it means they aim to exercise good judgment, and discomfit readers only when –on balance– discomfort is warranted.
How does that apply to us? It’s true that when people go to the article about the penis, they probably expect to see an image of a penis, just like they do when they look it up in a reference book in their public library. It’s also true that they probably wouldn’t benefit much from a gallery of cellphone camera shots of penises, and that’s why we don’t have those galleries on our articles. In lots of areas, we are currently doing a good job.
But not always.
When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic. When an editor writes “you can’t be serious?!” after vagina,anus,perineum_(detail).jpg is posted on the front page, the response shouldn’t be WP:NOTCENSORED: we should have a discussion about who visits the homepage, and we should try to understand, and be sensitive to, their expectations and circumstances and needs. When we get thousands of angry e-mails about our decision to republish the Jyllands-Posten Muhammad cartoons, we should acknowledge the offence the cartoons cause, and explain why, on balance, we think they warrant publication anyway. None of that is censorship. It’s just good judgment. It demonstrates transparency, a willingness to be accountable, and a desire to help and serve our readers — and it would earn us trust.
I believe that in our discussions to date, we’ve gotten ourselves derailed by the censorship issue. I know that some people believe that the Wikimedia Foundation is intending to coercively intervene into the projects, in effect overruling the judgment of the editorial community. I don’t see it that way, I regret that others do, and I dislike the ‘authoritarian parent / rebellious adolescent’ dynamic we seem to be having trouble resisting.
Wikipedia is not censored. It should never be censored. That doesn’t relieve us of the obligation to be thoughtful and responsible.
So: what needs to happen?
We need to have a discussion about how to responsibly handle objectionable imagery. That discussion doesn’t need to happen with the Wikimedia Foundation (or at least, not solely with the Wikimedia Foundation). The projects should be talking internally about how to avoid unnecessarily surprising and offending readers, without compromising any of our core values.
Those community members who are acting like provocateurs and agitators need to stop. Demonizing and stereotyping people we disagree with pushes everyone into extremist positions and makes a good outcome much less likely. We need to look for common ground and talk calmly and thoughtfully with each other, staying rooted in our shared purpose. Some editors have been doing that throughout our discussions: I am seriously grateful to those people, and I wish others would follow their example.
“Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.
We need to set aside anxieties about who’s in charge, and quit fighting with each other. We need to be aware of who’s not at the table. We need to bring in new voices and new perspectives that are currently lacking, and really listen to them. Those community members who’ve been afraid to talk need to speak up, and those who’ve been driven away need to come back.
The purpose of this post is to call for that responsible engagement.
Like I said at the top of this post, my purpose in writing this is not to talk about the referendum results or the image hiding feature: for that, I’ll be talking in more official venues.
I strongly recommend you perform a representative sample survey on the question of whether external block-lists (as used e.g. in AdBlock Plus and the like) would obtain more community approval than the category idea which has been floated by the mock-up.
Someone asked what the advantages of external block lists are:
1. Lists of blocked images would be able to cover a much wider variety of content (e.g. Mohammed for strict Muslims, spiders for arachnophobes, arbitrary combinations of any such categories) than just a few categories;
2. Block lists would be able to be tuned to match community standards (e.g. different degrees of nudity for different cultures);
3. Block lists could be published by third parties as well as the Foundation or its volunteers. It would be up to the user which to list(s) to use, based on which list publishers they decide to trust. This puts the projects in less of a position of being accused of censorship than if editors or the Foundation were responsible for tagging images by a limited number of categories from a central authority database;
4. Ad blockers which use collaborative block lists have proven to be much more popular than those which don’t;
5. Categories in a fixed set could conceivably become out of date as social standards change but block lists would be able to evolve with the standards of those who trust them over time;
6. Decisions about whether to place a given image in a fixed set of categories by a central authority, whether made by the Foundation or by volunteers, would necessarily involve lengthy acrimonious debate about borderline cases which would impose a drain on volunteer time and morale, which could be avoided if multiple third party authorities were making their own decisions about what to block.
Also I would like to acknowledge Tom Morris, User:Tom_Morris on Enwiki, Commons, etc., for suggesting this idea or at least bringing it to my attention.
If you want to comment, please do it here! I’m getting lots of personal e-mail in response to this post, but we will get the most benefit if we all talk together, rather than people just talking privately with me.
Thanks for a really thoughtful post, Sue. This is a really difficult emotional issue and I think you’ve captured the situation well.
Seconded!
Such a tricky issue. One of the hardest things to deal with when working with online communities as opposed to in person communities is that online, someone’s always got the passwords and is paying the hosting fees. I do not mean this to be some sort of “hey the people who pay the bills make the rules” but that I think it’s the elephant in the room to pretend like this whole thing sprung up fully formed from nowhere. So, there have always been guidelines and standards about what Wikipedia is and is not, and they’re always contentious. The largest issue lately, I think, is that people are very very connected to Wikipedia and the Wikipedia that they see when they edit or contribute or read or whatever and it can be hard to get outside of that. So you don’t think that you’re going to see a vagina on the front page, until suddenly you do, and then you have to think “huh, what do I think about that?”
There is also the problem that if you require consensus and/or voting each of those systems has upsides and downsides and could, potentially get gamed if passions are high. And I’m not sure a straw poll with sixty people is the same thing as a mandate to do any particular thing long term, though it was certainly the right way to go forward in that case.
In smaller communities you can often, as in a public library situation, appeal to some sort of loose “community guidelines” and then at least you get something that seems to work for the bulk of the people in a community. Wikipedia being international, that sort of thing doesn’t work so well since there is such a wide range of what’s beneficial, acceptable, or barely tolerated, that it can even be difficult to find common talking points. And, like the public library model, we decry censorship, but we definitely pick and choose what goes on the shelf. We don’t filter our internet but we often have a “this is a shared space, be cool” guideline of some sort for people who want to look at explicit content on the internet. And for some people even a “be cool” guideline is too much and for other people it’s nowhere near enough, and we muddle forward.
I work in an online community as a moderator and this is an issue that comes up. There are few places on the internet that are truly total free speech bastions. Things get removed because of copyright issues, because of government interference, or just because of general conflicts. I think once Wikipedia started remiving some things in past edits from view (and probably before that, but that was when I started really paying attention to this issue) it basically started down the road of only some things being okay to be on the encyclopedia that anyone can edit. And since it’s really the only thing of its size and depth and breadth, there is definitely a feeling that things that are somehow disallowed from Wikipedia are in some way being censored instead of just not belonging due to scope or what have you. I’d be interested to know how the “Please don’t use Wikimedia Commons for your porn stash” conversation is going.
However, we need to be able to talk about these things and I’m with you that if we just start shouting people dont who disagree with us, we’re not having a real conversation. Th “who is not at the table” issue is crucial. Wikipedia has always been run by the people who were the most devoted to it, but the mission of the project is larger than just who is contributing and there needs to be a way to bring those other voices in and have them be legitimate as a different perspective. Thanks for wrirting this.
Personally, I’m super happy that you bringing up the distinction between “Editorial Judgement” and “Censorship,” since I think that is the crux of the discontinuity between the various perspectives.
WP:NOTCENSORED does not – and should not – mean “Just because we can do this – show a gaping vagina on the front page that we should do so.” That is a juvenile’s understanding of the “rules”, and almost crosses into “bad faith/attempt to shock” lines.
Sure, if I go to the article [[Penis]], I should expect there to be a photo of a penis. I’m not entirely certain I should expect to see it “above the fold”, however. The article [[Pregnancy]] is a good example: why choose the naked woman as the featured article over the clothed one (which is *below* the fold)?
If the only answer is WP:NOTCENSORED, I don’t find that a compelling argument (note that the German Wikipedia has the images reversed).
I am by no means a prudish person, nor am I a zealot, or unreasonable. However, I think that “pissing people off just because we can” is a really *poor* reason to piss someone off and frankly points to emotional immaturity. It is possible to recognize that images of Muhammed exist without placing them in the encyclopedia in such a way that angers over a billion Muslims – and if our goal is to “Bring the Sum of Human Knowledge to Everyone for Free”, erecting a barrier-to-entry that cuts out many peoples seems antithetical to that spirit.
@“Just because we can do this – show a gaping vagina on the front page”
Telling/seeing just half of the story is not serious.
Did You read “featured […] article about the vulva”?
Where do You think does a picture of a vulva belong to if not to a featured article about the…vulva?
I agree with Susan. Editing is not censorship. Removing a bad picture of a vagina, and replacing it with a better picture, or even arguing about which is the better picture, is in no way censorship.
Neither is obeying the principle of minimum surprise. A friend (Isabel Draves) pointed out to me several years ago that no, not everyone wanted to see pictures of my infected spider bite in my Flickr stream. Not on a good day and not on a bad day either. Nor did anybody ask to see pictures of the road rash I earned two weeks ago. Good taste is not censorship either. BTW: http://en.wikipedia.org/wiki/Road_rash and http://en.wikipedia.org/wiki/Spider_bite but if you visit either of those, you wouldn’t be surprised to find pictures of road rash or spider bites.
The guiding principle should be: is someone going to want to see this picture while they’re eating? If not, then don’t put it up where somebody can see it without expecting to see it. Do I want to see somebody’s anus while I’m eating? Frankly, no. Does anybody want to see my road rash while they’re eating? Frankly, no. The list of “somebody’s gonna toss their cookies if they see this at the dinner table” images isn’t very long.
I wish I could “like” this with the furious heat of a thousand fiery suns.
I continue to be moved by the story that we heard of a father whose son was researching stereoscopes for a school project. What did he find on the page about stereoscopes? A stereoscopic image of a nude woman.
We failed that father and that child. We must do better.
I assume the child is still in recovery?
As I assume you are?
(Evidently the recovery process is incomplete in your case).
And that, MZ, is the problem in a nutshell. The total lack of empathy for the father and child IS the problem.
MZ, I assume you disagree with the idea this is a topic for serious discussion?
Not all parts of it need to be taken equally seriously. Who pissed in your beer, anyway? I thought it was funny.
Well yes, once he was taken away from his abusive father.
Great post, Sue. I’ve been thinking about the discussion on Foundation-l (and a lot of similar forums that I subscribe to) and I really appreciated when you stood up and said: unacceptable when the “asshole” comment came up. Because I feel like we need to STAND UP more to bad behavior. If we don’t, then a culture of acceptance creeps in; a culture where it becomes acceptable for people to be mean and vicious and aggressive to one another. I feel like we don’t stand up enough — mostly because we’re afraid that the viciousness might turn on us. But we need to do it more. We need to communally enforce better standards. That’s the only way to change culture. Everyone has to do their little bit. So, thank you. And I promise to stand up more.
This post appears mostly to be the tone argument:
http://geekfeminism.wikia.com/wiki/Tone_argument
– rather than address those opposed to the WMF (the body perceived to be abusing its power), you frame their arguments as badly-formed and that they should therefore be ignored.
This is somewhat problematic.
It is totally within the WMF’s purview to propose solutions which the community should evaluate. Does anyone in this discussion think the WMF ought *not* to participate in the community?
The WMF is a corporate entity. Corporations can only participate through the intermediary of individuals speaking for themselves. A corporation has no eyes, and is sexless; that makes it incapable of recognising obscenities until it is so notified.
I’m not sure a lot of people actually have any problems with exercising judgment about where images are used, though of course there is a spectrum of views on what’s appropriate in any given situation, and some communities may reach conclusions that seem abhorrent to others. In any event, the push for an image filter seems like a distraction from that focusing on how (or whether) we can make the editorial judgment process better.
Also: http://search.gmane.org/?query=asshole&group=gmane.science.linguistics.wikipedia.technical
If you want to find rage, you should always check tech lists first. :-)
I also find it odd that this post completely fails to mention the massive de:wp poll voting 85% against the imposition of the filtering system.
I note also that you fail to acknowledge this is an imposition.
This is a strange framing of the issue.
Thank you. As someone who has been called a “prude” “idiot” “bitch” “bore” “conservative” and more. Thank you. I can only hope others take heed, and those who oppose take this with a logical approach. Thank you Sue. I look forward to working with the community to develop ideas, policies and concepts that will benefit not only the Wikimedia community, but those who utilize our websites as resources. I know that my ideas are not in the minority, and often times I feel like they are…
I was expecting your post to discuss the case for the image filter but instead it focuses on editorial judgement. Nothing to disagree with there, but I would have steered around “empathy” and opted for “respect”.
The word Respect has a particular useful meaning in the sense of respect for cultural heritage when discussing how editorial policy might choose to limit the nature of derived images of people (UNESCO has some useful guidelines to consider). Further it does not rely on empathy, which can be hard to achieve for our contributors when considering all points of view, but respect is a realistic goal for editors, even those with a medical condition that makes empathy an abstract concept.
As an example, consider where I mass uploaded images of blind children from the 1910s being shown museum artefacts by touch. “Uncensored” might mean that someone could be allowed upload a version with their heads with cartoons, “empathy” would struggle as these unidentified children are long dead and so editorial policy is less restrictive, “respect” would mean that we would want derived images to preserve the context of the image.
The image filter is a violation of the mission of the Wikimedia Foundation.
“The mission of the Wikimedia Foundation is to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally.”
If the editors of a project put an image in an article is because the image is relevant/educational/illustrative and therefore is content that the Foundation must “disseminate effectively and globally.” Letting some users to block Wikipedia content is NOT a good way to do it.
It’s a waste of time and resources to support the POV that certain content should be censored.
Is opening the door to censorship and to give arguments to enemies of knowledge and freedom.
“freely share in the sum of all knowledge” remember?
Did you read the blog post you commented on?
I bet not. It’s a knee-jerk reaction. Part of the problem, as Sue points out, is automatic reactions like that. She’s getting attacked for pointing that out. (that too was a predictable knee-jerk reaction)
I’ve given up on WP and moved on to other things.
Sue was writing about editorial decisions versus censorship.
With respect to Wikipedia the editorial choice is made by selecting images to be used or not used in the article the editor is writing.
But when the publishing company then blocks this image, thats censorship.
The selection of images is, like the wording of the text, very open to the community and will be reviewed often, as the article is read by many people.
The tagging of images is something which is done in the background, hidden from direct view.
I do not quite share the concern about the tone of conversation around this issue. Some blowback was pretty much built in, certes, but I don’t honestly think it derailed the thoughtful discussion an inch. I am espescially gratified and even proud of our community in that people who were ardently in support of the filters, had the integrity and intellectual honesty to admit that the referendum as constituted had very little legitimacy as a measure of how the community really feels about the issue.
«Demonizing and stereotyping people we disagree with pushes everyone into extremist positions» etc.: oh, thank you for reminding us, but you’re doing the very same thing here: «we refrain from the expression of editorial judgment on this one issue», «we’ve become an echo chamber: we hear only voices like our own, expressing points of view we already agree with». Do you really think that assuming that the community is failing and saying that whoever doesn’t agree with you doesn’t want to see and solve problems, doesn’t have empathy, doesn’t apply editorial judgement, imposes his point of view on the others etc… that all this is not «demonizing and stereotyping people we disagree with»?!
I clearly see the problems of the echo chamber and the rudeness in discussions. But your solution is no solution.
The proposed filter won’t work. Wikimedia has a very bad record of creating solutions for people who don’t want to learn the usage of MediaWiki in detail. If you give people hundereds of options to filter they won’t use it at all or they will use it wrong. It’s the same problem you describe above: young well educated men in Europe could use the proposed filter to any effect. The occassionale Wikipedia reader won’t profit at all. The people who are offended by pictures of Mohammed in other language versions of Wikipedia won’t appreciate it.
Second problem: WMF does not impose editorial judgement, it refrains to set any standards on the projects. The filter could cause an opposite effect. Since everyone can theoretically filter controversial content, there is no need to use your judgement anymore. “Lets take 20 pictures of breasts in any article, the offended people will not see it.” If pictures of vaginas on the main page are not acceptable, you will have to find a way to convince people to refrain putting them there. The filter won’t do it.
Replacing an image with a placeholder is not censorship if clicking on the image shows it anyway. And if you are free to turn off placeholders entirely in your user settings. In fact, I would probably make use of a “workplace-safe” filter, especially if it can be activated for my work machine only.
It was mentioned somewhere that third-party filter software developers might use our tagging to block such tagged images completely. And that will probably happen. However, such filter software probably “overblocks” at the moment anyway, hiding more images than necessary. The situation would then be similar to the flagged revisions on German wikipedia: Hiding certain things by default (tagged images, IP edits), counter-intuitively, allows more freedom (everyone can edit the page, in case of FR).
I don’t think third party filters will use Wikimedia Commons categories. They will use the same methods as for any other sites. Most filters will block entire sites without any regards for different kinds of content. Some use more accurate methods like pattern recognition. But they won’t integrate an filter specifically tor Wikimedia commons.
First: My english isn’t very good. Sorry for that. I’m a female editor and long-time volunteer on the german wikipedia. I voted against the image filter and I had no problem with the vulva on the front page. I think it’s a mistake to think that only white young angry men have this position. As Jan Eissfeldt wrote in the last Signpost (http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-09-26/Opinion_essay) filters are not very popular in Germany at all and the concept of sexuality might differ from that in the U.S.
Because I was curious a joined the last Office-hour on IRC last week. I heard the same things I find now in this post. I has to say, that I’m very distressed about the fact that I should have a lack of empathy or a lack of imagination. To analyze the personal character from people who doesn’t live in the same cultural area as you do might show a lack of empathy as well?
Hope to see Sue in November. Regards, Kellerkind
+1. To talk down different opions in the “lack of emphathy” way is no good style. And it shows no emphathy for the fact that maybe in other cultural areas than the us or arabic there are different positions that need respect as well …
I think that Sue was criticizing EVERYBODY’s tone … maybe even her own.
I agree with you that filters won’t work. The problem is that some images are simply unacceptable in some contexts. Since you can’t predict the context in which an image on the front page will be viewed, these images should not appear on the front page of a wiki. Has *nothing* to do with sex and everything to do with good taste.
The vulva picture should *not* have been on the front page. Neither should pictures of shit, blood, amputated limbs, car crashed, Mohammed, etc.
Dear Mr. Nelson, Catfisheye told you on the commentpage of Achim Raschkas Blog and I will tell you again: “The only reluctance of showing the vulva on the main page I feel (but ex post) that it aroused so many misogynist comments, calling the picture of a vulva nasty or stuff, and a paternalistic discourse of a vulva being sexual therefore must be hidden. Sorry, but half of the mankind has a vulva as a normal part of its body. It is thoughts and culture that sexualise a body part, and yeah, obviously these are mainly male thoughts and a male culture, again…”
http://achimraschka.blogspot.com/2011/09/story-about-vulva-picture-open-letter.html
Thank you for your attention.
You say, “The other day –as far as I know for the first time ever– somebody called someone else an asshole on one of our mailing lists.”
You’re just new… Not to say such behavior ever was appropriate or got us anywhere.
Your emphasis on editorial judgment is productive; we are publishers and need to be conscious of the impact of our product on others.
Great post, Sue, thanks for that! Two things I want to add / comment on:
First a little story: Only a few weeks after I started my current job as the Executive Director of Wikimedia Germany, the following happened. The article about a small and very new German nonprofit called “MOGIS” was deleted in the German Wikipedia, because there were no sources and they did not meet the notability requirements for non-profits. MOGIS described itself as an organization of “victims of abuse against internet censorship”.
This deletion started a huge outcry in the German online community, a lot of bloggers got involved, press attention, and so on (and all that at the beginning of the first fundraiser under my watch – poor me!). The discussion lasted for weeks, inside and outside of Wikipedia.
What is interesting about this is that in my view, that was the first time when a long overseen group voiced their opinion: the readers. People who do not contribute to Wikipedia in any way, but who rely on it on a daily basis. They said to us: We do not care about your notability criteria, if we want information, we want to find it in Wikipedia.
I think that this is an important lesson for all of us: We need to get a better understanding of our readers, their wishes, their expectations. I don’t say that we have to please the readers in every way, but we should take their point of view more into account when we think about our rules.
To use a different example than the vulva picture: In the German article on arachnophobia, there was for years a picture called “Grammostola aureostriata L9 female OnAHand (2).JPG” (you find it here:http://de.wikipedia.org/wiki/Datei:Grammostola_aureostriata_L9_female_OnAHand_%282%29.JPG). It shows a large, hairy spider sitting on a human hand. For someone who suffers from Arachnophobia, that picture might make it impossible for him to read the article and thus learn something about his illness. And if that happens, than we have failed that poor arachnophobe, because he cannot “freely share in the sum of all knowledge.” And a lot of articles on arachnophobia in different languages in Wikipedia have even more gruesome pictures of spiders, used as “illustrations”. So this is not a German problem, but one of poor editorial judgment, because it leaves out the perspective of a reader looking for information.
The second aspect is more a comment: Your wrote that the image filter is no censorship, because “Censorship is imposed from outside”. I agree with this, but I think large parts of the German Community feel that the filter *is* imposed from the outside, that is the Board of Trustees of the Wikimedia Foundation. Now that is sad, and it might not be fair, but it is the way it is: For a lot of people within the projects, the WMF, its Board (or Wikimedia Deutschland for that matter) is something they do not know a lot about and they do not really care. This thinking goes like: “You do your job (running the website, developing software, doing great outreach stuff), while we do ours (writing the encyclopedia).” And from that point of view, the Board crossed a line by deciding an editorial question.
The lesson here is that all of us who are working for Wikimedia (foundation, chapters, payed or volunteers) need to communicate much better into the community about what is happening on a meta level, such as the board, the chapters or the foundation.
And one last comment on the filter (and yes, I know this is not the right place, but anyway): The filter tries to solve a social problem by technical means – not a good idea.
The arachnophobia case shows: editorial judgement can work. If you talk to editors they will agree that an hairy spider is not the best idea for this article.
An arachnophobia filter won’t work. The reader has to use the filter before he or she knows about a particular picture in the article. And this is just unrealistic, since 99.x percent of Wikipedia readers do not even bother to register.
Perhaps you could give editors the possibility to set an NSFW-template for images. It’s just a few lines of JavaScript and could be used in the very few cases where educational contents can offend.
Almost 100% certain that an article on arachnophobia should not include a picture of a spider. Probably not a drawing either. Such images belong in an article on spiders.
Same thing for snakes. “Why does it always have to be snakes?” And whatever is called the syndrome where people faint at the sight of blood. And neither should an article on epilepsy have an animated flashing GIF.
The need for good editing knows no bounds. We shouldn’t mistake tasteless editing for a systemic problem in Wikipedia.
Readers contribute by reading. I appreciate their contribution.
I have to agree with Sue in many points. But there are some issues that weren’t addressed, because your statement was more general.
At first we have a long discussion about the Vulva on the main page. You forgot to mention that it was put up as test (we wanted the discussion) to see how our readership would handle it. It was not put up to say “Hey, we can do that”, it was put up to start a more general discussion. At the end the authors came to the conclusion that there is no need for such a restriction. If an image is chosen based upon how valuable it is to illustrate a topic, then this is the way to go. Choosing an inferior one because might not be as offensive was denied as an option.
You have spoken about the filter a little, but what is missing is, that it does not solve the problem. The filter is far away from editorial judgment, the discussion not related to the article and it sets global standards. Given your post, i have to agree that we should not put in images based on the sentence “We can do that”, but excluding images based on the sentence “It could offend some” is at the same level. What is needed is the discussion inside the article if the picture (offensive or not offensive) is a valuable addition to the article.
The problem wasn’t the article. The image is completely appropriate (although I suggest that that woman has an ugly vulva — apologies but it’s my truth) for an article on vulvas. The problem is that it hit the front page. *That* was inappropriate, and bad editing. If you disagree, then we should post the pictures of my bloody road rash “for discussion”. Or if you think it’s about sexual acceptance and tolerance, then we should post pictures of bloody tampon or menstrual pad.
But I agree that the filter idea is just dumb. Nobody will bother to use it until *after* they have seen an image that offends them. That is, after the harm is done.
And that’s the kind of sexism the discussion on DE exposed. Lots of people argued that they would like to have that image replaced with a more “beautiful” one.
Should we allow only beautiful and sexy people to be naked at a nudist beach, because other visitors could feel disgusted? Should we categorize images into beautiful and ugly genitals?
I think this is the mindset of people who see genitals in a mainly pornographic context but I assume that man as woman don’t want to be reduced to their role as a potential sex partner or porno star.
The world doesn’t consist only of clean, shaved vaginas. And those how can’t accept that shouldn’t use an encyclopedia.
You’re reading too much into my comment. This is not an article about sexism, about sex, porno, or anything else. It’s about the female vulva. As such, it’s not a good photo. Where are the well-defined outer labia? The inner labia? The clitoris? The urethra? Obviously she has all of these, but they’re not apposite for a photograph of a vulva.
It’s absurd to say that a male calling a vulva ugly is sexist. What would a male calling a penis ugly be? *Also* sexist? What if I don’t like the model’s hands? Is *that* sexist?
You’re over-reaching.
Good luck, Sue, and the rest of the board.
You’re going to need it. The community is too broken to solve this important problem.
Nahhhh. It’s just a problem of poor editing. We’ve faced that problem in the past, we’re facing it now, and we’ll have to face it in the future. WP:NOTCENSORED is never a justification for bad editing.
Crucial, though, is identifying the correct problem. If you identify the problem as American intolerance of sexual imagery, then you will never solve that problem. If you identify it as an edit beyond the bounds of good taste regardless of culture, then it’s solvable.
You write: “Wikipedia is not censored” is true. And, we need to stop using it as a conversation killer. It’s the beginning of the conversation, not the end of it.
My comment:
No, no, no, most definitely no. There are some things that are NOT discussable, and this is one of it. You cannot be half pregnant, somewhat alive or a bit uncensored.
Andreas, you are trying to solve the wrong problem. Wikipedia is not a bit censored in that there should be an article on anything encyclopedic even if some people don’t like it. Wikipedia is subject to tasteful editing in that the front page should not have pictures from certain articles likely to offend some (dare I say *any*?) reader. The problem is that the front page has zero context, and you cannot predict the reader’s context. Many pictures would be inappropriate for the front page. Anything with blood or shit. Anything where people have a bad reaction to visual stimuli: blinking for epilectics; spiders, snakes, rats.
You would likely have wide support to suppress blinking. Would you object to a Jurassic coprolite that cast new light on the eating habits of dinosaurs? If you’re going to ban pictures of spiders, snakes and rats so as to accommodate their respective phobics you might as well ban pictures of Ayrabs to pander to the xenophobes. By following that path you end up only appealing to the lowest common denominator. Maybe it’s even better to have no photos lest you offend someone with an obscure phobia that you never heard of. Judgement is not the exclusive responsibility of the writer. The reader who engages in wilful misinterpretation needs to accept his share.
I guess the year should be 2011, not 2010.
Sue, you can’t be serious. The image of the Vagina on the mainpage naturally lead to some refusing comments, but the big majority of readers didn’t complain. We in Germany have the BILD tabloid on which mainpage there is a pin-up that in British does just appear on page 3 and noone complains. In every schoolbook for biology such images are shown to pupils (even in primary school) and on wikipedia it was used in the same educational matter. so what is the big deal? there is noone. the complete thing about cultural nutral doesn’t work. to make it simple france and germany have not the same standards as the usa or iran. so the communities have to decide how the do the editorial work to please the majority of their readers. on the german wikipedia this is much different from the arabic wikipedia or the japanese. we should praise cultural differences instead coming together on the smallest shared point … you can impliment an filter on wikipedias which communities ask for it to accomondate to the expectations of their raeders, not on wikipedias which have no such problem at all. and when there is porn hosted on commons without any educational use to see, then it has to be deleted not fitered. it is the same rediculous try like in germany to block child pornography instead of fighting for the deletion of it. Sue, make up your mind and respect the communities and single wikipedias. this has to be the editorial choice of every single wkipedia and so respect the editorial choice made by teh German wikipedia.
Julius, the big deal is that each of your examples has a context. A biology textbook has a biological context. BILD is well-known to have lots of flesh on its cover. The Wikipedia front page (for any culture) has zero context. Has nothing to do with culture. Every culture has its uncomfortable images. We respect readers by only presenting them with uncomfortable images in context.
No context? The Wikipedia front page has an /educational/ context. Isn’t that enough?
“Censorship is imposed from outside. Editorial judgment is something we do every day in the projects”
I feel that is the most important part of this post.
You’re right. As the U. S. Wikimedia Foundation is not part of the de.wikipedia community, its U.S.-centric approach is considered very much “from outside” here, hence we regard it as censorship.
Technically incorrect, as the de.wikpedia.org nameserver is run by the WMF. I believe that the de.wikipedia server is colocated with all the other wikipedias. And the de wikipedia includes links to commons.
Word. We, as editors, decide. The readers from the outside should not be able to.
One of the issues in this debate is that people seem constitutionally incapable of understanding and sympathizing with the positions of others. Sue describes those opposed to the image filter as “rebellious adolescents” while Brandon calls them juvenile and immature. In both cases an assumption is made that the point of certain practices (and the sense of liberty that permits them) is to shock and dismay people. You’re dismissing an huge array of nuanced and intelligent arguments as if they didn’t matter. How can you be surprised when that upsets people?
Further, the post and some other comments fails to make the appropriate distinctions between a discussion about editorial judgment and the recent cause of “angry debate.” The image filter proposal is what has caused such a backlash; I can’t support the image filter, especially the way it’s been described, despite the fact that I have been one of the more vocal proponents of “editorial judgment” when it comes to images that invoke personality rights and privacy concerns. One can agree that upskirt photos, porn stashes, and huge galleries of graphic photos on Commons are wrong and handled poorly… and still strongly disagree with the image filter as discussed. By thoughtlessly lumping them all together, you are again dismissing and refusing to acknowledge the validity of opposing opinions.
Finally, it’s tough to avoid the “authority / subordinate” dynamic when you speak in those terms. “Editors” and editorial judgment in news organizations are an expression of authority and, often, local social and economic concerns. In Wikimedia projects, editorial judgment is exercised by individuals and small groups coming to an agreement. The Wikimedia Foundation seems to be urging us to view the image filter as an element of editorial judgment, but it is fundamentally different in many ways – not least because it is apparently to be imposed upon us from the top down. This immediately and inevitably provokes a conflict over “who is in charge” and – in context – such a conflict is not merely needless anxiety but crucially important for the future of the movement.
You have misrepresented my comment.
I stated, basically, if the only reason to do something is “because we can”, that that reasoning is juvenile and immature. That is not the same as saying “those opposed to the filter are juvenile and immature.”
Even mature adults can have immature reasoning.
To be honest, I don’t think that the opponents are unreasonable; in fact, I think there are many good points brought forth by both sides. I just happen to think one argument is stronger than the other.
But to say that I called filter opponents names is disingenuous and tries to uncover a conflict which doesn’t exist.
The trackback does not seem to work. So I will post it here again,. Interesting post, but I disagree on several accounts: Sorry, Sue Gardner, but the image filter question exactly asks, who is in charge.
Maybe we could end this discussion by making Wikipedia restricted to registered users only. People should agree to reading suprising content and seeing astonishing illustrations if accepting the ToU.
This would satisfy the US-american wish for filtering and the european wish to keep wikipedia unfragmented. (sry for my bad english)
Outstanding post, Sue.
Love to you,
Andreas
It seems to me that you are using a picture on a Wikipedia you can’t read as an excuse to impose an image filter you want. As Pavel Richter put it: You are trying to solve a social problem by technical means of a filter – not a good idea.
Furthermore you describe yourself as editor of the Wikipedia. Just take a look at the Wikipedia itself, were mere authors are titled as editors. You and the Foundation are part of the enabling fraction not of the content providers. Your organization used exactly this argument to fend of liability for the Wikipedia content in Germany, by the way.
Thus by restricting the pictures shown to the users you are performing an act of censorship. The clever bit on your part is that the offending pictures are downloaded on the user’s computer anyhow, just not displayed. Thus the liability of possession is with the user, he does not have to see or know about the image on his computer for that.
Looking at the current decline in active participation in Wikipedia your primary focus should be on keeping content generation part of the project up and running. The content in all the Wikipedias I regularly frequent are far from anywhere near finished and are in constant need of (volunteer) editor attention. It’s these volunteers that provide the information you sell (via the fundraisers) and that pay for the Foundation and its staff. Alienating part of that community, even when it is only the one from the second largest Wikipedia, should be definitely not on your to-do list, though you seem to be doing a might good job at it!
This post appears mostly to be the tone argument:
http://geekfeminism.wikia.com/wiki/Tone_argument
– rather than address those opposed to the WMF (the body perceived to be abusing its power), Sue frames their arguments as badly-formed and that they should therefore be ignored.
This is somewhat problematic.
I don’t think that this one will be solved by the direct action of imposing some rule. The polarisation on this makes no sense. As in a marriage, polarisation obscures what started the fight in the first place. We have a lot of younger people who feel keenly concerned with perceived rights without yet having learned sensitivity to the way the exercise of those rights affects the rights of others.
We also have a vocal segment of the readership that can be too quick to feel offence, a segment too willing to interpret the slightest disagreeable comment as grave attack. More in line with reality is that comments that are perfectly acceptable in one community can be misinterpreted as seriously offensive in another. When this becomes problematic the blame usually needs to be shared, and there is a need to understand that an agreement with one individual will not forestall the need to negotiate this with another, no matter how boring it may be to repeat the experience with each individual. In a multi-person environment it gets more complicated because neither extreme is acceptable. Insisting on an endless array of offensive pictures is just as damaging as a standard based on rigid morality.
We have allowed a high degree of mistrust to build across our communities. They did not start with offensive pictures. It is as much rooted in the endless debates about notability and original research, or the persistent wars between inclusionists and deletionists. The feeling that someone’s work might be deleted long after it was finished because someone is now seeking to impose new standards leaves people skittish and mistrustful. Imposing new perfections on the past is disrespectful. The meme, “So fix it!” asks the complainer to improve on what is there; it is not a licence to destroy. It is in this background that the offensive images debate is now being waged.
That the German community has taken a stand so diametrically opposed to the view of the Foundation is an ominous development. That has left the impression (rightly or wrongly) that an American view of morality is being imposed on communities that are mostly based in other countries. There are certainly legal implications to having servers and an incorporation in the United States, but those must be kept to an absolute minimum. Failing this, it is inevitable that the activities of the Foundation will be examined with the same magnifying glass as US foreign policy generally. The recent Foundation tendency to centralise policy and finance can only exacerbate this.
Largely, the Foundation needs to step back from its drive to impose this policy until more opportune times. The proposed policy needs to be viewed in a larger context that also needs to be addressed. When that has been done there may be no need for what is being now proposed.
Dear Sue,
I was active in the discussions on the Foundation mailing list. I’m also a perfect example of the white male Liberal/Libertarian stereotype that concerns you.
Yet I’m poles apart from you on this, both in my analysis of my fellow editors concerns and in the response I’ve had to my postings on the Foundation mailing list.
Firstly my analysis of the opposition is very different to yours. I understand your comments about the opposition, but I see other concerns as well.
Any proposal which depends on the category system will lead to complaints that x or y image has been on Commons for so many days without being categorised as depicting a penis. That’s why people worry that this would increase the burden on a dwindling volunteer force, many of whom didn’t want this in the first place.
A category based image filter system relies on all current and future images being appropriately categorised with regards to the categories used by the image filter. Including eleven million already on commons and millions more to come. In a paid operation you would hire lots of extra staff to do this. In a volunteer operation all one can say is that the management implemented a filter without recruiting the staff to do the tagging to make it work. Flickr resolves this by putting the onus on the uploader of an image to categorise it, and thereby gets into conflict with uploaders who don’t want to do that, or have a different perception of controversial content. People worry that an image filter based on categories would place a categorisation burden on either the uploader or the whole volunteer community, or both, and that we have over eleven million images that would need to be reviewed against the new filter categories.
There are also people who argue that our category system is not designed for this purpose – categories are there to say what is depicted by an image not how potentially offensive it might be. The two do not always coincide.
There are people, including myself, who are comfortable with the idea of a personal filter which gives people greater choice as to which images they see, but who don’t want to empower anyone to censor what others can see. Whether at the level of a Wikimedia project, a nation, an ISP or indeed a shared IP address.
People care about the projects so much that even if they oppose a decision, if the consensus goes against them they will put time in to making things work. But many people don’t see the Foundation proposal for an image filter as either workable or a consensus based decision. Failing either is a problem, failing both more so.
Secondly I don’t see consensus as a barrier to be overcome by appealing to a different audience, I see consensus as a process whereby people listen to each others concerns and seek to accommodate them.
So how come my posts to the Foundation mailing list proposing an image filter got a more polite response than the official Foundation proposal? Perhaps it was the move away from categories, the shift in the onus of responsibility and the labour from the uploader or the community to the actual users of the filters. Perhaps it was because I’ve been careful to describe the recent questionnaire as a survey or a consultation, but not as a referendum. Perhaps it was because my support for a filter was motivated by a desire to break the barriers that limit our readership in the Islamic world, more than concern to appease critics in a region where we have pretty high readership already. Or perhaps it was because I made it clear that I was listening to feedback and trying to adapt to it. So if you seek a “change in tone and emphasis” may I recommend an extra emphasis on achieving consensus by acknowledging and trying to understand and resolve objections?
TTFN
WereSpielChequers
PS Now that DE wiki’s poll has come out 86% against an image filter, any further discussion of an image filter needs to respond to that poll. This can be done in many ways: One can seek to resolve and reverse that opposition and get the 60% shift in opinion needed to achieve consensus support on DE wiki; To bypass it by only implementing the filter in projects that have consensus for implementation; To overrule it by seeking a project wide consensus that all projects must have this, even if the local consensus is against;. Or to overrule it by discarding consensus and making a WMF decision. I think we should have an image filter, but I’d rather not have one than have one implemented against the will of the community. Would you and the board be willing to say the same?
Thanks for this post, Sue. I think you make a lot of good points. It’s really unfortunate that the “referendum” was so poorly implemented that it has detracted from your very good goals…
I’d like to add to one of your points. You mention that image of cleavage and how it probably shouldn’t be in the article about necklines. There certainly should be an image of some cleavage in the article, since it is a very important part of the subject matter. The problem with that particular image is that it is of rather poor quality. In particular, there is clutter in the foreground and the lighting is poor (and cutting off the subject’s head is a questionable composition choice). There is a subconscious tendency to associate poor quality images like that with porn, since a lot of porn is of poor quality in similar ways. A better quality image of the same subject wouldn’t carry the same associations and probably wouldn’t elicit the same complaints (in fact, there are some other very similar, but more professional looking, pictures next to this one in the article that you haven’t commented on).
You imply yourself that a photo of a penis taken with a camera phone is in some way more objectionable that a professional looking photo of a penis. Perhaps we could make a lot of progress on this issue by improving the quality of our photographs, rather than removing them (or making them hide-able).
Yes, my own reaction was one of disbelief on reading this:
> When an editor asks if the image cleavage_(breasts).jpg really belongs in the article about clothing necklines, she shouldn’t get shouted down about prudishness: we should try to find better images that don’t overly sexualize a non-sexual topic.
A non-*what*?
Poor censorship; with friends like these…
As a response from the guy who wote the vulva article in the German Wikipedia on my blog: “A story about a vulva picture – an open letter to Sue Gardner” – http://achimraschka.blogspot.com/2011/09/story-about-vulva-picture-open-letter.html
Mmh, the link to my blog does not appear so here is my full answer from my blog (http://achimraschka.blogspot.com/2011/09/story-about-vulva-picture-open-letter.html):
Dear Sue Gardner,
please handle this as a response on your post named “On editorial judgment, and empathy” and as an addition to Dirk’s “Sorry, Sue Gardner, but the image filter question exactly asks, who is in charge” and Jan’s signpost topic “The global mission, the image filter and the “German question””
Who am I?
First let me introduce myself: I am Achim Raschka and I am one – if in quantity not the main – main authors of the German article Vulva. In this case I am one of those you adress with the “who are acting like provocateurs and agitators” that “need to be stopped”. People like me you describe as “young well educated men in Europe”. O.k. – maybe you call a man (me) around 40 years a young man and also maybe you think, a full-time employed and married father of four little children – a fifth on the way – spends hours of his rare time for an article on a sexual object just to agitate? What I really think is that it doesn’t matter to you who I am and who all those other authors in the German community are that you adress in your post. So let me add some more informations on me only as a chance to learn who this vulva man should be: I am author in the German wikipedia since 2003 and have done about 70.000 edits (what is your score?), about 100 articles mainly written by me are selected as good or featured articles in these years. In January this year I was invited to San Francisco by your team to discuss about article quality and ways to adapt the ambassador programme to Germany, a bit later I was elected to be member of the board of the German chapter.
About the Vulva
In your post you claimed the “close-up photograph of an open vagina” (sorry to teach: it was an open vulva, no vagina which is an internal organ – article mainly written by me) as one of the main reasons to discuss about “handling controversial imagery.”
So, as the author of this article, I was never asked by you about my (and our) reasons why this picture was shown on the main page, and also, the WMF never asked anyone of the German authors community, why they didn’t choose to withdraw this picture after discussion. But although you never asked, I would like to tell you the vulva story:
The Vulva is one of the most viewed articles in the German Wikipedia and it was one of the most choosen topics as well before I decided to work on this article. Before I started this work it looked like this – an article on a main topic in a terrible condition. I – a biologist – worked mainly together with two gynecologists and a veterinarian and anatomist as well as a midwife I know from the birth of one of my children as an external advisor, all best known for their quality articles in the German Wikipedia, to push up this article from a volume of about 13 kB to the recent article with a volume of about 70 kB. Every single sentence, every chapter and even every picture was discussed in the editor’s team and on the discussion page in terms of quality for the article and the described subject – every single picture was chosen for it’s value for this article as this should be in Wikipedia!
After this work the article was elected to be a featured article in the German Wikipedia and was nominated by me for a presentation on the main page – together with the picture you call “controversial” (Why? It shows a natural, hairy vulva like my wife has one and even you have one!). The only reason for this nomination was to show that even in topics like this we are able to produce featured work and to show the product of a high concentrated team of experts from biology and medicine – no more and no less. That’s all, that’s the vulva story.
What a filter will effect
To go a bit further, I try to line out, what will be the effect of a filter to content like this: With the opportunity to filter content – text or pictures – chosen by others than the authors, you will hand over the discussion on content to users who never discussed any part of the content of the chosen articles. Why should a team of experts discuss any content of an article if it doesn’t matter what they decide – why should we choose the best and most valuable content for Wikipedia if anywhere in commons and even at the reader’s side our decision doesn’t matter. Why should we write articles and work on them if even the WMF and their board don’t appreciate this volunteer’s work?
Maybe it doesn’t matter to you if all those guys like me should withdraw their work in the German Wikipedia or even think and talk about a fork – maybe there will be a next generation of editors for this project and maybe we cannot withdraw because of addiction to this wonderful project – even maybe this will become reality I predict there will be a real big gap in trust between the volunteers in the German Wikipedia on the one hand and the Wikimedia Foundation and even the English project on the other hand, if you decide to overrule and act against the wishes of nearly the whole German Wikipedia community. And to cite Pavel Richter for a closing word: “The filter tries to solve a social problem by technical means – not a good idea.”
Kind regards,
Achim Raschka
The vulva article itself is fine. So is the photo (well, I argue that you could find a woman with a prettier vulva, but whatever). The problem is that it appeared out of context on the front page. The front page HAS no context. Consequently, there are a whole slew of images which should never EVER appear on the front page. Pictures of blood, shit, amputated limbs, aborted babies, oozing wounds, car crashes, spiders, snakes, rats, bestiality, etc. In context, these photos are fine, because people will expect them from the context.
The mistake was to take the photo out of context. Nobody seems to realize that. They try to make it about sex, or about American vs German sensibility, or WP:NOTCENSORED. Since they’re not the problem, talking about them that way will never EVER come to a satisfactory conclusion, just as you have written and just as sue writes.
Sorry, I don’t understand this blog. The German article shows more than one closeup without hair removal and with. The pictures are almost the same in every language version. Only the amount of text differs. As schoolboys our first look in any encyclopedia was to look at the pictures of specific articles. That has been always so and I hope it will be ever. Maybe there are articles where filters make sense. The lemma Vulva or vagina definitively doesn’t need them.
What I still don’t understand is why WMF should personnally bother about this issue ? There are nowadays a good deal of Parental control programs who could do as well. It might be a good idea to develop cooperation with theses programs — by indicating whether an image is controversial and what for. But I don’t think we have some interest in committing us to doing our own control. This kind of touchy thing ought to stay outside of the general structure of our projects.
Because people complain, and they complain to the WMF, not to the individual editor who did the edit (which is what they *should* do, but “should” is a normative word).
Hmm… The WMF is not responsible for it. It has mainly an hosting and promoting part. What editors do must be solved by editors.
What the WMF could do is saying : « we have a lot of complains about controversial Images », and advise wikimedian communities on the one hand and parental contral programs on the other hand to do something about it. It should restrain itself to a moral position and, not in the least, act on the encyclopedical content.
After reading the blog of Sue Gardner twice (I just couldn’t believe it the first time), I’m still quite upset and more or less speechless about such a point of view from someone who is with the WMF. The WMF should as soon as possible change itself to a *purely* author-led organization, because without the authors, the whole movement would not exist. But without people like Gardner it will definitely continue to prosper. Sorry if I’m sounding harsh, but I feel even worse than that.
Not the filter as such makes me upset, but the fact that the WMF supports and develops it. The task of the WMF is to support the distribution of free knowledge to everyone, not to give everyone a tool to hide (parts of) that knowledge. If someone wants to hide certain parts of that knowledge, they can of course do so – it’s a free world after all -, but please without any help of the WMF.
That does NOT mean that we don’t need more awareness for editorial judgment of the contents, like in the arachnophobia example by Pavel Richter (thanks to you for bringing this up). This would actually be something that the WMF should teach the authors (I mean how to write content so that it optimally reaches the groups of people who might be interested in it).
Th.
p.s. Thanks to Achim Raschka for that quick and necessary response in your blog. It was very comforting to read.
We proudly enforce a *higher* standard than the law requires on copyright, in service of our mission. I believe we owe it to our readers and editors to consider what *higher* standards would best support our collaborative and editorial mission.
The idea that mere entertainment of that notion constitutes editorial interference is simply wrong. There is certainly an editorial line the Foundation should not cross, but I have not seen it approaching that line in its handling of this issue.
Unbeleavable. Please Sue – retire. I don’t see any way to get back in track.
This is not only about the image filter but also about the relation between the foundation and the community (de-community != de-chapter). In my opinion, this relation is fundamentally different between enwiki and dewiki.
While there has always been interaction between the english wikipedia and the foundation (Jimbo deciding disputes and later creating the ArbCom, coordination between ArbCom and foundation,…) the german wikipedia has basically been self-sufficient as long as it exists. While “Voting is evil” in enwiki, voting is everything in dewiki because there is no other way to decide things that don’t fall into categories such as “right” or “wrong” in the absence of authorities. Also, there is a strong spirit of independence in the community.
If one takes into account that dewiki has been left to themselves for many years with only few exceptions, it should be more understandable that there are strong reactions. It is basically the first time that the foundation intervenes in a way concerning the content (or its display), trying to solve a problem that almost no one had regarded as such.
I don’t think it’s a good idea to call people who disagree “provocateurs and agitators” — I tend to stand on the “censorship is evil” side of the bandwagon as well, and it’s easy to take it a bit personally when we’re all tarred with such a brush!
Some general notes on my perspectives —
Traditionally I’ve been very anti-censorship. History is littered with the shattered bodies of people who were ostracized, beaten or killed for making statements or taking personal actions that offended someone. Should I worry more about the comfortable suburban father who is offended by his child seeing a naked woman, or about the women who are maimed or murdered every year by family members for ‘dishonoring the family’ by wearing skimpy clothes or having premarital sex? Should I worry about the people offended by a mild religious parody they could easily have just ignored, or the people receiving death threats over drawing them?
In the grand scheme of things I tend to feel like that father’s making much ado about nothing, and maybe I’m even right. But is that actually the point?
The idea behind the image filter proposal isn’t to make offensive images *go away* — it’s to provide a speed bump to the subset of people who would react inappropriately due to their taking offense.
I like to quote an exchange from the movie Pulp Fiction, something to the effect of:
“Promise me you won’t get offended, but …”
“No. I don’t know what you’re going to say so I don’t know if it would offend me; if I get offended anyway I’d be breaking my promise by no fault of my own.”
In an ideal world, nobody’d be offended by any of this stuff; we’d all address everything rationally. I personally don’t find human genitals or drawings of ancient prophets offensive at all, while I find gruesome pictures of dead bodies to be distasteful and unpleasant and would usually rather not see them myself. But like an embarrassing personal question, sometimes any or all of them are relevant to a topic and *should be there*.
It’s not really the subject matter that’s at issue with so-called “controversial images” — it’s the *intense reaction* that a fairly large set of people have to some of them, something far more visceral and instantaneous than you’ll get from reading any text.
Generally authors tailor what/how much they show to both the subject matter and the expected audience. Sometimes you know you’re going to shock a few people (and sometimes shock is an appropriate feeling to convey!) but it requires knowing who that audience is intended to be.
With Wikipedia, we have a *huge* worldwide audience made up of people with lots and LOTS of different beliefs and reactions… but only one version of each article per language.
It’s not really censorship we’re looking for, but a tool to help target content to the audiences who will appreciate it.
Are there slippery slopes? Hell yeah. But I think there are some legitimate benefits that I think shouldn’t be dismissed out of hand with the cry of “freedom of speech!”
It’s entirely possible that the image filter proposal needs lots of fixes or wild changes — I’m not going to argue on that as I’m not familiar with its intimate details and have neither been championing nor fighting it. But I do know that software is malleable — we don’t have to be fixed on anything — and I’d rather have an open, configurable system that lets us keep knowledge and information available for anyone who wants to see it than one that ends up removing information and images entirely.
— brion
Thanks to the folks who’ve commented here. I commented on Achim’s blog, and I will probably write more here later. But I wanted to say super-fast, for now, because I think it’s not clear: I am deliberately not talking in this post about the image filter, or about the referendum results, because I am not ready to talk about them yet. That’s because the results are still being analysed and discussed, and there’s no point in me talking about about it until I have something to actually say. (I did send this report to the Board a few days ago: it says pretty much everything I have to say about the referendum, for the moment.)
What I talk about in this post is completely independent of the filter, and it’s worth discussing (IMO) on its own merits. If people really want to talk about the filter that’s fine, but I worry that anger about it will make it impossible to have a thoughtful conversation about editorial judgment. And I think editorial judgment’s worth discussing, independent of the other issue.
A question for the people here: if you set aside the filter question, what’s your view on the question of editorial judgment? When readers complain they’ve been surprised or offended by something they saw on Wikipedia, how do you think we should respond?
Sue Gardner wrote:
> I am deliberately not talking in this post about the image filter,
> or about the referendum results, because I am not ready to
> talk about them yet. That’s because the results are still being
> analysed and discussed, ..
Why isnt the ‘referendum’ data being released so the community can also analyse and discuss it?
[…] On editorial, judgement and empathy par Sue Gargner […]
Sue: Surely there are some problems with editorial judgement, but in my experience they are few. As reader I have never seen a picture that would have shocked me as teenager. There are scary pictures, there are disgusting images, but they are all in articles about scary and disgusting subjects.
The discussion about this did not take of after de.wp showed a vulva picture on the main page. The discussion started after the porno wars between Larry and Jimmy. That is the ignition point and this is what shaped the debate.
If you want to strengthen editorial judgement, that’s a fair game. If there is a discussion not to put pictures of hairy spiders on top of the article about arachnophobia, the result will be predictable.
But what you have actually done is weakening the editorial judgement. If people see different versions of the article, you can’t really impose rules for the content. Why should Wikipedians refrain from putting controversial pictures in every single article when there is a filter solution? As mentioned above the filter won’t work for the vast majority of readers, but it will be part of the new ruleset. Editors would have to work under the assumption that it works.
The core problem is: WMF wants to change parts of editorial jungement without getting tangled into the decision process, probably for legal reasons. If WMF decides what picture is shown, WMF can be sued or taken responsible in many ways. Is this a factor in this discussion?
But imposing a new rule which puts massive amounts of work on the shrinking communiy of authors who tend to not agree with the Board’s decision puts Wikipedia as working environment more at risk as it is today. And the ill advised referendum did much to offend Wikipedians who were willing to discuss.
If you want to improve the tone of debates you have to work with the community. ‘Wiki loves monuments’ was an unpredicted success. ‘Wiki hates controversial content’ is an unmitigated disaster.
>But what you have actually done is weakening the editorial judgement. If people see different versions of the article, you can’t really impose rules for the content.
This is indeed an issue, and could even destroy the wiki system (it shouldn’t because it doesn’t include text), besides being against NPOV. Cf. http://meatballwiki.org/wiki/ViewPoint
I’m trying not to reply to specific comments, mostly because threaded comments display badly on this blog. (I could probably fix it to unthread, and eventually I will try.) But just a few quick things………
* I found Thomas Dalton’s comment really useful. That’s the kind of conversation I would like to see us having — e.g., what can we do to fix the supply side problem? Niabot’s post was helpful too, I thought.
* To Brion: I knew when I wrote “provocateurs and agitators,” I would regret it :-) But I really did mean provocateurs and agitators: it wasn’t code for “people who disagree with me.” I consider myself also to be part of the censorship-is-evil brigade: if I am making those people feel I oppose them, that’s not good. I don’t want, and did not want, to alienate or offend people like Achim Raschka.
* Similarly, to Nate. I don’t think I actually described anyone as “rebellious adolescents.” What I said was that we have gotten ourselves locked into an (unhealthy) ‘authoritarian parent / rebellious adolescent’ dynamic. Believe me, if you are uncomfortable in the rebellious adolescent role, I am even more uncomfortable as an authoritarian parent. That’s part of what drove me to write this post: I have felt myself getting backed into a corner towards playing that role, and I am wanting to resist being cast that way. (I’m not saying who put me there: it’s entirely possible I put myself there. I am just saying I don’t think it’s helpful, and I’d like us to all get out of that box.) Anyway, I blame my choice of words on my journalistic background — I’m trained to write colourfully, and I don’t mean any offence.
* To John: I don’t know why the referendum data isn’t publicly released, and for all I know it will be. I can imagine good reasons why some might not be released, for example to protect people’s privacy. In my report on meta, I included many of the write-in comments, despite the fact that I don’t think we explicitly asked permission to do that. I felt uncomfortable doing it, but I did it anyway because i) I picked only comments in which people didn’t give out information that would definitively identify them, and ii) I thought it was important for people to hear people they disagree with expressing themselves thoughtfully and intelligently. I found that reading the comments really helped me to understand other people’s perspectives, and I hope other people will find that too.
I am quite confused at the moment … At what point exactly did you forget that the Foundation is there to serve the communities and that you get their money because of the communities’ work, and not the other way round? You aren’t in any way supposed to be the “authoritarian parent” of the communities, your role is, if any, the role of a “humble servant” of the communities. The communities were there before you, and they don’t exist to follow your orders.
Well it is an interesting shift in the attention of the foundation from the editors to the readers, although presumably the first are a part of the last and the last potentially could be become part of the first. This shift would not be such a kind of problem, if we would have substantial data about the readers, at the moment we can assume that there is a tacit majotrity waiting for the filter to come and when this filter is installed we could claim to have done something successful because “the reader” will remain tacit. To care for the problems of the editors which do have such a difficult thing in mind like a non biased quality content is much more difficult and therefore much harder to proof to have done something for it, because the editors are used to express their opinion, even if that means they have to write in a foreign language.
What bothers me since this image filter discussion started is, that following her en:wp entry Sue Gardner has received a degree in journalism, so I would have expected to see some kind of research like for a good wp-entry as the starting point of this discussion, sine ira et studio as it should be. If you look up phrases like “media in educational context” or “cultural differences in new media” in scientific databases you see that the number of publications is legion, but for this problem nobody actually bothered to read them. As in any debate in wp-context you will need a scientific argumentation to convince your “opponents”, no accusations. Imagining to press a like button “with the furious heat of a thousand fiery suns” surely will not help.
And I have doubts that visiting the general meeting of the de-chapter, which does by no means coincide with the german writing community, and not being able to talk with those few attendees in german will help. It seems more like “I tried to talk with them, but they are not reasonable because they did not understand me”. Surprise surprise.
“Trained to write colourfully”, there’s an indictment of the journalism profession! … but I’m happy about the “u”.
I haven’t yet adopted the hard “g” spelling for judgement, but I do agree that editorial judgement is worth discussing. Unfortunately, such dispassionate consideration as may be needed is more difficult when a specific application is forefront in everyone’s mind.
Purely on the issue of editorial judgement; lamentably it is hugely skewed in the direction of midwestern US values, as should be expected. Continental Europeans, much less Northern Europeans have way more liberal views on what is properly accorded to amour propre than mid-westerners. We celebrate the diversity of cultures, but lament their clashes, particularly when you are on the losing side.
Dear Sue, you missed one very important point when referring to the decision to put a vulva on de-WP’s main page: The image and the decision on the article of the day did not make news in Germany. No one complained, no one deemed it controversial.
Germany and German culture are different from yours in regard to nudity. If en-WP does not want a vulva on their main page: fine, don’t put it there. But please accept that other cultures have other values.
Perhaps a majority of Germans were offended, but their offense did not rise to the level of complaining? Perhaps the Germans who were offended were not the type of people to complain? Perhaps Germans, as a culture, are more accepting of things they don’t like and complain less? Perhaps one person complained, and his complaints were dismissed?
Absence of evidence is not evidence of absence.
But more to the point, there will be images which will offend the average German. They should not appear on the front page out of context.
Again someone claiming to speak for the silent majority. This is an invalid, straw man argument, please refrain from making it in a serious debate. And this is a serious debate.
Actually, I’m claiming to speak for the one person that this image totally hacked off, and has forsworn Wikipedia forever. Was posting a tasteless image worth the loss of that person? Are you suggesting that nobody was offended (you can’t prove that, of course)? I’m suggesting that somebody was.
We should be considerate and careful, don’t get me wrong. And I don’t know this particular case well enough to say for certain that the image should or shouldn’t have been in there, although it does seem to me that the people here from German Wikipedia do have a good point.
But there is absolutely no way that we can please everyone. Someone will always get offended by something. There’s no point in trying to please everyone, it just can’t be done.
That does of course not mean that we should go to the other extreme and do our best to offend people. But I don’t think anyone has meant to do that, either.
See post below.
“Tasteless”? Is that really the term you wanted to use?
The picture wasn’t presented out of context, it was part of the “today’s featured article”-box, surrounded by the first paragraph of the article. (“Out of context” is a much better description for the way pictures are presented on Commons, which is were our real problems lie – but a filter won’t help with that.)
Thousands of people are offended by our content every day. Most of the time it’s not because of an image, but because of an assertion in the article text. An encyclopedia which puts itself in the tradition of the Enlightenment has to present things as they are. That doesn’t mean to deliberately shock people, but it’s the opposite of relativism and trying to please everyone.
I am from the German wikipedia, user since 2005, admin since 2010, and strongly object to your patronizing attitude, Sue. Why don’t you rely on your own arguments? Why do you feel that presenting yourself as a speaker for a “tacit majority” will strengthen your point? You seem to indicate that, by virtue of your “empathy”, you know what the “tacit majority” would say if they were not tacit …. This kind of reasoning is definitely not helpful. “Be empathic, will you!” is a contradiction in itself.
If anyone takes objection to a picture in a wikimedia project, this should not brushed off rudely, okay. But it is exactly the wrong attitude to soften everything to the point that no one has to be “astonished.” We have to find ways to quarrel, to act in conflicts, in dissens, _without_ patronizing the other. What you are doing is, alas, exactly the opposite.
As a member of the OTRS who was active at the time we had the vulva on the front-page: The complaint-emails were very low in number (far lower than I expected when I saw the vulva-image on the front-page at midnight). There were no newsreports on the TV or the radio (AFAIK); the public response was very small.
So even if you overrule the german community and force the filter on us: You will not help the readers (or only a very small number of it), because the readers are not and were not shocked and do not need a filter and will not use it (until you changed the opt-in to an opt-out which – we both know this – is only a matter of time).
Overly, this is rather an anglo-german debate. In order to enlarge a bit the scape of implication, I tried here to adress the issue from a francophone viewpoint : « An idiomatic French expression well renders what I felt when I discovered the Image Filter Referendum : comme un cheveu sur la soupe. It means litterally like a hair in the soup, the nearer english equivalent being out of the blue. So far, I have hardly seen much complains about the use of so-called controversial material… »
Next here : http://wikitrekk.blogspot.com/2011/09/out-of-blue.html
Been thinking about this some more. I don’t believe that filtering images based on tags is necessary. It largely wouldn’t solve the stated problem, it serves to split up the community by what they can and can’t see, and for the people who don’t get to choose the tags on their preference, it *is* censorship.
Instead, we editors need to recognize that there are two kinds of images: images which are offensive to some people, and images which are not. Images in the first class: portraits of Mohamed; in the second class: portraits of Winston Churchill. I think that the key to keeping most people happy is to ensure the images of the first class only appear in articles about the first class. Thus, people who don’t want to see pictures of gangrenous limbs can stay away from the Gangrene article, while people with any kind of interest in that article can visit it without fear that the editor self-censored.
I think we editors also need to recognize that this issue is not about Germans or German tolerance; not about Americans or American prudery; not about vulvas (or penises); not about censorship; not about any specific article or edit or editor.
We also need to recognize that it is not possible to eliminate all complaints. We do need, however, to be able to *respond* to all complaints. Thus, when somebody complains about the picture of someone’s chest being ripped open for open heart surgery on the page about Cardiac Surgery, the recipient of the complaint can very reasonably say “Well, what did you expect to see besides a picture of Cardiac Surgery?” If a Muslim complains about seeing pictures of Mohammad on the article about Mohammad, they should be told that the encyclopedia is for everyone, not just Muslims (or anyone else with a religious objection, e.g. iconoclasts to images of Christ).
I hope that every editor can see how this system can be implemented without anything more than a recommendation. I hesitate even to suggest any kind of tagging, in case somebody decided that they want to create a censored Wikipedia with no offensive images. The question an editor should ask themselves is “If someone of a gentle sensibility has read the title of this article, will they be surprised by the presence of this image.” If the answer is “yes”, then the editor should exercise their better judgment.
Thus, the images of my infected spider bite (which I injudiciously posted to Flickr and was promptly scolded for doing so) would surprise someone who was looking at the article on Spiders or Knees (even though it was right behind my knee), but they wouldn’t surprise someone reading the article on Skin Infections. I want to make it clear here that I’ve made the mistake I’m hoping other people can avoid.
Can we lay this issue to rest now that I have resolved it using my nearly infinite wisdom? Don’t make me use The Force on you! … cuz you’ll never know if I do!
Oh, and before any Germans want to get on their high horse about how Germans are not offended by any image, no matter how horrible, here is an image that will offend a majority of Germans:
https://picasaweb.google.com/116934584659759034135/Random#5658101423994580898
I’ve got a good answer to that which I’ve seen printed on a t-shirt saying “Teach me democracy” http://www.bestsyndication.com/2005/Dan-WILSON/Current-Affairs/09/images/092605-lynndie-england-abu-ghraib.jpg
:-)
Thanks Sue for opening the discussion and trying to get it back on the real topic: How do we cope with problematic content (and failures to reasonably deal with it) on Commons and other Wikimedia projects? I think it is very bad that the way this issue has been handled has lead to the discussion being all about censorship and not the mess that commons is.
I don’t envy your position: The board made a decision that you have to carry out. And now, some people even think that you made the decision (and confuse the CEO with the board members).
A couple of comments. First of all, the process about the image filter lacked in my opinion an important element that is needed to make big changes in the wikimedia projects work: The part where the editors make it their own. You wrote an interesting piece on that (https://suegardner.org/2010/11/09/making-change-at-wikimedia-nine-patterns-that-work/), but when you compare fundamental changes that worked (introduction of license change, flagged revisions, the strategy project) to this one, then those had a point where the editors could make the thing/the process their own, like a vote or a discussion process. There is no such process in the image filter and that’s problematic.
The other thing is that on the german wikipedia, controversal images have never been a real problem. The vulva example might be an example of bad judgement, but it is certainly not an example of editors being careless. Before putting the article and the image for 24 hours on the main page, there was a discussion that centered largely around the controversial image. Editors were aware and then made the editorial decision. The backlash was there, but it was actually not that large that you could say that the editors underestimated the controversiality of the image. Also, the vulva image on the main page was a singular event.
Now, what surprised me all the time is why an image filter should be enabled on a project like the german wikipedia that has no real problem with controversial content. Apparently the explanation is that the board and you think that there is a problem with controversial content on the german wikipedia. And honestly, I can only attribute that to a skewed cultural perception. When the board wants a filtering option on the aceh wikipedia, well, I can at least understand that, although I don’t endorse it. But there’s no need for that on the german wikipedia.
One final comment: Besides the handling of this issue, I believe that there are reasons on why censorship is dominating the debate. And that’s the fact that radical islam wants censorship and the discussion about the muhammed caricatures is about censorship and self censorship. The other is that the right in the US is also about censorship and self censorship, in particular regarding nudity. Here is a slippery slope and here is a core question about if there is any alignment between our values and those of radical islam or the far right in the US. I say our values are about fighting intolerance and ignorance. I know that the board thinks so too. But apparently there’s a gap in the views on how to achieve that outside of the western world.
You are full of bullshit. German Wikipedia handled this issue very well. There is nothing wrong about showcasing an article about vagina on the front page. There is nothing wrong about displaying, prominently, an image of a vagina on a article about vagina.
And we don’t accept bullshit about objectionable content. We don’t accept bullshit about vaginas or penises or anuses or any other body part. We don’t accept bullshit about Muhammad, either. We are on fucking 21th century and we are building an encyclopedia.
And in fact I find it funny (and sad) that you state that we should publish Muhammad cartoons anyway, but at same time you, without ever saying this explicitly, impose vague barriers on displaying body parts. You also accuse entire communities when they reach consensus on displaying it, if you think they should do otherwise. This shows to me that you are much more worried on sensitivities that westerners also share. And that you lack respect for community consensus. Your position of power worries me.
The point I actually agree is that we shouldn’t overly sexualize a non-sexual topic. But this is a no-brainer, and quite tangential. Putting a vagina on an article about vagina isn’t to overly sexualize; featuring an article about vaginas on front page has nothing to do with that either.
@Russ Nelson: Do you know any Germans in person? ;-) Your picture might not be the funniest picture ever created but I doubt that anyone might find it offensive. Some Germans might even take it as a compliment.
About voices in a vaccum, I find it incredibly ridiculous and frustrating that I can’t let like-minded people know about issues being voted on because of canvassing rules. This is purely biasing the community.
Very alianating image, Russ. Gross! Gross! Gross! I’m so insulted… :-)
As a german I am quite surprised how the german contributors here depict there home country. The german public would not complain when depicting an open vulva or erected penis on a magazine cover? We have lots and lots and lots of full frontals in TV and magazines?
What? This is not Germany. What we have are bare female breasts on every third cover of “Der Spiegel” and on every second cover of “Stern” because they sell better when showing pretty nude female bodies to illustrate the concept of tax reform or healthy nutrition ;-)
You paint the alleged german openness to nudity as you need it for this discussion. I saw much more nude bodies in american tv series like True Blood than in _any_ german series ever. Or have I missed something in the german telly?
Oh, and I saw Ashton Kutsher at Ellen Degeneres. They didn’t actually show his most private parts but the poor studio audience must have seen it….
1. An Encyclopedia is not a magazine cover. 2. Yes, of course this was a test and a bit provocative. In fact the public reaction was very low (Much lesser than I ever thought it would be). Not much attention in the press ( 3 articles, I remeber. 2 short neutral annotations and 1 longer article making jokes about prudish administrators and jimbo wales). No big waves in the OTRS. So what?
Public attention to the “article of the day” showing the vulva pic was near zero. Public attention about *Wikipedia* (*not* Wikimedia Foundation) wants to implement an image filter on contrary is high.
[…] Sue Gardners Blog […]
First Im a female editor since 2007. The realization of gender mainstreaming in wikimedia projects was for me always an important concern…however I had and have no problem with the vulva-photo on the front page. In my opinion, the Foundation should not interefere with editors’ matters and settings but supporting the wikimedian projects. I see also the danger, that the picture – filter could be the beginning of a process, which will flash over contents. For me, every community has the right to realize its own rules, without creating a dependency for another community. The further discussion should be an open process.
I don’t understand how some of the abusive comments above are being allowed to stand here. A reasoned discussion needs to follow certain standards to be acceptable. Or is this the kind of controversy you aimed to generate?
I am delighted that we all now share the opinion that this is a matter of editorial judgement. The image filter of course is no tool to implement editorial judgement. How then can we improve, indeed do we need to improve – our editorial judgement?
Firstly we can avoid making inappropriate cultural judgements. Being a strait-laced old Anglo-Saxon, for example, I would characterise the German front page “experiment” as ‘unwise’ – however we need to firstly give some credence to the repeated claims that in the Germanosphere this is not considered as controversial as it would be in the Anglosphere, secondly remember that it was in the nature of an experiment, and thirdly that it did not generate the type of backlash that it would have done on en:
And this is really a good example of context, one of the key themes that recurs repeatedly in our discussions. Images and text are appropriate in one context, and inappropriate in another. Currently this is a human decision, and it is exercised by the community. Any reader who finds something inappropriate can edit it, bring it up on the community portal or email someone. That doesn’t mean it will be changed, in general it will depend on community consensus.
So the basic “gap” here, apart from human error, is the perception that “the community” fails to reflect “the readership”. While it is certainly true that there are demographic skews to the community, it has yet to be established that these are distinct from the readership. (The Wikipedia Readership Survey 2011 should help wit this.) And even if a difference is found, it is still a jump to assume that, for example, the image of a naked man on the article on statues is there because we have a “young North Atlantic male ” editorship.
Therefore I conclude that fundamental research is needed to establish the presence of a real problem, and to identify in detail the nature and scale of that problem if it exists. Only with these basic facts at our fingertips can we hope to move forward together.
I’m afraid that the chickens have finally come home to roost. The WMF is starting to pay the price for his hands off policy concerning the wikimedian communities, and has now to deal with the same unsavory characters that have been wreaking havoc in the projects. The human qualities of that crowd leaves much to be desired and will drag the WMF from the gutter into the sewer. A return of sorts to it’s roots. In the words of Dag Hammarskjold: “You cannot play with the animal in you without becoming wholly animal, play with falsehood without forfeiting your right to truth, play with cruelty without losing your sensitivity of mind.” The WMF have been hearing this from many people. They told you so. What I find the most sad is that it didn’t and it doesn’t have to be so. I wish you the best of luck, but I would not bet on it.
I don’t know whether or not it will affect the oriention of the debate, but I have launched a francophone poll on the Image Filter (See the voting page : http://fr.wikipedia.org/wiki/Wikipédia:Sondage/Installation_d%27un_Filtre_d%27image And my own synthesis in english : http://wikitrekk.blogspot.com/2011/10/launching-french-poll-on-image-filter.html). Nevertheless, this poll is merely a survey (a sondage). If negative, its result shall not impeach the appliance of the Image Filter (on the contrary, so it seems to me, to the german Meinungsbilder). It only aims to observe francophone communities’ opinion on such an important matter.
Terrific post. I was astonished at the hostile reaction to a filter which would allow a reader to choose what they wanted to see. Clearly, there would have been problems to work out, but the fact that it has been practically shot down is ridiculous. The example of the image at German WP is such a perfect example of the difference between NOTCENSORED and common sense. Thanks for taking on the issue. I hope that attitudes will soften and this will get worked out.
Sue, we /already/ practice editorial judgement. Note the extensive discussion on de.wikipedia that preceded the selection of the Vulva article for the front page. That’s editorial judgement. The picture on en.wikipedia on the Autofellatio article has been discussed and we made a conscious editorial decision to keep it. That’s editorial judgement.
The apparent problem, then, is not a lack of editorial judgement, but rather that /you disagree/ with the editorial judgements being reached. That’s something /you/ need to deal with, not us.
One relatively famous editor at the German wikipedia is I think spot on:
http://de.wikipedia.org/w/index.php?title=Wikipedia_Diskussion%3AKurier&action=historysubmit&diff=94224297&oldid=94221377
(I don’t know if you know the comment, but I can translate it if needed.)
Sorry, my own opinion if I may:
Wikipedia is largely run by media incompetent trolls (the editors). These people actually believe that putting up an obscene picture on the main page is going to change the world. The irony is though that you conceptually believe the same thing.
Re to Anon wikiuser. I’m assuming you are talking about the Vulva picture on the mainpage of the German language wikipedia. If that had gone on the mainpage of the English language Wikipedia then I could understand your description of it as obscene – but as several DE editors have explained the photo is not obscene in their culture, and the lack of complaints from the German speaking public indicates that they are competent and correct. If we had put that picture on the front page of the English language Wikipedia then I believe there would have been many complaints and people would have reasonably argued that those who put it on the front page were incompetent or trolling, but it didn’t go on EN it went on DE a different culture with different values.
We aren’t the only global organisation to operate different rules in different parts of the world – http://en.wikipedia.org/wiki/Fox_News_Channel and http://en.wikipedia.org/wiki/The_Sun_(United_Kingdom) are sister companies, but only one is synonymous with photographs of topless women.
Bravo, eloquently stated Sue. The unfortunate reality is that our editors in fact do tend to be adolescents, or young adults, so to expect sonsistenyl grown up and mature behavior of such people is unrealistic. I am not sure just how involved you are in regular editing and interacting with the community, but from my many year editing, NOTCENSORED, along with other policies, have been abused for years to include in WIkipedia questionable and controversial content that most mainstream media institutions would consider unprintable. The environment of Wikipedia does lend itself to the easy creation or enforcement of ethics in journalism or writing. To date, there is no ethical standard to which content or editors is held. And when consensus exists to pursue unethical ends, it occurs. There is no mechanism to stop it. Hence some users utilize Wikipedia to foment slander, present lies as truths, display perversion in prominent areas to expose it to wide audiences, and generally subvert the primary goal of wikipedia which is to present a quality encyclopedia to the masses, free of charge.
In my opinion this is just a symptom of a larger problem. A large number editors, who have gradually grown into the majority, are no longer working towards the ultimate goals of the project, nor do they care to ever achieve it. And through their manipulation of polices and procedures, they have chased off most mature editors and the few subject and field experts we once had.
Its time for the board to step in and take a more active role. Lets set a goal and achieve it. Lets make Wikipedia a respected and credible resource. Lets shed our reputation for poor quality, falsehoods, misrepresentations, and general unreliability. And lets put behind us the things that generate those perceptions – which include making pornographic materials available to minors without any warning or notification.
I have read through Sue’s blog post and many of the comments. I would like to preface my comments with the fact that I work with nude bodies from all walks of life every day. I am not a prude. I also work with varying sensibilities every day. By default, I tend to go more conservative because I don’t want people to be uncomfortable–I also understand that I have no place in imposing my own comfort level on others. I have also contributed to wikipedia under a different handle.
I am going to propose an idea that may be a little controversial in this community with regard to a filter. A filter that the user can implement is NOT and is no way censorship. In fact, it gives the user more freedom–the freedom of CHOICE. Sensibilities are quite varying and are very personal. I firmly believe that an organization or a person imposing their own sensibilities on others is extremely oppressive and judgmental in deciding what people should and should not be comfortable with.
Here is an example: Let’s say the government is ruled by a certain religion, one that you do not believe in. But by rule of law, that is the only religion you may practice. Is that freedom? NO, it is not. As many of us know, the freedom to CHOOSE is true freedom. Forcing one’s sensibilities on others is oppressive.
A filtering tool would simply give the user a CHOICE to view, or not view, the material in question.
I truly believe that WMF and Sue Gardner want to make Wikipedia democratic and accessible to EVERYONE.
In order to do that–and to gain more diversity in the WP editing community, I strongly believe that the Wikipedia community needs to be sensitive and accommodating to more modest sensibilities. If you don’t want to look at a giant vagina or a picture of Mohammed, you should not be FORCED to–that is oppressive, but if you want to, it will still be there for you to look at, based on your filter settings.
CHOICE is the ultimate freedom. Oppression is a lack of choices.
The problem with the proposed type of filter is: you dont know what is filtered.
Every article and the images included or not included is filtered, we call that editorial judgement. Everybody can see who has written or deleted text, choose an image or rearrange the layout. We shorten the trivia, we divide articles into subarticles, we delete parts without sources.
And the reader has the chance to interact, perhaps by uploading a better image. And the reader can have a look at the older versions.
But the idea the board is (was?) offering us is not open editorial judgement, but but a second level of influence (which I still call censorship), changing the article after editorial judgement was executed and perhaps several options were discussed.
Its like the journalist has choosen an image and the archivar is refusing to fetch it from the lockers.
Nobody will know why this image is missing and the other one is still there. And why should an image be hidden in all contexts? If you find an image not fitting in a certain article, replace it with a better one, perhaps hide it behind a “curtain” or remove it from the article. But dont do anything outside of the editorial office.
As a reader I dont want to rely on the decisions in the backoffice about subject that might make me incomfortable
[…] My trust in the German Wikipedia was unsettled recently when I stumbled over the ongoing discussions in the community on censorship, where antifeminist tendencies seem to emerge and spread. I tend to forget the people who are […]