Tonight I went to see historian Timothy Garton Ash talk with his friend Tobias Wolff at Stanford. The occasion was the publication of Timothy’s newest book, a collection of essays and reportage loosely built around the idea that “facts are subversive.” Timothy’s premise seems to be –roughly, loosely– that people in power are often trying to construct narratives in support of a particular economic, political or culture agenda, and that facts –even very small ones– can sometimes trip that up.
One thing they talked about was about honesty in memoirs — for example, Mary McCarthy’s 1957 autobiography Memories of a Catholic Girlhood, in which McCarthy disarmingly confesses that “the temptation to invent has been very strong,” and “there are cases when I am not sure myself whether I am making something up.” And about George Orwell’s Homage to Catalonia, in which Orwell wrote:
“I have tried to write objectively about the Barcelona fighting, though, obviously, no one can be completely objective on a question of this kind. One is practically obliged to take sides, and it must be clear enough which side I am on. Again, I must inevitably have made mistakes of fact, not only here but in other parts of this narrative. It is very difficult to write accurately about the Spanish war, because of the lack of non-propagandist documents. I warn everyone against my bias, and I warn everyone against my mistakes. Still, I have done my best to be honest.” (1)
This brought into focus for me something I’ve long half-recognized — both in my own experiences of reading Wikipedia, and the stories people tell me about how they use it themselves. Article after article after article on Wikipedia is studded with warnings to the reader. “This article needs references that appear in reliable third-party sources.” “This article needs attention from an expert on the subject.” “This article may be too technical for most readers to understand.” On this page, you can see 24 common warning notices — and there are many, many more.
And I think that’s one of the reasons people trust Wikipedia, and why some feel such fondness for it. Wikipedia contains mistakes and vandalism: it is sometimes wrong. But people know they can trust it not to be aiming to manipulate them — to sell them something, either a product or a position. Wikipedia is just aiming to tell people the truth, and it’s refreshingly honest about its own limitations.
Tobias Wolff said tonight that sometimes such disclaimers are used manipulatively, as corroborating detail to add versimilitude to text that might otherwise be unpersuasive. I think that’s true. But in the case of Wikipedia, which is written by multitudes, disclaimers are added to pages by honest editors who are trying to help. They may not themselves be able to fix an article, but at the very least, they want to help readers know what they’re getting into. I like that.
(1) I looked that up on Google Books when I got home. Yay, Google Books!