A couple of articles have come my way this week, which highlight one of the challenges of open information sharing: people sometimes lie. The articles I've read are about a false bit of "history" created on Wikipedia, defaming a gentleman that used to work for Bobby Kennedy.
If you know your source is subject to subversion, you can look for third party verification - but people want to trust things like Wikipedia because they very good, and they make research much easier. Encouragingly, my 12-year-old's school is educating their students on the cautions of using Wikipedia as a reference source. In fact, the first I heard about this Wikipedia false history incident was from them via email. This, from the school librarian:
Wikipedia, an online "encyclopedia," is being used heavily by students. They need to be aware that it is not always accurate. Here is an example: http://www.usatoday.com/news/opinion/editorials/2005-11-29-wikipedia-edit_x.htm
My solution: use it if you must but verify the information in at least two other sources that have established reputations for providing reliable information. That might mean looking in a book!
My daily CNet News.com alert contained a perspective piece on "Wikipedia and the nature of truth," which offers additional perspective on the article above.
Lying is nothing new, of course. But access to lies gets easier with the internet.
When people lie in a credible venue, how long can the venue remain credible? It depends on whether you take action to rectify the lie, and implement controls to reduce the possibility that future lies will be tolerated.
One solution is to use some kind of verification process to assure you of the credibility of the source. Professional research organizations and commercial encyclopedias ostensibly have fact checkers to vet this out.
I'm not sure if Wikipedia's structure allows a fact checking process, or if it simply relies on peer review and "let us know if you see something wrong" vigilance. That's fine, but it seems there should be some sort of authentication for content providers, at minimum, so we could avoid the "we don't really know who made this false claim" situation outlined in the USA Today piece. I think that's reasonable for a resource like Wikipedia, which is emerging as an authoritative reference source.
What do you think?