Wikis are a great idea, but they are clearly vulnerable to bad actors. If there is a large community supporting the Wiki, it can have social antibodies against 'bad' content. But wiki architecture is also open to mechanized attacks, and those can be overwhelming. What to do. One lightweight but potentially effective answer comes out of today's Slashdot interview with Wikipedia founder Jimmy Wales. [link fixed]
Q: I like the concept of a wiki, but I'm a bit concerned about the current implementation.
Right now, we are seeing several instances where crawlers are disrupting wikis, spammers are embedding wiki links to their sites to boost their Google rankings, and advertisers are placing ads in wikis until someone goes through and nukes them.
Do you have any thoughts as to how wikis can be modified to prevent things like this in the future?
Sure, I think it's pretty simple to solve problems like that. One of the first tricks I would try is to parse the wiki text that someone inputs to see if it contains an external link. If so, then only in those cases, require an answer to a captcha.
Second step, keep editing wide open for everyone, but restrict the ability to post external links to people who are trusted by that community. Make it really easy for trusted users to extend the zone of trust, because you want to encourage participation.
Basically what I think works in a wikis is to trust people to do the right thing, and trust them as much as you can possibly stand it, until it hurts your head and makes you scared for what they're going to break. Because that is what works.
People are not fundamentally bad. It only takes the smallest of correctives to take care of that tiny minority that wants to disrupt the community.