As a sys admin I worry about those kinds of things. It sounds like a great idea in a utopian society, but it only takes 1 person to mess up the whole thing. Oddly enough though I'm not worried about someone deleting all the posts. A simple forward looking DB could easily handle that. I mean, create a base file from the original post, and every time someone edits the post, create a differential file that is Huffman or Run-Length encoded and then only keep the most recent post in the DB's main file. This way you can "rebuild" the post up to any particular date by "replaying" the differentials. Then every so often, when everything looks ok, update the base file up to some number of edits from present to minimize wasted overhead. What I worry about isn't content deletion, trolling, spamming, etc, I worry about overflow. What happens if you have an innovative person trying to overwhelm the DB server?
In my comment program I control this with maximum post frequency and maximum post size, but would that be practical in a Wiki? I would say yes on post frequency, but unsure on max post size. Maybe if you allowed them to post 10,000 characters once every 30 seconds... Naw, because of the "repost the whole page" thing, they would really have to keep the whole page under that limit...
Well, anyway, I like the ideal case, but feel that Wiki is just not practical in the modern day. However, the forum effect is quite desirable because at the very least you make an individual responsible for their own content by associating it with a username.
Still doesn't mean I don't mind posting on other people's wikis...
Mood: Inquisitive