A trust metric enabled Wikipedia

Mar 09, 2006 16:36

I love the idea of a trust metric enabled Wikipedia. There's plenty of questions to chew on.

How would a trust metric enabled Wikipedia work? Obviously every user will have to list trust information about other users, but what do you do with that information?

Here's the easy way: Jimbo is the root of trust from which the trust metric runs. If your trust is above a certain threshold according to that metric, you can edit, otherwise you can't. If you don't like it, set up your own wiki.

Can we do better than that? Supposing we move away from the model in which a single version of the page is the "current" version, and old versions are there only as a historical record? That can be anything from a small step to a giant leap.

As a small step, we could allow untrusted users to edit the page, but normal Wikipedia visitors only see the version that has most recently been edited by a trusted user. Each trusted user is expected to review any untrusted edits they may implicitly be including when they edit a page, which is what users do now in any case for the most part; if they don't like the edits, they could revert back to the last trusted version, and normal users will never see the edits they rejected. This resembles the article validation proposals currently going forward on Wikipedia, but backed by a real trust metric.

Supposing we allow the article to fork? Wikipedia already "sort-of" allows forks, in that you can choose any version of an article as the basis for your next edit, but your new edit will become the current version and no metadata record of what edit you started from is kept, so you can't do useful things with the forks. Supposing we explicitly record the tree of versions? This would allow the sophisticated tools provided by any modern version control system to synthesize a new version of the article from whichever subset of the edits the user thought was appropriate, making it much easier for the trusted users to winnow the wheat from the chaff when choosing an edit to make current.

Now we approach the "giant leap". When winnowing the wheat from the chaff, I don't have to consider only the binary trusted/untrusted decision of the trust metric; I can use the gradations of trust it provides as a guide to how much good faith to assume in a given edit. Coming closer to the leap, I don't have to use Jimbo as the root of trust for these decisions - the point of a trust metric like Trustflow is that I can afford to do the trust calculations for myself.

The "giant leap" comes when we take it to the next level. Under these circumstances, is there any need for all users to agree on a single "current version"? The "default view" of Wikipedia might have Jimbo as the root of trust, but I might like to choose someone else - myself, for example. What effect would this have on NPOV disputes? Would each side of a contentious debate end up with their own persistent fork, or would a consensus version emerge? Would the good done by decentralization outweigh the bad done by automatic self-reinforcing reading bias?

Given this outline, when I go to view an article on a topic, how do I choose which version to view? What tradeoffs between newness and trust will I make? Will there be problems of information moved from one article to another "slipping through the cracks" if I trust the old version of one but the new version of another?

Should we be trying to use domain-specific information? One user may be trusted when editing on cryptography but not on animal welfare, say. Can we synthesize domains from the information available about what links to what and who is editing it? Can we capture in our trust information that people who know about crypto trust X but not people who know about animal welfare, and use that to fine-tune our trust decisions depending on the subject area of the article we're assessing it in reference to? Can we do so without anyone having to explicity identify domains?

Finally (for now), can we do all this in a distributed fashion, so that we are all hosting our own intricately interlinked and interrelated versions of Wikipedia, each drawing edits from each other but reflecting our own unique spin on the world?
Previous post Next post
Up