Editorial Oversight and Control
Revision as of 13:30, 23 October 2015 by Natbrown (Text replacement - "[about|" to "[About WikiTranslate|")
|Editorial oversight and control 0 Languages Russian|
This page summarizes the various processes and structures by which WikiTranslate articles and their editing are editorially controlled, and the processes which are built into that model to ensure quality of article content.
Rather than one sole form of control, WikiTranslate relies upon multiple approaches, and these overlap to provide more robust coverage and resilience.
Overview of editorial structure
Anyone who visits the site with the exception of blocked users can edit it as far as they create account. There are mechanisms that help community members watch for bad edits, a few administrators with special powers to enforce good behavior, and to withdrawal or restriction of editing privileges or other sanctions when needed.
WikiTranslate is a wiki, anyone can contribute to WikiTranslate as for Wikipedia, and everyone is encouraged to. Since Wikipedia is a pioneer in communal knowledge building of this kind WikiTranslate is learning from the Wikipedia model's approach of the best methods that information can be added.
WikiTranslate's editorial control process
- Core community level controls
- The degree of oversight possible with bona fide editors.
- The wiki system itself, which as operated, appears to strongly select for robust and best collaborative knowledge of many people (even on contentious topics), rather than the unrepresentative viewpoint or negative impact of a few.
- Editorial panels and processes
- Enforced policies which provide all editors with a solid basis to take matters into their own hands in addressing both deliberate and innocent bad edits.
- A consensus-based ethos that we discuss in WikiTranslate Facebook groups, which beneficially impacts the decision-making process.
- Software-facilitated controls
- Systems built into its editing software that make it easy for a large number of editors to watch for vandalism, monitor recent changes, and check activity in articles in personalised watchlists, in real time.
- Design decisions in the software that make identifying and reverting any number of bad edits possible at the click of a button, whereas vandalism itself takes longer to do.
- Ability to set fine-grained software blocks on problematic editors, and partially or fully protect targeted articles.
- Standardized alerts, known as tags, which can be added to any fact or article, and which allow individual facts (or entire sections and articles) to be highlighted as questionable or brought immediately to others' attention.
- Controls under development
- The control known as flagged revisions is being rolled out as of 2007. It aims to differentiate the version shown to most readers, from the draft "cutting edge" version being edited, and in the first instance to only show the latter when it has been checked for reasonableness. This system is expected to provide a powerful way to prevent most vandalism or poor quality edits from being seen by readers, once it is fully operational.