The Vandalism Studies project is a portion of the Counter-Vandalism Unit designated to conduct research related to unconstructive edits on Wikipedia. The project covers all vandalism on Wikipedia. If you'd like to get involved, please add your name to the Members list, below!
CHCSPrefect (talk·contribs) I made this account to stop Vandalism coming out of my school's IP Address, this is what this account was made for, something like this is where I belong!
Rushbugled13 (talk·contribs) I wish to help maintain the reliability of Wikipedia as a resource, and vandalism is a large problem with respect to reliability.
These are some preliminary questions may stimulate future studies. Not all questions may be answerable, so think of it more as a brainstorming section.
Analysis of vandalism
Who is responsible for vandalism? What do vandals want? What are the demographics of the vandal population?
What proportion of vandals are on dynamic IP addresses, and hence very hard to block?
Are IP edits ever responsible to improving a featured article while on the Main Page? (See also essay IPs are human too.)
What motivates people to vandalize articles? How can we minimize the satisfaction they get from doing it? (See: The motivation of a vandal)
Do vandals just choose another article to edit instead if an article is semi-protected? How can we test this?
Why do certain articles attract more vandalism than others?
What types of vandalism are there? What message are they trying to get across? Why do vandals not fully realise that their actions are futile?
What sort of financial gains can be made from using Wikipedia to advertise - are spammers just wasting their time, or can it actually be profitable? Are our anti-spam measures adequate?
What is the overall contribution from schools and universities? Are they worth having? Do universities contribute less vandalism than schools, or are all ages equally immature?
How does the rate of vandalism vary throughout the day?
Would still be problems with vandalism if unregistered editing were blocked? How can we test this hypothesis? Certain categories could be experimentally altered to block unregistered editors, but then vandals could just choose an article that wasn't protected. We would have to block all IP editing, which would certainly be controversial, even just to gather a small sample of data. The blocks would also have to allow newly registered users to edit, otherwise there wouldn't be time to create an account and then wait 4 days. Perhaps we could use a comparative method by doing the experiments on another wiki instead?
Quantitatively, how are levels of vandalism affected (both in terms of percentage of edits and number of edits) when there is external attention draw to an article (e.g. Slashdot or The Colbert Report). Do levels of vandalism return to normal (e.g. in elephant) in all cases? How quickly?
How much of vandalism is self-reverted?
How do the levels of reverted edits compare between articles of different quality (e.g. GA vs. start class)
How often are good faith edits labeled as vandalism, either a) mistakenly and through misinterpretation of policy or b) maliciously?
Are editors any more likely to continue or desist vandalizing if warned by a bot instead of a person?
How often are vandals warned on their talk page after committing an offense?
What are the costs and benefits, and hence overall utility, of warning users? How do users respond to warnings?
Who is responsible for reverting vandalism?
What effects does semi-protection have on the level of vandalism of protected articles?
What strategies can we employ to catch vandalism quickly?
How can we catch most of it at recent changes?
How can we establish a situation where almost every article has someone responsible for maintaining it? Is this even a good idea? (See: Ownership of articles)
How good are editors at reverting vandalism? That is, is it reverted properly, or is it often dealt with poorly, e.g. removing a whole paragraph that the vandal has simply altered in meaning.
What happens to vandalism levels when edits won't show up in the current version of the article - a trial of something like stable versions, where the vandal cannot vandalize the actual article people see, or something functionally similar, is needed. Perhaps a small section (e.g. all articles in a certain category) could be tested out.
Buriol, Luciana S.; Carlos Castillo; Debora Donato; Stefano Leonardi; Stefano Millozzi (2006). "Temporal Analysis of the Wikigraph"(PDF). Sapienza University of Rome.