Is English Wikipedia’s ‘rise and decline’ typical?

This graph shows the number of people contributing to Wikipedia over time:

The Rise and Decline of Wikipedia The number of active Wikipedia contributors exploded, suddenly stalled, and then began gradually declining. (Figure taken from Halfaker et al. 2013)

The figure comes from “The Rise and Decline of an Open Collaboration System,” a well-known 2013 paper that argued that Wikipedia’s transition from rapid growth to slow decline in 2007 was driven by an increase in quality control systems. Although many people have treated the paper’s finding as representative of broader patterns in online communities, Wikipedia is a very unusual community in many respects. Do other online communities follow Wikipedia’s pattern of rise and decline? Does increased use of quality control systems coincide with community decline elsewhere?

In a paper that my student Nathan TeBlunthuis is presenting Thursday morning at the Association for Computing Machinery (ACM) Conference on Human Factors in Computing Systems (CHI),  a group of us have replicated and extended the 2013 paper’s analysis in 769 other large wikis. We find that the dynamics observed in Wikipedia are a strikingly good description of the average Wikia wiki. They appear to reoccur again and again in many communities.

The original “Rise and Decline” paper (we’ll abbreviate it “RAD”) was written by Aaron Halfaker, R. Stuart Geiger, Jonathan T. Morgan, and John Riedl. They analyzed data from English Wikipedia and found that Wikipedia’s transition from rise to decline was accompanied by increasing rates of newcomer rejection as well as the growth of bots and algorithmic quality control tools. They also showed that newcomers whose contributions were rejected were less likely to continue editing and that community policies and norms became more difficult to change over time, especially for newer editors.

Our paper, just published in the CHI 2018 proceedings, replicates most of RAD’s analysis on a dataset of 769 of the  largest wikis from Wikia that were active between 2002 to 2010.  We find that RAD’s findings generalize to this large and diverse sample of communities.

We can walk you through some of the key findings. First, the growth trajectory of the average wiki in our sample is similar to that of English Wikipedia. As shown in the figure below, an initial period of growth stabilizes and leads to decline several years later.

Rise and Decline on Wikia The average Wikia wikia also experience a period of growth followed by stabilization and decline (from TeBlunthuis, Shaw, and Hill 2018).

We also found that newcomers on Wikia wikis were reverted more and continued editing less. As on Wikipedia, the two processes were related. Similar to RAD, we also found that newer editors were more likely to have their contributions to the “project namespace” (where policy pages are located) undone as wikis got older. Indeed, the specific estimates from our statistical models are very similar to RAD’s for most of these findings!

There were some parts of the RAD analysis that we couldn’t reproduce in our context. For example, there are not enough bots or algorithmic editing tools in Wikia to support statistical claims about their effects on newcomers.

At the same time, we were able to do some things that the RAD authors could not.  Most importantly, our findings discount some Wikipedia-specific explanations for a rise and decline. For example, English Wikipedia’s decline coincided with the rise of Facebook, smartphones, and other social media platforms. In theory, any of these factors could have caused the decline. Because the wikis in our sample experienced rises and declines at similar points in their life-cycle but at different points in time, the rise and decline findings we report seem unlikely to be caused by underlying temporal trends.

The big communities we study seem to have consistent “life cycles” where stabilization and/or decay follows an initial period of growth. The fact that the same kinds of patterns happen on English Wikipedia and other online groups implies a more general set of social dynamics at work that we do not think existing research (including ours) explains in a satisfying way. What drives the rise and decline of communities more generally? Our findings make it clear that this is a big, important question that deserves more attention.

We hope you’ll read the paper and get in touch by commenting on this post or emailing Nate if you’d like to learn or talk more. The paper is available online and has been published under an open access license. If you really want to get into the weeds of the analysis, we will soon publish all the data and code necessary to reproduce our work in a repository on the Harvard Dataverse.

Nate TeBlunthuis will be presenting the project this week at CHI in Montréal on Thursday April 26 at 9am in room 517D.  For those of you not familiar with CHI, it is the top venue for Human-Computer Interaction. All CHI submissions go through double-blind peer review and the papers that make it into the proceedings are considered published (same as journal articles in most other scientific fields). Please feel free to cite our paper and send it around to your friends!


This blog post, and the open access paper that it describes, is a collaborative project with Aaron Shaw, that was led by Nate TeBlunthuis. A version of this blog post was originally posted on the Community Data Science Collective blog. Financial support came from the US National Science Foundation (grants IIS-1617129,  IIS-1617468, and GRFP-2016220885 ), Northwestern University, the Center for Advanced Study in the Behavioral Sciences at Stanford University, and the University of Washington. This project was completed using the Hyak high performance computing cluster at the University of Washington.

4 Replies to “Is English Wikipedia’s ‘rise and decline’ typical?”

  1. I’ve only skimmed the paper so far, but I’m very happy to see this published at last! A long awaited contribution.

    I hope in the future we’ll have a more complete and reliable dataset of XML dumps from the MediaWiki wikis (and maybe other wiki engines as well) than Wikia and WikiTeam have so far managed to provide.

    Speaking of data, the DOI of the dataset does not seem to work as of now: https://doi.org/10.7910/DVN/SG3LP1 .

    1. Thanks for the pointers about the DOI for the dataset. We’re working on fixing some issues with the dataset and the documentation. I’ll blog again as soon as it’s done. Thanks, Federico!

  2. I was wondering if there is a way to see how useful those edits were. The Fedora Project had to cut down the open-ness of the wiki because we had an extraordinary amount of spam being added to everything from comments, page edits, new pages, etc. From 2012-2016 this got to be a larger and larger amount of the work being done. In mid 2016 it got so bad with 1000’s of pages being added daily by people being paid to do this. We ended up closing off the wiki to a great extent but this also cut down the amount of useful edits (which were getting hard to find from the spam). I know that other wiki’s were seeing this as a problem at the same time as we were all having to ratchet down how people could edit. I don’t know if that is already taken care of in the stats your students have been doing.. so wanted to bring it up in case it would need to be looked at in future studies.

  3. What I came up with awhile back is this notion of a fad-wave: “But the reason MySpace so frequently dodged the “fad” label was because it was more like a fad-wave. A fad-wave is built on the cascading excitement that is renewed every time more of your friends catch onto the fad. So when you initially join MySpace, you have like an initial six-month excitement cycle. But by the time month three rolls around, a group of your friends join, and they start their own six-month cycles, further extending your cycle by an extra month perhaps. Then by month three of their cycles (your month six), a group of their friends join, which extends your friends’ cycles a month (and your cycle by maybe another month). Until you find yourself hanging around for a year-and-a-half until everybody you know has finally gotten the MySpace bug out of their system.”

Leave a Reply

Your email address will not be published. Required fields are marked *