The Signpost

Dispatches

Content reviewers crucial to setting standards

Related articles
Reviewing content
How busy was 2008?
16 February 2009
Reviewing featured picture candidates
24 January 2009
How to start reviewing
7 April 2008


Content review processes such as Featured article candidates (FAC), Featured list candidates (FLC), Good article nominations (GAN) and Peer reviews (PR) are at the core of establishing and maintaining high standards for Wikipedia articles, and provide valuable feedback on how to achieve these standards. Reviewers in these processes tend to gain significant respect in the community for their work. Despite the prestige of the job, such reviewers are in short supply, and 2009 saw a reduction in reviewer participation by most measures.

Featured articles represent Wikipedia's best work, and achieve this status after a review open to the whole Wikipedia community. Editors can support the article's promotion if they believe it meets all the criteria, or oppose it by providing examples of instances where it does not. The featured article director or his delegates will promote an article if consensus in favour of promotion has been reached among the reviewers after a reasonable time.

In 2009, 522 articles were promoted to Featured article (FA) status, while 157 articles had featured status removed via the Featured article review (FAR) process. The net increase, 365 featured articles, is almost 40% down on the 2008 figure of 576.[1] This trend has been evident throughout 2009; the rate of promotion has slowed, because it is taking longer to get sufficient reviews for a given featured article candidate (FAC) to determine consensus to promote the article or not. The decline in reviewer activity has been noted several times throughout the past year on the talk page associated with the FAC process, and is backed up by an analysis of the figures.

Summary:
  • Annual increase in FAs down 37%
  • FAC reviews down 26%
  • FAC reviewers down 36%
  • FAC "nominators only" up 250%
  • FAR participants down 32%

In 2009 there were 991 FACs (522 successful, 469 unsuccessful), which attracted a total of 9,409 reviews. 1,434 editors were involved with the FAC process, of whom 224 were nominators only, 302 were both nominators and reviewers, and 908 were reviewers only. A successful FAC had, on average, reviews from 12 different people, while an unsuccessful FAC had reviews from 9. In 78% of all FACs, one of these reviewers was Ealdgyth who reviewed the sources used for reliability.[2] By contrast in 2008 there were 1,328 FACs (719 successful, 609 unsuccessful), which attracted a total of 12,743 reviews. 1,987 editors were involved with the FAC process, of whom 87 were nominators only, 258 were both nominators and reviewers, and 1,642 were reviewers only. A successful FAC had, on average, reviews from 11 different people, while an unsuccessful FAC reviews from 9. Once again Ealdgyth provided sterling service, commenting on reliability of sources for 66% of all 2008 FACs.[2]

Thus compared to 2008, there were 28% fewer people participating in the FAC process in 2009, which led to 26% fewer reviews. However there were in fact 35% fewer people providing reviews; the number of editors nominating an article but not reviewing others increased by a factor of 2.5, or 250%.

Articles can also lose featured status through the Featured article review process. Editors who believe an article no longer meets the featured article criteria can list it at FAR. Ideally one or more editors will take on the task of bringing it up to standard. The FAR process showed a similar decline in participation in 2009. Last year there were 219 FARs (157 demoted, 62 kept), and 767 editors participated in reviews. In 2008 there were 263 FARs (143 demoted, 120 kept), and 1129 editors participated. The number of editors participating thus dropped by 32% in 2009.[3]

Summary:
  • Annual increase in FLs down 38%
  • FLC participants down 23%
  • FLRC participants up 31%

Similar processes to FAC and FAR exist for primarily list-based content—featured list candidates (FLC) and featured list removal candidates (FLRC). In 2009, 500 lists were promoted to Featured list (FL) status, while 83 lists had featured status removed via the FLRC process. The net increase, 417 featured lists, is down compared to the 2008 value of 669.[4] In 2009 there were 574 reviewers and nominators, while in 2008 there were 743.[5]

FLRC bucked the trend, having 235 people involved in 114 reviews, compared to 179 in 72 reviews in 2008.[5] The increased number of lists having their featured status reviewed is possibly a consequence of the large growth of the featured list process in 2008.

Good articles

Summary:
  • Annual increase in GAs down 11%
  • GA participants down 25%

Good articles (GA) must meet a less stringent set of criteria than featured articles. The review process also differs—promotion to GA only requires a review from one editor who was not a significant contributor to the article. The number of Good articles (GA) increased by 2,151 over 2009. This is down 11% on the net increase of 2,416 in 2008. There are currently 8,104 Good articles, 1.8 times the number of featured articles and lists.[6] The total number of nominators and reviewers in this process is also down compared to 2008—1351 compared to 1809, a drop of 25%.[7]

A-Class review

Summary:
  • WP:MILHIST A-Class reviews up 40%
  • Number of WP:MILHIST ACR participants steady

On the Wikipedia 1.0 assessment scale there is a level between FA-Class and GA-Class—A-Class articles. An A-Class rating may be awarded by a WikiProject whose scope covers that article; the process is determined by each WikiProject. This contrasts with the centralised (i.e. not WikiProject-based) processes for Featured articles etc. A small number of WikiProjects have active formal A-Class review systems.[8] Of these half dozen A-Class review departments, that of the Military History WikiProject is the largest, processing 220 A-Class reviews in 2009. This is an increase on the 155 reviews processed in 2008, however the number of participants in the process (nominators plus reviewers) has remained steady; 144 in 2009, compared to 140 in 2008.[9]

Peer review

Summary:
  • PR reviewers down 37%
  • PR "nominators only" down 11%
  • Three editors provided 43% of 2009 reviews

Peer review (PR) differs from the previously discussed processes in that it does not result in the awarding of a particular status to the article; instead it is a means for editors to solicit suggestions for improving an article. Peer review is often recommended as a way of attracting the attention of previously uninvolved editors to spot problems which might not be apparent to those closer to the article. Once again this requires reviewers.

In 2009 a peer review was requested for 1,478 articles, resulting in 2,062 reviews. Of these, 891, or 43%, were carried out by just three editors—Ruhrfisch (343), Finetooth (332) and Brianboulton (216).[10] They were assisted by a further 730 reviewers making one or more review comments. A further 503 editors nominated articles for PR but did not review others.[11] Once again, these numbers are down on last year. In 2008, 2,090 articles had a peer review. For technical reasons the number of reviewers could only be determined for the period February to December;[12] in this period 1028 editors reviewed PRs and a further 499 nominated articles for PR and did not comment on others. In the corresponding period of 2009 the numbers are 645 (37% lower) and 449 (11% lower) respectively.[11]

How can I help?

Start reviewing articles! This previous Signpost article gives suggestions for how to go about it. Perhaps start off at Peer review where "you can literally leave one sentence and help improve an article."[13] To find out more about reviewing Good Articles, you can see Wikipedia:Reviewing good articles. You can even ask for a mentor. At places like FAC or FLC you could start off by checking the criteria (What is a featured article?, What is a featured list?), then reading other people's reviews to see what sort of things to look for. If you don't feel confident enough to support or oppose initially, you can leave a comment instead.

Notes

  1. ^ Source: Wikipedia:Featured article statistics.
  2. ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FAC pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The nominators usernames were obtained by parsing the HTML of the monthly archive pages (e.g. Wikipedia:Featured article candidates/Featured log/January 2009 or Wikipedia:Featured article candidates/Archived nominations/January 2009, and recording the usernames listed after the string "Nominator(s)".
  3. ^ These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FAR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. This method probably overestimates the number of users involved, as it counts links to users who, as significant contributors to the article, were notified of the FAR.
  4. ^ Source: Template:Featured list log.
  5. ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FLC or FLRC pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The number of reviewers cannot be separated from the number of nominators, as was done in the FA case, because the nominators were not listed in a standardised form until February 2009.
  6. ^ Source: Wikipedia:Good articles.
  7. ^ Source: Revision history statistics of Wikipedia:Good article nominations.
  8. ^ Of the 1606 WikiProjects or task forces which have created categories to hold A-Class articles (Source: Category:A-Class articles), only 320 appear to use A-Class, i.e. currently have any A-Class articles. (Source: Wikipedia Release Version Tools). Only 28 have pages in Category:WikiProject A-Class Review, indicating a formal review mechanism. Looking at these pages individually shows that only the Aviation, Ships, Military history, U.S. Roads, and possibly the Tropical cyclones Wikiprojects had active A-class review departments in 2009.
  9. ^ These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual ACR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form.
  10. ^ Source: Wikipedia talk:Peer review.
  11. ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual PR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The nominators' usernames were obtained by finding the creator of each individual peer review page (e.g. Wikipedia:Peer review/Gilbert Foliot/archive1) using API queries like this one.
  12. ^ The category January 2008 Peer Reviews does not exist.
  13. ^ User:Ruhrfisch at Wikipedia:Wikipedia Signpost/2008-09-15/Dispatches.

















Wikipedia:Wikipedia Signpost/2010-02-08/Dispatches