The Signpost

Technology report

What is: localisation?; the proposed "personal image filter" explained; and more in brief

What is: localisation?

Related articles
What is...?

Wikimedia Labs: soon to be at the cutting edge of MediaWiki development?
23 April 2012

MediaWiki 1.20wmf01 hits first WMF wiki, understanding 20% time, and why this report cannot yet be a draft
16 April 2012

What is: agile development? and new mobile site goes live
12 September 2011

The bugosphere, new mobile site and MediaWiki 1.18 close in on deployment
29 August 2011

Code Review backlog almost zero; What is: Subversion?; brief news
18 July 2011

Wikimedia down for an hour; What is: Wikipedia Offline?
30 May 2011

Bugs, Repairs, and Internal Operational News
25 April 2011

What is: localisation?; the proposed "personal image filter" explained; and more in brief
21 March 2011


More articles

This week's Technology Report sees the first in an occasional editorial series entitled What is?. The series aims to demystify areas of the Wikimedia and MediaWiki technology world for the casual editor. Today's article is on "localisation", a process where the MediaWiki interface is translated into other languages (over 300 of them).

For the past five years, localisation is something MediaWiki has done very well. For 188 different languages (or language variants), 490 or more out of the most used 500 interface messages (including sidebar items and "Revision as of", for example) have been translated from the default (English) into that language. That list includes big names (French, German, Spanish) but also a myriad of smaller language groups as diverse as Lazuri (spoken by approximately 32,000 people on the Black Sea) and Tachelhit, a Berber language spoken by 3 to 8 million Moroccans (full list).

Translation, in the vast, vast majority of cases, cannot be handled by MediaWiki developers alone. Instead, the effort is crowdsourced to a large community of translators at translatewiki.net, an external site with nearly 5,000 registered users (source). The site was built for translating all things MediaWiki, but now also handles a number of other open source projects. When new interface messages are added, they are quickly passed onto translatewiki.net, and the finished translations are then passed back. Every project which uses the LocalisationUpdate extension (including all Wikimedia projects) provides access to the latest translations of interface messages to users in hundreds of languages within a few days of translation.

Over 100 issues (source) remain with language support for right-to-left languages, languages with complex grammar, and languages in non-Roman scripts, but the situation is slowly improving. For more information about MediaWiki localisation, see MediaWiki.org.

"Personal image filter" to offer the ability to hide sexual or violent media

At the upcoming meeting of the Wikimedia Board of Trustees on March 25/26, a design draft for the "Personal image filter" will be presented, a system that will allow readers to hide controversial media, such as images of a sexual or violent nature, from their own view. This modification would be the first major change to come out of the long-lasting debates about sexual and other potentially offensive images. In May last year they culminated in controversial deletions by Jimbo Wales and other admins on Commons, at a time where media reports, especially by Fox News, were targeting Wikimedia for providing such content. Subsequently, the Foundation commissioned outside consultants Robert Harris and Dory Carr-Harris to conduct the "2010 Wikimedia Study of Controversial Content", which was presented at the Board's last physical meeting in October. The study's recommendations were not immediately adopted, with the Board forming a workgroup instead. (See the summary in the Signpost's year in review: "Controversial images".)

Mock-up showing filter preferences for an anonymous user
Mock-up showing three different filter categories for an image when hovered over
Mock-up showing a filtered (shrouded) image

The study had recommended that "a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images ... into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command ('under 12 button' or 'NSFW' button)", but that "no image [should] be permanently denied to any user by this regime, merely its appearance delayed".

In response to an inquiry by the Board if such a feature was feasible and how it might look, the draft design for the Personal Image Filter was developed by the Foundation's tech staff, in particular designer Brandon Harris (User:Jorm (WMF), no relation) and has already been presented to the workgroup, which in turn will present it to the Board this week. The design introduces a global "Content Filter" category on Commons, containing all images that can potentially be hidden according to a user's preferences, with a set of subcategories corresponding to such preferences. As a kind of localization of these, "individual wikis will be required to maintain a 'Category Equivalence Mapping'", to which they can add (but not remove) their own subcategories. The total number of subcategories is intended to be small though, with "somewhere between 5-10" global subcategories, and together with local ones "the interface can comfortably support around 10-12 filters before becoming unwieldy". Like the original recommendations from the study, the proposal appears to leave it to the communities to define the set of filterable subcategories, but it sketches a possibility:

Users (both anonymous and registered) can select which categories they want to filter via an annotation next to filterable images that lists the filter categories the image belongs to, or from a general display setting (accessible via a registered user's preferences, or for anonymous users via a new link next to "Log in/Create account").

Both the recommendations of the Controversial Content study and the workgroup's chair Phoebe Ayers emphasise the opt-in (i.e. voluntary) nature of the filtering. From a technical perspective, the changes needed to arrive at an opt-out (i.e. mandatory at first) version are obviously rather trivial, and indeed until very recently, the proposal encompassed an additional option for "Default Content Filtering", that could be activated on a per-wiki basis if consensus on that project demanded it. The option was removed by Jorm who explained that it had originally been included "because I could see this being used by non-WMF sites", but decided to remove it because it was "more of a suggestion for implementation, rather than a requirement, and appears controversial".

In fact, at least on the English Wikipedia, the standard skins have for a long time provided CSS and JavaScript code to allow parts of a page to be hidden for all readers. However, the use of the corresponding templates has generally been restricted to talk pages ({{collapse}}), tables and navigational components ({{hidden}}), with objections to their use for more encyclopedic content. Still, their use for controversial images has been advocated by some, including Jimmy Wales who argued in favour of using the "Hidden" template for Muhammad caricatures: "Wiki is not paper, we should make use of such interactive devices far more often throughout the entire encyclopedia, for a variety of different reasons." Wales, who has been a member of the Board's Controversial Content workgroup since a reshuffle in winter (the others being Ayers, Matt Halprin and Bishakha Datta), recently responded to two related proposals on his talk page ([1], [2]), supporting "reasonable default settings" for the display of controversial images, based on "NPOV tagging" such as "Image of Muhammad", rather than subjective assessments such as "Other controversial content".

The Controversial Content study's recommendations had suggested that the feature should be "using the current Commons category system", in the form of an option that users can select to partially or fully hide "all images in Commons Categories defined as sexual ... or violent". For registered users, it recommended even more fine-grained options, to restrict viewing "on a category by category or image by image basis" even outside the sexual or violent categories, similar to Wales' "NPOV tagging". But this was rejected as impractical for the Personal image filter proposal. Brandon Harris explained why:

In brief

Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for many weeks. Users interested in the "tarball" release of MW1.17 should follow bug #26676.


















Wikipedia:Wikipedia Signpost/2011-03-21/Technology_report