The Signpost

File:Asteroid falling to Earth.jpg
State Farm
CC 2.0
300
In the media

The end of the world

"The end of the world" is a heck of a headline. The Signpost will not be able to cover that story, of course: after the event, there would be no reporters to write up the story, and no readers left to read it. But it would literally be the ultimate story for any journalist. In this issue, we instead cover several stories on things that might some day bring about the penultimate issue of The Signpost.
The Guardian covers how Wikipedia covers how the world might end. If you need a musical warmup for this, listen to In the year 2525. The more serious stories we cover might fit under the heading The end of Wikipedia as we know it, such as another Indian government that wants to dictate changes to a Wikipedia article, ANI, and Elon Musk's antics. These are difficult times, but I have faith that Wikipedians are up to the challenge. – S

It's the end of the world as Wiki knows it

TKTK
Imagine every molecule in your body exploding at the speed of light – not an entirely fictional scenario for the ultimate fate of the universe

The Guardian's Internet Wormhole column examines some Wikipedia pages about the fate of everything, including the disappearance of the Y chromosome in the Timeline of the far future. At least we have a Boltzmann brain to look forward to.

The Guardian article starts with 719 BC and continues to 2026, and then just keeps on going. From the 24th century onwards, apparently, "things start to get really trippy: a 'negative equinoctial paradox' in 2353, every person in Japan having the same surname by 2531, and 'the 639-year-long performance of John Cage's organ work As Slow as Possible' concluding in 2640".

Then, it moves on to Timeline of the far future and, for the "truly adventurous", the Ultimate fate of the universe, where readers can handily choose their favorite apocalyptic scenario between Big Freeze, Big Crunch, Big Bounce, Big Rip or Big Slurp. Just take your time and enjoy it. – S

The banner used in the 404 Media article (credits: Bluerasberry)

As reported by 404 Media – the article has also been discussed in audio format on YouTube (at 31:15 – 50:00) – the Wikimedia Foundation "is building new tools that it hopes will help Wikipedia editors stay anonymous in part to avoid harassment and legal threats." Despite the "new" claim, most of the specific examples described in the article have either existed or been in the works for a long time already.

Italian online newspaper Il Post also covered the news (in Italian), explaining how:

Wikipedia [...] is notoriously written and edited by a community of volunteer users; everyone, with a bit of training, can contribute to it. Normally, editors are anonymous users who draw little attention: in recent years, however, physical and legal threats have risen up, especially against those who edit about potentially controversial topics, such as Wikipedia articles related to science and politics.
— Il Post (translated and re-adapted)

Two recent examples of these instances include three editors who are currently involved in the Asian News International vs. Wikimedia Foundation court case – see previous Signpost coverage – and another one over at the French Wikipedia who has reportedly faced multiple threats by a journalist for his edits to the article of newspaper Le Point – see this issue's News and Notes.

Still, the most worrying signs so far have come from the United States: the Signpost previously wrote about the Heritage Foundation's supposed plan to "identify and target Wikipedia editors abusing their position", and published an Op-ed by user GorillaWarfare on efforts by figures like Elon Musk and outlets like Pirate Wires to discredit Wikipedia and its editors, as well as pressure from right-wing activists like Libs of TikTok; this last matter has also been discussed by Lila Shroff in The Atlantic (free subscription required). The editorial board of the New York Post, on February 5, directly exhorted "Big Tech" to "block Wikipedia until it stops censoring and pushing disinformation", at least partially based on its objections to Wikipedia's sourcing policies, and presented alongside a slew of bias complaints. While some are colorable and some are risible — and all certainly deserving at least a response — direct calls for suppression are nonetheless significant and dramatic.

That being said, as noted by 404 Media, the WMF has already acknowledged the general trends at play here in the "External Trends" section of their 2024-2025 annual plan, which states:

Human rights threats are growing. Physical and legal threats against volunteers and staff who fight disinformation continue to grow. Accusations of bias and inaction by those whose preferred narratives do not prevail on Wikipedia may be encouraged and amplified by purveyors of disinformation.

[...]

Law is weaponized in important jurisdictions. Bad-faith lawsuits, by people who don’t like the verified information appearing on Wikipedia pages, are succeeding in some European countries. Some incumbent leaders are abusing their powers to silence and intimidate political opponents.

Concerns from Wikimedia executives only appear to have intensified ever since, as proven by several recent public declarations lined up by 404 Media. During an online meeting with the Board of Trustees on January 30, Wikipedia co-founder Jimmy Wales said that he considered rising threats to Wikipedia by Musk and other figures as "something we need to grapple with", while the WMF CEO, Maryana Iskander, told the Trustees, "We're all just trying to understand what is happening not only in the United States, [but across the world], so the best we can do is monitor, check-in on staff, and try to understand what's needed". Iskander also added that the Foundation was going to "do a risk assessment for community conferences for Wikimania", in order to ensure the safety of people gathering at in-person events.

Two Wikimedia lawyers, Phil Bradley-Schmieg and Jacob Rogers, shared more details on the WMF's most plausible plans to enhance user protection, especially in regards to unregistered users. During the aforementioned meeting, Bradley-Schmieg mentioned the Foundation's ongoing work on the "Temporary Accounts program" – begun in 2019, previously under the name "IP Masking" – which would hand logged-out users a temporary username to hide their IP address, so that it could be accessible "only to people who are really engaged in anti-vandalism". See also prior Signpost coverage: "News from WMF" (2024-11-06), "Mandatory IP masking" (2020-11-01).

In a separate meeting with Community Resilience and Sustainability, also held on January 30, Rogers suggested the possibility to extend the use of sockpuppet accounts to a wider number of non-English Wikimedia projects, while also noting that the WMF had been working to limit the amount of data they retain on any given user – for instance, IP addresses associated with edits are deleted or anonymized after 90 days. According to the Foundation's most recent transparency report, in the first six months of 2024 it received 26 formal requests for information on users, six of which came from the United States, the highest number of any jurisdiction. They provided information in just two cases, one of which was from the US, and the other from Sweden.

As summarized by 404 Media, Rogers also said that WMF has "created a legal defense program that will in some cases fund the defense of Wikipedia editors who are attacked through the legal system, as long as that editor or staffer was contributing to a Wikimedia project in good faith" (presumably a reference to the "Wikimedia Foundation Legal Fees Assistance Program," launched in 2012). The Foundation has recently fought cases in India (the aforementioned ANI vs. WMF) and Germany.

During one of the meetings, upon being asked if the Foundation would consider moving its headquarters out of the US – since it’s currently based in San Francisco – Rogers said:

[Such a decision] would probably not do very much, because the projects would remain accessible in the United States, and many things would still be subject to US law even if the foundation moved its headquarters to a different jurisdiction.

[...]

I think a move would be extremely expensive and cost something in the tens to hundreds of millions of dollars. [...] I see that as one of the most significant, expensive, and extreme possible options. You would only do that if it was like, the only solution to a major problem where doing that would make sense.

Neither the Wikimedia Foundation, nor the Heritage Foundation responded to a request for comment by 404 Media. – O, S, B, H, J

Editor under pressure removes edits about Hindu nationalist historical figure

Sambhaji has become the latest subject of strife between Wikipedia editors and legal authorities in India

An editor has apparently partially submitted to demands of the Cyber Crime Investigation Cell of the Maharashtra Police, to remove allegedly derogatory remarks about Hindu king Sambhaji from Wikipedia. Editor Ratnahastin stated, "I have been sued, legal issues refers to the troubles I'm facing. It is not a threat," after removing edits they had previously made and promising not to revert edits others have made (including those who reverted Ratnahastin's self-reverts). He also said that he had previously contacted Trust and Safety for assistance.

Sambhaji was the king of the Maratha Empire who led the war against the Muslim Mughal Empire in the 1680s. He is revered by many Indians, in much the same way that citizens of many other countries revere their own patriotic or national heroes.

Soon after a new biopic about Sambhaji, titled Chhaava, was released worldwide on February 14, the Indian press – e.g. The Hindu, Hindustan Times, and India Today – began reporting on complaints about the English Wikipedia article about Sambhaji. According to the sources, Chief Minister of Maharashtra Devendra Fadnavis instructed the local police to have the "derogatory remarks" removed from Wikipedia.

The film, like most biopics, is not meant to be a neutral piece of non-fiction; it was adapted from the historical novel Chhava by Shivaji Sawant. As reported by a Hindustan Times story, the director of the film had his own discussions with politicians about a dance scene, which can only be seen now in the movie trailer. The folk dance, known as lezim, is athletic and energetic; in the trailer, it's also emotional, perhaps excessively so, but not pornographic or otherwise immoral. The HT report about the controversy is vague about the reason for the removal of the dance scene, and the somewhat more-extravagant scenes in the HT's own video about the dance scene's removal have now also been locked out of the web. Much of this information arrived bit by bit, and was being discussed and digested at WP:ANI as early as February 18.

On February 21, several Indian sources reported that four or five Indian editors have been "booked" or had "a case registered" against them in court.

Ratnahastin began removing information that same day, while mentioning legal problems in his edit comments. At his own user talk, Wikipedia co-founder Jimmy Wales was questioned about the matter. He responded:

[W]hen legal threats against individual users are involved, it is wise for the WMF to be very circumspect about what statements they issue and what actions they are taking. User privacy matters a great deal, and user safety (both against such threats but also the potential social media witch hunt that can easily emerge) is paramount. It's generally a mistake to assume that because the wider community can't be brought into confidential discussions and actions of the legal team, those discussions and actions aren't taking place.--
— User:Jimbo Wales 13:14, 23 February 2025 (UTC)

S, B

In brief

Guess we've finally found the healthiest alternative to doomscrolling...
  • Meet WikiTok: informative, wholesome, and better than chewing gum: On February 5, New York-based app developer Isaac Gemal launched a new web app, WikiTok, which allows for viewing Wikipedia pages as if they were a TikTok feed: the news have been covered by Ars Technica, as well as The Washington Post, in "What if TikTok and Wikipedia had a baby?" (pay-walled). The Ars Technica article provides a particularly interesting insight on how WikiTok works, and even reached out to Gemal himself, who broke down how AI coding tools such as Claude and Cursor "helped [him] ship really, really fast and just capitalize on the initial viral tweet asking for Wikipedia with scrolling." What Gemal seemingly does not want to capitalize on, though, is hyper-personalized and addictive content: he actually posted the whole code on GitHub, so that anyone can contribute to WikiTok and improve it further, and said quote, "We're already ruled by ruthless, opaque algorithms in our everyday life; why can't we just have one little corner in the world without them?"
Replica of Wichita Falls' eponymous falls which were destroyed in a flood, one of several calamities there.
  • Larry Sanger's conversion: Wikipedia co-founder Larry Sanger has officially described himself as a Christian, as stated in a recent post on his blog and in multiple religious publications. Formerly an agnostic, Sanger has already documented his religious shift in recent years.
  • Clicks and crore — whatever you call it, that's a lot: The Wikipedia article Kumbh Mela — documenting the largest human gathering in the world — recorded 22 lakh pageviews on the English Wikipedia in January 2025 (2,202,934 by our count). Just in India, there were 3.06 crore impressions via Google (30 million) and 10.5 lakh clicks (over one million) in January, according to the The Times of India, for a 3.4% click-through rate (that's high).



Do you want to contribute to "In the media" by writing a story or even just an "in brief" item? Edit next week's edition in the Newsroom or leave a tip on the suggestions page.


+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
  • It's funny how people with an axe to grind will say that Wikipedia is biased against them, and yet still rely on it for information. This can be seen most easily on YouTube, where Wikipedia articles are used as sources with a screenshot (and often no attribution) all the time. Of course there are biases on Wikipedia, but generally far more nuanced than the detractors would understand. All the best: Rich Farmbrough 18:14, 27 February 2025 (UTC).[reply]


















Wikipedia:Wikipedia Signpost/2025-02-27/In_the_media