The Signpost


Community view

Oh! The humanity

This article includes the opinions of six Wikipedians about Elon Musk's new encyclopedia, Grokipedia. These opinions have been lightly edited for length and grammar and links to articles have been added.

On October 27, 2025, Elon Musk, the world's richest person, introduced his encyclopedia, named Grokipedia, which promptly crashed. The next day, it was re-introduced with about 850,000 articles, many of which taken directly from Wikipedia. Other articles look very similar to Wikipedia articles, presumably because the AI bot that wrote the articles was trained with data from Wikipedia. The mainstream media reacted as if Grokipedia had crashed and burned – see the In the media section of this issue for a summary of the many articles. Of course, this was only version 0.1 of Grokipedia, so it may be too early to condemn it to the ash heap of history. It’s not one of the worst catastrophes in the world. Not yet, anyway.

User Rhododendrites has published an academically-oriented paper about the risks of Wikipedia in Tech Policy Press, which we have re-published in the Opinion section.

For this column, however, The Signpost asked six Wikipedians about their respective views of Grokipedia.

A scientist speaks

[edit]
TKTK
Wade in 2017. Photo by Dave Guttridge, CC BY-SA 4.0

Jess Wade (GR) is a physicist at Imperial College London and has created over 1,200 articles about women scientists on Wikipedia throughout the years.

One of the many wonders of Wikipedia is that it is created by people, for people. Wikipedia pages are concise, carefully cited, and balanced – Grokipedia pages are repetitive, sloppy, and reflect Musk's own political biases. Wikipedia pages are the result of groups of anonymous nerds who value intellectual integrity and precision; if something is presented as a fact, it is likely to have been verified through a bunch of independent sources. The same cannot be said for Grokipedia, which is as accurate as all other Large Language Models; statistical machines optimised on what the internet says is probable, rather than what is actually true.

While much of the content on Grokipedia is lifted from Wikipedia, the Grokipedia pages are longer, sloppier, and partisan. This can be seen by comparing the biographies of Meredith Whittaker (GR), Timnit Gebru (GR) and Joy Buolamwini (GR), academics who champion the ethical and transparent development of technology, on both platforms. For each researcher, Grokipedia adds thousands of words on "Controversies and criticisms," hiding its own biases in sentences that start "proponents of …" and "critics have also," presenting one-sided opinions (from interviews and social media posts) as facts. Grokipedia is reality through Musk's lens, manipulated narratives presented in a format that people have come to trust.

Where's the opera?

[edit]
TKTK
Pruitt in 2022. Photo by Fuzheado, CC BY-SA 4.0

Steven Pruitt (GR), known as Ser Amantio di Nicolao on Wikipedia, has made over six million edits and loves Italian opera. He might have ignored Grokipedia if The Signpost hadn't asked for his opinion.

We've seen similar post-Wiki sites crop up before, and none of them have had any particular staying power. There's no reason to think that Grokipedia will be any different.

I looked up a handful of articles related to early nineteenth-century opera, mostly Italian, including several articles I created – Fanny Eckerlin, Le nozze in villa, Caroline Unger – as well as Isabella Colbran (which I did not). None have been transferred over, indeed, most of the articles I searched for were missing on Grokipedia.

I also looked up the article about myself. The article is quite a bit more prolix than that on the English Wikipedia, and not so well-organized. The wording is not identical to the English language Wikipedia. AI has been used to rewrite, or rework, large swaths of it, and introduced a handful of minor errors. The article is quite a bit longer than that on the English Wikipedia, due to the introduction of semi-extraneous, often critical, information. It goes into some detail talking about criticisms of my work, though really nothing concrete: there is an entire section, with several subsections, titled "Criticisms and debates". It's nothing I haven't heard before, but it seems to be stitched together from blogs, forums, and comments.

Grokipedia is half-baked at best, and I don't see much of a future for it in its current state. I doubt it will have much staying power without a severe overhaul.

Can the machine keep up with us?

[edit]

Vysotsky is a newly retired Dutch academic librarian, who wrote the Serendipity column for The Signpost for about two years. He has contributed over 12,000 photos to Wikimedia Commons and has been editing Wikipedia since 2007. Here he comments on the article about the 2025 Dutch general election (GR):

This is Grokipedia's method of operation, according to Grok: "How it works: Articles are automatically generated based on training data, with Grok integration for real-time updates and fact-checking. Users can submit suggestions, but there's no crowdsourcing like Wikipedia." Well, I fact-checked the "real-time updates". I know it’s only Grokipedia v.0.1, but updating the outcome of Dutch national elections shouldn't be too hard for artificial intelligence. It’s data, after all.

But no: the Dutch general elections were held on 29 October. The next day, 30 October, the overall outcome was clear: Liberal Democrats (D66) won 26 seats, Nationalist anti-migration (PVV) 26 seats, Conservative Liberal party (VVD) 22 seats, Social Democrats+Greens (GL-PvdA) 20, Christian Democrats 18 seats, ten other parties won the remaining seats.

There was only one point unclear: which party had obtained most votes? The Liberal Democrats, or the Nationalist party? That became clear on 1 November: the Liberal Democrats got most votes – and could take the lead in trying to form a new government. English Wikipedia reported the results on 30 October (with 99.7% of the votes counted). As of 4 November, Grokipedia still gives a two-week old prediction ("Projected Seats (Latest Aggregate, Oct 2025)“): PVV 40, GL-PvdA 24, CDA 24, D66 17, VVD 15). On top of the page: "Fact-checked by Grok last month". So much for the speed of fact-checking by Grok. The leader of D66 and projected Prime Minister, Rob Jetten, isn't mentioned in their article at all; his party D66 only once, in the table with predictions.

Most importantly: this reveals why we need humans, who are curious and dedicated to topics close to their heart, wanting to report on the matter as soon as official results are made available. Bots are neither curious, nor dedicated.

What is in a baby bottle?

[edit]

Mary Mark Ockerbloom has edited Wikipedia for almost 20 years, works as a paid Wikipedian in Residence for educational, scientific, and cultural organizations, and organizes the Philadelphia WikiSalon for new editors and others who wish to develop their editing skills.

Imagine you're a mother-to-be wondering whether to use baby bottles to feed expressed milk or formula. Would you rather read a Wikipedia article curated by humans, or an AI-generated one from Grokipedia? Baby bottle (GR) is one of many articles I've substantially rewritten as a Wikipedian in Residence. While working on it, I asked myself, "What information would someone be trying to find when they read this article?"

Answering that question requires a deep understanding of our concern as humans. LLMs are unlikely to do this well. They rely on the sources they are given, whether good, bad, or indifferent. They lack the underlying world knowledge that humans use to assess and prioritize information.

When I rewrote "Baby bottle", I added over 10,000 words and 154 references. I focused on things parents might want to know, like design considerations, materials, safety and use of baby bottles.

Grokipedia's first paragraph describes a baby bottle as having three typical components. Wikipedia cites a fourth, the protective cap used to keep bottles clean and prevent spills.

Grokipedia's second paragraph is one rambling sentence that begins with "prehistoric ceramic bottles", jumps to high levels of mortality in the 19th century, and concludes that safety has improved since then. This disjointed treatment of past events reflects LLMs' lack of real-world understanding of time. If you ask an LLM what is happening "today", it looks at millions of statements where the word "today" was used. Its answer may reflect what was said last week or ten years ago. It doesn't understand that "today" has meaning based on when it is asked.

Grokipedia's sentences are long, disjointed, and bombastic. Some of it reads like advertising. I'm thankful that Wikipedia editors have worked steadfastly to remove promotionally-toned additions to the "Baby bottle" page. I know which page I'd rather read, if I was a new mom.

Grokipedia did something better than Wikipedia

[edit]

User Oltrepier mainly edits the Italian-language Wikipedia and is a Signpost reporter, too. He did find something that Grokipedia has done better than Wikipedia. In fact, the enWiki article on the Detention of Johan Floderus (GR), which Oltrepier himself created in 2023, is getting out of date.

The Grokipedia article, including the sources, is definitely more up-to-date than the one on Wikipedia, but it's also verbose and drags on and on. The language used by Grok can be clunky, with strange word choices – for example, "documented cases exceeding 66 victims". Right from the start, the article focuses less on the actual key events involving the EU diplomat Johan Floderus and more on the hostage diplomacy used by Iran in recent years, to the point where it reads more like a political speech than an encyclopedia entry. It definitely doesn't help to feature phrases such as, "This persistence highlights the judiciary's subordination to political imperatives, where legal facades mask bargaining tactics amid the regime's prioritization of ideological control over impartial justice".

AI can sometimes help

[edit]
TKTK
Wills (right) with Tanya Tucker in 2019. Photo by Mike Klem, CC BY-SA 4.0

Betty Wills, known as Atsme on Wikipedia since 2011, also founded Justapedia, a Wikipedia fork which resembles Wikipedia more than Grokipedia. Justapedia welcomes both conservative and liberal editors, according to Wills. With a little help from Grok 4 beta, she summarizes the difference between Wikipedia and Grokipedia as follows:

Grokipedia is not human, can't relate to the human condition, and can’t initiate doubt as it's an LLM. Garbage in, garbage out

  • It amplifies the biases in its training data (e.g., overrepresenting Western perspectives). It needs human oversight and prompts like "Analyze your response for bias."
  • It will hallucinate plausible but false information (e.g., inventing non-existent historical events). It puts language fluency over accuracy, with no built-in fact-checking.
  • It cites Quora, The Daily Mail (UK), Britannica, and Biography.com. Like Grok, ChatGPT, and the other AI bots, it will also cite Reddit and Wikipedia.
  • It can't judge notability or importance and doesn't have the level of originality/creativity needed to create the kinds of new articles that will make it competitive with other encyclopedias.
  • Even if it neutralizes what it considers "bias", it's not trustworthy. It doesn't have Wikipedia's New Page Patrol, or even editors helping to keep fake articles out.

The Steele dossier article (GR) shows the differences in perspectives between Grokipedia and Wikipedia. Grokipedia tries to remove perceived biases to achieve a "neutral" POV, rather than covering all notable POVs.

Consuming AI-generated information is like quelling a growling stomach by downing a Whopper and fries from a drive-thru versus savoring a well-prepared, five course meal in an upscale restaurant.

Conclusion

[edit]

When the Grok chatbot, Betty Wills, and five other Wikipedians send out much the same message, it's hard to ignore. Grokipedia is extremely flawed, perhaps fatally so, because it's controlled by one biased person with extreme views and because it lacks human understanding and the human touch. Using AI to write an encyclopedia means that the "writers" do not think for themselves, cannot recognize a notable topic or a reliable source, hallucinate "facts", do not question their own writing, and cannot eliminate bias or find a neutral point of view.

What's Grokipedia missing? In a word, humanity.


Signpost
In this issue
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.


















Wikipedia:Wikipedia Signpost/Next_issue/Community_view