The spread of misinformation on the internet can be likened to a wildfire. It begins with a spark. Then, it is picked up by another thing that catches fire and continues to spread until everyone is talking about it.
As a researcher, you come across this fact, and because it is featured in so many places — some of them very reputable — your instinct tells you it must be true. But then, just to be sure, you decide to dig a bit deeper. Then doubt starts to creep in. Finally, you conclude there’s no way it is true and wonder why so many people believe it or at least how come so many are tooting it as if it were a no-brainer.
What I want to highlight through this article is how a false claim from a modest background takes over the information ecosystem and evolves into a widely referenced beast. Oftentimes, the root cause is not even mischief. It could just be carelessness. Somebody somewhere did not do their due diligence while writing something. Another person comes across their written word and runs with it. Then another. And on and on it goes. You get the point. Classic domino effect.
The first time I noticed this trend was in 2019 while working on this fact-check of images tied to the Amazon rainforest fires. At first, I’d described the rainforest as the planet’s lungs, providing 20 per cent of its oxygen supply. If you listened to the news a lot that year, you would see this statement everywhere. But, I later understood that the claim — though trumpeted by the most respected media organisations, including CNN — was, in fact, incorrect.
While writing the fact-check, I didn’t find articles debunking the claim. Now I see that a bunch of them have sprung up. This one by National Geographic says the origin of the 20 per cent myth is unclear. But my discovery then from poring through Google Scholar results was that it likely started from an academic paper. It was cited in another paper and then it eventually made it into journalism and the mainstream.
I am unwilling to go down that rabbit hole of tracing what led to what again. I, however, have another example that should do a better job of clearing the fog.
Last December, I edited an article about the trend of begging in Nigeria as a survival mechanism in the face of worsening economic conditions. I thought it would be nice to include statistics on the population of beggars in Nigeria relative to other parts of the world. But the closest thing I found was this Wikipedia article that ranks different countries according to the number of homeless people they have. The article claimed that Nigeria has 24.4 million homeless people, the largest in the world. The only problem is that there is no one reliable source for the figures. It is merely an aggregation from all sorts of different places.
When I saw the article last year, it had up to 70 references from different data years for the various countries. But currently, for some reason, all that is gone (and the data years are all oddly stated as 2024). If you check the December 2023 version of the page, you will see that the article cites a 2007 UNHCR report as its source for the Nigerian homeless population figure. Not accurate. That report does not even mention the word ‘Nigeria’.
Right now, the article adds the note “internally displaced, per IDMC” to the claim, but this is also misleading. According to the Internal Displacement Monitoring Centre, there are only 3.6 million IDPs in Nigeria.
This is one incredibly flawed source. Unfortunately, you will find this statistic in all sorts of places. If you google “how many homeless people in Nigeria,” the search engine will respond in bold letters: 24 million.
When I conducted this search in December, Google referenced a blog post by Mustard Insights for this confident assertion (the website is no longer accessible). Now, it references an article by Punch Newspaper.
The figure is also mentioned here by Punch, here by Vanguard, here by The Nation, here by CGTN Africa, in this academic paper, and in tons of other places. The academic paper cited Development Aid as its source, which in turn cites this other paper, which really has no discernible source.
However, most of the other articles cite World Population Review as the source of the figure.
Guess what? I checked World Population Review (again, we have to use the Wayback Machine because the article has undergone significant changes since December), and their own source is that same Wikipedia article. They did not explicitly state this, but virtually all the figures were the same (with the obvious exception of Pakistan). No source is mentioned.
On the about us page, they say “data sources and methodology should be listed on each page, but if you require more information,” you should contact Shane Fulmer, the founder. Shane is a web developer and the WPR’s about us page suggests that the platform relies heavily on simply scraping the internet for data and visualising it.
It states, “Most demographic data is hidden in spreadsheets, behind complex APIs, or inside cumbersome tools. World Population Review’s goal is to make this data more accessible through graphs, charts, analysis and visualisations. We also strive to present the most recent information available, and develop our own projections based on recent growth.”
In short, a hallucination that found its way to Wikipedia was picked up by a web crawler probably run by one person and then further amplified by popular news platforms in Nigeria. Ladies and gentlemen, this is how misinformation evolves. Before you know it, we are making policies and enacting laws based on this shaky foundation.
There are countless other instances, too.
In 2019, I fact-checked the claim that staring at women’s breasts prolongs a person’s lifespan. Nigeria’s news agency, NAN, circulated the report, citing a study published in the New England Journal of Medicine by German scientist Karen Weatherby. No such study exists. In fact, the claim dates back to 1997 and first appeared in an American tabloid famous for its fictional articles. Various reputable local press outfits republished NAN’s article. Some of them realised their mistake and either pulled down or corrected their reports. Others haven’t bothered to make any changes.
I suspect that the “10 million almajiri children in Nigeria” estimate has gone through that snowballing effect as well. I do not know the chronology exactly, nor have I spent enough time gathering all available evidence, but the earliest mention of the figure I’ve found from a cursory search is this 2019 press report quoting Abdullahi Sule, the governor of Nasarawa state. Being a public figure, I guess people just assumed he knew what he was talking about. The claim might have appeared on the Almajiri Child Rights Initiative website afterwards. The non-profit also does not say how the number came about. In 2020, an article published by UNICEF shared the statistic, with the author citing their source as “some estimates”.
And this is exactly what happens. Somebody hallucinates, and then a reputable organisation like CNN, NAN, or UNICEF comes along and lends weight to the hallucination, and then it takes on a life of its own.
Some of the examples of this trend I’ve seen are a product of propaganda and influence operations.
This article in the Washington Post talks about one case in 1983. A small pro-Soviet newspaper in India published an article claiming that AIDS was caused by experiments sponsored by the United States, citing a “well-known American scientist and anthropologist” who happened to be anonymous. Then, the article was cited as a source by a Soviet newspaper two years later. Then, it made its way to the front page of a British tabloid. Then, it was picked up by an international news agency and spread to dozens of countries.
That tactic of deliberately planting a false claim in a local newspaper to spread international propaganda is still used today.
To wrap this up, my recommendation is that we must be more deliberate about verifying information before sharing it — as journalists, academics, civil society actors, and so on. We must understand that the mere fact that a claim is viral does not mean it is true. We must also be aware of the immense power we wield to influence the spread of (mis)information.
I know it can be difficult, especially when more credible data is not available, like in the case of Nigeria’s homelessness problem or the population of almajiri children. It sucks when the best data you have is one you cannot even validate.
Editors have a huge role to play as well, especially in newsrooms that do not have dedicated fact-checkers. Encourage writers to rely only on primary sources of information. Very important! Also, insist that reporters state or hyperlink the sources of their data and verify each one if you can.
I have spotted many errors when editing stories using this approach. For example, someone states that a city has the fourth worst air quality on the continent. But when you check the report hyperlinked, you realise the city is ranked fourth in a list of only four cities. The sample is no bigger. Or someone states that a number of people have been victims of kidnapping in a region, then you check the source and see that the figure only pertains to school abductions. Or someone references an organisation, but you check the source and realise that it is merely an opinion article and not the authoritative position of the organisation itself.
If you do not catch these errors before they are published, you risk igniting that first spark in a misinformation wildfire.