Math has publication fraud, too
Sep. 24th, 2025 03:34 pm![[syndicated profile]](https://www.dreamwidth.org/img/silk/identity/feed.png)
Scholarly publishing in mathematics is unlike many other fields, marked by fewer papers, fewer coauthors per paper and fewer citations. But that doesn’t mean the field is immune to fraud and cheating.
A pair of papers posted to the arXiv addresses the issue of fraudulent publishing in math, particularly metrics gaming, and offers a list of recommendations to help detect and deal with that problem and other fraudulent activities. (The former was also published in the October AMS Notices; the latter will appear in the November issue.) “Fraudulent publishing undermines trust in science and scientific results and therefore fuels antiscience movements,” mathematician Ilka Agricola, lead author of both papers, told Retraction Watch.
A professor of mathematics at Marburg University in Germany, Agricola was president of the German Mathematical Society in 2021-2022 and is chair of the Committee on Publishing of the International Mathematical Union. The new articles are the products of a working group of the IMU and the International Council of Industrial and Applied Mathematics.
Agricola spoke with us about the reports and about fraud in mathematics. Questions and responses have been edited for brevity and clarity.
Retraction Watch: Few people talk about fraudulent publishing in math. Why is that?
Agricola: For a long time, mathematicians thought that as long as they keep away from predatory journals or paper mills, the problem does not affect them. This turned out to be wrong.
Retraction Watch: If you look at the number of papers that tripped Clear Skies’ Papermill Alarm in 2022 (we included a histogram in this article we wrote for The Conversation), math is pretty far down the list. Are there a lot of fake papers in math?
Agricola: It is probably fair to say that the problem is not as severe as in other fields like cancer research, but the community is smaller and the number of fake papers is growing at alarming speed. Predatory and low-quality mega-journals are trying hard to lure respected scientists into their parallel universe of fake science, thus trying to give themselves the impression of respectability. Thus, one of our goals is to raise awareness for the issue in the mathematical community!
Retraction Watch: As you note in the new papers, Clarivate announced in 2023 it had excluded the entire field of math from its list of “Highly Cited Researchers,” or HCRs. What’s going on?
Agricola: The publication culture in math differs a bit from, say, experimental and life sciences. On average, mathematicians publish fewer papers with fewer authors than scientists in other fields. So, with the same absolute number of papers and citations, one can become a “highly-cited researcher” in math, but not in other fields. Thus, gaming the system is easier.
The list of HCRs for mathematics became so screwed that Clarivate couldn’t pretend anymore that it had any value. This being said, Clarivate announced that they would look into new measuring tools, but didn’t come up with any alternative ideas in the meantime, nor did they contact any representatives of the international mathematical community.
Retraction Watch: You note in your paper that the institution with the highest number of highly cited researchers in 2019 was China Medical University in Taiwan, which does not have a program in mathematics. How is that possible?
Agricola: Good question! Admission to the “Hall of Fame” of HCRs has a decisive advantage for the university of a scholar: It impacts their rank in the “Academic Ranking of World Universities” [ARWU, published since 2003], also nicknamed the “Shanghai ranking.” So, the institution benefits even if it does not have a math program. Of course, we agree that a topic not taught at a place ought not to be included, but that’s how the system works. Clarivate is treating its HCRs in a very kind way: They ask them about their affiliation. So, at the moment of making the count, researchers just give China Medical University (or any other institution) as their affiliation, despite not being there. The researchers typically get a contract as a “visiting professor” to hide that, in the end, they are being dishonest about their affiliation, and of course, being an HCR may give them more benefits, prestige, and grant access at their home institution as well. I was certainly surprised to learn that there is a lot of cheating with affiliations going on! Actually, many institutions do not have clear rules for primary and secondary affiliations.
Retraction Watch: If a clinical trial is fake, that’s a problem that can obviously affect life-and-death decisions. Do bogus papers and other types of publishing fraud in math have real-world consequences, too?
Agricola: Mathematics has many famous open conjectures. Predatory journals can give people the opportunity to publish “proofs” of these without credible peer review. The status of these results can become unclear, and further research based on them will then be a waste of effort or resources. On a larger scale, many junk papers are claiming to deal with applications, so it could create the wrong impression that these results solve concrete problems, and actually don’t.
Retraction Watch: You and your coauthors are mathematicians, and yet you argue against focusing on numbers like journal impact factors and publication and citation counts. Is that what’s driving all of this bad behavior?
Agricola: “When a measure becomes a target, it ceases to be a good measure.” This quote is from the British economist Charles Goodhart, and it also applies to bibliometrics measures. Of course, gaming these metrics has always existed, but some of us liked to believe that they would be roughly OK, with some error bar due to some cheating. Now, we realize the error bar is larger than the number one wants to measure. Perhaps one advantage of mathematicians is that they are not easily impressed by numbers, and we have the means to understand and analyze them — this is our job. And so, the conclusion is very clear: The correlation between bibliometrics and research quality is so low that we should not use bibliometrics. And I urge all colleagues to say so openly!
Retraction Watch: So how do we judge research quality if we shouldn’t use publication metrics?
Agricola: Read the actual publications instead of relying on bibliometrics! Plus, in mathematics, we are lucky to have two extremely well curated databases for math papers and journals, zbMath Open and MathReviews. If a journal is not included there, it’s either very interdisciplinary or one should get suspicious.
Retraction Watch: Is it possible for individual researchers to jump off the bibiometrics bandwagon without jeopardizing their careers?
Agricola: We need to fight for a change in culture, that’s for sure, and the path will be rash and hard. To young researchers, we should give the warning that being involved in predatory publishing can also just as well put their scientific integrity at risk. Remember the people who had to resign because of data falsification?
But the situation is not hopeless. Some effective changes would be easy to implement:
- Define good publishing practices, encourage people to follow them, educate the young generation about predatory publishing.
- Discourage the use of bibliometrics in hiring and promotion committees or for graduating.
- Evaluate faculty by their best papers and ‘activeness’ without pressure to publish too frequently.
- Do not publish in journals charging article processing charges — most serious math journals don’t do it.
- Check the quality of journals before joining their editorial board.
- Choose the journals or special issues in which you publish and the journals for which you write reviews wisely.
Like Retraction Watch? You can make a tax-deductible contribution to support our work, follow us on X or Bluesky, like us on Facebook, follow us on LinkedIn, add us to your RSS reader, or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.