Sleuths Uncover Web of Research Fraud in 400+ Papers From China

Most of the papers, though not all, were published in just six journals and after peer-review.

Elisabeth Bik, the microbiologist and research integrity consultant noted for unearthing evidence of research misconduct, tweeted on February 21 that she and some others had uncovered over 400 scientific papers “that all share a very similar title layout, graph layout, and (most importantly) the same Western blot layout” – indicating an organised web of potential fraud. She also expressed concern that “there might be hundreds of papers more”, and that she and her collaborators may just have spotted the obviously fraudulent ones.

Western blotting is a technique that microbiologists employ to identify the proteins present in a tissue sample. As an analytical technique involving real-world materials, no two images of western blots are supposed to look alike, and similarities suggest the image may have been manipulated, inadvertently or otherwise.

Guided by this and similar giveaways, Bik, @SmutClyde, @mortenoxe and @TigerBB8 (all Twitter handles of unidentified persons), report – as written by Bik in a blog post – that “the Western blot bands in all 400+ papers are all very regularly spaced and have a smooth appearance in the shape of a dumbbell or tadpole, without any of the usual smudges or stains. All bands are placed on similar looking backgrounds, suggesting they were copy-pasted from other sources or computer generated.”

Bik also notes that most of the papers, though not all, were published in only six journals: Artificial Cells Nanomedicine and BiotechnologyJournal of Cellular BiochemistryBiomedicine & PharmacotherapyExperimental and Molecular PathologyJournal of Cellular Physiology, and Cellular Physiology and Biochemistry, all maintained reputed publishers and – importantly – all of them peer-reviewed.

As a result, the discovery of the problem papers has prompted concerns about the ability of peer-review to check research misconduct in the scientific community.

Indeed, when Bik writes, “Finding these fabricated images should not rely solely on the work of unpaid volunteers,” she evidently means herself and her collaborators – but her words also apply to peer-reviewers, who are unpaid for their work and often lack both the resources and the inclination to investigate each paper in close detail. As a result, peer-review is often not the insurmountable defence some proclaim it to be, nor are peer-reviewed journals as free of bogus science as they claim to be.

As Madhusudhan Raman, a postdoctoral scholar at the Tata Institute of Fundamental Research, Mumbai, wrote in The Wire, “… any attempt to radically change the nature of peer-review must necessarily be accompanied by a change in the way the referees are compensated for their time and effort, especially within academia.”

A PubPeer user who goes by Indigofera Tanganyikensis first identified the problem in two papers (this and this) both published by Chinese researchers. On February 17 this year, a little under a week before Bik published her blog post, two American researchers – Jennifer A. Byrne and Jana Christopher – published an article discussing similar research misconduct based on 17 papers they had discovered.

According to Bik, “As it turns out, Byrne and Christopher’s publication describes the exact same set of papers that our small team of image forensics detectives had been working on in the past month.”

These sleuths, as @SmutClyde wrote on Leonid Schneider’s blog, believe they have stumbled upon at least one paper mill. To quote (selectively) from Bik’s post,

A paper mill is a shady company that produces scientific papers on demand. They sell these papers to [people] who do not have any time in their educational program to actually do research. Authorships on ready-to-submit or already-accepted papers are sold to medical students for hefty amounts. … Whether or not the experiments described in these papers has actually been performed is not clear. Some of these paper mills might have laboratories producing actual images or results, but such images might be sold to multiple authors to represent different experiments. Thus, the data included in these papers is often falsified or fabricated.

The mills seem to have been hired by Chinese clinicians affiliated to various medical colleges and hospitals in China (234 of the 400+ papers have been authored by people affiliated to institutions in Shandong province). The papers were all published between 2016 and 2020. @SmutClyde wrote that after they publicised their findings, including the dataset of papers they had identified as potentially fraudulent (and which they continue to update), an author of one of the papers wrote in:

Being as low as grains of dust of the world, countless junior doctors, including those younger [than] me, look down upon the act of faking papers. But the system in China is just like that, you can’t really fight against it. Without papers, you don’t get promotion; without a promotion, you can hardly feed your family. I also want to have some time to do scientific research, but it’s impossible. During the day, I have outpatient surgeries; after work, I have to take care of my kids. I have only a little bit time to myself after 10 pm, but this is far from being enough because scientific research demands big trunks of time. The current environment in China is like that.

Considering how the peer-review of all of those journals have failed, what the detectives have found effectively represents a large volume of unscientific data entering the scientific literature, funnelled predominantly by Chinese researchers who probably hired a paper mill to help meet the publishing requirements set by their respective institutions. Bik wrote that “it is of great concern to see that this specific paper mill has successfully ‘infected’ particular journals” and that “it is very alarming to see that journal editors do not appear to have noticed the similarities between dozens of papers published in their journals.”

This said, the note from the unnamed Chinese author indicates the source of the problem is hardly new or even confined to China.

For example, until Prakash Javadekar, then the Union human resource development minister, said in mid-2017 that college teachers would not be required to undertake research to qualify for promotions, people who had not trained for research and have since been embedded in environments not properly equipped to support research were forced to conduct research, and publish papers.

“Javadekar is to be loudly applauded and congratulated for taking this measure,” Pushkar wrote for The Wire at the time. “The research requirement in the [Academic Performance Indicators] for college teachers was a travesty. All that it achieved was a proliferation of fake journals for college teachers to publish in.”

Indeed, India has come to be known the fake journals capital of the world, partly as a result of requiring people who cannot undertake research to undertake research, and partly because research productivity has become one of the core measures of determining whether a country is a “scientific superpower”.

For another example, the journal Nature reported that “Pakistan’s research output increased the most among all countries in the world – by 21%” in 2018, a feat that it dubbed a “phenomenal success”. However, as Anjum Altaf, former provost of Karachi’s Habib University and a famous teacher, subsequently told The Wire, “The volume of third-rate publications in Pakistan has increased greatly simply because [Pakistan’s Higher Education Commission] introduced a tenure-track system and required publications for promotion.”

Mental Health in Academia: What About Faculty Members?

The hypercompetitive academic culture has ways of always keeping you on your toes.

The life of a professor is a constant balancing act where we try to juggle personal and professional responsibilities under the pervasive stress of managing expectations in an often hypercompetitive culture. There is always a fear that we may drop the ball, a sense that if that were to happen, we would be alone and the only one to blame. The system assumes that we should be old enough, experienced enough, and tough enough to withstand all the pressure that comes with the job. Being a faculty member in a university can be one of the most fulfilling career paths, but it has also become one of the most stressful jobs.

The storms of academia

As young scientists taking on a faculty position, we quickly transition from being a team member to a team leader; from never worrying about securing funding to being overwhelmed with grant deadlines; from managing a single project to planning and guiding the work and careers of several students and post-docs; from worrying about ourselves to being absorbed in worrying about everything except our wellness. The great majority of us have never developed a course or taught classes on our own, yet we are all expected to assume these responsibilities.

Many universities give good support when it comes to teaching, yet most offer very little training or help in project and team management, leadership, mentoring and conflict resolution, let alone mental health awareness and intervention. We are expected to learn everything on the job. In other words, we learn by making mistakes that we – and to some extent our students and staff – directly or indirectly end up paying for. Driven by our passion for science, we keep trying and do get better at it, but very rarely pause to assess: “At what cost?”

As the tenure clock starts ticking, stress and anxiety often begin to increase; the stakes become higher, and many begin to struggle with the ambiguity of the tenure criteria and the lack of feedback. The pressure mounts to publish papers in ‘high impact journals’, to secure prestigious grants, go on lecture tours, and fill in all the blanks in our CV. Frustration, disappointment, self-doubt or burnout are all too common throughout this journey.

Also read: What Is it About Working in STEM Labs That Increases Anxiety, Depression Risk?

Even after tenure, the pressure often does not go away. Instead, we simply transition from one type of stress to another: from being anxious about publishing and securing tenure to being worried about funding, deadlines, increased administrative duties, the pressure to secure more prestigious grants and awards, and concerns for our reputation. The hypercompetitive academic culture has ways of always keeping you on your toes. For senior faculty, the definition of success is a constantly moving target that is shaped and reshaped by the achievements of our peers and the expectations of our superiors. After achieving tenure, success becomes much harder to define, especially because the more senior we get, the less feedback we receive. What remains constant, however, is the lack of acknowledgement that faculty may be struggling, and the absence of formal or informal support from our institutions.

We constantly preach that failures provide unique learning opportunities and are the stepping stones for success. But when it comes to our own careers, many of us may feel that failure is never an option or something that we are willing to accept, admit or share. Academia is also not the place where we are likely to get second chances. Our peers may view our failure less as a potential learning experience and more as a sign that we are not fit for academia. In a culture of perfectionism and nearly constant peer pressure, the lines between disappointment and failure become very blurry.

Most of us quickly learn that we must project an image of always being in control and unshaken by all the storms of academia. We feel the need to ‘fake it’ until (hopefully) we make it. In reality we, like our students, frequently experience stress, fear and insecurity as well as anxiety, depression and burn out. As faculty, many believe that admitting we are stressed or going through a mental health crisis would be a mistake; that if we do, no one will see us the same way, and that it may compromise our relationship with our students, our colleagues and our superiors. In the absence of a collegial and supportive culture, and with many professors spending most of their time in their office only surrounded by computers, a faculty position can be emotionally, mentally and physically draining. It should not be this way, and no one should suffer alone.

Too high a price to pay

Often, we do not realise until it is too late that poor work-life balance and pretending that we are on top of everything comes at a great cost to our health, wellbeing and our families. Pressure, stress and anxiety frequently translate into sleep deprivation, exhaustion, irritability and isolation, all of which negatively affects our quality of life and our interactions with students and colleagues. Chronic stress is also a major risk factor for developing many psychiatric and cardiovascular disorders: I have come to learn this first hand after suffering two heart attacks during the past three years.

We have to equip new postdocs and faculty with the necessary resources and training to manage their new responsibilities, navigate the pre-tenure period and handle their mental health challenges. I am happy that my university – The Ecole Polytechnique Fédérale de Lausanne – has started to recognise and prioritise more professional training and support for faculty, but more work is still needed. As a community, we need to pause, reflect and work together to systematically assess why faculty, but also students and staff in our universities, experience so much stress and so many mental health problems. This, and normalising the conversation about mental health, are crucial steps to tackling the mental health crisis in our campuses.

While researching the topic, writing and reflecting on my experiences, I realised that there are many things that I could have done differently; I wish I’d had the courage to admit that I was not a superhuman, to seek help to handle my mental health and the daily struggles to achieve work-life balance. From how I managed my time, job expectations and my wellbeing to how I failed at times to fulfil my responsibilities towards my family, I learnt there is no such thing as suppressing your feelings or hiding your struggles. If you do not deal with them, they linger in your head, mess up your sleep pattern, impact your health, and affect the lives of people around you. I regret not taking the time and initiative to seek the assistance of experts and to follow structured training programs that could have helped me with my new responsibilities and to manage my mental health and support my students and colleagues.

Also read: In India, IITs’ Decision to Use Tenure to Improve Research Could Backfire

But it is never too late: this is exactly what I am doing now. I have come to terms with the fact that balancing my life with my work means saying “no” more, traveling less, prioritising my health and family, and giving up on trying to please everyone or do everything that can be done. To make more time for myself, my family, my research and my students and group members, I have decided to start by reducing the size of my team, by giving up certain grants and by further consolidating our research programs. I am now more comfortable opening up and discussing my own difficulties with team members and colleagues. I am also more sensitive to the struggles of those around me, and having them share their mental health challenges has been equally therapeutic. At the community level, I plan to organise a series of lectures and activities in 2020 to help normalise the conversation about mental health on campus. I would like to advocate for joint community-based initiatives that create an environment where people struggling with stress and mental health issues feel supported and are not afraid to be excluded.

I hope this article will help others in academia feel comfortable speaking out about their struggles and mental health challenges, and sharing their thoughts, feelings and experiences. As faculty, we cannot take care of our students if we do not learn how to take care of ourselves.

Hilal A. Lashuel is an associate professor and director of the Laboratory of Molecular and Chemical Biology of neurodegeneration in the Brain Mind Institute at the Ecole Polytechnique Fédérale de Lausanne.

This article was originally published by eLife and has been republished here on a Creative Commons Attribution license.

Let’s Discuss the ‘Reward for Publication’ Idea Instead of Trashing It so Fast

Like any policy, there are bound to be shortcomings in this one as well. However, it might be more worthwhile to consider ways of fixing them instead of throwing the baby out with the bathwater.

K. VijayRaghavan, the principal scientific advisor, recently announced that the government was considering monetary benefits to incentivise research scholars to have their studies published and ideas patented. This is a welcome step.

Such an incentive could increase the number of papers published in good journals as well as contribute to the overall development of a country’s scientific community.

Other scientists have argued that such incentives could also promote unethical research practices. However, it is important to understand one can’t simply publish more papers in (legitimate, peer-reviewed) journals. We still have to clear our supervisors and peer-reviewers, and neither group has anything to do with this incentive.

Also read: With Scholars Set to Restart Protests, a Peek Inside the Research Funds Gridlock

Additionally, research misconduct is already prevalent in India because of the ‘publish or perish’ mentality, among other reasons. Therefore, it would be better to set up ways to scrutinise and eliminate malpractice instead of abolishing incentives that could improve the situation.

Another point of contention is the difference in amounts being considered for papers in Indian and international journals: Rs 20,000 for the former and Rs 50,000 for the latter. It may not be proper to determine the value of a journal based on its impact factor. However, the difference in grant amounts only reflects the relative importance of international journals among scientists and scientific institutions.

Policies like this are typically devised to improve research output in a field. It is the individual who must have the moral responsibility to follow ethical practices.

It’s important to keep the broader context in which research is undertaken in mind. It is true that India as a whole lags far behind other ‘powers’ in its league, especially in its capacity for productive research and innovation. Part of the reason for this is that researchers, and their needs, have often been neglected despite the transformative potential of their work.

A lackadaisical administration, poor facilities and flawed career advancement schemes have prevented, and continue to prevent, school-goers from considering careers in research.

Also read: To Reinvent Peer Review, We Must Reinvent How We Pay Peer-Reviewers Back

The education system also leaves much to be desired. In many countries, postdoctoral fellows from around the world contribute greatly to research and development. However, research in India relies mostly on PhD scholars. And many of these scholars are not well-trained at the masters’, or even at the undergraduate, level.

Additionally, the quality of mentorship and infrastructure is such that only those students inherently motivated to do good science do good science. The others don’t, and a PhD by itself doesn’t offer any incentives to do so.

Some universities require a student to publish at least one paper in order to receive her PhD. However, the average length of a PhD programme is five years, more than enough time in which to fulfil this requirement.

In these conditions – and not in the conditions where labs are well-staffed, well-equipped and well-paid – monetary incentives based on quality of publication will push students to do more and better work. This will simply be the result of marrying research with the advantages of a historically more lucrative field, like engineering.

Also read: Impact Factors Fail in Evaluating Scientists. Why Does the UGC Still Use Them?

Such a scheme could also encourage scholars to be more competitive, inculcate sustainable habits to produce more, and aspire to elevate the quality of the labs to which they belong. A rewards scheme for patents is also welcome for the same reasons.

Like any policy, there are bound to be shortcomings in this one as well. However, it might be more worthwhile to consider ways of fixing them instead of throwing the baby out with the bathwater.

It isn’t clear at the moment if the Ministry of Human Resource Development and/or the Department of Science and Technology plan to extend these incentives to pre-PhD students as well. They should.

Sudhakar Srivastava received his PhD from the CSIR-National Botanical Research Institute in 2009 and worked for four years as a postdoc at Ben-Gurion University, Beersheba. He is currently a postdoc at the Beijing Forestry University.

Scientists in the Lurch After Imprecise MHRD Notice About ‘Paid Journals’

A “completely confusing statement” in a gazette notification has scientists wondering which of their papers will and won’t be considered towards their promotions in the future.

A “completely confusing statement” in a gazette notification has scientists wondering which of their papers will and won’t be considered towards their promotions in the future.

Which paper will take me up, which paper won't? Credit: joi/Flickr, CC BY 2.0

Which paper will take me up, which paper won’t? Credit: joi/Flickr, CC BY 2.0

New Delhi: A notification from the ministry of human resource development to the National Institutes of Technology (NITs) in July this year has stated that papers published in journals that levy an article processing charge will not earn career advancement credits for their authors.

The notification has been directed towards quelling the menace of predatory journals, which will publish anything in exchange for an often substantial fee. However, the MHRD move has also sparked consternation among some scientists, who believe that there are legitimate journals that also charge an article processing charge, or APC, and that punishing scientists who publish in such journals would not be fair.

Rise of predatory journals

Scientists routinely publish the papers that they write in scientific journals. The journals provide two services in return. They subject the papers that they receive to a peer review, where the manuscripts are vetted by a group of experts on the same topic for their novelty and validity (among other things). And once a paper has cleared peer review, the journal publishes it to create a public record of it as well as to publicise it. Conventionally, such journals have covered the costs of peer review and printing (and reap great profits) by charging readers an access fee.

“Journals are published by commercial entities, academies, societies and government bodies. Most non-open-access journals are published by commercial entities, with a few corporations dominating,” said Subbiah Arunachalam, a scientist and activist. “Their journals are priced high and several of them make big profits. Not all the journals they publish are of high quality though.”

One of the major modern scientific publishing paradigms, broadly collected under the label ‘open access’ (OA), flipped this convention. OA journals make their money by charging scientists an APC and then make the published paper available to access freely.

The OA movement has been perceived as a form of social justice because the financial burden of publishing is moved towards educational institutions and universities, which have higher spending power, and away from individual consumers, who often can’t afford the access fee by themselves. The movement also gives scientists in countries with lower spending power the ability to easily access papers published by scientists in richer countries.


Also read: Why open access has to look up for academic publishing to look up


However, some journals – many of them published from India – call themselves ‘OA journals’, charge an APC from scientists but don’t bother with the quality, originality or validity of what they’re publishing. Collectively called predatory journals, their rise has been fuelled by the ‘publish or perish’ environment in modern academia, where scholars can’t rise to the top unless they publish great papers frequently. In India, the problem has been exacerbated by a set of guidelines issued by the University Grants Commission last year. And when the institutions they belong to simply want to be able to claim their faculty members publish a lot of papers, the result is that predatory journals can charge a high APC, still expect scholars to want to publish mediocre research with them, and get away with it.

Simple statement, many consequences

The MHRD gazette notification, issued on July 21, 2017, lists amendments to the first statutes of the NITs. One amendment includes updates to the credit point system used by institutional administrators to determine if a faculty member qualifies for a promotion. For example, an assistant professor with 20 credit points can qualify to become an associate professor if she accrues 50 credit points. There are many ways to earn credits. One is to publish papers: according to the notification, each paper published that is indexed in the Science Citation Index or the Scopus database earns four points.

However, there is a catch: “Paid journals not allowed” (note 2, s. no. 4, p. 15).

This is a reference to the ‘pay to publish’ modus of predatory journals but it so happens to include some legitimate OA journals as well as non-OA journals that charge a fee to make a paper openly accessible. Subhash C. Lakhotia, a cytogeneticist and senior scientist at the Banaras Hindu University, Varanasi, pointed out that the latter is a practice often carried out by ‘conventional’ journals like Nature and some titles published by Elsevier. As a result, “papers in all these journals become discounted – which is absolutely ridiculous.”

R. Subrahmanyam, the additional secretary for technical education at the MHRD, explained the notification to The Wire: “Non-consideration of publications in ‘paid journals’ for career advancement is a standard practice in IITs and other premium institutions, not only NITs,” He added that he wasn’t “aware of any ‘standard’ journals which take money for publication of a high-quality article”.


Also read: At least 82 in UGC’s list of ‘preferred’ journals can be classified as ‘predatory’


However, if the MHRD’s intention was to preclude authors publishing in predatory journals alone, some have argued that the brevity of the rider is callous. Lakhotia called it a “completely confusing statement”. But Subrahmanyam defended it: “The fear is that once you open this window, there is NO stopping of predatory journals. But [we] can get this contention examined by some experts” (emphasis in the original).

Peush Sahni, the editor of the National Medical Journal of India, thinks the sweeping nature of the ‘paid journals’ line does not bode well. In an editorial he coauthored in 2016, he had pointed out a similar issue with a notification of the Medical Council of India the year before. It had stated that if scientists at India’s medical colleges and institutions published papers in ‘e-journals’, those papers would not be considered when the scientists were up for promotions. However, this is unfair because excluding ‘e-journals’ would exclude “many high-quality journals that are published only in the electronic format”.

K. VijayRaghavan, secretary of the Union government’s Department of Biotechnology (DBT), acknowledged Sahni’s contention: “There are many excellent journals that are not predatory and charge, but these usually also waive when you declare an inability to pay.” Lakhotia disagreed, however, saying that many OA journals that used to waive their APCs earlier now refuse to and ask for a minimum fee.

Money put to better use

On the other hand, some scholars and activists have said that this is a welcome move. Arunachalam, a member of the team that drafted the OA policy adopted by the Department of Science & Technology (DST) and DBT, told The Wire that he doesn’t prefer scientists pay APCs to have their papers published – even in legitimate OA journals – “especially if the money for paying comes from the taxpayers”.

According to a paper in Current Science last year that Arunachalam coauthored, Indian scientists between 2011 and 2014 published 14,293 papers in OA journals that charged a fixed APC. The average APC per paper was $1,173 (Rs 75,900). A study published in 2014 had estimated that the global average per paper was $964 (Rs 62,400). Without APCs in the picture, Arunachalam and his coauthors estimated that Indian institutions could save about Rs 16 crore every year.

“The money saved could be used for other research-related activities – experiments, fellowships to students, facilitating student participation in conferences, hosting overseas collaborators, etc,” he told The Wire.

The DST-DBT OA policy requires full copies of all research funded either in part or in full by public money to be deposited in institutional repositories, or in a centralised national repository, that is free to access. “What I am suggesting and what the DST-DBT policy demands is for authors to deposit the post print – the final draft after peer-review and acceptance by a journal” in the repositories, and for the NITs and IITs to consider those copies for the credits system instead of rejecting “paid journals” en masse.

So the question remains as to how anyone expects legitimate OA journals that don’t charge an APC to recoup money spent on arranging for peer review, etc. As Joseph Esposito, a management consultant and an expert on the topic, has written, “Whatever the benefits of OA, reduced costs are not among them”.

Some Indian journals that are both OA and don’t charge an APC are published by the Current Science Association, the Indian Academy of Sciences, the Indian Council of Agricultural Research and the All Indian Institute of Medical Sciences, among others.

However, there is a marked preference among scientists around the world – and within India – for publishing in journals with a high impact factor (IF). The IF is a measure of how important a journal is; it is calculated based on the average number of times a paper published in the journal is cited by other papers.

Of late there has been some awareness of the harmful effects of measuring a scientist’s ‘worth’ based solely on the number of papers they’ve published in high-IF journals, but such awareness is not widely shared. For example, getting one’s paper published in the UK-based  Nature is still considered by many to be more ‘prestigious’ than when published in, say, India’s own Current Science – if only because Nature has a larger and more influential readership. Over time, this perception leads to a self-fulfilling prophecy.

Quantitative metrics to measure quality

Predatory journals have made this notion harder to defeat by misusing the ‘OA’ label, but they aren’t the only problem. Some publishers levy an APC and also continue to charge for subscriptions, a practice known as double-dipping. “This model is unethical in the first place and unaffordable to developing and emerging countries,” Arunachalam said. “When there are better and cost-effective ways to make one’s results known to other researchers, I don’t see any reason why we should choose the APC route.”


Also read: Research fraud of one kind or another will continue to take place in India


However, are scientists entirely free to choose to publish their papers in such journals without any impact on their careers? A faithful adherence to the DST-DBT OA policy would say the answer is ‘yes’. But some scientists this correspondent spoke to say the answer is really ‘no’. “It’s a common problem in India, both officially and unofficially,” according to Lakhotia.

“As the situation with regard to evaluation of scientists stands now, in India, I do not feel that scientists can afford to not take journal reputations into cognisance while choosing where to publish,” said Amitabh Joshi, a professor of evolutionary and organismal biology at the Jawaharlal Nehru Centre for Advanced Scientific Research, Bengaluru. “Many of us do not want to spend the time and energy involved in reading and assessing the worth of a piece of work. So noting that it is published in a ‘big’ journal and assuming that ‘therefore it must be good science’ becomes a shortcut to proper scientific assessment. We cover this up by claiming that this is more ‘objective’. It is however also a fact that the freedom to exercise ‘subjective’ judgement can be misused for nepotistic or other unethical purposes. Ultimately quality can’t be judged via quantitative metrics.”

(Joshi clarified that his opinions are his personal views as a practicing scientist and are not necessarily shared by institutions with which he is affiliated in various capacities.)

Ultimately, the way out seems to be to have the MHRD clarify the notification such that it excludes journals that are known to be legitimate. One common way to identify them has been to check if they are listed in certain journal databases, like Scopus or Web of Science. But even should this happen, it will be a stopgap measure.

The real issue is that there appears to be discontentment simmering among OA activists who want public money to be spent on research-related activities instead of being given to journals in the form of an APC. On the other hand, scientists feel that they should not be penalised for what often is, at the end of the day, good science.

According to Lakhotia, the gazette notification to the NITs is a result of “bureaucrats making decisions without consulting scientists and the science academies”. He said that the MHRD has to stop going by the numbers – a request of Joshi’s also – and instead take qualitative measures to judge scientific research: “The MHRD should guide scientists on what is good science. Indian science is often of high quality but when the name of the journal matters and the IF matters, etc. – that is what is killing Indian science.”

At Least 82 in UGC’s List of ‘Preferred’ Journals Can Be Classified As ‘Predatory’

If the UGC was truly serious about a transparent vetting process, its first step should be to make the list of journals available in a more accessible format.

If the UGC was truly serious about a transparent vetting process, its first step should be to make the list of journals available in a more accessible format.

These predatory journals, and those who publish in them, tend to be from Asia and Africa. Credit: alexwatkins/Flickr, CC BY 2.0

These predatory journals, and those who publish in them, tend to be from Asia and Africa. Credit: alexwatkins/Flickr, CC BY 2.0

The phrase ‘publish or perish’ isn’t meant to fill you with a warm glow. It’s meant to describe a cut-throat academic world in which professors have two options. Option A is that they write research papers, rack up citations, attend conferences, serve on the editorial board of journals and generally tack on more and more pages to their resume. Though it’s hard to believe, Option B is even less fun. Option B is the drying up of research funds, a lack of promotions or (more likely) an inability to find a job in the first place. This system was meant to incentivise good research but, in a world where good research takes time and teaching workloads aren’t getting lighter, it has instead heralded the rise of the for-profit academic publishing industry, exemplified by the predatory journals.

Predatory journals look like genuine scholarly publications but, behind the façade of respectability, there tends to be a litany of false claims and a bill for the joy of seeing your paper published. They are, in a way, unintended beneficiaries of the open access (OA) revolution. The shift to OA is predicated on a change in revenue model. Instead of charging people for access to the journal, all papers were made openly available and the journal was sustained by collecting an article processing charge (APC) at the time of submission. This charge was meant to cover the costs of peer review, maintenance of the website, etc. Predatory journals charge an APC without doing any of the work that merits the cost.

These predatory journals, and those who publish in them, tend to be from Asia and Africa. One study found the average publishing fee to be USD 178 (Rs 12,000). The same study found that, in 2014, an estimated 420,000 articles were published by around 8,000 journals. In September 2016, the US Federal Trade Commission sued the Hyderabad-based publisher OMICS Group for, as Ivan Oransky put it, “bilking researchers out of potentially millions of dollars”. While the majority of these papers are from private colleges, premier national institutions aren’t exempt. A recent Current Science analysis of 3,300 papers found 11% of them were from institutes affiliated to the Indian Council of Agricultural Research, the CSIR labs, the National Institutes of Technology, IITs, etc.

In an effort to combat this pernicious trend, the UGC announced that it would convene a committee to prepare a list of recommended, genuine journals. Only the papers published in these journals would count towards an academic’s performance evaluation. And only these papers would count towards an academic’s score in the Academic Performance Indicators (API) system. This in turn forms the basis of the Career Advancement Scheme (CAS) and the direct recruitment of teachers and other academic staff as per the University  Grants Commission (Minimum Qualifications for Appointment  of  Teachers  and  other Academic  Staff  in  Universities  and  Colleges  and  Measures  for  the  Maintenance  of  Standards in Higher Education) Regulations, 2010. Prior to the amendment in July 2016, the universities decided by themselves as to which journals would count and which wouldn’t.

Last week, the UGC published its list of recommended journals. They numbered a staggering 38,653 across all disciplines. A quick analysis of the list showed at least 35 journals that might be classified as predatory. (The full list is available at the end of the article.)

The 35 include the Journal of Computers, which describes its aim as “the integration of computers and opens to the world”. The Journal of Natural Products is an Indian journal run by a Sudanshu Tiwari, which coincidentally has the same name as a journal published by the American Chemical Society. Another on the list, the World Applied Sciences Journal, accepts articles “in biological sciences, biodiversity, biotechnology, clinical sciences, animal and veterinary sciences, agricultural sciences, chemistry, environmental sciences, physics, mathematics and statistics, geology, engineering, computer science, social sciences and information technology”, and asks submitters to recommend reviewers in their field.

To its credit, the UGC has declared that the list is dynamic and subject to change as new evidence is made available to them. But if they were truly serious on a transparent vetting process, their first step should be to make the list of journals available in a more accessible format rather than a 200-page-long PDF that is not searchable. It is unclear right now how university administrators could even cross-check a journal with the list. Without a simpler mechanism, the process looks too cumbersome to implement immediately.

Ideally, the entire system of output metrics should be massively overhauled. As K. VijayRaghavan, the secretary of the Department of Biotechnology, told the magazine Science, “The fundamental problem is an ecosystem that values where you publish and how many papers you publish rather than what you publish. That needs to be changed.”

List of journals in UGC list that also appear in Jeffrey Beall’s list* of predatory journals 

  1. Actual Problems Of Economics
  2. Aging
  3. Australasian Medical Journal
  4. Cellular And Molecular Biology
  5. Der Pharma Chemica
  6. European Journal Of Science And Theology
  7. European Journal Of Social Sciences
  8. Genetics And Molecular Research
  9. Global Media Journal
  10. Interdisciplinary Toxicology
  11. International Archives Of Medicine
  12. International Journal Of Environment
  13. International Journal Of Health Research
  14. International Journal Of Network Security
  15. International Journal Of Nursing
  16. International Journal Of Pharmacognosy
  17. International Journal Of Pharmacy
  18. International Journal Of Pharmacy And Technology
  19. Journal Of Clinical And Analytical Medicine
  20. Journal Of Computers
  21. Journal Of Electrical Engineering
  22. Journal Of Environmental Biology
  23. Journal Of Environmental Hydrology
  24. Journal Of Internet Banking And Commerce
  25. Journal Of Language And Literature
  26. Journal Of Natural Products
  27. Journal Of Pharmacy Research
  28. Journal Of Psychology And Theology
  29. Journal Of Software
  30. Oncoscience
  31. Romanian Biotechnological Letters
  32. Scholarly Research Exchange
  33. Shiraz E Medical Journal §
  34. Sport Science
  35. World Applied Sciences Journal

* Author’s note on data and the methods of analysis

This analysis comes with a few caveats. The UGC list was published in the form of five PDFs of scanned documents that had been run through an optical character recognition (OCR) software. This meant that the extraction of data from the PDF was far from perfect. In fact, I would estimate my analysis covered less than 50% of the journals listed and so my finding could be conservative. I then looked up the UGC’s list of journals in Jeffrey Beall’s list of predatory journals to see if there was a match.

This was performed using an R script; you can find the code for that here. Jeffrey Beall is a librarian at the University of Colorado, Denver. He maintained the list from 2008 till earlier this month, when his site mysteriously shut down. Retraction Watch, which broke the news, says the list might be maintained by Cabell’s, a publishing company, in the future. The analysis relies entirely on the archived version of his list. I have taken no steps to double-check his work.


§ Since the publication of this article, the editorial office of Shiraz E Medical Journal has written to The Wire to insist their journal is legitimate:

“Please note that our journal has ISSN (1735-1391) and was accredited by the Iranian Commission of Medical Sciences Publications at the Ministry of Health and Medical Education and also was indexed in many famous databases including Scopus. Furthermore, the journal is a member of the Committee on Publication Ethics (COPE).”

I, as the EIC, was prime-minister of Health and Medical Education in Iran (from 2008-2012) and I, personally, am so sensitive about publication ethics and any suspicious malpractice in the journal. In the below link, you can find the ethics rules and regulations:

http://emedicalj.com/?page=public_pages&name=publication_ethics

http://emedicalj.com/?page=public_pages&name=instruction_for_authors

About “Jeffrey Beall’s List”: As you already were aware, this list is removed by Jeffrey and we believe that this list was not a documented or evidence-based list. We also opened a claim against Beall in a court and we hope be ending this event, you also reconsider any conclusion raised from that illegal list. Hope, you understood our hard situation and help us to talk better and work better in the field of journalism. We, deeply hate all fake journals and predatory journals.

Prof. K.B Lankarani
Editor in Chief


Update: On April 14, 2017, the UGC published a text version of their list. An independent analysis by one of our readers, Saket Choudhary, a PhD student at the University of Southern California, compared the new version with Beall’s list and found matches on 82 journals. The full list is available to view here.

Thomas Manuel is the winner of The Hindu Playwright Award 2016.