Sleuths Uncover Web of Research Fraud in 400+ Papers From China

Most of the papers, though not all, were published in just six journals and after peer-review.

Elisabeth Bik, the microbiologist and research integrity consultant noted for unearthing evidence of research misconduct, tweeted on February 21 that she and some others had uncovered over 400 scientific papers “that all share a very similar title layout, graph layout, and (most importantly) the same Western blot layout” – indicating an organised web of potential fraud. She also expressed concern that “there might be hundreds of papers more”, and that she and her collaborators may just have spotted the obviously fraudulent ones.

Western blotting is a technique that microbiologists employ to identify the proteins present in a tissue sample. As an analytical technique involving real-world materials, no two images of western blots are supposed to look alike, and similarities suggest the image may have been manipulated, inadvertently or otherwise.

Guided by this and similar giveaways, Bik, @SmutClyde, @mortenoxe and @TigerBB8 (all Twitter handles of unidentified persons), report – as written by Bik in a blog post – that “the Western blot bands in all 400+ papers are all very regularly spaced and have a smooth appearance in the shape of a dumbbell or tadpole, without any of the usual smudges or stains. All bands are placed on similar looking backgrounds, suggesting they were copy-pasted from other sources or computer generated.”

Bik also notes that most of the papers, though not all, were published in only six journals: Artificial Cells Nanomedicine and BiotechnologyJournal of Cellular BiochemistryBiomedicine & PharmacotherapyExperimental and Molecular PathologyJournal of Cellular Physiology, and Cellular Physiology and Biochemistry, all maintained reputed publishers and – importantly – all of them peer-reviewed.

As a result, the discovery of the problem papers has prompted concerns about the ability of peer-review to check research misconduct in the scientific community.

Indeed, when Bik writes, “Finding these fabricated images should not rely solely on the work of unpaid volunteers,” she evidently means herself and her collaborators – but her words also apply to peer-reviewers, who are unpaid for their work and often lack both the resources and the inclination to investigate each paper in close detail. As a result, peer-review is often not the insurmountable defence some proclaim it to be, nor are peer-reviewed journals as free of bogus science as they claim to be.

As Madhusudhan Raman, a postdoctoral scholar at the Tata Institute of Fundamental Research, Mumbai, wrote in The Wire, “… any attempt to radically change the nature of peer-review must necessarily be accompanied by a change in the way the referees are compensated for their time and effort, especially within academia.”

A PubPeer user who goes by Indigofera Tanganyikensis first identified the problem in two papers (this and this) both published by Chinese researchers. On February 17 this year, a little under a week before Bik published her blog post, two American researchers – Jennifer A. Byrne and Jana Christopher – published an article discussing similar research misconduct based on 17 papers they had discovered.

According to Bik, “As it turns out, Byrne and Christopher’s publication describes the exact same set of papers that our small team of image forensics detectives had been working on in the past month.”

These sleuths, as @SmutClyde wrote on Leonid Schneider’s blog, believe they have stumbled upon at least one paper mill. To quote (selectively) from Bik’s post,

A paper mill is a shady company that produces scientific papers on demand. They sell these papers to [people] who do not have any time in their educational program to actually do research. Authorships on ready-to-submit or already-accepted papers are sold to medical students for hefty amounts. … Whether or not the experiments described in these papers has actually been performed is not clear. Some of these paper mills might have laboratories producing actual images or results, but such images might be sold to multiple authors to represent different experiments. Thus, the data included in these papers is often falsified or fabricated.

The mills seem to have been hired by Chinese clinicians affiliated to various medical colleges and hospitals in China (234 of the 400+ papers have been authored by people affiliated to institutions in Shandong province). The papers were all published between 2016 and 2020. @SmutClyde wrote that after they publicised their findings, including the dataset of papers they had identified as potentially fraudulent (and which they continue to update), an author of one of the papers wrote in:

Being as low as grains of dust of the world, countless junior doctors, including those younger [than] me, look down upon the act of faking papers. But the system in China is just like that, you can’t really fight against it. Without papers, you don’t get promotion; without a promotion, you can hardly feed your family. I also want to have some time to do scientific research, but it’s impossible. During the day, I have outpatient surgeries; after work, I have to take care of my kids. I have only a little bit time to myself after 10 pm, but this is far from being enough because scientific research demands big trunks of time. The current environment in China is like that.

Considering how the peer-review of all of those journals have failed, what the detectives have found effectively represents a large volume of unscientific data entering the scientific literature, funnelled predominantly by Chinese researchers who probably hired a paper mill to help meet the publishing requirements set by their respective institutions. Bik wrote that “it is of great concern to see that this specific paper mill has successfully ‘infected’ particular journals” and that “it is very alarming to see that journal editors do not appear to have noticed the similarities between dozens of papers published in their journals.”

This said, the note from the unnamed Chinese author indicates the source of the problem is hardly new or even confined to China.

For example, until Prakash Javadekar, then the Union human resource development minister, said in mid-2017 that college teachers would not be required to undertake research to qualify for promotions, people who had not trained for research and have since been embedded in environments not properly equipped to support research were forced to conduct research, and publish papers.

“Javadekar is to be loudly applauded and congratulated for taking this measure,” Pushkar wrote for The Wire at the time. “The research requirement in the [Academic Performance Indicators] for college teachers was a travesty. All that it achieved was a proliferation of fake journals for college teachers to publish in.”

Indeed, India has come to be known the fake journals capital of the world, partly as a result of requiring people who cannot undertake research to undertake research, and partly because research productivity has become one of the core measures of determining whether a country is a “scientific superpower”.

For another example, the journal Nature reported that “Pakistan’s research output increased the most among all countries in the world – by 21%” in 2018, a feat that it dubbed a “phenomenal success”. However, as Anjum Altaf, former provost of Karachi’s Habib University and a famous teacher, subsequently told The Wire, “The volume of third-rate publications in Pakistan has increased greatly simply because [Pakistan’s Higher Education Commission] introduced a tenure-track system and required publications for promotion.”

Research Fraud Allegations Loom Over Star Chinese Scientist

Just last week, Professor Xuetao Cao addressed a crowd of thousands at the Great Hall of the People in Beijing on the topic ‘Research Integrity’.

New Delhi: The former head of the Chinese Academy of Medical Sciences and current president of Nankai University in Tianjin Professor Xuetao Cao is under fire for claims that he has falsified data in several research papers.

The alleged data duplication was first identified by microbiologist and data integrity expert Elisabeth Bik, who flagged manipulated images from papers on PubPeer, a peer-reviewed platform to evaluate scientific research, by a “big-name professor who is a Chinese Academician and president of a top tier Chinese university”.

Academician and immunologist Cao Xuetao is a fellow of the German Academy of Sciences, the French Academy of Medicine, the US National Academy of Medicine and the American Academy of Arts and Sciences, and the recipient of a lifetime achievement award for mentoring in science from the journal Nature.

Also read: CSIR Institute Under Scanner for Publishing Papers With Duplicate Images

The allegations are bound to be doubly discomforting for the Chinese scientific community as he has held the distinction of being the youngest medical professor in China, the youngest member of the Chinese academy of engineering and the youngest general in the Chinese armed forces (at different times). Last week, he addressed a crowd of thousands at the Great Hall of the People in Beijing on the topic ‘research integrity’.

In a Twitter post, Bik also said she was “not accusing anyone of misconduct” and that “these duplications might just be honest errors”. She had previously reported in 2014 a fraudulent paper from Cao’s lab to the journal that published it. After a small correction was issued in 2015, Bik decided to reexamine more of Cao’s published work.

However, after allegations of data manipulation went viral on social media in China, Cao told China Newsweek he would investigate the claims. According to a post on science journalist Leonid Schneider’s blog, Cao also responded to Bik on PubPeer saying he appreciated her “commitment to protecting the accuracy of scientific records and the integrity of research pursuits” and that he would “work with the relevant journal editorial office(s) immediately if our investigation indicates any risk to the highest degree of accuracy of the published records”.

CSIR Says It’s Probing Scientific Misconduct Allegations, Drafting New Guidelines

Papers published by multiple CSIR labs were allegedly using manipulated or duplicated images.

New Delhi: Director-general of the Council for Scientific and Industrial Research (CSIR) Shekhar C. Mande renewed the conversation around research misconduct at laboratories associated with the organisation with a tweet on Monday.

Mande tweeted a statement from CSIR which made three points (paraphrased from the statement):

  1. The specific allegations made in the media are being probed, and the committee is in the final stages of the investigation. It will then submit its report to Mande.
  2. A separate committee was set up to draft new guidelines on research ethics and integrity. This committee has submitted its report to Mande, and that report will be sent to academics for wider consultation. Once finalised, the guidelines will be implemented.
  3. While a conflict of interest policy already exists, a new practice has been put in place: anyone who attends a CSIR committee meeting must sign a conflict of interest statement and adhere to it.

CSIR’s website says that it has a network of 38 national laboratories, 39 outreach centres, three innovation complexes and five units.

The allegations

This recent burst of introspection seems to have been brought on by allegations of image manipulation at a number of CSIR laboratories. As The Wire earlier reported, at least 130 papers published by the Indian Institute of Toxicology Research were listed on a website that monitors image manipulation and duplication, Pubpeers. According to The Scientist, 31 papers from the CSIR-Indian Institute of Chemical Biology and 35 from the CSIR-Central Drug Research Institute were also listed for the same reason.

While the authors of the allegedly fraudulent papers published by IITR varied, one common name on 49 of the articles was Yogeshwer Shukla, IITR’s chief scientist of food, drug and chemical toxicology. Shukla is a senior scientist, and has claimed to have cured cancer in mice using nano-encapsulated elements from plants with Ayurvedic relevance. Several of these claims, however, are now under question due to image manipulation.

Also read: What Scientists Make of the Govt’s New Draft Ethics Policy for Academics

At CSIR-IICB, The Hindu reported, distinguished fellow Chitra Mandal had the most number of problematic publications listed on Pubpeers, at 28. One of these papers has been retracted. Mandal served as acting director of the institute in 2014-15, and has received the Sir J.C. Bose Fellowship from the Department of Science and Technology.

At CSIR-CDRI, the cases of image manipulation flagged were all relatively recent – between 2010 and 2018. Chief scientist Naibedya Chattopadhyay had the most number of papers with manipulated images, at five, according to The Hindu. Another one of his papers had been retracted and four others corrected for faulty images.

These allegations were reported in June 2019. In a notice to the heads of all labs and institutes that fall under CSIR, the joint secretary (administration) said then year that since allegations of plagiarism and image manipulation had emerged, a committee was being set up to list ethical guidelines that must be adhered to. That notice, however, did not list the specific allegations.

Research fraud in India

Even before Mande’s tweet on Monday, individual institutes and labs under CSIR have been uploading their research guidelines on their websites over the last few months, perhaps both to serve as a reminder for scientists and to let people know that there are some standards in place.

CSIR’s troubles haven’t come out of nowhere. Research fraud of various kinds – from plagiarism to predatory publishing – is a well-known affliction plaguing Indian academia, and while it has been pointed out several times, an adequate solution is yet to be found. The problem of image manipulation has the added disadvantage of being difficult to spot, unlike text plagiarism which can be easily detected.

Also read: UGC’s Anti-Plagiarism Rules Don’t Make Room for Realities of Indian Academia

Given the problems with research fraud, the retraction rate for academic papers coming out of India is also high. According to Nature India, “A total 980 papers have been retracted from India so far, and of them, 33% were retracted due to plagiarism, while 13% were retracted due to image duplication and manipulation.” The ‘publish or perish’ mentality and the lack of adequate funds to compete with global scientists has been blamed for this problem in the past.

In several cases, institutions have taken action against those seen to indulging in misconduct. In July 2016, for instance, a CSIR lab in Chandigarh – the Institute of Microbial Technology – dismissed a senior scientist for fabricating data in three published papers. All three of Swaranjit Singh Cameotra’s problematic articles were published in and later retracted from the journal PLOS ONE. The institute’s investigation and action, though, concluded two years after the retraction.

It remains to be seen whether CSIR will take the recent allegations as seriously, and how stringent the new guidelines it is developing will be.

What Scientists Make of the Govt’s New Draft Ethics Policy for Academics

For some time now, Indian academia has been plagued with crises, whether they be related to data fraud, plagiarism, workplace safety, caste-based discrimination or sexual harassment.

Last week, the office of the Principal Scientific Adviser (PSA) to the Government of India publicised a new draft National Policy on Academic Ethics, prepared with feedback from two of India’s three science academies (Indian National Science Academy and Indian Academy of Sciences). A brief published online states, “The document lays down the foundational principles for upholding integrity and ethical practices in an academic environment and also streamlines the course of action to ensure delivery of justice in case of malpractices.” It adds that the draft policy will be implemented for an unspecified period, after which the office will solicit feedback and ready the final version.

The Wire invited comments from a few scientists on the document, and these are published below. The first comment is from K. VijayRaghavan, the PSA and the chief architect of the policy.


K. VijayRaghavan, Principal Scientific Adviser to the Government of India

Science in India has many achievements, from successes in applying its outcomes to benefit our society and economy, quality basic-science and building institutions that support excellent researchers and teachers. However, there are also many points of concern we must address as we grow. Principal among them is the median quality of science. The value of current science is best judged by the future, but in the meantime, we must ensure that the bulk of it addresses what we feel are substantive questions and not obviously pedestrian ones, and that it is conducted following accepted ethical practices. Such practices can no longer be transmitted by precept or diktat; we must formally articulate our expectations and provide training.

In science, as in all walks of life, people slip-up. There will also be deliberate misconduct ranging from the minor to the serious. There will be situations where institutional and laboratory environments encourage and ensure ethical conduct. There will be others where a blind-eye is turned to bad practice and yet others where misconduct is substantial. Our goals are to maximise the good and minimise the bad. At the heart of the solution is culture – a culture of following the rules even when no one is looking. This requires training from an early age, not only in good conduct in science but in good conduct, period. Above this training-level is that of example. Teachers have a particular role here. The third level is that of policing and enforcement. Responses must be correct, exemplary and proportionate, with due protection to whistleblowers and against gratuitous accusations.

The Academic Ethics Policy tries to lay out these general principles, in the form of a ‘living’ document that describes the best practices in a general way.  Specific articulations in agriculture, medical research, health research, basic sciences, etc. will now be needed. These will have to be implemented at the institutional level, with recourse to higher levels if there is evidence of failure. Institutional ethics and misconduct committees should be set up and be functional. These are important tasks to which we must all set ourselves.


Prajval Shastri, astrophysicist at the Indian Institute of Astrophysics, Bengaluru

The policy is explicit that sexual misconduct, gender-based harassment and bullying in the workplace, as well as discriminatory behaviour based on any of caste, class, religion, ethnicity, gender and sexual orientation, are all unethical – which is a big step forward if adopted. Furthermore, there is acknowledgement that gender bias is systemic, and it is implicit that all members of organisations at all levels of the hierarchy should undergo gender-sensitisation training. This is a very necessary and welcome step.

While public engagement by academics has been so far regarded as optional in Indian academia, it is very welcome that the policy sees the lack of public engagement by publicly funded academics as abdication of responsibility and therefore unethical.


Gautam Menon, biophysicist at Ashoka University, Sonepat

The draft academic ethics policy addresses, correctly, the need to identify and address plagiarism, data fraud, predatory publishing and undeserved authorship of papers. I like the fact that it specifically addresses bias against under-represented sections, identifies the importance of dealing with workplace bullying and is specific about addressing conflicts of interests. It also makes clear the need for swift and fair resolutions to complaints, the need for privacy and confidentiality of proceedings, as well as the need to guard against malicious allegations and to protect reputations where unnecessarily clouded.

The document, however, contains no discussion of ethics associated with technology research, e.g. the development of tools for the use of personal data to track individuals without their consent. A broad statement on ethical issues associated with experiments on humans and animals might not have been amiss but is absent. I would have preferred stronger wording than “When potential conflicts are liable to occur, the official must make this known to the concerned colleagues”, and would have argued for mandatory recusal in such cases unless absolutely unavoidable. But overall, the document is an important mission statement towards consolidating good academic ethics practices across science and education at all levels, and I gladly support it.


D. Indumathi, theoretical physicist at the Institute of Mathematical Sciences, Chennai

The draft national policy on academic ethics is a long overdue one and as such welcome. Having said that, it appears to have a very broad scope. It has long been acknowledged that plagiarism, predatory journals, and ethics in data collection/analysis as well as in theory-related papers have been found out, but no systematic approach to the problems have been seen so far.

The main issue, to my mind, is in the implementation. While the draft policy mandates a standing committee to deal with the issue, the composition of such committees and their independence will be crucial to any meaningful outcome. If the problem is in high places, then a ‘local’ committee may have its hands tied, although the policy document does recommend safety norms and procedures for such ‘whistleblowers.

Unless this is clearly in place (admittedly not an easy thing to do), the policy may fall short of its very ambitious but welcome goals.


Madhusudhan Raman, theoretical physicist at the Tata Institute of Fundamental Research, Mumbai

A national policy on academic ethics is, of course, very welcome, and the ethical guidelines laid down are clear and well-intentioned. For some time now, Indian academia has been plagued with crises, whether they be related to data fraud, plagiarism, workplace safety, caste-based discrimination or sexual harassment. Before I discuss my response to it, I should like to underscore that a commendable role has been played by science journalists in the highlighting of these crises, and I hope they continue to hold academia’s feet to the fire.

The manner in which power is distributed and leveraged in academia, however, is the single largest roadblock to a more just and inclusive academy. Caste-based discrimination, bullying and sexual harassment often affect the youngest members of our community, and the relations of power are so arranged as to hamstring any attempts at bringing perpetrators to justice. A truly bold and progressive code of ethics would emphasise the primacy of the interests of the oppressed classes within academia, and secure for them a seat at the table where all determinations of innocence and guilt are made. Until this happens, it is difficult to imagine these guidelines being in any way restorative or effective in resolving the crises that plague our community. This was understood over a hundred years ago – isn’t it time we caught up?


Shruti Muralidhar, neuroscientist at the Massachusetts Institute of Technology, Boston

This is probably one of the few comprehensive ethics documents that has been officially put forward by a central body governing academic research in India. It is a first step forward, but needs a lot of support and implementation. Choosing a few of the points to focus on:

1. Bias and discrimination: Its heartening to see a ‘to-do’ list along with a ‘do not do’ list – setting it apart from most ethics policies and documents.

2. ‘Full and equal participation of women’: Great sentiment, but needs hard numbers and practically attainable standards and goals. If those numbers can’t be fulfilled (for example, in committee member representation) at the time, then that fact should be recorded and publicly accessible. For example, online as summarised case logs subject to provisions of the RTI Act.

3. Public interaction and outreach: It is unfair to declare science communication as an academic’s ‘duty’ but without providing training and incentives for the same. Science communication is not an easy task – to learn or to perform. Academics who go out of their way to learn and do good science communication must be rewarded. Therefore, unless there are science-communication training modules and grant incentives in place, this point is currently just lip-service.

Overall, this policy document needs to be bolder and talk about enforcement and audits/time periods of implementation for all the suggested measures. Rather than being a stand-alone set of rules, it should be mirrored in the form of an institutional ethical policy framework at each academic institution in India, along with defined time frames and actions for dealing with each of the infractions.


Jayant Murthy, astrophysicist at the Indian Institute of Astrophysics, Bengaluru

In the context of the high profile cases that have come to light recently, I believe it is welcome that issues of ethics in academics are being highlighted at the national level. However, those who have to be reminded of academic ethics are those for whom a ‘National Policy’ will make no difference. It has never been acceptable to plagiarise, discriminate or bully and simply stating this again is not useful. Instead, what is needed is a professional review mechanism where those who violate the code are punished. This has not happened: there is a perception, based on actual cases, that academic integrity is not valued and, in fact, holds one back from one’s well-deserved place in the academic hierarchy. This could very well be solved by UGC or at the institutional level without a National Policy. Appoint the right people and the outcomes will come.

On another, perhaps minor, note: the document says that scientists are encouraged to voice their professional opinions openly and without fear. The line between our professional competence and our scientific competence is not an easy line to draw when our scientific temper is under threat every day and, unfortunately, the PSA may not speak for the government as a whole in this.

Should Publishing Journal Articles Be Mandatory for PhD Students?

While being published is a prerequisite for being sufficiently competitive for faculty positions, making it compulsory is fuelling an explosion of bogus research.

A University Grants Commission (UGC)-appointed committee has recommended that it should not be mandatory for PhD students to publish journal articles (in addition to writing their dissertation) in order to earn their doctorates, Nature reported. UGC regulations currently require that:

Ph.D. scholars must publish at least one (1) research paper in refereed journal and make two paper presentations in conferences/seminars before the submission of the dissertation/thesis for adjudication, and produce evidence for the same in the form of presentation certificates and/or reprints.

The larger issue here is the lack of rigorousness and the indifferent quality of PhD programmes at Indian universities which lend to poor quality research. The sorry state of research is widely known and acknowledged. Less than two weeks ago, in a public notice issued on May 21, 2019, the UGC invited proposals from interested parties to conduct a study on ‘The Quality of PhD Theses in Indian Universities’ in order to review and assess the quality of all PhD dissertations submitted at India’s universities – whether central, state, state-private or deemed-to-be universities – over the last 10 years. Clearly, the UGC wants to get a better idea about how bad things are.

PhD programmes at most Indian universities are, despite what some would consider to be stringent UGC regulations, poorly run. The fault lies at several levels, not just with the rules, and not only with the UGC either. University officials, faculty and students have all contributed to diminishing the meaning of research.

Bogus research, whether by PhD students or faculty, is commonplace and used for obtaining PhDs or publishing journal articles (often in fake journals). Offenders are rarely caught and almost never penalised. It is, however, true that the UGC itself has unwittingly magnified some of these problems.

While research fraud has always existed, the problem has become bigger over the last decade. In 2010, in its enthusiasm to improve the research performance of universities, the UGC introduced the Academic Performance Indicators (APIs) in which publishing journal articles was made mandatory for faculty across all kinds of institutions, including teaching-focused colleges.

This led to an explosion in the quantity of bogus research. The UGC is still firefighting the ill-effects of that decision, for example by preparing regulations against plagiarism and by trying to prepare a list of legitimate – as opposed to fake or predatory – journals. Incidentally, India leads the world when it comes to publishing in fake journals.

Also read: We May Soon Find Out How Good Our PhDs Are

As with APIs (which applied to faculty), for PhD students, the requirement of publishing a journal article in addition to the dissertation was no doubt introduced to promote research. The requirement also made the PhD programme seem more rigorous. However, by most accounts, what it has done is to promote bogus research. Many PhD students took the path of publishing in fake journals. For others, publishing a journal article simply became an additional burden and delayed obtaining a PhD.

There are reasonable arguments both for and against the requirement for publishing a journal article before completion of the PhD dissertation. In theory, publishing one article – not necessarily in a journal which ranks among the top 10 or 20 in the field – that draws from one’s (ongoing) PhD dissertation does not seem too demanding.

One might add that it should even be expected from students at the best universities in the country. On the other hand, writing the dissertation is the final frontier for PhD students. All else including course work, exams and publishing journal articles is secondary.

It is not only in India that there are disagreements on whether or not PhD students should publish journal articles before completing their PhD dissertation. In North America and elsewhere too, the issue is sometimes a matter of debate, although the broader context is somewhat different. In most cases, publishing journal articles is not mandatory; instead, PhD students are strongly advised by their PhD supervisors and mentors to publish an article or two in academic journals (or obtain an acceptance from them) by the time they defend their PhD dissertation.

However, this is to make their applications sufficiently competitive for faculty positions. In an increasingly tight job market with a shrinking of tenure-track positions over the years, PhD students must push themselves to publish in order to improve their chances of securing a tenure-track faculty position. Furthermore, the culture of publishing is deeply ingrained at research and teaching-cum-research universities and incoming PhD students absorb the informal rules fairly quickly.

Also read: Flaws in Academic Publishing Perpetuate a Form of Neo-Colonialism

Much likes Western universities, the best Indian universities – especially some of the IITs, IIMs and several private universities – expect entry-level faculty i.e. assistant professors to either have published journal articles (or have articles accepted for publication) or (in the case of those from the humanities and social sciences) have obtained a book contract. These institutions are actively seeking and hiring those with PhDs from abroad precisely because they have completed their PhDs at institutions where a culture of publishing is widespread and have usually published journal articles during the course of their PhD programmes.

Therefore, whether or not the UGC gets rid of the journal article requirement, PhDs from Indian universities who are entering the job market minus publications are at a disadvantage. That being said, at the average Indian university, indeed at most Indian universities, faculty appointments are often rigged so such things do not matter at all.

Pushkar is the director at the International Centre Goa. The views expressed here are personal.

We May Soon Find Out How Good Our PhDs Are

A review of recent PhDs will be a useful first step in knowing just how bad things are.

There are occasions, however rare, when the much-maligned University Grants Commission (UGC) deserves our applause. One such occasion is now.

In a public notice issued on May 21, 2019, the UGC has invited proposals from interested parties to conduct a study on ‘The Quality of PhD Theses in Indian Universities’. The objective of this initiative is to review and assess the quality of all PhD dissertations submitted at India’s universities – whether central, state, state-private and deemed-to-be universities – over the last 10 years.

The specific quality-aspects that are to be looked into is open. The notice states that these are to be proposed by the interested parties themselves. Presumably, plagiarism will top the list because it is commonplace in Indian academia and at the same time it is easier to determine due to the widespread use and effectiveness of anti-plagiarism software. Some of the rest, such as manufactured or faked research – whether undertaken in laboratories, libraries or in the field – will surely take longer than the six-month period the UGC has announced for the completion of the study.

But why is this review necessary in the first place?

It is true that the Indian government spends precious little on research. However, it is also true that much of what passes off as research by a very large number of researchers and teachers at our universities is bogus. Plagiarism and other kinds of research fraud are commonplace. Indians lead the world in terms of publishing in fake journals. We know that things are bad because even though the quantity of research has improved somewhat since the introduction of the Academic Performance Indicators (APIs), the quality of our research output remains low. And poor research performance is one of the reasons why India’s universities are lagging in world university rankings.

In this context, the PhD thesis or dissertation is typically the first major research product that a prospective faculty member produces. The quality of a PhD dissertation indicates two basic things. First, and the most obvious, is that it indicates whether the student/researcher is capable of producing good-quality research.

Second (and this is particularly important in India): the review will reveal which universities and university department support genuine research and/or are capable of doing so. In some, or even, many cases, poor quality of PhDs may have more to do with the limitations of a department/university than with an individual’s capability. Institutions can be obstacles to good research. It is a sad fact that a very large number of departments/universities in India do not have the physical infrastructure – such as libraries and laboratories – and the intellectual/academic capabilities – such as suitably-qualified mentors for PhD students – to support research.

This may not be true across all disciplines/departments at a given institution but certainly applies to many. In addition, there are cases where departments/universities are not entirely lacking in these respects but they have given up on genuine research because the benefits of bogus research – whether from plagiarism or publishing in fake journals – are easy and immediate in terms of faculty promotions and other benefits such as a high score on research for departments/universities in national rankings. Of course, the incentives for bogus research are even higher than normal because penalties for plagiarism or other sins are nearly unheard of.

Despite institutional limitations, however, there are still some good quality PhD dissertations that are written and submitted. A review will reveal just how large their numbers are.

Overall, a review of recent PhDs will be a useful first step in knowing just how bad things are or whether they are better than we think in terms of research at our universities.

It is possible that some of those who obtained their PhDs in the last 10 years may have started to worry about being found out because their dissertations are plagiarised or contain manufactured research. There is nothing to fear from this review, however. It appears from the public notice that the UGC’s intentions are simply to find out what has been going on over the past decade.

Further, India’s universities are extremely tolerant towards research fraud. It is rare for plagiarists and other offenders to be punished even when they are caught. Still, it must be admitted that progress – though slow – is being made. Last year, the UGC approved the UGC (Promotion of Academic Integrity and Prevention of Plagiarism in Higher Educational Institutions) Regulations, 2018 to address the problem of plagiarism by students, researchers and faculty.

The UGC is to be congratulated for taking the decision to assess the PhDs submitted over the last 10 years, even if it is likely that the review will throw up seriously embarrassing results. According to the All India Survey of Higher Education, in 2010, there were 77,798 students enrolled in Ph.D. programmes. Their numbers had doubled by 2017, when there were 161,412 students in PhD programmes.

While this data represents less than 0.5% of all the students, it will be useful for the government to know whether its strategies for promoting and monitoring research have proved beneficial or not. An increase in student numbers in PhD programmes is not good enough. What matters even more is the quality of their research.

Pushkar is director, The International Centre Goa. The views expressed here are personal.

UGC’s Anti-Plagiarism Rules Don’t Make Room for Realities of Indian Academia

It is difficult to imagine that people in positions of power and with strong political connections will be caught and penalised for plagiarism.

The University Grants Commission (UGC) recently approved the UGC (Promotion of Academic Integrity and Prevention of Plagiarism in Higher Educational Institutions) Regulations, 2018. These regulations are dedicated to addressing plagiarism by students, researchers and faculty at India’s universities and colleges.

Plagiarism is, along with publishing in fake journals and fabrication and falsification of research, among the major offences committed by academics worldwide. Other lesser known, though common, offences include the practice of adding an author’s name to a paper when she has not contributed to the research, not acknowledging conflicts of interest and general sloppiness in conducting research.

In many countries and certainly at the better universities across the world, there are regulations and strong mechanisms in place to detect and punish offenders. Still, academic fraud of one kind or another takes place everywhere.

Until recently, the UGC as well as the universities have been rather casual with matters of research fraud. As a result, existing regulations and investigative and punitive mechanisms have not deterred fraudulent activities. Offenders usually get away with small and big crimes and this has encouraged others to follow the same path. Slowly, however, the government is taking steps to address research fraud. The latest set of UGC regulations pertaining to plagiarism are an example of the government’s intent to control research fraud. But it is important to understand that anti-plagiarism measures by themselves only attack one pillar of research fraud and must be combined with attacks on at least two others: fake journals and fabrication/falsification of research. It is also necessary to admit that the immediate impact of these anti-plagiarism measures will be minimal.

API and promoting research fraud

Though research performance of India’s universities and other academic institutions has improved, it is overall still dismal. There are many reasons for this research deficit, of which inadequate funding is an important one but not the only factor. Indian academic institutions underperform in research because most universities have traditionally emphasised teaching over research. Indeed, research has been close to the bottom in terms of institutional priorities. However, with the growing popularity of world university rankings in which India’s universities perform poorly because of low research output, the government started to take notice and take measures to address the deficit.

One of the first attempts at addressing the research deficit was the introduction of the academic performance indicator (API) in 2010. The API required all faculty members at central universities and central-government funded colleges to do research and publish – in addition to teaching and administrative duties – to benefit from the Career Advancement Scheme (CAS). Unfortunately, most state universities and colleges also adopted the API. With the widespread application of API across all kinds of academic institutions, including undergraduate institutions that are entirely teaching-focused, faculty members were left with no choice but to publish or stagnate in their positions. These included people who lacked any basic training for research and those who were already overburdened with teaching, administrative and other responsibilities. Further, most teachers work at colleges with woeful infrastructure and where the overall academic environment is inimical to substantive research.

The result of the nearly-compulsory implementation of API was that many faculty members took recourse to plagiarism, publishing in fake journals or both. While for some, plagiarising and/or publishing in fake journals was simply a short-cut to career advancement, for most it was a necessity. In both cases, they fed each other: research-deficient faculty members plagiarised and published to catch-up or get ahead of those who were carrying out genuine research, and at some point the latter realised that they would be left behind if they did not do the same things. Many started to plagiarise, publish and flourish. This gave birth to what is now a flourishing global industry of fake journals headquartered in India.

To a great extent, the current and ongoing wave of research fraud in the form of plagiarism and publishing may have started with the API, which itself was a by-product of the growing popularity of the world university rankings. The API was created to boost India’s research output and improve the rankings of its universities; instead, it gave a tremendous boost to fraudulent research.

The API is soon to be revised but the improved version falls short of recognising the proper structure and complexities of India’s higher education sector and will continue to be abused.

Fake journals menace

The Indian Express recently carried a series exposing the fake journals industry in India. This has reportedly led to an immediate response from the government, with the higher education secretary R. Subramanyam issuing an order: “If any substandard/predatory journals are found to be in the list recommended by the vice-chancellors, that would be held personally against the vice-chancellor concerned.”

It is a pity that neither this government nor previous ones paid much attention to reports extending over more than a decade on various kinds of academic malpractices that benefitted dishonest academics and punished honest, hard-working ones. This has direct implications for the new anti-plagiarism measures that the government has put in place. Over time, a large number of academics have risen up the ranks by getting away with academic fraud and the task of restoring academic integrity is now to be placed in their hands!

To its credit, the government has in the last couple of years tried to deal with the menace of fake journals. In mid-2016, the University Grants Commission (UGC) took up the difficult task of preparing a list of legitimate journals; faculty members would have to publish only in these journals to benefit from the API. The task has so far been done rather badly. In early 2017, the UGC released a messy first list of legitimate journals that included the names of several fake journals and excluded many legitimate journals. In May 2018, it removed the names of 4,305 titles from its list, noting that these were “of poor quality,” provided “incorrect/insufficient information” about themselves or made “false claims.” In the process of excluding fake journals, however, it also removed several legitimate journals from the list. ‘The list’ very much remains a work in progress and will be for a while.

The new UGC rules

The new UGC regulations on plagiarism represent a sincere attempt to restore some credibility to Indian academia. The text of the regulations is clearly written and there seems to be little that is ambiguous or wrong with it. One can of course debate some specific aspects of the regulations but overall, it is an excellent document. However, the true test of any set of rules and regulations is whether they will be effective. In this case, it seems that anti-plagiarism measures will at best only be partially successful and that too with the passage of a considerable period of time.

In the pre-API era, only a select ambitious academics indulged in research fraud because the others did not have to publish research articles for regular career advancement. For the most part, what mattered was teaching experience measured in terms of number of years. After the API was introduced, publishing was no longer a matter of choice, as outlined above.

The Indian higher education system also experienced massive deterioration from the 1980s onwards, and certainly in terms of the kinds of people it attracted, including faculty members. Academia became for the most part a leftover profession, which one joined after failing at everything else. At the risk of generalisation, one can say that India’s higher education sector is dominated by the mediocre in terms of its faculty. A simple research project involving re-examination of PhD dissertations submitted at some Indian universities, including the best ones, will almost certainly show that many of existing faculty members carried out substandard research and engaged in plagiarism and fabrication. Many of them are now heads of departments, principals, vice-chancellors and academic bureaucrats in positions of power.

The clauses regarding detection, reporting and handling of plagiarism in the UGC regulations suggest one can’t be too optimistic that they will be effective. These regulations call for the creation of a Departmental Academic Integrity Panel (DAIP) consisting of the head of the department as chairman and two other members, one a senior academic from outside the department, to be nominated by the head of the institution; second, a person well versed with anti-plagiarism tools, to be nominated by the head of the department. Plagiarism cases are to be reported to the DAIP, which will also have the power “to assess the level of plagiarism and recommend penalty(or penalties) accordingly.”

The UGC regulations also call for the creation of an Institutional Academic Integrity Panel (IAIP) consisting of the pro-VC/dean/senior academician of the institution as chairman, and three other members, all of them nominated by the vice-chancellor/principal/director of the institution: a senior academic from the home institution; one member from outside the home institution; and the third, a person well versed-with anti-plagiarism tools.

According to the UGC regulations, the manner of dealing with cases of plagiarism will be as follows:

If any member of the academic community suspects with appropriate proof that a case of plagiarism has happened in any document, he or she shall report it to the DAIP. Upon receipt of such a complaint or allegation, the DAIP shall investigate the matter and submit its recommendations to the Institutional Academic Integrity Panel (IAIP) of the HEI.

What could go wrong?

As stated earlier, the regulations are quite well-prepared and written – but there are immediate problems.

Assuming that the head of the department is ‘clean’, can we expect her to pursue charges of plagiarism against a colleague? Department heads are appointed by rotation and the current head may not take any action for fear of being harassed when someone else takes over as head. Second, if the head is someone who has herself engaged in shady practices, she is even less likely to take any action since others may target her as well. The same set of issues will come into play at the institutional level, with the IAIP.

The fact is that the success of the anti-plagiarism regulations is contingent on how they are applied by the people who run India’s universities, from vice-chancellors down to faculty members. There are all kinds of structural obstacles to making the anti-plagiarism regulations work effectively. They will be successful if over time, Indian universities, especially research and teaching-cum-research institutions, open themselves up to hiring faculty on the basis of merit and with proper scrutiny. With the exception of a few institutions, this is not happening yet. For example, it has been reported that even at a premier institutions such as the Jawaharlal Nehru University (JNU), several new faculty appointments have a record of having plagiarised in their work.

Recently, Parliament was informed that over the past three years, there have been three cases of plagiarism against vice-chancellors and others where appropriate action has been taken: Chandra Krishnamurthy, vice chancellor of Pondicherry University (2015); Anil Kumar Upadhyay, reader of Mahatma Gandhi Kashi Vidyapeeth, Varanasi (2017); and Vinay Kumar Pathak, vice chancellor of Dr A.P.J. Abdul Kalam Technical University, Lucknow (2018). These numbers are ridiculously low. Of course, there is no way to know what the actual numbers of plagiarists and others who engage in research fraud are, but it would not be wrong to assume that many of them will be responsible for giving teeth to the UGC’s plagiarism regulations.

In an interview, Jeffrey Beall, who ran a hugely-influential website which identified fake journals and publishers until he was forced to shut down, said, “There is no easy solution. I learned that the publishers now have much political power, and they will do anything possible, including collusion with universities, to attack their critics.”

The same is true for plagiarism and plagiarists. There is no easy solution. Many plagiarists are vice-chancellors, principals, deans and occupy those positions because of their proximity to politicians. They are considered respected members of the academic community. It is difficult to imagine that people in positions of power and with strong political connections will be caught and penalised for plagiarism.

There is, however, one step that the government can take to limit plagiarism and research fraud: tweak the API to make research optional for college teachers.

Pushkar is director of The International Centre Goa (ICG), Dona Paula. He tweets at @PushHigherEd. The views expressed here are personal.

The Hindi version of this article, which was submitted for publication on August 8, appeared in Rajasthan Patrika on September 4 and may be accessed here.

Joint Director of Zoological Survey Plagiarised Book, Published False Data

His and his colleagues’ actions imperil coral research within the country, especially in the time of climate change, and leave scientists around the world with data that doesn’t make sense.

Bengaluru: Research on coral reefs published by scientists at the Zoological Survey of India (ZSI) has been found to have plagiarised content, some of it wrong and misleading.

This was flagged in an email to a mailing list for coral researchers by Douglas Fenner, who has been working on corals around the world for decades.

When Fenner went through a paper on corals in the Andaman and Nicobar Islands by a research group from ZSI, he found a long list of 274 coral species. ‘Status of Scleractinian Diversity at Nancowry Group of Islands Andaman and Nicobar Islands’ is authored by Tamal Mondal, C. Raghunathan and K. Venkataraman, and was published in the Middle-East Journal of Scientific Research in 2013. The list, however, included two species, Orbicella annularis (formerly Montastraea annularis) and Siderastrea radians, that are from the Caribbean and not known to be present in the Indo-Pacific region including the Andaman and Nicobar Islands.

Orbicella annularis. Credit: Louiswray/Wikimedia Commons, CC BY-SA 3.0

Orbicella annularis. Credit: Louiswray/Wikimedia Commons, CC BY-SA 3.0

“If the authors knew how startling and important a find those were, they would have made a big deal out of it, worth a big paper in a major journal,” Fenner wrote in his email. An earlier paper (‘An Observation on the Coral Bleaching in Andaman Islands’, published in the International Journal of Environmental Sciences in 2011) by two of the same three authors – Mondal and Raghunathan – listed 293 species of corals, of which nine, Fenner noted, were Caribbean.

He then looked up a book, ‘New Records of Scleractinian Corals in Andaman and Nicobar Islands’ by Mondal and Raghunathan, along with three other authors – Ramakrishna, R. Raghuraman and C. Sivaperuman – at the ZSI. He found more of the same, along with outright plagiarism.

This plagiarism manifested in multiple ways. The introductory section in the ZSI book is identical to text from various sources, chiefly ‘Reefs at Risk: A Programme of Action‘, published by the International Union for the Conservation of Nature. This source is not cited at all, even in the references section. Moreover, a Google search using any sentence picked at random from the ZSI book turns up matches from various other sources that have not been cited. Even a typo in the book, “1O mm” instead of “10 mm” (the uppercase letter O instead of the digit 0) has been carried over from the original source.

The norm in academic work is to use quotation marks and to cite the source from which one is quoting. Neither of these have been done in the ZSI book. What’s more, many of the citations that do appear were present in the original source.

Fenner didn’t stop there. He compared the descriptions of many of the corals in the ZSI book to that found in ‘Corals of the World’ by J.E.N. Veron, published in 2000. “Lo and behold, the wording was exactly the same, every single word,” wrote Fenner in the mailing list email.

Veron’s book (an online version is here) is considered the authoritative source on corals. Fenner himself worked with Veron – a legendary figure among coral researchers, said to have discovered 20% of all coral species – for six years in the Caribbean. Fenner is now based in American Samoa, a US territory in the South Pacific, as a consultant for the National Oceanic and Atmospheric Administration.

Fenner found that other bits of Veron’s book had found their way into the ZSI book too. It listed, for each coral species, “key characters” that were used to identify that species – and this text too was identical to that in Veron, unattributed. “This is plagiarism,” wrote Fenner. “It is passing off the text as though it was original and written by the authors.”

Porites porites. Credit: Wikimedia Commons

Porites porites. Credit: Wikimedia Commons

The book, like the papers, claimed to find corals in the Andaman and Nicobar Islands that are actually found only in the Caribbean (in addition to at least three species that don’t exist). To compound the misleading nature of the information, these Caribbean species have in fact been misidentified. The photos accompanying these labels, Fenner noted, are actually those of Indo-Pacific species.

For instance, one of the photographs in the book is labelled as Porites porites, a Caribbean species that is a hexacoral, with six tentacles. But the photo actually shows an octocoral, with eight tentacles. “From the texture of the colony surface, it appears to be Heliopora coerulea,” an Indo-Pacific species, said Fenner. “This is a very basic mistake and demonstrates clearly that they don’t know what they are doing. That makes the data totally unreliable, and any other scientist that uses it will have many mistakes. So the scientific community has to know that their data and papers are unreliable.”

As it happens, one of the questionable papers (Int. J. Env. Sci. Vol. I (1st Issue), pp. 37-51, 2011) by this research group has been cited two times by other authors. The paper lists two species twice, each time with different data. “That suggests that they may have been making the data up,” said Fenner. Similarly, in a third paper published by Mondal and Raghunathan with Venkataraman (‘Diversity of Scleractinian Corals in Middle and North Andaman Archipelago’, published in the World Journal of Zoology in 2011), one species is listed twice with different data.

The problem, he pointed out, is that these published works do not contain any information that could be used to check if the claimed species identification is accurate.

Heliopora coerulea. Credit: Wikimedia Commons

Heliopora coerulea. Credit: Wikimedia Commons

The lead author of the book, Ramakrishna, is a former director of the ZSI. He expressed surprise at the finding of plagiarised content in the book when contacted by email. He said that he had retired in July 2010 and that the book was published only later, in September 2010. K. Venkataraman, a coauthor of one of the two papers that reported Caribbean corals in the Andaman and Nicobar Islands, succeeded Ramakrishna as director of the ZSI. C. Raghunathan, who is an author on all three papers as well as the book, is presently the joint director of the ZSI. Email queries to Mondal, Venkataraman and Raghunathan were unanswered at the time of writing – as were emails to Kailash Chandra, the present director of the ZSI.

Ironically, Chandra was himself accused of plagiarism in 2014 along with three of his colleagues, in a paper on hawk moths. The then director Venkataraman had promised stringent action if Chandra was found guilty. Chandra took over as director of the ZSI in 2015, according to his CV.

§

The ZSI, founded in 1916, has been and continues to be an important institution. It “holds the type specimens for many species in the country,” said Rohan Arthur, a marine scientist at the Nature Conservation Foundation, where he directs the reef research program. “It’s found across the country, and is still among the principal institutions of taxonomic research in the country. However, institutions of this calibre really need to have adequate checks of research quality and ethics if they do not want to squander their formidable academic reputation. Many of our legacy research institutions in the country need a serious, ground-up rethink of how they’re doing their research if they want to remain relevant.”

One conspicuous problem with many of the publications from this research group at ZSI, as can be seen from their CVs, is that they are in-house publications. “In-house publications have their place and they are a valuable source of information, where scientists can publish their work rapidly,” said Arthur. “But unless their quality is rigorously monitored, they can rapidly lose their credibility.”

Fenner suggests that the ZSI should have outside experts, including internationally recognised coral taxonomists, as reviewers for its in-house publications in this area. And that they should collaborate more with researchers abroad and publish in international journals. “That is a way to get outside help in improving quality,” he said. “ZSI needs to rely less on quantity and more on quality, as judged by international judges, not only Indian.”

However, foreign collaboration for taxonomic research in India is hampered by regulations such as the Biological Diversity Act, 2002, which make it difficult to exchange specimens. One of the intentions behind the Act was to stop biopiracy: people abroad patenting biological material of Indian origin for commercial profit. However, as a 2008 article in Current Science, titled ‘Death sentence on taxonomy in India’, put it,

The Biological Diversity Act, 2002 seriously curtails the scientific freedom of individual taxonomists by putting draconian regulations on the free exchange of specimens for taxonomic research and threatens to strangulate biodiversity research in India with legal as well as bureaucratic control.

The Act requires that exchange of specimens for research be done through the government, with all the bureaucracy and inappropriate handling of delicate and valuable specimens that that entails.

A related side effect is that specimens housed in the ZSI’s section in the Indian Museum in Kolkata has become unavailable to researchers abroad. As the article said, “It is generally accepted among the scientific community that the types [specimens] are the property of science and should be made available to bona fide researchers throughout the world.” Failure to do this on the part of government-run museums in India would, it went on to add, “totally isolate Indian biodiversity researchers and is akin to a self-imposed siege on scientists in the country.” This situation appears to have contributed to taxonomic research in India growing stale.

Researchers in India outside government-run institutions find it difficult to collect new specimens too. “Taxonomic studies of coral down to the species level requires some fairly detailed skeletal analysis,” said Arthur, who limits himself to identifying the genus rather than the species. (Genus is the level above species in biological classification.) “Without being able to collect physical specimens, it’s very difficult to identify coral species. Most institutions in the country apart from the ZSI find it very difficult to even get samples because corals are a protected group.”

§

In April 2018, India adopted a new tiered system for penalising plagiarism, based on evaluating what percentage of the work in question has plagiarised content. These regulations were issued by the University Grants Commission and is binding on universities. It’s unclear if they apply to a research organisation such as the ZSI, which comes under the Ministry of Environment, Forest and Climate Change.

Fenner had contacted the ZSI researchers with his findings. “I had hoped that the people in ZSI who did this would admit publicly what they did. I urged them to do it, but they have not done it,” he said. “ZSI needs to change its culture of tolerance of these things.”

Arthur, who has worked on corals around India and Kenya, cautioned against generalising from this one instance of plagiarism, given that the email Fenner sent to the mailing list was titled “Plagiarism in coral reef science in India”. “I find it a little objectionable – and more than a little parochial – to label this ‘Plagiarism in coral reef science in India’, as he did,” said Arthur. “While raising the issue is critical, it does a huge disservice to tar the entire research community with a single brush.”

“There is a problem of quality, we have a problem with our attitude towards publishing. It’s a much larger problem that has to do with certain institutional cultures – that certainly needs fixing,” Arthur said. “But this should not take away from the large number of scientists who are doing some excellent research in marine and coral reef science in India.”

Fenner agreed. “It is clear that many scientists in India are honest and do not engage in these practices, and the problem is by no means restricted to India,” he said. “It is just shocking to see such a blatant example of it in documents published by professionals, instead of just students, and the authors getting away with it and benefiting from it.”

India, in fact, has a long heritage of pioneering work in coral taxonomy. The first International Coral Reef Symposium was held at the Central Marine Fisheries Research Institute in Mandapam Camp, Tamil Nadu, in 1969. One of the forces behind this was the late C.S. Gopinadha Pillai.

“Dr Pillai is world-renowned for having been one of the most important pioneers of coral reef taxonomy, globally,” said Arthur. “The Latin name of every species known to man will be followed by the name of the taxonomist who first described it formally. If you examine the list of corals known from tropical reefs today, Dr Gopinadha Pillai’s name appears frequently. His work has been quite an inspiration for many people globally when it comes to coral taxonomy. But since his passing away, there have been very few people to carry on that mantle. As a result, what we know of coral taxonomy from reef regions in India is still relatively poor. It’s still not completely well-worked out.”

Plagiarised and unreliable work from institutions like the ZSI makes this worse, not least because quite apart from the academic implications, there are real-world implications too.

“We depend on taxonomists to give us accurate descriptions of the species we find. And if they did not, many of the biogeographic trends we rely on don’t make much sense,” said Arthur.

Estimating the population of a species of coral, to begin with, requires reliably identifying species. If this isn’t done, any population-based estimates become unreliable. “So, if you find either increases or decreases in numbers of species in particular regions,” said Arthur, “it is very difficult to evaluate if this is a true trend that needs exploration for scientific or for conservation reasons, or if it’s merely an artefact of shoddy science.”

And we could do without shoddy science at a time when corals are facing climate-change-induced stress. “We know that the responses of coral to climate change are highly dependent on the species. So you can get very wrong information if you’re identifying species wrong,” said Arthur. “So on one level, there are troubles academically but there could be very real problems in terms of management as well.”

Nithyanand Rao is a freelance science writer.

The Stanford Prison Experiment, or How to Make It Big Using Research Fraud

There are now multiple sources to prove that the infamous experiment was a sham.

New Delhi: The Stanford prison experiment is one of the most popular psychological studies of all time. It’s often quoted in textbooks and research papers, as well as in popular culture. Despite being almost 50 years old, the experiment still captures the public imagination. And now we know that imagination – and not rigorous scientific study – went into creating it, too.

In 1971, Philip Zimbardo, a young psychology professor at Stanford University, built a fake jail in a basement on campus. Nine ‘prisoners’ and nine ‘guards’ were chosen after a call for applications in the newspaper. All the participants were male and college-aged, and were paid for their time. ‘Senior staff’ at the prison included Zimbardo and some of his students.

Within six days, the experiment was shut down, apparently because guards were taking on cruel, inhuman attitudes and prisoners were traumatised and breaking down. The results were used – and still are – to argue that people have inherent sadism within them, which comes to the fore when they have power over others or when put in certain situations.

But what if the entire experiment was a sham?

There are now multiple sources to prove that it was.

A report by Ben Blum published on Medium has used interviews to show that the famous experiment was doctored to provide certain results. One of the most famous moments from the study was when one of the prisoners, Douglas Korpi, kicked at the door while screaming, “I mean, Jesus Christ, I’m burning up inside! Don’t you know? I want to get out! This is all fucked up inside! I can’t stand another night! I just can’t take it anymore!”

Korpi has now told Blum that he was faking it. He was definitely afraid – but not of the guards. He thought he would have time in ‘prison’ to study for the GRE, but the guards weren’t giving him his books. So he was faking a breakdown so that they would let him out. It was all quite fun, he said, until he wasn’t allowed to leave. “I was entirely shocked,” he told Blum. “I mean, it was one thing to pick me up in a cop car and put me in a smock. But they’re really escalating the game by saying that I can’t leave. They’re stepping to a new level. I was just like, ‘Oh my God.’ That was my feeling.”

Other prisoners also said that they asked to be let out but weren’t. Zimbardo, however, denied that people were kept inside against their will. He said the consent forms included a safe phrase – “I quit the experiment” – if you really wanted to leave. But the consent forms are available online, Blum writes, and do not include anything of that kind.

This is important because, according to Blum,

“Zimbardo’s standard narrative of the Stanford prison experiment offers the prisoners’ emotional responses as proof of how powerfully affected they were by the guards’ mistreatment. The shock of real imprisonment provides a simpler and far less groundbreaking explanation. It may also have had legal implications, should prisoners have thought to pursue them. Korpi told me that the greatest regret of his life was failing to sue Zimbardo.”

According to Korpi, he admitted that his breakdown was false, but Zimbardo insisted that they keep portraying it as real for years to come. Zimbardo was big on media coverage of his experiment, sending regular press releases to a local channel while it was on and arranging media appearances for Korpi and others after. “If he wanted to say I had a mental breakdown, it seemed a minor note,” Korpi told Blum. “I didn’t really object. I thought it was an exaggeration that served Phil’s purposes.” Even after Korpi said he no longer wanted to appear on shows, Zimbardo “hounded” him to continue.

That’s not all. The so-called extreme behaviour adopted by the guards also has another explanation. This behaviour did not come out of the blue, Blum reported, but was taught to the guards in the orientation session.

“We cannot physically abuse or torture them,” Zimbardo told them, in recordings first released a decade and a half after the experiment. “We can create boredom. We can create a sense of frustration. We can create fear in them, to some degree… We have total power in the situation. They have none.”

There’s another thing about the Stanford prison experiment that isn’t talked about much, and that’s the role David Jaffe, an undergrad student who was the ‘warden’, played. Jaffe and his friends reportedly conducted a similar simulation experiment on their on a few months before Zimbardo’s, and that’s what inspired the professor.

“Dr. Zimbardo suggested that the most difficult problem would be to get the guards to behave like guards,” Jaffe wrote in a post-experiment evaluation. “I was asked to suggest tactics based on my previous experience as master sadist. … I was given the responsibility of trying to elicit ‘tough-guard’ behavior.”

The guards’ behaviour, then, was not as spontaneous as Zimbardo wanted people to believe.

One of the most cruel guards during the experiment was Dave Eshelman, who speaks in a southern American accent in the audio recordings released by Zimbardo. Eshelman told Blum that both his accent and his attitude were very much an act and not a result of the situation he was in; he was trying to help the researchers succeed. “I took it as a kind of an improv exercise,” Eshelman said. “I believed that I was doing what the researchers wanted me to do, and I thought I’d do it better than anybody else by creating this despicable guard persona. I’d never been to the South, but I used a southern accent, which I got from Cool Hand Luke.

This isn’t the first time the validity of the Stanford prison experiment has been questioned, though the amount of detail and the many levels on which Zimbardo’s claims have rebutted are new. Even just after the study was published, psychologists raised questions on the researchers’ analysis of the results. Well-known psychologist Eric Fromm wrote,

“The authors believe it proves that the situation alone can within a few days transform normal people into abject, submissive individuals or into ruthless sadists. It seems to me that the experiment proves, if anything, rather the contrary.”

An attempt to replicate the experiment also failed to provide the same results. According to one of the authors of that study, Steve Reicher, Zimbardo tried to stop their paper from being published. The British Journal of Social Psychology decided to publish the study anyway, accompanied by a note from Zimbardo that said, “I believe this alleged ‘social psychology field study’ is fraudulent and does not merit acceptance by the social psychological community in Britain, the United States or anywhere except in media psychology.”

But in spite of the doubts raised in the past, the experiment continues to be taught to psychology students in the guise of a valid scientific exercise. According to Vox, this is because there is a lag between popular consciousness around certain results and what teachers and textbooks say. And a part of that can only be fixed when old studies are followed up on – or even replicated – to make sure that their results still stand, or look into why they don’t. Science reporter Brian Resnick writes,

“If it’s true that textbooks and teachers are still neglecting to cover replication issues, then I’d argue they are actually underselling the science. To teach the “replication crisis” is to teach students that science strives to be self-correcting. It would instill in them the value that science ought to be reproducible.”

Blum has a different analysis for why people still quote and teach the Stanford prison experiment – it makes us feel better about our actions.

“As troubling as it might seem to accept Zimbardo’s fallen vision of human nature, it is also profoundly liberating. It means we’re off the hook. Our actions are determined by circumstance. Our fallibility is situational. Just as the Gospel promised to absolve us of our sins if we would only believe, the SPE offered a form of redemption tailor-made for a scientific era, and we embraced it.”

Research Fraud of One Kind or Another Will Continue to Take Place in India

Initiatives to check research fraud usually fall short because they are not eventually implemented or because techniques of fraud have advanced, rendering older measures as well as countermeasures irrelevant.

Initiatives to check research fraud usually fall short because they are not eventually implemented or because techniques of fraud have advanced, rendering older measures as well as countermeasures irrelevant.

Various issues of the journal 'Science' issued between 2000 and 2011. Credit: yeaki/Flickr, CC BY 2.0

Various issues of the journal ‘Science’ issued between 2000 and 2011. Credit: yeaki/Flickr, CC BY 2.0

This is the second part of a two-part essay on the UGC’s attempts to improve the research output out of Indian institutions. The first part is here.

A new UGC committee is consulting experts and organisations from across disciplines – including the Indian Council of Medical Research (ICMR), the Council for Scientific and Industrial Research (CSIR) and others – to prepare an exhaustive list of legitimate journals across disciplines. The draft list is expected to be prepared in about six weeks. The selected journals will be classified in different categories based on their impact and relevance. The rationale behind this initiative is to prod Indian academics to publish in legitimate rather than in fake or substandard journals. If successful, this may help improve the quality of research and publications at India’s universities.

One of the challenges about this initiative is that the UGC’s master-list of legitimate journals will need to be sufficiently comprehensive and inclusive across disciplines. It will also need to be updated at regular intervals to include new journals. It is also very important, especially for the humanities and social sciences, that the list of journals is prepared and categorised in a way that it discriminates favourably toward those which are more relevant to India.

The other important question is whether the UGC’s master-list of legitimate journals will curb the larger problem of research fraud. Academics constantly find new techniques to cheat and the UGC initiative will therefore likely have only a limited impact.

The journals master-list

Preparing a comprehensive list of legitimate journals requires the labour of very competent people trained in library sciences. Academics need to be consulted, of course, but I do not think they have the proper training or the interest to do this kind of job properly. One has to wonder if we have enough professionally-trained librarians at our universities and other academic institutions. It is hard to know. This is the first problem with preparing an exhaustive list of legitimate journals.

A related issue is how frequently the master-list will be updated. These days, new journals, including legitimate ones, are introduced quite frequently (though certainly not as often as fake journals) and it will be necessary to update the master list at least on an annual basis. If that is not done, academics who publish in new journals may lose out until that time when new journals are included in the master list.

Second, the task of preparing a master-list of journals is best handled by organisations involved in the business of ranking journals. There are no such Indian organisations. Therefore, one must depend on the products of Thomson Reuters – which includes the prestigious Science Citation Index (SCI) – and Elsevier’s Scopus database. They are both highly-rated and used by academics and higher-education institutions worldwide.

However, for the reasons discussed below, the ranking and/or impact factor of journals by Thomson Reuters and Elsevier is not quite suitable for faculty in the humanities and social sciences who work on India (and this is also true to a lesser degree for other areas of study, including the sciences). For those who may not know, the impact factor was created by Eugene Garfield, a librarian, in the early 1950s and measures how often journal articles are cited.

Apropos research in humanities and the social sciences

The research carried out by Indian academics, as also by academics in other non-Western countries, tends to be country-specific and at best region-specific. The reasons for this are manifold but lack of resources and interest are both important factors. For example, in India, most historians research and write on Indian/South Asian history. Now, at the global level, the major humanities and social science journals – those with the highest impact factor – have a strong pro-Western bias.

This is clearly reflected in their content. An overwhelming majority of articles are focused on Western countries. For example, if one looks at the top-rated philosophy, politics or history journals, there is very little content on Indian philosophy, politics or history. In effect, Indian academics are nearly shut out from publishing in top-ranked journals because most of them write on India or South Asia. The top-ranked journals have no interest in such content unless it is on issues that affect Western countries in a very direct way (for example, Islamic terrorism).

India-specialists the world over publish in specific journals which they know academics from around the world working on India consult on a regular basis. Most of these journals tend to be ranked relatively low in terms of their impact factor. Two good examples, both from India, are Economic and Political Weekly and Contributions to Indian Sociology. Many other journals which are both popular and relevant for Indian scholars, even though published in Europe or North America, also have relatively low impact factor.

Consider the Scopus list of top-ranked humanities and social science journals, for instance: There are very few grade-A and even grade-B journals in anthropology, philosophy, sociology, history, politics, English, and other subject areas which publish India-content except occasionally. Indeed, many of these top-ranked journals do not carry India content in the form of an article or two devoted to Indian politics or history or something else even once a year. At the same time, those journals which include substantial India content – say, 20% of all articles in a particular journal over a period of two years – are nearly all region-specific, and rarely figure in grade-B and typically find a place in grade-C or lower. Interestingly, several region-specific journals which focus on Europe and Western countries are ranked grade-A and grade-B.

What this means is that India’s social scientists rarely publish or get published in Elsevier’s grade-A journals. And this is not just about India-educated or India-based social scientists; over the last two decades, leading India-experts based at the top 50-100 universities have only on occasion published in grade-A journals (the one exception is economics).

In sum, the UGC’s master list of top journals in the humanities and social sciences, in terms of relevance and impact factor, will need to be amended if it uses or ‘borrows’ from the Elsevier or Thomson Reuters lists. The issue is not one of competence and/or research abilities of India’s social scientists but more about the very limited opportunities they have for publishing in top-ranked journals.

There is a different kind of bias against faculty in the sciences, engineering and other areas. Unlike the humanities and social sciences, the sciences and engineering are region-neutral. Therefore, academics in these areas do not face the same challenges that social scientists do. However, it is widely believed that in those cases where journal submissions reveal the institution/country of origin of an academic, there is discrimination on the basis of language and to a lesser extent regarding content. Of course, there are studies which show that “linguistic bias” is a “myth” and find that the issue is more about the poor quality of research.

Based on my discussions with some very capable and widely-published colleagues, the truth may be closer to something in between. There are cases where poor quality research papers are submitted and rightly rejected; in other instances, however, rejections are handed out for the flimsiest reasons.

Overall, it is extremely important that the UGC’s master list of legitimate journals is suitably adapted for India. However, knowing how things work in the higher education sector, there are good reasons to worry that the master list of legit journals may not be up to the mark. It will likely remain work in progress for a very long time. It would certainly be too optimistic to hope that the master list would be updated at regular intervals.

What will the UGC’s master-list achieve?

The rationale behind the preparation of a master list of legit academic journals is to ensure that India’s academics do not publish in fake or substandard journals and benefit from their efforts, whether to clinch entry-level positions or for promotions. The question is whether this master list will achieve its objective of reining in such practices. At this time, it becomes necessary to recognise that publishing in fake journals is not the only kind of research fraud that is taking place; therefore, the UGC master list will not be adequate to address these other kinds of academic malpractices.

Even before the master list of legit journals is prepared however, the UGC itself seems to be undermining the rationale behind preparing such a list. First, the UGC may agree to the proposal that universities can recommend additional journals for inclusion in the master list (see Part I). By itself, this is not a bad thing. Despite good intentions, the committee involved with preparing the master list of legit journals may exclude good-enough journals that are relevant to Indian scholars. At the same time, however, there are also no guarantees that universities will not recommend substandard journals and that the committee will not accept them.

Second, there are reports that the modified API will provided greater flexibility under the “research” category so that academics do not need to publish journal articles to score high points. As I argued in Part I, the revised API scheme, if implemented, will allow faculty to accumulate points even without journal publications. It is hard to understand the logic behind doing the tedious work of preparing a list of legitimate journals if the importance of publishing in those journals is substantially diminished.

The arrival of SCIgen

In addition to the above, there are two larger and significant issues pertaining to research fraud that the UGC master list will not be able to address. Research fraud has evolved and taken up new forms in the 21st century. Plagiarism, fabricated research and other such related practices have always existed in academia, in India as well as elsewhere. Now, however, new technologies have permitted diversification of fraudulent research practices especially since the past two decades or so.

One of the new forms of fraudulent research and publishing is through the use of software to generate academic papers. In 2005, Jeremy Stribling, Dan Aguayo and Max Krohna at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) developed a computer science paper generator that could stitch together nonsense papers with impressive graphs and such. The papers produced by their SCIgen software seemed genuine enough and were accepted at big conferences and by established journals. SCIgen has been utilized by scores of academics to publish journals articles and conference proceedings by reputed publishers such as Springer (Germany) and the Institute of Electrical and Electronic Engineers (IEEE, US). It took some chasing by Cyril Labbé, a French computer scientist at Joseph Fourier University, to bring down 120 SCIgen papers.

Incidentally, Labbé maintains a website where one can screen for SCIgen-created papers. Also, the inventors of Scigen have developed SCIpher which can detect Scigen papers. These days, Springer uses SciDetect to fight fake papers.

It is not exactly known how widely SCIgen has been used. If India-based legitimate computer science journals have not been using anti-SCIgen programmes, it is very likely that many of our computer scientists have published in them.

Computer generated papers are not limited to computer science and related disciplines. Les Perelman, a former director of undergraduate writing at MIT, and his students at Harvard and MIT developed Basic Automatic B.S. Essay Language Generator or Babel. Perlman was at war with essay-grading automatons which are being increasingly used to grade humanities and social science papers. Babel Generator is primarily designed to fool the machines but such programmes can perhaps create authentic-enough papers that dupe humans as well.

Fake peer reviews

The second new and popular method involves cheating through the peer-review process. When academics submit research articles for publication in journals, they are reviewed by a few experts with competence in that specific area of study. Though not without faults, this method has been in use for decades and has ensured that articles selected for publication are of good quality. Because academics are under increasing pressure to publish in the current era of world university rankings (which use research output as a key measure of ranking universities and in turn, universities pressure their faculty to publish more), more competitive research grants, and promotions and such, they have started to turn to new business firms devoted to profiting from ‘helping’ academics publish in prestigious journals.

In August 2015, Springer retracted 64 published articles for false peer reviews. These articles had appeared in journals on neurobiology, cancer research, biochemistry and other scientific topics. It announced that editorial checks had “spotted fake email addresses, and subsequent internal investigations uncovered fabricated peer review reports.” Nearly all the articles were authored by China-based academics. In March of the same year, BioMed Central retracted 43 articles for false peer reviews. Again, most authors were China-based.

Part of the problem stems from the fact that many scientists, especially non-native English speakers, seek outside help, usually from third-party firms, to publish their papers. In many cases, these firms offer services that go beyond language polishing and may include the creation of false identities for peer reviews. Most publishers are now aware of the systematic and organised attempts to manipulate the peer review processes. As a result, the whole peer-review system is under greater scrutiny for a broad range of flaws and irregularities, ranging from lackadaisical reviewing to cronyism to outright fraud.

Research fraud will remain a problem

We have to accept that research fraud of one kind or another will continue to take place. Further, it is not an Indian or a Chinese problem; Western academics also indulge in such practices. What is perhaps more important is that the concerned stakeholders, in particular governments, universities and publishers, take steps to minimise such practices. Journals have already started to address the problem seriously since publishing fraudulent articles gives them a bad name. Reputed universities too tend to take action against erring academics because research fraud hurts their image.

In many countries around the world, however, including India, even the most obvious and egregious kind of research fraud is often ignored and/or not penalised even when the culprits are exposed, especially when it involves those in office. It does not help that in these countries, governments tend to maintain tight control over the higher education sector and dictate policies to academic institutions. For example, initiatives to check research fraud, such as preparing a master list of legit journals, come from the government. These initiatives usually fall short either because they are not eventually implemented or because, as in this case, techniques of research fraud have advanced further making older measures dated and irrelevant.

Pushkar is an assistant professor at the Department of Humanities and Social Sciences, BITS Pilani-Goa.