A prestigious cancer institute corrects dozens of articles and retracts others after a blogger made a mistake

The Dana-Farber Cancer Institute has requested the retraction of six studies and corrections to another 31 articles after a scathing critique drew attention to alleged errors that one blogger and biologist said range from sloppiness to “really serious concerns.”

The allegations – against top scientists at the prestigious Boston institute, which is a teaching affiliate of Harvard Medical School – put the institute at the center of a fierce debate over research misconduct, how to monitor scientific integrity and whether the organizational structure of academic institutions science encourages shortcuts or deception.

The criticism also highlights how artificial intelligence is playing an increasing role in detecting sloppy or questionable science.

The allegations, which involve image duplications and manipulations in biomedical research, are similar to concerns raised last year against former Stanford University President Marc Tessier-Lavigne, who resigned following an investigation.

A biologist and blogger, Sholto David, drew attention to Dana-Farber after highlighting the problems in a slew of studies by top researchers.

In early January, David detailed duplications and potentially misleading image edits in dozens of articles produced primarily by Dana-Farber researchers. In a blog post, David wrote that research by the institute’s top scientists appears to be “hopelessly corrupt with flaws obvious from a cursory reading.”

Following the publication of David’s blog, Dr. Barrett Rollins, the institute’s integrity research officer, said in an emailed statement Wednesday that Dana-Farber scientists had requested that six manuscripts be withdrawn, that 31 manuscripts were in the process of being to be corrected and that one manuscript remained under review.

Rollins added that some of the articles highlighted by David had already come up in “rolling reviews” previously conducted by the institute.

“The presence of image discrepancies in an article is not evidence of an author’s intent to deceive,” Rollins said. “That conclusion can only be drawn after a careful, fact-based investigation that is an integral part of our response. In our experience, mistakes are often unintentional and do not rise to the level of misconduct.”

Ellen Berlin, communications director at Dana-Farber, wrote in an email that the allegations all involved pure or basic science, as opposed to studies leading to cancer drug approval.

“Cancer treatment is not affected in any way in the review of Dana-Farber’s research articles,” Berlin wrote.

David is one of many sleuth scientists who read journal articles to find errors or fabrications. He compared his hobby to playing a spot-the-difference game or completing a crossword puzzle.

“It’s a puzzle,” David said in an interview, adding that he enjoys looking at numbers that show the results of common biological experiments, such as those involving cells, mice and Western blots, a laboratory method that identifies proteins.

“Of course I think it’s important that the science is good,” he said.

Scientific errors in published work have been a concern in the scientific community in recent years. The website Retraction Watch, a website that keeps track of retracted papers, has more than 46,000 papers in its database. The organization’s history of withdrawn work dates back to the 1970s. A 2016 Nature article states that more than a million articles in the biomedical field are published every year.

The website PubPeer, which allows outside researchers to post critiques of research that has been peer-reviewed and published in journals, is a popular forum for scientists to flag problems. David said he has written more than 1,000 anonymous critiques on the website.

David said a trail of questionable science led him to Dana-Farber. In an earlier study, David examined the work of a surgeon from Columbia University. He discovered deficiencies in the work of the surgeon’s staff, who eventually drew his attention to Dana-Farber’s leadership team.

David said he reviewed the leadership page of Dana-Farber’s website and checked the work of the top scientists and leaders.

He discovered a whole host of image errors, many of which could be explained by sloppy copying and pasting or mix-ups, but also others where images have been stretched or rotated, which are more difficult to explain. Some errors were previously identified by other users on PubPeer. David combined these earlier concerns with his own findings in a blog post focusing on the institute. The Harvard Crimson, a student newspaper, was the first to publish a news story about the allegations.

David said images of mice in one paper looked as if they had been digitally altered in ways that seemed intentional and could distort the paper’s takeaways.

“I don’t understand how that could happen as an accident.” David said.

Most of the errors are “less serious” and could have been accidents, he said. Yet, according to David, the abundance of errors indicates a failed research and review process if no one catches them before publication.

“If you discover a duplication, that’s a symptom of a problem,” David said.

Elisabeth Bika scientist who investigates image manipulation and research misconduct said David’s work was credible.

‘The allegations he is making are exactly the same as what I would make. They are just right,” Bik said.

Bik, who has been doing this kind of detective work for about a decade, says she is often frustrated by the lack of response from academic institutions when she identifies errors. She said she was happy to see Dana-Farber responding and had already taken proactive steps to correct the scientific record.

“I am very pleasantly surprised that the institute is taking action. I hope they will continue to do this with the publishers,” said Bik. “I have reported many of these cases where nothing happened.”

Image manipulation is under close scrutiny in scientific communities, especially after Stanford University’s Tessier-Lavigne resigned from his position as president of the institute following criticism of his previous work in neuroscience.

Tessier-Lavigne said he himself was acquitted of fraud or falsifying data, but an investigation found that members of his laboratory had improperly manipulated research data or engaged in “flawed scientific practices,” according to a report by a panel of external researchers who evaluated the data. case.

The report states that Tessier-Lavigne’s laboratory culture rewarded young scientists whose work produced favorable results and marginalized those who did not, a dynamic that could have led to young scientists manipulating results and courting favors.

Outside researchers said this kind of culture is not uncommon at top institutions, where ambitious professors can run sprawling labs with dozens of graduate students eager to please their superiors and who know that publishing a splashy paper could quickly advance their careers.

Some scientists have become increasingly concerned that limited opportunities for young scientists and a problematic system for publishing scientific work have encouraged career cuts.

“There are a lot of incentives to produce mountains of research and publish them in these high-impact journals to make a name for themselves,” says Dr. Ferric Fang, a microbiologist and professor at the University of Washington. “We encourage this kind of behavior.”

Problems with images published in research are widespread.

In a 2016 paper published in the American Society of Microbiology, Bik and Fang evaluated images from more than 20,600 articles in 40 biomedical journals from 1995 to 2014. They found that about 3.8% of journal articles contained “problematic numbers” and that at least half of them had elements that were ‘suggestive of deliberate manipulation’.

New tools help both institutions and investigators to root out errors and possible misconduct. David used a program called ImageTwin to identify some of the questionable figures provided by Dana-Farber researchers.

The AI-powered software can record a study, analyze the images and compare them to each other in about 15 seconds, as well as to about 50 million scientific images in its database, said Patrick Starke, co-founder of ImageTwin.

The software has been commercially available since 2021. Starke, based in Vienna, said a few hundred academic organizations use the tool to identify problems before publication.

“It’s great if it gets picked up and retracted, and it’s even better if it doesn’t get published,” says Starke, who imagines the program used in academics as much as plagiarism checking tools that analyze text.

But Starke said it will be a challenge to stay ahead of those who cut corners or cheat. Studies have already shown that AI programs can generate realistic-looking numbers from common experiments such as western blots, Starke said. His company develops tools to look for AI-generated patterns in research images.

“If AI can make photos of faces realistic, this is probably already happening in the scientific literature,” says Bik. “That’s the next level of deception, I’m not sure we’re even ready for that.”

This article was originally published on NBCNews.com

Leave a Comment