1 / 27

Increasing the Reliability of Preclinical Research

Increasing the Reliability of Preclinical Research. 4 :45 PM - 5:00 PM Does Our System Impair Honest Reporting of Data ?: Lee M. Ellis, MD 5 :00 PM - 5:15 PM The Experience of Industry in Trying to Validate Data from Academic Labs: Colin G. Begley, PhD, MBBS 5 :15 PM - 5:30 PM

linh
Télécharger la présentation

Increasing the Reliability of Preclinical Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Increasing the Reliability of Preclinical Research 4:45 PM - 5:00 PM Does Our System Impair Honest Reporting of Data?: Lee M. Ellis, MD 5:00 PM - 5:15 PM The Experience of Industry in Trying to Validate Data from Academic Labs: Colin G. Begley, PhD, MBBS 5:15 PM - 5:30 PM The Role of Journals in Confirming the Validity of Published Data: Veronique Kiermer, PhD 5:30 PM - 5:45 PM As a Phase I Investigator, How Do I Identify the Most Reliable Drugs for Study?: Lillian L. Siu, MD 5:45 PM – 6:00 PM Panel Discussion / Q&A (e - Q&A)

  2. Does Our System Impair Honest Reporting of Data? Lee M. Ellis, M.D. Department of Surgical Oncology UT MD Anderson Cancer Center Houston, Texas, USA Vice Chair, SWOG, Translational Medicine

  3. Does Our System Impair Honest Reporting of Data? • The issue at hand • Why does this occur? • What can we do to fix this?

  4. The Spectrum of Reporting Preclinical and Clinical Data Honest Sloppy Selective Reporting Data Fabrication

  5. Recent Publications/Stories Have Prompted Discussions on Data Reproducibility and Ethics in Research Glenn Begley to discuss data

  6. Retractions Have Gone Up, and Most Retractions Are Due to Fraud PNAS, 2012

  7. As R-01 Funding Falls, Retractions Go Up http://retractionwatch.files.wordpress.com/2013/03/funding_retractions.jpg

  8. Retractionwatch.com

  9. Does Our System Impair Honest Reporting of Data? • The issue at hand • Why does this occur? • What can we do to fix this?

  10. Causes of “Massaging” of Data

  11. An IRB Approved Survey Conducted at The MD Anderson Cancer Center • 240 responses in 6 hrs • 311 responses after 3 days • IRB Approved Protocol • PI: Len Zwelling, MD • Co-PI: Lee Ellis

  12. Have You Ever Tried To Reproduce A Finding From A Published Paper And Not Been Able To Do So? Mobley at al. PlosOne 2013

  13. (Faculty Only) As A Mentor, Are You More Interested In A Candidate For A Position In Your Lab With 4 Cancer Research Papers Vs 1 Cancer Cell Paper? N = 150 Mobley at al. PlosOne 2013

  14. Driving Forces for Irreproducible Data (>90 respondents-Trainees Only) • Were you ever pressured to publish findings of which you had doubt? • 22% • Have you noted pressure from a mentor to prove that his/her hypothesis was correct, even though the data you generated may not support the hypothesis? • 31% • Are you aware of mentors who require a high impact publication before a trainee can leave the lab? • 49% Mobley at al. PlosOne 2013

  15. How Do You Get to Publish In a High Impact Journal? • Smart • Lucky • Great collaborators • Patience (more on that later) • And, the ability to tell a “perfect story”

  16. The Need to Tell The “Perfect Story” • High impact journals, and their reviewers, only accept the “perfect story” • Unfortunately, biology is not linear and never presents a perfect story • The need to tell the perfect story leads to the temptation to “massage data”, especially for revisions • If you get through the first round of reviews, the trainee is then provided with a list of experiments to do • Temptation to provide the exact data that is needed as “we are oh so close” • We like to provide “perfect data” on revisions to shorten a potentially prolonged review process • Next slide

  17. Selected Comments From the Survey • crumbling of integrity and value - bean counters judging science by journal names - institutional failure on dealing with alleged fraud. • Everything here in US is screwed up. There is nothing to do other than move out. …. Who publishes more deserve respect, while others who are honest and cast doubt about their own results (or third party results) as condenmed. There is no way out. It is either join the "bright team" or be labeled as incompetent. • … my previous mentor and also our current neighbor lab PI push too much to produce best data all the time. .. sometimes it make trainee consider manipulates data only to escape from stress. Especially, many international trainees (postdoc) also have VISA issue. Thus, PI starts push them with visa issue trainees feel a lot of stress and eventually it make them can do whatever PI WANT. • From my experience, no one will help you if you stand up for what is right. ….The system is unfortunately broken …. • Pressure is ….from the job market and funding dynamics. The impact factor insanity is destroying science. A small group of powerful editors and friends control everything.

  18. Selective Reporting of Laboratory Studies • Journals prioritize “positive” results • If a drug works in 2 cell lines, and does not in 8, we only see the results on the 2 cell lines • Students, post-docs and faculty need publications for advancement • “Publish or perish” • In many labs, 2 trainees work on the same project competing with each other…guess who wins? • Therefore, we tend to report only the “positive” data and ignore the negative data

  19. Does Our System Impair Honest Reporting of Data? • The issue at hand • Why does this occur? • What can we do to fix this?

  20. Thanks to Ray Petryshyn, Ph.D. Slides presented by H. Varmus at AACR and Board of Scientific Advisors Meeting http://deainfo.nci.nih.gov/advisory/bsa/bsa0313/index.htm

  21. A Sampling of ORI Actions 2013

  22. Most Common ORI Actions • Retract paper(s) • Have research supervised for 3 yrs • No service on committees for 2-3 yrs • Most can still receive NIH funding • For those found guilty of fraud, we must have a punishment that fits the crime. • What is the deterrent for such behavior? • Indeed, the entire system needs an overhaul, but let’s start with making outright fraud something that can be deterred by tough punishment and prohibits this person from ever having the chance to do this again. • - This is, of course, even more important for clinical fraud

  23. The penalties for committing scientific fraud that may impact the lives of patients, trainees, and faculty, and is supported by taxpayer money or philanthropy, is LESS than that of one who cheats on a test at UVA! Each student is charged with the responsibility to refrain from dishonorable conduct. Accompanying this individual commitment to abide by the Honor System is an even more demanding commitment, a responsibility to ask those who violate our standard of honor to leave the University. Accepting these responsibilities is vital to the successful maintenance of our student-run Honor System.

  24. Academic Pressures and Fraud: Solutions to Consider • Present negative data • Editors must require that all data be presented, including negative data • Authors must sign-off • Links to articles to report validation or alternative findings • Protect whistle blowers • Continue to nurture physician-scientists to be able to interpret questionable preclinical data • Credit/reward great teachers and mentors…it is not just about papers and grants • Increase penalties for fraud Begley, Ellis: Nature 2012

  25. Thank you for your attention!

More Related