1 / 41

Principles for Quality Research and Quality Evidence

Portraits of a Scholar , from the 16 th Century…. ...to today. Artist: Domenico Feti. Artist: Ferdinand Bol. Artist: Rembrandt Harmenszoon Van Rijn. Artist: JKoshi’s photostream , flickr. Principles for Quality Research and Quality Evidence. Ted Kreifels, Ph.D. Overview.

nydia
Télécharger la présentation

Principles for Quality Research and Quality Evidence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Portraits of a Scholar, from the 16th Century…. ...to today Artist: DomenicoFeti Artist: Ferdinand Bol Artist: Rembrandt Harmenszoon Van Rijn Artist: JKoshi’sphotostream, flickr Principles for Quality Research and Quality Evidence Ted Kreifels, Ph.D.

  2. Overview Importance of good research Traits of quality research Standards and methods used to assess quality research and quality evidence BAD research practices Common causes of bias in data Methodological “potholes” How to trust information Errors in Research

  3. Introduction • Every businessman, scientist, engineer, technician, clinician, and manufacturer investigates, develops, or reveals useful knowledge (research) • We each play important roles: • Scientists, engineers, and analysts (create information) • Librarians (manage information) • Decision-makers (apply information) • Jurists (judge information) • Journalists (disseminate information) • Other examples?

  4. Our Motivation We (typically) have a sincere desire and an interest in determining what is TRUE based on the information and evidence we have available

  5. Motivation (continued) • Good research empowers us to reach our own conclusions • Bad (distorted) research • Starts with a conclusion • Presents only facts, usually taken out of context, that supports the author’s initial conclusion • Bad research should not to be confused with propaganda • Propaganda is information that is intended to persuade and is sometimes misrepresented as objective research • Bad research should not be confused with “bull****” • Bull**** is a deliberate, manipulative misrepresentation and steers one away from the truth Bad research causes real harm and deserves strong censure

  6. Research versusEvidence Quality Research and Quality Evidence arerelated, but separate topics Quality Research pertains to the scientific process Quality Evidence is the sum collection of research data, and pertains to thejudgmentregarding the strength and confidence one has in the findings emanating from the scientific process

  7. Research producesEvidence • Quality research is a precursor to quality evidence • Following factors influence the type and quality of evidence produced • Design • Questions • Methods • Coherence and consistency of findings

  8. Quality Matters! If scientific research lacks credibility, it’s difficult to make confident, concrete assertions or predictions Confidence is obtained by the robustness of the research and the analysis done to synthesize results

  9. Traits of Quality Research

  10. Traits of Quality Research (continued) Objectivity Internal Validity External Validity Construct Validity Reliability Honest and Thorough Reporting • Have I introduced any bias in the manner I collect or think about my data? • Can changes in the outcome be attributed to alternative explanations that were not explored in the study? • Do findings apply to participants/specimens whose place, times, and circumstances differ from those of other study participants/specimens? • Does the research adequately measure key concepts? • Have we collected the data in a consistent manner? • “…the truth, the whole truth, and nothing but the truth…?”

  11. Standards Used to Assess Quality of Scientifically-Based Research Pose a significant, important, well-defined question that can be investigated empirically and that contributes to the knowledge base Offer a description of the context and existing information about an issue Apply methods that best addresses the question of interest Test questions that are linked to relevant theory and considers various perspectives

  12. Standards Used to Assess Quality of Scientifically-Based Research (continued) Ensure an independent, balanced, and objective approach to the research with clear inferential reasoning supported by a complete coverage of relevant literature Use appropriate and reliable conceptualization and measurement of variables Provide sufficient description of the samples, and any comparison groups Ensure the study design, methods, and procedures are transparent and provides the necessary information to reproduce or replicate the study

  13. Standards Used to Assess Quality of Scientifically-Based Research (continued) Present evidence, with data and analysis in a format that others can reproduce or replicate Use adequate references, including original sources, alternative perspectives, and criticism Adhere to quality standards for reporting (i.e., clear, cogent, complete) Submit research to a peer-review process

  14. Standards Used to Assess Quality of Scientifically-Based Research (continued) The more one aligns to these standards, thehigher the quality Following only a few of these principles is insufficient to assert quality Evaluate alternative explanations for any findings, discusses critical assumptions, contrary findings, and alternative interpretations Assess the possible impact of systematic bias Use Caution to reach conclusions and implications

  15. Publishing • Publishing is an important benchmark, but the quality of research should not be judged solely by whether (or not) it is published in leading journals • Using biblio-metric analysis (citing by other authors) as a measure of quality is also faulty • All research that is published in journals or cited by others is NOT necessarily accurate, reliable, valid, free of bias, non fraudulent • Biblio-metric analysis is primarily a measure of quantity and can be artificially influenced by journals with high acceptance rates

  16. Assessing Quality Research • In industry, one of the most respected means of assessing quality is to establish consensus among subject matter experts and systematic review • Same is true in academia • Strategies for reaching consensus in academia include position statements, conferences, the peer review process, and systematic review • What other differences (or similarities) exist between industry and academia?

  17. Assessing Quality Research (continued) • Another form of reaching consensus is by using standardized reporting techniques • Report essential information regarding samples, statistics, randomization, and analysis • Publish detailed technical standards in relevant professional societies • What other techniques help us reach consensus?

  18. Assessing Quality Research (continued) • Sandia National Laboratories exhibits traits of basic research, advanced development, industrial, and manufacturing • We use a “layered defense” or layered strategy for defect prevention • Bottoms-Up meets Top-Down in the middle (via Reviews, Gates, etc.) • Triple-A Teamwork: Assurance, Acceptance, Assessment • We do our best work when we work together to establish consensus during each step to achieve quality

  19. Bad Research Practices • Defining issues in ideological terms • i.e. Using exaggerated or extreme perspectives to characterize a debate • Ignoring/suppressing alternativeperspectives or contrary evidence • Insulting/ridiculing others with differing views • Totally unacceptable …reflects poorly on oneself, one’s organization

  20. BadResearch Practices (continued) Designing research questions to reach particular conclusions Using faulty logic to reach conclusions Using biased dataand analysis methods Ignoring limitations of analysis and exaggerating implications of results

  21. Bad Research Practices (continued) Using unqualifiedresearchers not familiar with specialized issues Not presenting details of key data and analysis for review by others Citing special interest groups or popular media, rather than peer-viewed professional and academic organizations

  22. Bad Research Practices (continued) And, the MOST COMMON mistake: Assuming association (events that occur together)… Proves causation (one event causes another) Have I missed anything?

  23. Example of a Methodological PotholeReference Units • OBSERVATIONS • Traffic fatality trends over four decades • When measured per capita they show little decline • When measured per vehicle-mile, fatality rates declined significantly Conclusion A: As measured per capita, various safety efforts have FAILED Conclusion B: Conditions require more people to drive further, yet vehicle handling and safety have improved so people feel safer while increasing risk (driving faster, leaving less distance between cars, etc.)—various safety strategies (e.g. better roads, vehicles, laws) have PASSED No single right or wrong reference unit—different reference units reflect different perspectives and may affect analytical results

  24. EXERCISESame Reference Units, Different Perspectives • OBSERVATIONS • Alcohol-impaired driving fatalities have decreased over 5 years throughout the United States • Alcohol-impaired driving fatalities have sharply increased in Kansas • Both are measured per vehicle-mile What are your CONCLUSIONS? What further QUESTIONS would you ask? Different quality researchers reflect different perspectives, knowledge, and experience

  25. 8 (of 60) Methodological Potholes * Sixty Methodological Potholes, David Huron, Ohio State University, 2000

  26. CARS How To Trust Information—Especially from Media and the Internet * Evaluating Internet Research Sources, Robert Harris, November 2010

  27. CARS Checklist • Reasonable • Fair and balanced • Objective • Reasoned and thoughtful • No conflict of interest • No fallacies or slanted tone • Seeks the truth • Supported • Listed sources • Contact information • Corroboration available • Claims supported w/evidence • Documentation supplied • Triangulated sources • Credible • Trustworthy source • Quality evidence • Quality control • Known, respected authority • Credentials • Organizational support • Accurate • Current • Factual • Detailed • Exact • Comprehensive • Whole truth

  28. EXERCISEFlour Power Research and Evidence Challenge: Is a liquid cup and a dry cup the same measure? I used the internet to research this question and draw a conclusion What percentage of internet sources answered: Yes/No?

  29. The “Ounce”Background Information • Unit of MASS (or weight) • Abbreviated, oz, from Latin “uncia” • Original Roman measure = 1/12 pound • Troy ounce (still used for precious metals) = Apothecary ounce = 1/12 lb • Several definitions and standards for an “ounce”: Mother Theresa, Spanish, metric • United States uses avoirdupois ounce = 1/16 pound • Unit of VOLUME • Abbreviated, floz, fl. oz., or oz. fl. • Other, fabric weight • Expresses the areal density of a textile fabric in North America, Asia, UK • Weight of a given amount of fabric, a square yard, or yard of a given width

  30. On PropagandaCollected from several sources including dictionaries, Wikipedia,and* Garth Jowett and Victoria O'Donnell, Propaganda and Persuasion 4th ed. Sage Publications, p. 7 • Propagandais information, ideas, (or even rumors) and a form of communication intended to persuade and influence • Propagandaoften presents facts selectively to encourage a particular synthesis and emotional, rather than rational, response • “Propaganda is a deliberate and systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist.” * • Originally, etymologically, the word “propaganda” is neutral • Positive, benign, innocuous examples: Public health recommendations, buying war bonds, reporting crimes to the police, getting out the vote • Negative example: Nazi (used to justify Holocaust), etc. Be wary! Propaganda is sometimes misrepresented as objective research!

  31. On Bull****(a real book) by philosopher Harry G. Frankfort, (Princeton Press 2005) • Bull**** is a manipulative misrepresentation • Bull**** is WORSE THAN A LIE (more dangerous) because it denies the value of truth • In contrast, lying is concerned with the truth in a perverse fashion: “A liar wants to lead us away from the truth.” • Truth Tellers (researchers) and Liars play opposite sides of the Game • Bull****ters take pride in ignoring the rules of the Game altogether • People sometimes try to justify their bull**** by citing relativism, a philosophy that suggests that objective truth does not exist “There are no facts, only interpretations”- Nietzsche • Any issue can and should be viewed from multiple perspectives…but Anyone who denies the value of truth and objective analysis is really bull****ting!

  32. Special Acknowledgement The following section regarding Errors in Research and the workshop case studies were taken from On Being a Scientist Responsible Conduct in Research, 2nd Edition produced by: - The National Academy of Sciences (NAS) - National Academy of Engineering (NAE) - Institute of Medicine (IOM) Printed by the National Academy Press, Washington D.C., 1995

  33. Errors in Research1st Category The “Honest Error” Usually caught internally through informal and formal peer review processes Dealt with internally through evaluations and appointments

  34. Errors in Research2nd Category • Ethical transgressions • Gross negligence • Misallocation of credit • Cover-ups of misconduct • Reprisals against whistle blowers • Malicious allegations • Violations of due process • Sexual and other forms of harassment • Misuse of funds • Tampering with experiments, instrumentation, results • Violations of government research regulations • May be caught internallyor externally any number of ways • Dealt with by administrative, legal, and professional penalties

  35. Misconduct3rd Category and Most Grave Error in Research • Deception • Making up data (fabrication) • Changing or misreporting data or results (falsification) • Using the ideas or words of another person without giving appropriate credit (plagiarism) • Deception strikes “at the heart” of values in good research • Deception may cause extreme consequences • Undermines progress, personal and institutional credibility • Loss of time in related research • Squanders public funds • Threatens future funding and support • Threatens public safety • Deception is dealt with using severe, career-ending, penalties

  36. The Selection of Data Deborah, a third-year graduate student, and Kathleen, a postdoc, have made a series of measurements on a new experimental semi-conductor material using an expensive neutron source at a national laboratory. When they get back to their own lab and examine the data, they get the following data points. A newly proposed theory predicts results indicated by the curve. Response Beam Intensity During the measurements at the national lab, Deborah and Kathleen observed that there were power fluctuations they could not control or predict. Furthermore, they discussed their work with another group doing similar experiments, and they knew that the other group had gotten results confirming the theoretical prediction and was writing a manuscript describing their results. In writing up their own results for publication, Kathleen suggests dropping the two anomalous data points near the abscissa (the solid squares) from the published graph and from a statistical analysis. She proposes that the existence of the data points be mentioned in the paper as possibly due to power fluctuations and being outside the expected standard deviation calculated from the remaining data points. “These two runs,” she argues to Deborah, “were obviously wrong.” How should the data from the two suspected runs be handled? Should the data be included in tests of statistical significance and why? What other sources of information, in addition to their faculty advisor, can Deborah and Kathleen use to help decide?

  37. The Selection of DataPrologue Deborah and Kathleen’s principal obligation, in writing up their results for publication, is to describe what they have done and give the basis for their actions. They must therefore examine how they can meet this obligation within the context of the experiment they have done. Questions that need to be answered include: If the authors state in the paper that data have been rejected because of problems with the power supply, should the data points still be included in the published chart? Should statistical analyses be done that both include and exclude the questionable data? If conventions within their discipline allow for the use of statistical devices to eliminate outlying data points, how explicit do Deborah and Kathleen need to be in the published paper about the procedures they have followed?

  38. A Conflict of Interest John, a third-year graduate student, is participating in a department-wide seminar where students, postdocs, and faculty members discuss work in progress. An assistant professor prefaces her comments by saying that the work she is about to discuss is sponsored by both a federal grant and a biotechnology firm for which she consults. In the course of the talk, John realizes that he has been working on a technique that could make a major contribution to the work being discussed. But his faculty advisor consults for a different, and competing, biotechnology firm. How should john participate in this seminar? What, if anything, should he say to his advisor—and when? What implications does this case raise for the traditional openness and sharing of data, materials, and findings that have characterized modern science?

  39. A Conflict of InterestPrologue Science thrives in an atmosphere of open communication. When communication is limited, progress is limited for everyone. John therefore needs to weight the advantages of keeping quiet—if, in fact there are any—against the damage that accrues to science if he keeps his suggestions to himself. He might also ask himself how keeping quiet might affect his own life in science. Questions: Does John want to appear to his advisor and his peers as someone who less than forthcoming with his ideas? Will he enjoy science as much if purposefully limits communication with others?

  40. Summary Why is good research important? What are the traits of quality research? Can you provide a few examples of standards and methods used to assess quality research and quality evidence? What are examples of bad research? What are a few common causes of bias in data and methodological errors? How does one trust information from the internet? What are the three categories of errors in research?

  41. Bibliography • Evaluating Research Quality, Guidelines for Scholarship • Todd Litman, Victoria Transport Policy Institute, November 28th, 2010 • What are the Standards for Quality Research? • Editor’s Focus , Technical Brief Number 9, National Center for Dissemination of Disability Research (NCDDR), 2005 • Sixty Methodological Potholes • David Huron, Ohio State University, 2000 • Evaluating Internet Research Sources • Robert Harris, Virtual Salt, November 22nd, 2010 • On Being a Scientist, Responsible Conduct in Research, 2ndEd. • Bruce Alberts, President, The National Academy of Sciences (NAS), 2005 • Kenneth Shine, President, National Academy of Engineering (NAE), 2005 • Robert White, President, Institute of Medicine (IOM), 2005

More Related