Presentation on theme: "Principles for Quality Research and Quality Evidence"— Presentation transcript:
2 Principles for Quality Research and Quality Evidence Portraits of a Scholar, from the 16th Century….Artist: JKoshi’s photostream, flickr...to todayArtist: Domenico FetiArtist: Ferdinand BolArtist: Rembrandt Harmenszoon Van RijnPrinciples for Quality Research and Quality EvidenceTed Kreifels, Ph.D.
3 Overview Importance of good research Traits of quality research Standards and methods used to assess quality research and quality evidenceBAD research practicesCommon causes of bias in dataMethodological “potholes”How to trust informationErrors in Research
4 IntroductionEvery businessman, scientist, engineer, technician, clinician, and manufacturer investigates, develops, or reveals useful knowledge (research)We each play important roles:Scientists, engineers, and analysts (create information)Librarians (manage information)Decision-makers (apply information)Jurists (judge information)Journalists (disseminate information)Other examples?
5 based on the information and evidence we have available Our MotivationWe (typically) have a sincere desire and an interest in determining what is TRUEbased on the information and evidence we have available
6 Motivation (continued) Good research empowers us to reach our own conclusionsBad (distorted) researchStarts with a conclusionPresents only facts, usually taken out of context, that supports the author’s initial conclusionBad research should not to be confused with propagandaPropaganda is information that is intended to persuade and is sometimes misrepresented as objective researchBad research should not be confused with “bull****”Bull**** is a deliberate, manipulative misrepresentation and steers one away from the truthBad research causes real harm and deserves strong censure
7 Research versus Evidence Quality Research and Quality Evidence are related, but separate topicsQuality Research pertains to the scientific processQuality Evidence is the sum collection of research data, and pertains to the judgment regarding the strength and confidence one has in the findings emanating from the scientific process
8 Research produces Evidence Quality research is a precursor to quality evidenceFollowing factors influence the type and quality of evidence producedDesignQuestionsMethodsCoherence and consistency of findings
9 Quality Matters!If scientific research lacks credibility, it’s difficult to make confident, concrete assertions or predictionsConfidence is obtained by the robustness of the research and the analysis done to synthesize results
10 Traits of Quality Research Exhibits these TraitsCredibilityObjectivity, internal validityTransferabilityExternal validityDependabilityConstruct validity, reliabilityConfirm-abilityObjectivity, Honest/Thorough reporting
11 Traits of Quality Research (continued) Have I introduced any bias in the manner I collect or think about my data?Can changes in the outcome be attributed to alternative explanations that were not explored in the study?Do findings apply to participants/specimens whose place, times, and circumstances differ from those of other study participants/specimens?Does the research adequately measure key concepts?Have we collected the data in a consistent manner?“…the truth, the whole truth, and nothing but the truth…?”ObjectivityInternal ValidityExternal ValidityConstruct ValidityReliabilityHonest and Thorough Reporting
12 Standards Used to Assess Quality of Scientifically-Based Research Pose a significant, important, well-defined question that can be investigated empirically and that contributes to the knowledge baseOffer a description of the context and existing information about an issueApply methods that best addresses the question of interestTest questions that are linked to relevant theory and considers various perspectives
13 Standards Used to Assess Quality of Scientifically-Based Research (continued) Ensure an independent, balanced, and objective approach to the research with clear inferential reasoning supported by a complete coverage of relevant literatureUse appropriate and reliable conceptualization and measurement of variablesProvide sufficient description of the samples, and any comparison groupsEnsure the study design, methods, and procedures are transparent and provides the necessary information to reproduce or replicate the study
14 Standards Used to Assess Quality of Scientifically-Based Research (continued) Present evidence, with data and analysis in a format that others can reproduce or replicateUse adequate references, including original sources, alternative perspectives, and criticismAdhere to quality standards for reporting (i.e., clear, cogent, complete)Submit research to a peer-review process
15 Standards Used to Assess Quality of Scientifically-Based Research (continued) Evaluate alternative explanations for any findings, discusses critical assumptions, contrary findings, and alternative interpretationsAssess the possible impact of systematic biasUse Caution to reach conclusions and implicationsThe more one aligns to these standards,the higher the qualityFollowing only a few of these principlesis insufficient to assert quality
16 PublishingPublishing is an important benchmark, but the quality of research should not be judged solely by whether (or not) it is published in leading journalsUsing biblio-metric analysis (citing by other authors) as a measure of quality is also faultyAll research that is published in journals or cited by others is NOT necessarily accurate, reliable, valid, free of bias, non fraudulentBiblio-metric analysis is primarily a measure of quantity and can be artificially influenced by journals with high acceptance rates
17 Assessing Quality Research In industry, one of the most respected means of assessing quality is to establish consensus among subject matter experts and systematic reviewSame is true in academiaStrategies for reaching consensus in academia include position statements, conferences, the peer review process, and systematic reviewWhat other differences (or similarities) exist between industry and academia?
18 Assessing Quality Research (continued) Another form of reaching consensus is by using standardized reporting techniquesReport essential information regarding samples, statistics, randomization, and analysisPublish detailed technical standards in relevant professional societiesWhat other techniques help us reach consensus?
19 Assessing Quality Research (continued) Sandia National Laboratories exhibits traits of basic research, advanced development, industrial, and manufacturingWe use a “layered defense” or layered strategy for defect preventionBottoms-Up meets Top-Down in the middle (via Reviews, Gates, etc.)Triple-A Teamwork: Assurance, Acceptance, AssessmentWe do our best work when we work together to establish consensus during each step to achieve quality
20 Bad Research Practices Defining issues in ideological termsi.e. Using exaggerated or extreme perspectives to characterize a debateIgnoring/suppressing alternative perspectives or contrary evidenceInsulting/ridiculing others with differing viewsTotally unacceptable …reflects poorly on oneself, one’s organization
21 Bad Research Practices (continued) Designing research questions to reach particular conclusionsUsing faulty logic to reach conclusionsUsing biased data and analysis methodsIgnoring limitations of analysis and exaggerating implications of results
22 Bad Research Practices (continued) Using unqualified researchers not familiar with specialized issuesNot presenting details of key data and analysis for review by othersCiting special interest groups or popular media, rather than peer-viewed professional and academic organizations
23 Bad Research Practices (continued) And, the MOST COMMON mistake:Assuming association (events that occur together)…Proves causation (one event causes another)Have I missed anything?
24 Example of a Methodological Pothole Reference Units OBSERVATIONSTraffic fatality trends over four decadesWhen measured per capita they show little declineWhen measured per vehicle-mile, fatality rates declined significantlyConclusion A: As measured per capita, various safety efforts have FAILEDConclusion B: Conditions require more people to drive further, yet vehicle handling and safety have improved so people feel safer while increasing risk (driving faster, leaving less distance between cars, etc.)—various safety strategies (e.g. better roads, vehicles, laws) have PASSEDNo single right or wrong reference unit—different reference units reflect different perspectives and may affect analytical results
25 EXERCISE Same Reference Units, Different Perspectives OBSERVATIONSAlcohol-impaired driving fatalities have decreased over 5 years throughout the United StatesAlcohol-impaired driving fatalities have sharply increased in KansasBoth are measured per vehicle-mileWhat are your CONCLUSIONS?What further QUESTIONS would you ask?Different quality researchers reflect different perspectives, knowledge, and experience
26 8 (of 60) Methodological Potholes ProblemRemedy/AdviceRangerestriction effectFailure to vary independent variables over sufficient range, so effects look small.Decide what range of a variable or what effect size is of interest. Run a pilot study.Ceiling effectWhen a task is so easy that the experimental manipulation shows little/no effect.Make the task more difficult. Run a pilot study.Floor effectWhen a task is so difficult that experimental manipulation shows little/no effect.Make the task easier. Run a pilot study.Sampling biasAny confound that causes the sample to be unrepresentative of the pertinent population.Use random sampling. If sub-groups areidentifiable use a stratified random sample. Avoid “convenience” or haphazard sampling.History effectAny change between a pretest measure and posttest measure not attributable to the experimental factors.Isolate subjects from external information. Use post-experiment debriefing to identify possible confounds.ReactivityproblemWhen the act of measuring something changes the measurement itself.Use clandestine measurement methods.Order effectIn a repeated measures, the effect that the order of introducing treatment has on the dependent variable.Randomize or counter-balance treatment order. Use between-subjects design.HypocrisyHolding others to a higher methodological standard than oneself.Hold yourself to higher standards than others. Apply self-criticism. Follow your own advice.* Sixty Methodological Potholes, David Huron, Ohio State University, 2000
27 CARS How To Trust Information—Especially from Media and the Internet Checklist for Information QualityDescriptionGoalCredibletrustworthy source, author’s credentials, evidence of quality control, known or respected authority, organizational supportA known, respected authority, a source of trusted evidenceAccurateup to date, factual, detailed, exact, comprehensive, audience and purpose reflect intentions of completenessA source that is correct today (not yesterday), a source that gives the whole truthReasonablefair, balanced, objective, reasoned, no conflict of interest, absence of fallacies or slanted toneA source that engages the subject thoughtfully and reasonably, concerned with the truthSupportedlisted sources, contact information, available corroboration, claims supported, documentation suppliedProvides compelling evidence for the claims made, a source you can triangulate (i.e. find at least two other sources that support it)* Evaluating Internet Research Sources, Robert Harris, November 2010
28 CARS Checklist Credible Reasonable Accurate Supported Trustworthy sourceQuality evidenceQuality controlKnown, respected authorityCredentialsOrganizational supportAccurateCurrentFactualDetailedExactComprehensiveWhole truthReasonableFair and balancedObjectiveReasoned and thoughtfulNo conflict of interestNo fallacies or slanted toneSeeks the truthSupportedListed sourcesContact informationCorroboration availableClaims supported w/evidenceDocumentation suppliedTriangulated sources
29 EXERCISE Flour Power Research and Evidence Challenge: Is a liquid cup and a dry cup the same measure?I used the internet to research this question and draw a conclusionWhat percentage of internet sources answered: Yes/No?
30 The “Ounce” Background Information MassVariantEquivalent (grams)AvoirdupoisTroyApothecaryMaria TheresaSpanish28.75Dutch metric100Chinesemetric50Unit of MASS (or weight)Abbreviated, oz, from Latin “uncia”Original Roman measure = 1/12 poundTroy ounce (still used for precious metals) = Apothecary ounce = 1/12 lbSeveral definitions and standards for an “ounce”: Mother Theresa, Spanish, metricUnited States uses avoirdupois ounce = 1/16 poundUnit of VOLUMEAbbreviated, fl oz, fl. oz., or oz. fl.Other, fabric weightExpresses the areal density of a textile fabric in North America, Asia, UKWeight of a given amount of fabric, a square yard, or yard of a given widthVolumeVariantEquivalent (ml)US30Imperial28
31 Propaganda is sometimes misrepresented as objective research! On Propaganda Collected from several sources including dictionaries, Wikipedia,and * Garth Jowett and Victoria O'Donnell, Propaganda and Persuasion 4th ed. Sage Publications, p. 7Propaganda is information, ideas, (or even rumors) and a form of communication intended to persuade and influencePropaganda often presents facts selectively to encourage a particular synthesis and emotional, rather than rational, response“Propaganda is a deliberate and systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist.” *Originally, etymologically, the word “propaganda” is neutralPositive, benign, innocuous examples: Public health recommendations, buying war bonds, reporting crimes to the police, getting out the voteNegative example: Nazi (used to justify Holocaust), etc.Be wary!Propaganda is sometimes misrepresented as objective research!
32 “There are no facts, only interpretations”- Nietzsche On Bull**** (a real book) by philosopher Harry G. Frankfort, (Princeton Press 2005)Bull**** is a manipulative misrepresentationBull**** is WORSE THAN A LIE (more dangerous) because it denies the value of truthIn contrast, lying is concerned with the truth in a perverse fashion: “A liar wants to lead us away from the truth.”Truth Tellers (researchers) and Liars play opposite sides of the GameBull****ters take pride in ignoring the rules of the Game altogetherPeople sometimes try to justify their bull**** by citing relativism, a philosophy that suggests that objective truth does not exist“There are no facts, only interpretations”- NietzscheAny issue can and should be viewed from multiple perspectives…butAnyone who denies the value of truth and objective analysis is really bull****ting!
33 Special Acknowledgement The following section regarding Errors in Research and the workshop case studies were taken fromOn Being a ScientistResponsible Conduct in Research, 2nd Editionproduced by:- The National Academy of Sciences (NAS)- National Academy of Engineering (NAE)- Institute of Medicine (IOM)Printed by the National Academy Press, Washington D.C., 1995
34 Errors in Research 1st Category The “Honest Error”Usually caught internally through informal and formal peer review processesDealt with internally through evaluations and appointments
35 Errors in Research 2nd Category Ethical transgressionsGross negligenceMisallocation of creditCover-ups of misconductReprisals against whistle blowersMalicious allegationsViolations of due processSexual and other forms of harassmentMisuse of fundsTampering with experiments, instrumentation, resultsViolations of government research regulationsMay be caught internally or externally any number of waysDealt with by administrative, legal, and professional penalties
36 Misconduct 3rd Category and Most Grave Error in Research DeceptionMaking up data (fabrication)Changing or misreporting data or results (falsification)Using the ideas or words of another person without giving appropriate credit (plagiarism)Deception strikes “at the heart” of values in good researchDeception may cause extreme consequencesUndermines progress, personal and institutional credibilityLoss of time in related researchSquanders public fundsThreatens future funding and supportThreatens public safetyDeception is dealt with using severe, career-ending, penalties
37 The Selection of DataDeborah, a third-year graduate student, and Kathleen, a postdoc, have made a series of measurements on a new experimental semi-conductor material using an expensive neutron source at a national laboratory. When they get back to their own lab and examine the data, they get the following data points. A newly proposed theory predicts results indicated by the curve.ResponseBeam IntensityDuring the measurements at the national lab, Deborah and Kathleen observed that there were power fluctuations they could not control or predict. Furthermore, they discussed their work with another group doing similar experiments, and they knew that the other group had gotten results confirming the theoretical prediction and was writing a manuscript describing their results.In writing up their own results for publication, Kathleen suggests dropping the two anomalous data points near the abscissa (the solid squares) from the published graph and from a statistical analysis. She proposes that the existence of the data points be mentioned in the paper as possibly due to power fluctuations and being outside the expected standard deviation calculated from the remaining data points. “These two runs,” she argues to Deborah, “were obviously wrong.”How should the data from the two suspected runs be handled?Should the data be included in tests of statistical significance and why?What other sources of information, in addition to their faculty advisor, can Deborah and Kathleen use to help decide?
38 The Selection of Data Prologue Deborah and Kathleen’s principal obligation, in writing up their results for publication, is to describe what they have done and give the basis for their actions. They must therefore examine how they can meet this obligation within the context of the experiment they have done.Questions that need to be answered include:If the authors state in the paper that data have been rejected because of problems with the power supply, should the data points still be included in the published chart?Should statistical analyses be done that both include and exclude the questionable data?If conventions within their discipline allow for the use of statistical devices to eliminate outlying data points, how explicit do Deborah and Kathleen need to be in the published paper about the procedures they have followed?
39 A Conflict of InterestJohn, a third-year graduate student, is participating in a department-wide seminar where students, postdocs, and faculty members discuss work in progress. An assistant professor prefaces her comments by saying that the work she is about to discuss is sponsored by both a federal grant and a biotechnology firm for which she consults.In the course of the talk, John realizes that he has been working on a technique that could make a major contribution to the work being discussed. But his faculty advisor consults for a different, and competing, biotechnology firm.How should john participate in this seminar?What, if anything, should he say to his advisor—and when?What implications does this case raise for the traditional openness and sharing of data, materials, and findings that have characterized modern science?
40 A Conflict of Interest Prologue Science thrives in an atmosphere of open communication. When communication is limited, progress is limited for everyone. John therefore needs to weight the advantages of keeping quiet—if, in fact there are any—against the damage that accrues to science if he keeps his suggestions to himself. He might also ask himself how keeping quiet might affect his own life in science.Questions:Does John want to appear to his advisor and his peers as someone who less than forthcoming with his ideas?Will he enjoy science as much if purposefully limits communication with others?
41 Summary Why is good research important? What are the traits of quality research?Can you provide a few examples of standards and methods used to assess quality research and quality evidence?What are examples of bad research?What are a few common causes of bias in data and methodological errors?How does one trust information from the internet?What are the three categories of errors in research?
42 Bibliography Evaluating Research Quality, Guidelines for Scholarship Todd Litman, Victoria Transport Policy Institute, November 28th, 2010What are the Standards for Quality Research?Editor’s Focus , Technical Brief Number 9, National Center for Dissemination of Disability Research (NCDDR), 2005Sixty Methodological PotholesDavid Huron, Ohio State University, 2000Evaluating Internet Research SourcesRobert Harris, Virtual Salt, November 22nd, 2010On Being a Scientist, Responsible Conduct in Research, 2nd Ed.Bruce Alberts, President, The National Academy of Sciences (NAS), 2005Kenneth Shine, President, National Academy of Engineering (NAE), 2005Robert White, President, Institute of Medicine (IOM), 2005