JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.

Slides:



Advertisements
Similar presentations
Conceptualization, Operationalization, and Measurement
Advertisements

Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Cross Cultural Research
Survey Methodology Reliability and Validity EPID 626 Lecture 12.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
VALIDITY AND RELIABILITY
Collecting data Chapter 5
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
The Purpose of Action Research
RESEARCH METHODS Lecture 18
Data Collection Considerations October Support Session.
Concept of Measurement
Lecture 7 Psyc 300A. Measurement Operational definitions should accurately reflect underlying variables and constructs When scores are influenced by other.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 14 Overview of Qualitative Research Gay, Mills, and Airasian
Methods used to validate qualitative
Reliability, Validity, & Scaling
Paper III Qualitative research methodology. Objective 1.7 Explain the importance of Credibility in Qualitative Research.
Qualitative Research.
VALIDITY, RELIABILITY, and TRIANGULATED STRATEGIES
Validity & Reliability Trustworthiness
Data Analysis. Quantitative data: Reliability & Validity Reliability: the degree of consistency with which it measures the attribute it is supposed to.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
RESEARCH IN MATH EDUCATION-3
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Reliability and Validity Why is this so important and why is this so difficult?
ScWk 240 Week 6 Measurement Error Introduction to Survey Development “England and America are two countries divided by a common language.” George Bernard.
Research Seminars in IT in Education (MIT6003) Research Methodology I Dr Jacky Pow.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Quantitative and Qualitative Approaches
CDIS 5400 Dr Brenda Louw 2010 Validity Issues in Research Design.
TRUSTWORTHINESS OF THE RESEARCH Assoc. Prof. Dr. Şehnaz Şahinkarakaş.
Measurement Validity.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Unit 5: Improving and Assessing the Quality of Behavioral Measurement
Post, post, post Structuralism to Post-structuralism Context is relevant Parts are as important as the whole and can change meaning of the whole How.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
SOCW 671: #5 Measurement Levels, Reliability, Validity, & Classic Measurement Theory.
Educational Research Chapter 14 Overview of Qualitative Research
Reliability and Validity Themes in Psychology. Reliability Reliability of measurement instrument: the extent to which it gives consistent measurements.
Chapter 6 - Standardized Measurement and Assessment
Reliability EDUC 307. Reliability  How consistent is our measurement?  the reliability of assessments tells the consistency of observations.  Two or.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
1 Prepared by: Laila al-Hasan. 1. Definition of research 2. Characteristics of research 3. Types of research 4. Objectives 5. Inquiry mode 2 Prepared.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Measurement and Scaling Concepts
Research Design. How do we know what we know? The way we make reasoning Deductive logic Begins with one or more premises, reasoning then proceeds logically.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Survey Methodology Reliability and Validity
Classroom Assessments Checklists, Rating Scales, and Rubrics
Lecture 5 Validity and Reliability
Reliability and Validity
Reliability and Validity in Research
Concept of Test Validity
Associated with quantitative studies
Dealing with Validity, Reliability, and Ethics
Classroom Assessments Checklists, Rating Scales, and Rubrics
پرسشنامه کارگاه.
Reliability and Validity of Measurement
RESEARCH METHODS Lecture 18
Research Methods in Education Session 7
Overview of Qualitative Research Gay, Mills, and Airasian
Measurement Concepts and scale evaluation
Qualities of a good data gathering procedures
Presentation transcript:

JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics

Reliability and Validity Before using a measuring instrument it is important to be assured that it has acceptable levels of reliability and validity.

Validity How we know that the data we collect (test scores, for example) accurately gauge what we are trying to measure The degree to which scientific observations actually measure or record what they purport to measure

Types of Validity Criterion Content Construct

Criterion Validity Using another measuring instrument as a criterion to check the validity of the new one.

Content Validity The content of the instrument needs to be relevant to the concept of what is being measured. For instance, an instrument designed to rate depression that asked questions not relevant to depression would have problems with content validity.

Construct Validity Describes the extent to which an instrument measures a theoretical construct. This is the most difficult validity to establish. Basically this approach ties the instrument in with a theoretical perspective.

Validity Linked to numerically based research Convinces the researcher and the researchee that the results of the research were right, accurate, and withstand scrutiny from other researchers Qualitative research – ▫Trustworthiness (Kinchloe) ▫Understanding (Wolcot)

Validity Guba Trustworthiness is established through credibility, transferability, dependability, and confirmability Credibility ▫Researcher’s ability to take into account all of the complexities that present themselves in a study and to deal with patterns that re not easily explained

Validity Transferability ▫Qualitative researchers’ beliefs that everything they study is context bound and that the goal of their work is not to develop “truth” statements that can be generalized to larger groups. ▫Depends on whether the consumer of the research can identify with the setting

Validity Dependability ▫The stability of the data  Triangulation of data tow or more methods  Audit trail-examine the interpretive accounts that are grounded in the language of the people studied and in their own words

Validity Theoretical validity ▫The ability of the research report to explain the phenomenon that has been studied and described Generalizability ▫Within the community that has been studied (internal) ▫To settings that were not studied by the researcher (external)

Validity Evaluative validity ▫Whether the researcher was objective enough to report the data in an unbiased way

Validity In Action Research Action researchers need a system for judging the quality of their inquiries that is specifically tailored to their classroom-based research projects Democratic validity requires that multiple perspectives of all participants have been accurately represented

Validity In Action Research Outcome validity required that the action emerging from the study leads to the successful resolution of the problem ▫Your study is valid if you learn something that can be applied to subsequent research Process validity requires that a study has been conducted in a dependable and competent manner

Validity In Action Research Catalytic validity requires that the participants are moved to take action o the basis of their heightened understanding of the subject of the study Dialogic validity involved having a critical conversation with others about your research findings and practices

Establishing Validity Talk little, listen a lot Record observations accurately Begin writing early Let readers see for themselves Report fully Be candid Seek feedback Write accurately

Reliability The consistency with which our data measures what we are attempting to measure over time Getting the same results over time As you think about the results of your inquiry, consider whether you think that your data would be consistently collected if the same technique were used over time

Reliability The extent to which a measure reveals actual differences in what is being measured. It is possible for a measuring instrument to show variance that has nothing to do with what it is actually supposed to be measuring. You could construct an IQ test that produces different scores but the difference in the scores is actually caused by the way the test is constructed rather than differences in the IQ of persons taking the test.

Sources of Error Sources of error in an instrument can be caused by: ▫definitions are unclear/vague ▫retrospective information, i.e., information that is gathered by recall or recollection ▫variations in collection conditions ▫structure of the instrument

Testing Reliability There are four basic methods for testing the reliability of an instrument. ▫Test-retest ▫Alternate form ▫split half ▫observer reliability

Test-Retest Repeated administration of the instrument to the same people on separate occasions. This is done to test the reliability of the instrument - not during the actual research project that the instrument is going to be used in.

Alternate Form Variations of the form, alternates, are used on the same individuals and then compared.

Split Half Items in the instrument are divided into comparable segments in such a way that the different segments should have comparable scores. This type of test for reliability is used to screen the internal reliability, or consistency, of the instrument.

Observer Reliability Compares the administration of an instrument performed by different administrators who are trained to use the same protocol.

Correlation Coefficient

A statistical procedure that measures the extent that the comparisons are similar or not. Used for testing the reliability of a survey, instrument, or test. Correlation coefficients range from 0.0 to 1.0.

Interpreting the Correlation Coefficient The Correlation Coefficient can produce a result resulting from 0.0 to is a perfect correlation and rarely, if ever, achieved. Usually a coefficient of.80 or better suggest that the instrument is reasonably reliable. 0.0 is the other end of the scale indicating that the instrument is not at all reliable.

Generalizability Refers to the applicability of findings to settings and context different from the one in which they were obtained. Can it be explained to a wider group of people Goal of action research is to understand what is happening in your school or your classroom and to determine what might improve things in that context

Personal Bias If we conduct our research in a systematic, disciplined manner, we will go a long way toward minimizing personal bias in our findings Challenge is to remain open and objective, to look, and to reflect on what we see

Ethics How each of us treats the individuals with whom we interact There is little distance between teacher researchers and their subjects—their students Qualitatively oriented action research is open ended Informed consent