Survey Design Class 02.  It is a true measure  Measurement Validity is the degree of fit between a construct and indicators of it.  It refers to how.

Slides:



Advertisements
Similar presentations
Chapter 8 Flashcards.
Advertisements

Reliability and Validity
Conceptualization, Operationalization, and Measurement
Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Ch 5: Measurement Concepts
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Increasing your confidence that you really found what you think you found. Reliability and Validity.
VALIDITY AND RELIABILITY
Psychological Testing Principle Types of Psychological Tests  Mental ability tests Intelligence – general Aptitude – specific  Personality scales Measure.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Measurement Reliability and Validity
Designing Research Concepts, Hypotheses, and Measurement
CH. 9 MEASUREMENT: SCALING, RELIABILITY, VALIDITY
Other Measurement Validity Types. OverviewOverview l Face validity l Content validity l Criterion-related validity l Predictive validity l Concurrent.
Reliability, Validity, Trustworthiness If a research says it must be right, then it must be right,… right??
RESEARCH METHODS Lecture 18
Beginning the Research Design
Neuman & Robson Chapters 5, 6, & (some of) 12 (p )
Lecture 7 Psyc 300A. Measurement Operational definitions should accurately reflect underlying variables and constructs When scores are influenced by other.
Validity n Internal validity: Are the methods correct and the results accurate? n External validity: are the findings generalizable beyond that particular.
Uses of Language Tests.
SELECTION & ASSESSMENT SESSION THREE: MEASURING THE EFFECTIVENESS OF SELECTION METHODS.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Chapter 7 Evaluating What a Test Really Measures
Reliability and Validity. Criteria of Measurement Quality How do we judge the relative success (or failure) in measuring various concepts? How do we judge.
Understanding Validity for Teachers
Chapter 4. Validity: Does the test cover what we are told (or believe)
Test Validity S-005. Validity of measurement Reliability refers to consistency –Are we getting something stable over time? –Internally consistent? Validity.
Measuring Social Life Ch. 5, pp
Measurement and Data Quality
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
Instrument Validity & Reliability. Why do we use instruments? Reliance upon our senses for empirical evidence Senses are unreliable Senses are imprecise.
Reliability and Validity what is measured and how well.
Instrumentation.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Final Study Guide Research Design. Experimental Research.
The Psychology of the Person Chapter 2 Research Naomi Wagner, Ph.D Lecture Outlines Based on Burger, 8 th edition.
Reliability & Validity
Validity Is the Test Appropriate, Useful, and Meaningful?
Measurement Validity.
Learning Objective Chapter 9 The Concept of Measurement and Attitude Scales Copyright © 2000 South-Western College Publishing Co. CHAPTER nine The Concept.
Chapter 8 Validity and Reliability. Validity How well can you defend the measure? –Face V –Content V –Criterion-related V –Construct V.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Chapter 4 Validity Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition Copyright ©2006.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Validity: Introduction. Reliability and Validity Reliability Low High Validity Low High.
Reliability & Validity  Reliability  “dependability”  is the indicator consistent?  same result every time?  Does not necessary measure what you think.
Chapter 6 - Standardized Measurement and Assessment
1 Announcement Movie topics up a couple of days –Discuss Chapter 4 on Feb. 4 th –[ch.3 is on central tendency: mean, median, mode]
Validity & Reliability. OBJECTIVES Define validity and reliability Understand the purpose for needing valid and reliable measures Know the most utilized.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Criminal Justice and Criminology Research Methods, Second Edition Kraska / Neuman © 2012 by Pearson Higher Education, Inc Upper Saddle River, New Jersey.
Reliability and Validity
Ch. 5 Measurement Concepts.
MEASUREMENT: RELIABILITY AND VALIDITY
Social Research Methods MAN-10 Erlan Bakiev, Ph. D.
Test Validity.
Introduction to the Validation Phase
Tests and Measurements: Reliability
Journalism 614: Reliability and Validity
Week 3 Class Discussion.
پرسشنامه کارگاه.
5. Reliability and Validity
RESEARCH METHODS Lecture 18
Reliability & Validity
Reliability & Validity
Ch 5: Measurement Concepts
Qualities of a good data gathering procedures
Presentation transcript:

Survey Design Class 02

 It is a true measure  Measurement Validity is the degree of fit between a construct and indicators of it.  It refers to how well the conceptual and operational definitions mesh with each others  The better the fit the greater the measurement

 When a researcher says that indicator is valid, it is valid for a particular purpose and definition. But less valid or invalid for others

 Questions about feelings toward school   Valid : for measuring morale among teachers   Invalid : for measuring morale of police officers

 (1)Face Validity  It is a judgment by the scientific community that the indicator really measures the construct

 Special type of face validity is the full content of a definition represented in a measure?   is the full content of a definition represented in a measure?

(1) Specify the content in a construct’s definition (2) Sample from all areas of the definition (3) Develop an indicator that taps all of the various parts of the definition.

 The validity of an indicator is verified by comparing it by another measure of the same construct in which a researcher has confidence

 (a) Concurrent: an indicator must be associated with preexisting indicator that is judged to be valid.  Example: Create a new test to measure intelligence.

 (b) Predictive: Indicator predicts future events that are logically related to construct. Example: the Scholastic Aptitude Test (SAT): to measure the ability of a student in college; if the SAT has high predictive validity, the students who get high SAT scores will subsequently do well in college.

 For measure with multiple indicators.  It has two kinds: (a) Convergent: Means that multiple measure of the same construct operate in similar ways

 (b) Discriminant: (Divergent Validity) It means the indicators of one construct are negatively associated with opposing construct.

 (1) Internal Validity: means there are no errors internal to the design of the research project High internal validity High internal validity means there are few errors

 (2) External Validity: It is the ability to generalize findings from a specific setting and small group to board range of setting and people High external validity High external validity means the results can be generalized to many situations and many groups of people.

 Statistical Validity: Means the correct statistical procedure is chosen and its assumptions are fully met