Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough.

Slides:



Advertisements
Similar presentations
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Advertisements

The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
VALIDITY AND RELIABILITY
Part II Sigma Freud & Descriptive Statistics
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
Part II Sigma Freud & Descriptive Statistics
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Chapter 4A Validity and Test Development. Basic Concepts of Validity Validity must be built into the test from the outset rather than being limited to.
RESEARCH METHODS Lecture 18
Chapter 4 Validity.
RELIABILITY & VALIDITY What is Reliability? What is Reliability?What is Reliability?What is Reliability? How Can We Measure Reliability? How Can We Measure.
RELIABILITY & VALIDITY
Concept of Measurement
Reliability and Validity
Lecture 7 Psyc 300A. Measurement Operational definitions should accurately reflect underlying variables and constructs When scores are influenced by other.
Validity n Internal validity: Are the methods correct and the results accurate? n External validity: are the findings generalizable beyond that particular.
Measurement: Reliability and Validity For a measure to be useful, it must be both reliable and valid Reliable = consistent in producing the same results.
Validity of Selection. Objectives Define Validity Relation between Reliability and Validity Types of Validity Strategies.
Chapter 7 Evaluating What a Test Really Measures
Classroom Assessment A Practical Guide for Educators by Craig A
Understanding Validity for Teachers
VALIDITY. Validity is an important characteristic of a scientific instrument. The term validity denotes the scientific utility of a measuring instrument,
Measurement Concepts & Interpretation. Scores on tests can be interpreted: By comparing a client to a peer in the norm group to determine how different.
Test Worthiness Chapter 3.
Ch 6 Validity of Instrument
Instrumentation.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
SELECTION OF MEASUREMENT INSTRUMENTS Ê Administer a standardized instrument Ë Administer a self developed instrument Ì Record naturally available data.
The Basics of Experimentation Ch7 – Reliability and Validity.
Validity. Face Validity  The extent to which items on a test appear to be meaningful and relevant to the construct being measured.
Reliability & Validity
Validity Is the Test Appropriate, Useful, and Meaningful?
Reliability vs. Validity.  Reliability  the consistency of your measurement, or the degree to which an instrument measures the same way each time it.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Advanced Research Methods Unit 3 Reliability and Validity.
Concurrent Validity Pages By: Davida R. Molina October 23, 2006.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Chapter 4 Validity Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition Copyright ©2006.
More Validity And some reliability. More validity Construct validity Content validity Face validity Concurrent validity Predictive validity Discriminant.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Validity and Item Analysis Chapter 4. Validity Concerns what the instrument measures and how well it does that task Not something an instrument has or.
Validity and Item Analysis Chapter 4.  Concerns what instrument measures and how well it does so  Not something instrument “has” or “does not have”
Measurement and Scaling
Reliability and Validity Themes in Psychology. Reliability Reliability of measurement instrument: the extent to which it gives consistent measurements.
Chapter 6 - Standardized Measurement and Assessment
Validity & Reliability. OBJECTIVES Define validity and reliability Understand the purpose for needing valid and reliable measures Know the most utilized.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 5 What is a Good Test?
WHS AP Psychology Unit 7: Intelligence (Cognition) Essential Task 7-3:Explain how psychologists design tests, including standardization strategies and.
Consistency and Meaningfulness Ensuring all efforts have been made to establish the internal validity of an experiment is an important task, but it is.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 25 Critiquing Assessments Sherrilene Classen, Craig A. Velozo.
Chapter 2 Theoretical statement:
Lecture 5 Validity and Reliability
Reliability and Validity in Research
Concept of Test Validity
Reliability & Validity
Tests and Measurements: Reliability
Human Resource Management By Dr. Debashish Sengupta
Week 3 Class Discussion.
پرسشنامه کارگاه.
PSY 614 Instructor: Emily Bullock Yowell, Ph.D.
RESEARCH METHODS Lecture 18
Measurement Concepts and scale evaluation
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Qualities of a good data gathering procedures
Presentation transcript:

Session 4 Reliability and Validity

Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough evidence to support my using this instrument in the way I want to with the client I want to? Reliability is a prerequisite.

Schedule Check-in Homework Discussion Validity Example

Homework A community agency is trying to decide which of two depression inventories to use with their clientele. The D1 inventory manual reports norm group (n = 4,523) test-retest reliability scores over two days of.95 and over 6 weeks of.45. What could account for such a difference?

Validity Three categories but not exclusive: Content How well do the items represent the intended behavior domain? How was this determined? Criterion How well do the scores relate to a specific outcome? Concurrent – right now Predictive – in the future Construct How well do the scores measure a theoretical phenomena? (intelligence, depression, values)

Content Validity Do items questions or tasks represent the intended behavior domain? Procedures Identify behavior domain Make test specifications Experts analyze degree to which content reflects domain

Criterion Validity Is instrument systematically related to outcome criterion? Concurrent validity – want to make immediate prediction such as diagnosis (Client is suicidal) Predictive – time lag between when information is gathered and criterion measured (High GRE’s and students’ GPA’s at graduation from advanced degree program)

Establishing criterion validity – correlational method Select group to use in validation study Administer instrument If concurrent validity collect criterion data If predictive validity wait until appropriate time to gather criterion data Correlate performance on instrument with criterion; resulting calculation called a validity coefficient

Construct Validity How well do the scores measure a theoretical phenomena? (intelligence, depression, values) Can’t be verified through one study Convergent evidence – related to what it should be related Discrimination evidence – not related to what it should be related Factor analysis Experimental interventions, developmental changes

Summary of Validity Instrument not validated, its uses are Content – instrument measures behavior domain Criterion – how well instrument predicts to a defined criterion (concurrent & predictive) Construct – accumulation of evidence including content and criterion validity

ASVAB example