Presentation is loading. Please wait.

Presentation is loading. Please wait.

Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough.

Similar presentations


Presentation on theme: "Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough."— Presentation transcript:

1 Session 4 Reliability and Validity

2 Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough evidence to support my using this instrument in the way I want to with the client I want to? Reliability is a prerequisite.

3 Schedule Check-in Homework Discussion Validity Example

4 Homework A community agency is trying to decide which of two depression inventories to use with their clientele. The D1 inventory manual reports norm group (n = 4,523) test-retest reliability scores over two days of.95 and over 6 weeks of.45. What could account for such a difference?

5 Validity Three categories but not exclusive: Content How well do the items represent the intended behavior domain? How was this determined? Criterion How well do the scores relate to a specific outcome? Concurrent – right now Predictive – in the future Construct How well do the scores measure a theoretical phenomena? (intelligence, depression, values)

6 Content Validity Do items questions or tasks represent the intended behavior domain? Procedures Identify behavior domain Make test specifications Experts analyze degree to which content reflects domain

7 Criterion Validity Is instrument systematically related to outcome criterion? Concurrent validity – want to make immediate prediction such as diagnosis (Client is suicidal) Predictive – time lag between when information is gathered and criterion measured (High GRE’s and students’ GPA’s at graduation from advanced degree program)

8 Establishing criterion validity – correlational method Select group to use in validation study Administer instrument If concurrent validity collect criterion data If predictive validity wait until appropriate time to gather criterion data Correlate performance on instrument with criterion; resulting calculation called a validity coefficient

9 Construct Validity How well do the scores measure a theoretical phenomena? (intelligence, depression, values) Can’t be verified through one study Convergent evidence – related to what it should be related Discrimination evidence – not related to what it should be related Factor analysis Experimental interventions, developmental changes

10 Summary of Validity Instrument not validated, its uses are Content – instrument measures behavior domain Criterion – how well instrument predicts to a defined criterion (concurrent & predictive) Construct – accumulation of evidence including content and criterion validity

11 ASVAB example


Download ppt "Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough."

Similar presentations


Ads by Google