# Reliability n Consistent n Dependable n Replicable n Stable.

## Presentation on theme: "Reliability n Consistent n Dependable n Replicable n Stable."— Presentation transcript:

Reliability n Consistent n Dependable n Replicable n Stable

Two Administrations n Test/retest same test - same group n Parallel/Alternate forms different versions of test - same group

Two administration procedure: n Administer tests (in two sessions) n Convert to z scores (if necessary) n Correlate (Pearson or Spearman)

Two administration issues: n Problems????? n Duration between???? n Type of variable?????

One administration procedure:  Administer test to one group  Divide questions to score  Split Half  first/second or odd/even halves????  Correlate scores from halves  Apply Spearman-Brown formula  estimate changes in length

Uses of one administration: n Internal consistency of items n What types of tests to use this on???

Inter item consistency n Statistical estimation  Kuder-Richardson Formula 20 (KR20) = dichotomous questions  Cronbach alpha (alpha coefficient) = all questions n (factor analysis)

Reliability coefficient n 0.00 - 1.00 n higher is better n score is relative, not absolute

Inter-rater reliability n Consensus between raters n Percentage of agreement n Kappa statistic (2 or many raters)

Threats to reliability n Construction n Administration (tester, testee, environment) n Scoring n Interpretation

Project homework n Which approaches were used to determine the reliability of your measure? Why were those measures selected? Could they have used other measures? Why/why not?