Presentation is loading. Please wait.

Presentation is loading. Please wait.

Woodcock-Johnson Cognitive Ability Test Brenda Stewart Ed 6331 Spring 2004.

Similar presentations


Presentation on theme: "Woodcock-Johnson Cognitive Ability Test Brenda Stewart Ed 6331 Spring 2004."— Presentation transcript:

1 Woodcock-Johnson Cognitive Ability Test Brenda Stewart Ed 6331 Spring 2004

2 Authors Acronym: WJ III COG (Woodcock-Johnson III, 2004) Dr. Richard Woodcock, Dr. Kevin S. McGrew, Dr. Nancy Mather (Plake, 2003) F. A. Shrank (computer software) (Woodcock-Johnson III, 2004)

3 Publication & Price Original: 1977, Updated: 2001 (Plake, 2003) Riverside Publishing Company (Woodcock- Johnson III, 2004) Current Price: – Achievement Battery: $444.00 – Complete basic examiner’s kit: $631.00 – With Case: $714.00 (Woodcock-Johnson III, 2004)

4 Purpose and Specifics Tests intellectual ability and cognitive ability with auditory phonemic awareness (Woodcock-Johnson III, 2004) Major Areas Tested: (Plake, 2003) – General intellectual ability – Specific cognitive abilities – Scholastic aptitude – Oral language Age Range: 2-90+ (Flanagan, 2001) Examiner Qualifications: specific training required (Plake, 2003) Test Type - Cognitive: Individual (Plake, 2003)

5 Validity Content Evidence – Aligned with core curricular areas and domains specified in federal legislation (McGrew, 1991) Substantive Evidence – Broad and narrow abilities are measured (Shrank, 2001) Internal Structure Evidence – Aligned with a stratified model of intellectual abilities defined by CHC Theory (Carrell, Horn, and Carroll) (Plake, 2003)

6 Validity External Structure – Correlates well with other tests measuring similar constructs (Shrank, 2001) Predictive Validity – reliabilities are sufficiently high, almost all in the.90’s (Shrank, 2001) Concurrent Validity – good concurrent validity overall (Shrank, 2001)

7 Evidence Reliability evidence – (over time) test-retest (one day) median scores range from.81-.85 (Plake, 2003) Reliability evidence – (over assessors) Interrater reliability for writing are reported to be in the high.90’s (Plake, 2003) Reliability evidence – (over content) split-half or Rasch analysis was used (McGrew, 2003)

8 Evidence Generalization evidence – focuses on test item level and the level of aggregated items. Checks on item fairness based on bias and sensitivity (McGrew, 1991) Consequential evidence – index of the precision with position in a group is indicated (McGrew, 1991) Practicality evidence – designed for convenience without using cumbersome test materials (McGrew, 1991)

9 Scores Scores obtained – Reliabilities are sufficiently high to make inferences about individual test takers: grade, age, percentile, discrepancy (Plake, 2003) Composite scores – 10 standard battery test scores (Plake, 2003) Scale scores – Formula for scale scores and two sets of discrepancy information, ability/achievement and intra-ability discrepancies (McGrew, 1991)

10 Scores Mean, standard deviation, standard error - norm based ability, same ability at the same age or grade level (Shrank, 2001) Composite scores – determines the significance of a subject’s score when it differs from others at the same age or grade level (Shrank, 2001) Scale scores – differentially g-weighted score that make up the General Intellectual Ability (GIA) or different tests have different weights (Satler, 2001)

11 Time Timed performance (Plake, 2003) Testing time – approximately 5 minutes per test; 55-65 minutes for standard battery (Plake, 2003)

12 Standardization Sample Information Sample size – 8,818 individuals from preschool age to adults (Plake, 2003) Demographic characteristics – matched to geographic region, community size, gender, race, and type of school. Adults on education & occupation levels and employment status (Plake, 2003)

13 Standardization Sample Information Special populations – students with disabilities (part-time in regular classes) and English language learners (Plake, 2003) Alternate forms available – forms A and B can be used interchangeably (Plake, 2003) Additional Notes – measures what it measures well, it may not cover some abilities efficiently (Plake, 2003)

14 Bibliography Flanagan, D. (2001). Comparative Features of the WJ III Tests of Cognitive Abilities (Woodcock-Johnson III: Assessment Service Bulletin No. 1) Itasca, IL: Riverside Publishing. McGrew, K.S., Werder, J.K., & Woodcock, R.W. (1991). WJ-R Technical Manual. Itasca, IL: Riverside Publishing. Plake, B., Impara, J., & Spies, R. (2003). Fifteenth Mental Measurements Yearbook. Lincoln, NE: University of Nebraska Press. Sattler, J. M. (1992). Assessment of Children: Cognitive Applications, 4 th ed. San Diego, CA: Jerome M. Sattler, Publisher, Int. Schrank, F. A., McGrew, K. S., & Woodcock, R. W. (2001). Technical Abstract (Woodcock-Johnson III Assessment Service Bulletin No.2). Itasca, IL: Riverside Publishing. Woodcock-Johnson III (WJ III). Retrieved from the Web, February 3, 2004. http://www.riverpub.com/products/clonical/wj3/pricing.html


Download ppt "Woodcock-Johnson Cognitive Ability Test Brenda Stewart Ed 6331 Spring 2004."

Similar presentations


Ads by Google