Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014

Slides:



Advertisements
Similar presentations
WMS-IV Wechsler Memory Scale - Fourth Edition
Advertisements

Achievement Tests Designed to measure the skills and abilities acquired through direct instruction or intervention. Can measure both lower order and high.
Issues and Solutions Regarding Dual Discrepancy Rationale for the shift to the DD model : There were a number of problems with using IQ as the predictor.
Merry Christmas and Happy New Year 2007 The Beery- Buktenica Developmental Test of Visual-Motor Integration Present by Asst. Prof. Dr. Nuntanee Satiansukpong.
Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services KASP 2014
Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services May, 2013
Handwriting performance of children with dyslexia
The Children’s Psychological Processes Scale (CPPS)
Using the CPPS to Evaluate Children with Learning Problems
A NEW APPROACH TO IDENTIFYING LEARNING DISABILITIES RTI: Academics.
SLD Eligibility Review Teresa Fritsch, Psy.S., NCSP School Psychologist
VALIDITY.
The Effects of Increased Cognitive Demands on the Written Discourse Ability of Young Adolescents Ashleigh Elaine Zumwalt Eastern Illinois University.
Ventura County SELPA Pattern of Strengths and Weaknesses (PSW) Model: An Overview This PowerPoint is provided as an overview to the Ventura County SELPA.
Bruininks-Oserentsky Test of Motor Proficiency- 2nd Edition
Perceptual-Motor Skills
The Children’s Psychological Processes Scale (CPPS)
Assessing Achievement and Aptitude
Assessment of Psychological Processes Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services Fall 2013.
Determining Eligibility Within Tennessee’s RTI² Framework TASP 2013 Fall ConferenceTASP 2013 Fall Conference Theresa Nicholls, Ed.S., NCSPTheresa Nicholls,
Adolescent Literacy, Reading Comprehension & the FCAT Dr. Joseph Torgesen Florida State University and Florida Center for Reading Research CLAS Conference,
Author: Sabrina Hinton. Year and Publisher: American Guidance Service.
Assessment: Understanding the Psycho-Educational Evaluation Elizabeth A. Rizzi, MA NYS Certified School Psychologist John Jay High School.
Interpretation of the WISC-IV
Woodcock-Johnson Cognitive Ability Test Brenda Stewart Ed 6331 Spring 2004.
Educational Assessment of Children
Athleticism, like intelligence, is many things
Formulating objectives, general and specific
Sped 576: Internship in Assessment Cindy L Collado University of Illinois at Chicago.
Specific Learning Disabilities in Plain English Specific Learning Disabilities in Plain English Children with specific learning disabilities (SLD) have.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Measures of Intelligences IQ
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
EC CHAIRPERSON/PSYCHOLOGIST MEETING Helpful Tips re: Interventions.
Language and Learning Disabilities. IDEA definition Disorder in one or more basic psychological processes involved in understanding or using language.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
Validity Is the Test Appropriate, Useful, and Meaningful?
Construct Validity of the Battery of Developmental Assessment (BDA): A Model Tool for Lebanon Huda Husseini Bibi (Ed.D) Lebanese International University.
+ Third Party Evaluation – Interim Report Presentation for Early Childhood Advisory Council December 19, 2013.
Characteristics of Students with Learning Disabilities and the Impact on Learning Mathematics.
Charlevoix-Emmet ISD Eligibility Guidelines
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Response to Intervention within IDEIA 2004: Get Ready South Carolina Bradley S. Witzel, PhD Department of Curriculum and Instruction Richard W. Riley College.
Assessing Learners with Special Needs: An Applied Approach, 6e © 2009 Pearson Education, Inc. All rights reserved. Chapter 5: Introduction to Norm- Referenced.
Psychological Testing
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Leiter International Performance Scale – Revised
Chapter 6 - Standardized Measurement and Assessment
Steven W. Evans, Christine Brady, Lee Kern, Christiana Andrews and the CARS Research Team Measurement Development and Inclusion Criteria: Developing Meaningful.
UNIT Standardization and Technical Properties n Standardization Sample n Reliability Studies Internal Consistency Reliabilities at Decision-Making Points.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
What makes us intelligent?. The ability to learn from experience, solve problems, and use knowledge to adapt to new situations. Is socially constructed.
San Luis Valley Gifted Education Network Meeting October 17, 2013.
Intelligence What makes us intelligent Or Not so intelligent.
Applications of the RIAS-2 in PSW There are multiple models of PSW ( e. g., Consistency-Discrepancy, Concordance- Discordance, Aptitude-Achievement Consistency).
“PSW” – What’s It All About?”
What makes us intelligent Or Not so intelligent
Applications of the RIAS-2 in PSW
Wechsler Individual Achievement Test-II
Child Psychopathology
Identifying Executive Function Deficits that Affect Academic Learning
Reliability & Validity
Using PSW to Identify SLD
Made for individuals ages birth to 89 years
What makes us intelligent Or Not so intelligent
Cognitive Abilities Test (CogAT)
What makes us intelligent Or Not so intelligent
What makes us intelligent Or Not so intelligent
Unit 11: Testing and Individual Differences
Presentation transcript:

Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014 How to Identify Neuropsychological Processing Deficits in Children with SLD Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014

Notice of Copyright 2014 This PowerPoint presentation and accompanying materials are copyrighted by Milton J. Dehn and Schoolhouse Educational Services, LLC. They are not to be reprinted, copied, or electronically disseminated without written permission. To obtain permission, email milt@psychprocesses.com.

Workshop Information Sources Essentials of Processing Assessment, 2nd Ed. Children’s Psychological Processes Scale (CPPS) Psychological Processing Analyzer (PPA) Numerous references and articles on processing assessment and interventions www.psychprocesses.com Presenter Contact: milt@psychprocesses.com

What are Neuropsychological Processes? Brain processes, operations, functions Include “cognitive” processes When information is perceived, transformed, manipulated, stored, retrieved, expressed Whenever we think, reason, problem-solve Both basic and higher level processes For SLD assessment, focus on broad processes related to academics

Neuropsychological Processes are Not Not IQ, but they contribute to IQ Not “abilities”, but are more the specific brain processes that underlie abilities More like aptitudes than abilities; aptitudes are more specific, abilities are more general Not skills; skills and knowledge are the product of processes Dehn: not include sensory, motor, social-emotional when conducting SLD assessment

Processes for SLD Assessment Attention Auditory Processing Executive Functions Fine Motor Fluid Reasoning Long-Term Recall Oral Language Phonological Processing Processing Speed Visual-Spatial Processing Working Memory (WM)

Processes and Academic Learning Psychological processes are like “aptitudes” Relations established through research Flanagan et al., & McGrew Swanson, Geary, and others The influence of processes varies by age For SLD look for academic area and related psychological processes to both be low See Table

Processes and Scores Allowed in Dehn’s Assessment Model The list of 11 processes Rating scales Composite scores preferred over subtest scores Achievement-like scores such as verbal, crystallized intelligence, vocabulary are excluded Some subtests are re-classified

Task Analysis/Classification of Subtests Consider definition of the process Consider factor analytic information What is the primary process being measured by the subtest? (not just input or output) Which primary process allows the examinee to successfully complete the task What the task is typically used to measure No such thing as “pure” subtest measure

Selective & Cross-Battery Testing Start with batteries you have Try to limit number of supplemental batteries Avoid redundancies Tests should be normed about the same time Only selected subtests administered Two subtests or composites are ideal May include rating scales Use cross-battery analysis procedures

Cross-Battery, Selective Testing Test all processes important for academics with most attention to an in-depth assessment of hypothesized weaknesses Pick composites first See selective testing table Link See comprehensive list link from Essentials of Processing Assessment, 2nd Edition

Dehn’s PSW Requirements Intra-individual weaknesses are statistically significant At least one process is a deficit (see definition) The deficit is related to the deficient academic skill Subtest scores must be unitary for a deficit There is at least one strength (a processes that is in average range) Consistency between low process score(s) and the related low academic skill score

Guidelines for Weaknesses & Deficits Scores below 90 are normative weaknesses Intra-individual strengths & weaknesses use 12 points Assumes composites/subtests have hi reliability Deficit = both normative and intra-individual weakness

Dehn’s Definition of Deficit 3 reasons for deficit emphasis Both weaknesses together is statistically rare A deficit indicates an underlying neurological impairment Students with both kinds of weaknesses really need special ed.

When to Use IQ Instead of Cross-Battery Mean Okay to use IQ as predictor because it has high correlations with most processes Is technically more appropriate because it has known reliability and SEM Use when only weak processes tested Use when only a few processes tested Use when a legal challenge is anticipated

Processing Analysis Worksheet Composite scores from test manual when possible Convert all scores to standard scores Compute clinical scores by averaging Compute processing mean or use IQ Calculate discrepancies Determine weaknesses and deficits Both kinds of weaknesses = a deficit Do pairwise comparisons Opposites and those closely related Completed Example

Pairwise Comparisons For intervention planning, not diagnosis Pay most attention to: Opposites Those that are closely related A greater discrepancy is required for significance Significant when confidence intervals do not overlap

Non-Unitary Scores When standard score difference is greater than 22 points (or 15 points) Something different is being measured or something is different about the subtest task Investigate further with more testing if cannot be explained

Using Dehn’s Automated Analysis Worksheet to Determine PSW Automated worksheet from Essentials of Processing, 2nd Edition

Psychological Processing Analyzer 2.0 Available at www.psychprocesses.com Identifies statistically significant strengths, weaknesses, deficits, and assets Can enter composite and/or subtest scores 11 psychological processes Takes scores (almost 400 to choose from) from more than 40 different scales: cognitive, achievement, rating, and processing

Psychological Processing Analyzer Some subtests are re-classified based on the primary demands of the task Options: Use the mean of the process scores or IQ as predicted score Differences greater than critical values are intra-individual weaknesses

PPA Equations Converts all scores to mean of 100; SD of 15 .01 or .05 level of significance Difference formulas based on reliability coefficients of composites/subtests Regression toward the mean Predicted score based on mean of other 10 Non-unitary scores are flagged

PPA Demo and Report See demo See sample report Report has table, graph, and narrative Pairwise comparisons also provided Identifies academic areas associated with the identified deficits

Using a Rating Scale to Assess Processes Processing deficits are manifested through behaviors Behavior ratings by teachers can be used to measure processing abilities Examples: BRIEF and other Executive Function Scales Also, the new CPPS Use the CPPS for processes not directly tested

Children’s Psychological Processes Scale (CPPS) Overview Standardized teacher rating scale Ages 5-0-0 to 12-11-30 121 items across 11 subscales Entirely online, internet-web based Online administration time of 15 minutes Online scoring and report Author: Milton Dehn; published by Schoolhouse Educational Services, 2012 Measurement Consultant: Kevin McGrew

Main Purpose of the CPPS To identify psychological (cognitive) processing weaknesses in children referred for a learning disability evaluation An additional source of data for diagnostic purposes Can be used as a Pattern of Strengths and Weaknesses (PSW) analysis Covers processes not directly tested

The CPPS Identifies Children with SLD LD subjects had significantly higher means on all subscales; about 1.5 SD difference Link The CPPS has high classification accuracy in regards to LD 37 LD subjects compared with matched controls Using CPPS GPA cutoff of 60 had 92% classification accuracy across 74 subjects

CPPS Standardization 1,121 students rated by 278 teachers 128 communities in 30 states in U.S. All data collected online Demographics match U.S. Census well Norms: 4 age groups (5-6; 7-8; 9-10; 11-12) Included children with disabilities

CPPS Processes Attention Auditory Processing Executive Functions Fine Motor Fluid Reasoning Long-Term Recall Oral Language Phonological Processing Processing Speed Visual-Spatial Processing Working Memory (WM)

CPPS General Processing Ability (GPA) Based on average of all process scores Emerges from factor analysis; similar to concept of general intelligence Processes function in an inter-related fashion Most processes contribute to any given behavior, task On CPPS defined as “the underlying efficiency of processing”

Additional CPPS Factors Second factor is Attention, EF, and WM: Self-Regulatory Processes (SRP) Third factor is Fine Motor and Visual-Spatial: Visual-Motor processes Clusters: Memory, Language WM loads higher on GPA than SRP: WM is both a cognitive and an executive process because it includes STM and WM

How the Web-Based CPPS Works A psychologist side and a teacher side Psychologist manages & has student records Teachers can only access blank rating forms Once teacher has completed ratings, completed form goes to psychologist’s side and teacher can no longer access

Completing Teacher Rating Form Takes approximately 15 minutes Responses: Never, Sometimes, Often, Almost Always Must respond to all items Incomplete ratings will save and can be completed later Free paper copies can be printed. Psychologist then fills in ratings online.

CPPS Items What the teacher rater sees Link Regrouped by subscale after rating In developmental order (ability) from lowest to highest item Example of scoring in developmental sequence Link

CPPS Report Brief narrative, graph, and a table of scores Change-sensitive W-scores T-scores; percentiles; confidence intervals Intra-individual strengths and weakness discrepancy table T-score to standard score converter Example

CPPS Discrepancy Analysis Use discrepancy table to determine pattern of strengths and weaknesses Predicted score based on mean of other 10 Regression toward the mean included +/- 1.00 to 2.00 SD of SEE discrepancy options Strengths and Weakness labeling is opposite of discrepancy, e.g. “-” value = a strength Link

T-Score Conversion Table Optional Purpose: To see how consistent CPPS scores are with achievement and cognitive scores T-score x 1.5 + 25 and then reverse distance from mean Example: T-score of 60 x 1.5 = 90 + 25 = 115 Then subtract 15 from 100 = 85 Example

Diagnosing LD with the CPPS Look for pattern of strengths and weaknesses (discrepancy table) Weaknesses should also be normative weaknesses (T-scores above 60) Look at Table to identify “deficits” Weaknesses should link to evidence-based achievement relations Same criteria as PSW model

CPPS Reliability Internal consistency subscale reliability ranges from .88 to .98 .99 on Total Score Inter-rater reliability Range of .21 to .90 Median coefficient of 76.5

Correlations with Achievement High correlations with WJ III Achievement Test scores Link The broader the achievement score, the higher the correlations

Correlations with WJ III COG Fewer correlations than achievement Link All CPPS processes have significant correlations with Cognitive Fluency (ability to quickly and fluently perform cognitive tasks) Most CPPS scales expected to link with WJ III COG tests do, except attention, processing speed, and WM (but does relate with STM) Also, discriminant evidence

CPPS Correlations with BRIEF CPPS Attention, Executive Functions, and Working Memory (SRP Factor) have the highest correlations with all BRIEF scales CPPS Attention and EF mostly are >.70 indicating they measure same domains as BRIEF Link Other CPPS scales correlate with BRIEF metacognitive scales but not behavioral

Questions