Presentation is loading. Please wait.

Presentation is loading. Please wait.

Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014

Similar presentations


Presentation on theme: "Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014"— Presentation transcript:

1 Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014
How to Identify Neuropsychological Processing Deficits in Children with SLD Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014

2 Notice of Copyright 2014 This PowerPoint presentation and accompanying materials are copyrighted by Milton J. Dehn and Schoolhouse Educational Services, LLC. They are not to be reprinted, copied, or electronically disseminated without written permission. To obtain permission,

3 Workshop Information Sources
Essentials of Processing Assessment, 2nd Ed. Children’s Psychological Processes Scale (CPPS) Psychological Processing Analyzer (PPA) Numerous references and articles on processing assessment and interventions Presenter Contact:

4

5 What are Neuropsychological Processes?
Brain processes, operations, functions Include “cognitive” processes When information is perceived, transformed, manipulated, stored, retrieved, expressed Whenever we think, reason, problem-solve Both basic and higher level processes For SLD assessment, focus on broad processes related to academics

6 Neuropsychological Processes are Not
Not IQ, but they contribute to IQ Not “abilities”, but are more the specific brain processes that underlie abilities More like aptitudes than abilities; aptitudes are more specific, abilities are more general Not skills; skills and knowledge are the product of processes Dehn: not include sensory, motor, social-emotional when conducting SLD assessment

7 Processes for SLD Assessment
Attention Auditory Processing Executive Functions Fine Motor Fluid Reasoning Long-Term Recall Oral Language Phonological Processing Processing Speed Visual-Spatial Processing Working Memory (WM)

8 Processes and Academic Learning
Psychological processes are like “aptitudes” Relations established through research Flanagan et al., & McGrew Swanson, Geary, and others The influence of processes varies by age For SLD look for academic area and related psychological processes to both be low See Table

9 Processes and Scores Allowed in Dehn’s Assessment Model
The list of 11 processes Rating scales Composite scores preferred over subtest scores Achievement-like scores such as verbal, crystallized intelligence, vocabulary are excluded Some subtests are re-classified

10 Task Analysis/Classification of Subtests
Consider definition of the process Consider factor analytic information What is the primary process being measured by the subtest? (not just input or output) Which primary process allows the examinee to successfully complete the task What the task is typically used to measure No such thing as “pure” subtest measure

11 Selective & Cross-Battery Testing
Start with batteries you have Try to limit number of supplemental batteries Avoid redundancies Tests should be normed about the same time Only selected subtests administered Two subtests or composites are ideal May include rating scales Use cross-battery analysis procedures

12 Cross-Battery, Selective Testing
Test all processes important for academics with most attention to an in-depth assessment of hypothesized weaknesses Pick composites first See selective testing table Link See comprehensive list link from Essentials of Processing Assessment, 2nd Edition

13 Dehn’s PSW Requirements
Intra-individual weaknesses are statistically significant At least one process is a deficit (see definition) The deficit is related to the deficient academic skill Subtest scores must be unitary for a deficit There is at least one strength (a processes that is in average range) Consistency between low process score(s) and the related low academic skill score

14 Guidelines for Weaknesses & Deficits
Scores below 90 are normative weaknesses Intra-individual strengths & weaknesses use 12 points Assumes composites/subtests have hi reliability Deficit = both normative and intra-individual weakness

15 Dehn’s Definition of Deficit
3 reasons for deficit emphasis Both weaknesses together is statistically rare A deficit indicates an underlying neurological impairment Students with both kinds of weaknesses really need special ed.

16 When to Use IQ Instead of Cross-Battery Mean
Okay to use IQ as predictor because it has high correlations with most processes Is technically more appropriate because it has known reliability and SEM Use when only weak processes tested Use when only a few processes tested Use when a legal challenge is anticipated

17 Processing Analysis Worksheet
Composite scores from test manual when possible Convert all scores to standard scores Compute clinical scores by averaging Compute processing mean or use IQ Calculate discrepancies Determine weaknesses and deficits Both kinds of weaknesses = a deficit Do pairwise comparisons Opposites and those closely related Completed Example

18 Pairwise Comparisons For intervention planning, not diagnosis
Pay most attention to: Opposites Those that are closely related A greater discrepancy is required for significance Significant when confidence intervals do not overlap

19 Non-Unitary Scores When standard score difference is greater than 22 points (or 15 points) Something different is being measured or something is different about the subtest task Investigate further with more testing if cannot be explained

20 Using Dehn’s Automated Analysis Worksheet to Determine PSW
Automated worksheet from Essentials of Processing, 2nd Edition

21 Psychological Processing Analyzer 2.0
Available at Identifies statistically significant strengths, weaknesses, deficits, and assets Can enter composite and/or subtest scores 11 psychological processes Takes scores (almost 400 to choose from) from more than 40 different scales: cognitive, achievement, rating, and processing

22 Psychological Processing Analyzer
Some subtests are re-classified based on the primary demands of the task Options: Use the mean of the process scores or IQ as predicted score Differences greater than critical values are intra-individual weaknesses

23 PPA Equations Converts all scores to mean of 100; SD of 15
.01 or .05 level of significance Difference formulas based on reliability coefficients of composites/subtests Regression toward the mean Predicted score based on mean of other 10 Non-unitary scores are flagged

24 PPA Demo and Report See demo See sample report
Report has table, graph, and narrative Pairwise comparisons also provided Identifies academic areas associated with the identified deficits

25 Using a Rating Scale to Assess Processes
Processing deficits are manifested through behaviors Behavior ratings by teachers can be used to measure processing abilities Examples: BRIEF and other Executive Function Scales Also, the new CPPS Use the CPPS for processes not directly tested

26 Children’s Psychological Processes Scale (CPPS) Overview
Standardized teacher rating scale Ages to 121 items across 11 subscales Entirely online, internet-web based Online administration time of 15 minutes Online scoring and report Author: Milton Dehn; published by Schoolhouse Educational Services, 2012 Measurement Consultant: Kevin McGrew

27 Main Purpose of the CPPS
To identify psychological (cognitive) processing weaknesses in children referred for a learning disability evaluation An additional source of data for diagnostic purposes Can be used as a Pattern of Strengths and Weaknesses (PSW) analysis Covers processes not directly tested

28 The CPPS Identifies Children with SLD
LD subjects had significantly higher means on all subscales; about 1.5 SD difference Link The CPPS has high classification accuracy in regards to LD 37 LD subjects compared with matched controls Using CPPS GPA cutoff of 60 had 92% classification accuracy across 74 subjects

29 CPPS Standardization 1,121 students rated by 278 teachers
128 communities in 30 states in U.S. All data collected online Demographics match U.S. Census well Norms: 4 age groups (5-6; 7-8; 9-10; 11-12) Included children with disabilities

30 CPPS Processes Attention Auditory Processing Executive Functions
Fine Motor Fluid Reasoning Long-Term Recall Oral Language Phonological Processing Processing Speed Visual-Spatial Processing Working Memory (WM)

31 CPPS General Processing Ability (GPA)
Based on average of all process scores Emerges from factor analysis; similar to concept of general intelligence Processes function in an inter-related fashion Most processes contribute to any given behavior, task On CPPS defined as “the underlying efficiency of processing”

32 Additional CPPS Factors
Second factor is Attention, EF, and WM: Self-Regulatory Processes (SRP) Third factor is Fine Motor and Visual-Spatial: Visual-Motor processes Clusters: Memory, Language WM loads higher on GPA than SRP: WM is both a cognitive and an executive process because it includes STM and WM

33 How the Web-Based CPPS Works
A psychologist side and a teacher side Psychologist manages & has student records Teachers can only access blank rating forms Once teacher has completed ratings, completed form goes to psychologist’s side and teacher can no longer access

34 Completing Teacher Rating Form
Takes approximately 15 minutes Responses: Never, Sometimes, Often, Almost Always Must respond to all items Incomplete ratings will save and can be completed later Free paper copies can be printed. Psychologist then fills in ratings online.

35 CPPS Items What the teacher rater sees Link
Regrouped by subscale after rating In developmental order (ability) from lowest to highest item Example of scoring in developmental sequence Link

36 CPPS Report Brief narrative, graph, and a table of scores
Change-sensitive W-scores T-scores; percentiles; confidence intervals Intra-individual strengths and weakness discrepancy table T-score to standard score converter Example

37 CPPS Discrepancy Analysis
Use discrepancy table to determine pattern of strengths and weaknesses Predicted score based on mean of other 10 Regression toward the mean included +/ to 2.00 SD of SEE discrepancy options Strengths and Weakness labeling is opposite of discrepancy, e.g. “-” value = a strength Link

38 T-Score Conversion Table
Optional Purpose: To see how consistent CPPS scores are with achievement and cognitive scores T-score x and then reverse distance from mean Example: T-score of 60 x 1.5 = = 115 Then subtract 15 from 100 = 85 Example

39 Diagnosing LD with the CPPS
Look for pattern of strengths and weaknesses (discrepancy table) Weaknesses should also be normative weaknesses (T-scores above 60) Look at Table to identify “deficits” Weaknesses should link to evidence-based achievement relations Same criteria as PSW model

40 CPPS Reliability Internal consistency subscale reliability ranges from .88 to .98 .99 on Total Score Inter-rater reliability Range of .21 to .90 Median coefficient of 76.5

41 Correlations with Achievement
High correlations with WJ III Achievement Test scores Link The broader the achievement score, the higher the correlations

42 Correlations with WJ III COG
Fewer correlations than achievement Link All CPPS processes have significant correlations with Cognitive Fluency (ability to quickly and fluently perform cognitive tasks) Most CPPS scales expected to link with WJ III COG tests do, except attention, processing speed, and WM (but does relate with STM) Also, discriminant evidence

43 CPPS Correlations with BRIEF
CPPS Attention, Executive Functions, and Working Memory (SRP Factor) have the highest correlations with all BRIEF scales CPPS Attention and EF mostly are >.70 indicating they measure same domains as BRIEF Link Other CPPS scales correlate with BRIEF metacognitive scales but not behavioral

44 Questions


Download ppt "Milton J. Dehn, Ed.D., NCSP Schoolhouse Educational Services June 2014"

Similar presentations


Ads by Google