Presentation is loading. Please wait.

Presentation is loading. Please wait.

Why I don’t understand the assessment results (and what I can do about it)

Similar presentations

Presentation on theme: "Why I don’t understand the assessment results (and what I can do about it)"— Presentation transcript:

1 Why I don’t understand the assessment results (and what I can do about it)

2 Hi…I’m the Assessment Director

3 Reasons for Assessment Accreditors demand it Institutional leaders ought to care Faculty members probably do care Students should know what we think of them

4 The Theory

5 What Could Go Wrong???

6 Take Action: Now what?

7 Take Action: Now What?

8 Assessment: The Process 1. Design 2. Implementation 3. Reporting 4. Follow-up

9 Design

10 Design: Validity and Reliability The emphasis [on validity] is not on the instrument itself; rather, it is on the interpretation of the scores yielded by a test. -- College BASE Technical Manual The CLA measures were designed by nationally recognized experts in psychometrics and assessment, and field tested in order to ensure the highest levels of validity and reliability. -- Collegiate Learning Assessment Advertisement

11 Design: Epistemology We say that a sentence is factually significant to any given person, if and only if, [she or] he knows how to verify the proposition which it purports to express—that is, if [she or] he knows what observations would lead [her or him], under certain conditions, to accept the proposition as being true, or reject it as being false. – A. J. Ayer, Language, Truth, and Logic [T]he meaning of a word is its usage in the language. – L. Wittgenstein

12 Design: Complexity and Validity Collegiate learning is complex Assessment should be: Authentic Contextual Rigorous Reliable Understandable

13 Design: Measurement Smeasurement Measurement requires: Units Ability to aggregate What we actually do is estimation

14 Analysis

15 Analysis: Using the Data Pieper, Fulcher, and Erwin – 1. Differences 2. Relationships 3. Change 4. Competency

16 Analysis: Tools MS Access Spreadsheet software Pivot Tables Statistics packages: Excel SPSS SAS Logistic Regression:

17 Analysis: Averages

18 Analysis: Proportions

19 Analysis: Example

20 Analyze: Example

21 Analysis: Example





26 CIRP Self-Assessed Mathematical ability Frequen cyPercent Valid Percent Cumulative Percent ValidLowest 10% 105.2 Below average 4623.723.8 29.0 Average 7940.740.9 69.9 Above average 4824.724.9 94.8 Highest 10% 105.2 100.0 Total 19399.5100.0 MissingSystem 1.5 Total 194100.0

27 Analysis: Example









36 Conclusions At strategic level, averages are fine At tactical level, proportions are easier to act on Abstraction is the enemy Don’t sacrifice validity for reliability without a good reason For complex skills, subjectivity is your friend

37 Last Requests?

Download ppt "Why I don’t understand the assessment results (and what I can do about it)"

Similar presentations

Ads by Google