Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment Instruments and Rubrics Workshop Series

Similar presentations


Presentation on theme: "Assessment Instruments and Rubrics Workshop Series"— Presentation transcript:

1 Assessment Instruments and Rubrics Workshop Series
Part 5: Data Reporting Continued and Action Plans April 27, 2016 Drs. Summer DeProw and Topeka Small And Guest: Rosemary Freer, Director of Testing Services

2 Workshop Agenda Follow-up from Part 4’s Workshop—Questions, Comments, Experiences to share? Utilizing the Testing Center for data collection: What the Testing Services can do for you Examples of data collection and analysis from Testing Center Closing the Loop: What’s it really mean? Examples When is good, good enough? Final thoughts about the workshop series (if time allows) Assessing the Assessment Office (survey)

3 Follow-up? Part 4’s Workshop:
Questions, Comments, Experiences to share? Rubrics? Data Reporting for rubrics and exams given in Blackboard?

4 Utilizing the Testing Center: What can the Testing Center do for you?
Score your objective exams using Scantron Generate a detailed analysis Conduct an item analysis Provide a score distribution report Provide testing psychometrics measures Point Biserial—measure of reliability. It correlates student scores on one particular question with scores on the test as a whole. Scores range from -1 to +1. The closer the score is to +1, the more reliable the question. Distractors Reliability Coefficient (KR20)

5 Testing Psychometrics Measures
Point Biserial—measure of reliability. It correlates student scores on one particular question with scores on the test as a whole. Scores range from -1 to +1. The closer the score is to +1, the more reliable the question. ( Reliability Coefficient (KR20)—Kuder-Richardson Formula 20 measures test reliability and are an overall measure of internal consistency. A higher value indications a strong relationship between items on the test. Scores are generally between 0 and +1, but .50 is considered an acceptable relationship level ( and ( Non-Distractors—Distractors (incorrect choices) should be plausible options. If some distractors are not being chosen then they are not a reasonable option and should be reviewed. ( ments.ashx) Other sources for good test construction—( pages/writing-good-multiple-choice-test-questions/)

6 Utilizing the Testing Center: Examples of data collection and analysis
A Detailed Analysis:

7 Utilizing the Testing Center: Examples of data collection and analysis
Item Analysis

8 Utilizing the Testing Center: Examples of data collection and analysis
Score distribution

9 Closing the Loop?! What does it really mean?
Synonymous with follow up Simply put, use the data you collected to develop an action plan to address any issues cited Make the changes you determined need to be made based on the analysis of the data collected Report out on how the “treatment” impacted student learning…did what you tried increase student learning? Did things stay the same? Now, start all over again! Develop outcomes, provide learning opportunities, assess student achievement of the outcomes, analyze results/data, develop an action plan, follow thru with the plan, assess student achievement after the “treatment” has been applied, analyze the results/data, report out on how the “treatment” impacted student learning…it’s a never-ending loop!

10 Closing the Loop?! Examples
Possible changes could include: Curriculum changes Resource allocation changes Academic‐process changes

11 Closing the Loop?! Examples Cont.
Document, Document, Document what you have done to close the loop!

12 Closing the Loop?! When is good, good enough?
Open Discussion

13 Stay tuned Approaching Deadlines Summer Workshops Fall 2016 Workshops
Assessment reports due June 1, 2016 for all programs for at least one outcome Specialized accredited programs may submit a status report of student-learning assessment work since last reaffirmation Summer Workshops Outcomes for general education digital books Collecting data using EAC Visual Data across multiple course shells in Blackboard Fall 2016 Workshops Indirect assessment measures Best practices in survey and exam construction


Download ppt "Assessment Instruments and Rubrics Workshop Series"

Similar presentations


Ads by Google