Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.

Similar presentations


Presentation on theme: "1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky."— Presentation transcript:

1 1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky

2 2 Session Overview Steps in Test Development and Test Administration Steps in Test Development and Test Administration Process for Evaluating Test and Items Process for Evaluating Test and Items Common Difficulties Common Difficulties Strategies for Analysis and Reports Strategies for Analysis and Reports

3 3 Improve Student Learning

4 4 Exercise 1 (10 minutes) How do you decide you are going to give a test? How do you decide you are going to give a test? What do you do to develop a new test? What do you do to develop a new test? How do you decide what types of item formats to use? How do you decide what types of item formats to use? How do you develop items? How do you develop items? How long does it take to put the test together? How long does it take to put the test together?

5 5 Using a Test for Classroom Assessment Generally, instructors develop their own classroom tests, making all decisions about when and how to construct, administer, score, and report results of tests. Generally, instructors develop their own classroom tests, making all decisions about when and how to construct, administer, score, and report results of tests. Construction is often done without formality or documentation. Construction is often done without formality or documentation. The most frequent use of tests by instructors is to assign grades related to individual student learning. The most frequent use of tests by instructors is to assign grades related to individual student learning.

6 6 A Test Is a Good Choice When… the student must demonstrate acquisition of knowledge or ability to process and use knowledge the student must demonstrate acquisition of knowledge or ability to process and use knowledge the student's knowledge about a wide range of content is to be evaluated (e.g., survey and capstone courses) the student's knowledge about a wide range of content is to be evaluated (e.g., survey and capstone courses) multiple observations of the content-related knowledge are needed (e.g., math and foreign languages) multiple observations of the content-related knowledge are needed (e.g., math and foreign languages) more resources are available for constructing the test than for scoring and reporting more resources are available for constructing the test than for scoring and reporting a large group is being assessed a large group is being assessed

7 7 Using a Test for Departmental Assessment Planning, implementing, and using results become a group effort – a shared set of decisions and responsibilities. Consensus is emphasized. Planning, implementing, and using results become a group effort – a shared set of decisions and responsibilities. Consensus is emphasized. Additional planning time, communication, and record keeping will be needed. Additional planning time, communication, and record keeping will be needed. When used for program assessment, test performance is generally used along with other information to describe group achievement and is independent of grading. When used for program assessment, test performance is generally used along with other information to describe group achievement and is independent of grading.

8 8 Advantages of Selected-Response Item Formats Better content coverage Better content coverage Higher reliability Higher reliability Greater efficiency Greater efficiency Objectivity Objectivity Mechanical scoring Mechanical scoring

9 9 Test Development Process Define purpose/use Define purpose/use Outline curriculum & consult goals/objectives Outline curriculum & consult goals/objectives Create test plan (aka blueprint or specifications) Create test plan (aka blueprint or specifications) Create a pool of items Create a pool of items Critique and revise items Critique and revise items Pilot/field test (e.g. item analysis, reliability/validity studies) Pilot/field test (e.g. item analysis, reliability/validity studies) Set guidelines for test administration, scoring procedures, interpretation of scores (e.g. develop norm tables and standard setting procedures) Set guidelines for test administration, scoring procedures, interpretation of scores (e.g. develop norm tables and standard setting procedures)

10 10 Exercise 2 (10 minutes) What role does the exam play in the assessment plan for this department? What role does the exam play in the assessment plan for this department? What type of item formats should be used on the comprehensive exam? What type of item formats should be used on the comprehensive exam?

11 11 Objectives Respond in an informed way to the form, structure and aesthetic qualities of artistic and literary works. Explore the interrelationships among historical events and intellectual, artistic, literary and philosophical or religious movements and works. Identify and analyze similarities, differences and interrelationships among the fine arts. Articulate central philosophical and religious questions and the varying responses to them within different cultures. Apply appropriate vocabulary and concepts for the description and analysis of artistic, literary, historical and philosophical or religious works. Explain how artistic and literary works from the past and present civilizations are individual expressions of cultural, historical and intellectual forces.

12 12 Test Blueprint (Content X Objective)

13 13 Test Plan (Items Within Cells)

14 14 Administrative Approaches Common exams Common exams Course-embedded testing Course-embedded testing Assessment Center/Assessment Days Assessment Center/Assessment Days

15 15 Sources for Items Research Research Curriculum Curriculum Real life Real life Professional development materials & textbooks Professional development materials & textbooks Own final exam files Comprehensive exams from other universities Retired test/item banks

16 16 Common Difficulties with Test Quality Push for higher order thinking often weak Push for higher order thinking often weak Insufficient consensus on test plan/blueprint Insufficient consensus on test plan/blueprint Discomfort with various item-writing approaches (writer’s block) Discomfort with various item-writing approaches (writer’s block) Insufficient editing Insufficient editing Little or no piloting of items Little or no piloting of items

17 17 Critiquing Items Match with test plan Match with test plan Item value Item value Item quality Item quality

18 18 Typical Item Review Questions Has each item received a quality appraisal? Has each item received a quality appraisal? Has each item’s content been verified? Has each item’s content been verified? Has each item been classified? Has each item been classified? Has the key been identified? Has the key been identified? Have you edited the items? Have you edited the items? Have items been checked for bias/insensitivity? Have items been checked for bias/insensitivity? Have you field tested the items? Have you field tested the items?

19 19 Planning A Departmental Test Develop test Develop test Plan administration Plan administration Analyses test data Analyses test data Report and use results Report and use results Agree on leadership/support roles Agree on leadership/support roles

20 20 Exercise 3 (30 minutes) Review the headings of the checklist form Review the headings of the checklist form List additional headings needed List additional headings needed List activities that need to be completed under the various headings List activities that need to be completed under the various headings Add a number to the activities to represent the order of completion Add a number to the activities to represent the order of completion

21 21 Analyses Examine quality of your assessment tools and procedures Examine quality of your assessment tools and procedures Describe the test results Describe the test results By test plan features By test plan features By meaningful subgroups By meaningful subgroups Combine with other datasets Combine with other datasets Execute and summarize planned comparisons linked to research design Execute and summarize planned comparisons linked to research design

22 22 Reports Determine stakeholders/constituents Determine stakeholders/constituents Identify specific interest Identify specific interest Create a timeline and share Create a timeline and share Include a vetting procedure Include a vetting procedure

23 23 Testing in the Future Computer based testing Computer based testing Adaptive (CAT) Adaptive (CAT) Self-Adaptive (SAT) Self-Adaptive (SAT) Computer assisted item generation Computer assisted item generation Automated essay scoring Automated essay scoring

24 24 Thank you for your attention. Deborah Moore, Assessment Specialist Deborah Moore, Assessment Specialist 101B Alumni Gym Office of Institutional Research, Planning, & Effectiveness 101B Alumni Gym Office of Institutional Research, Planning, & Effectiveness dlmoor2@email.uky.edu dlmoor2@email.uky.edu dlmoor2@email.uky.edu 859/257-7086 859/257-7086 http://www.uky.edu/LexCampus/ http://www.uky.edu/LexCampus/ http://www.uky.edu/LexCampus/


Download ppt "1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky."

Similar presentations


Ads by Google