Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn Swanson, M.S. – Center for Assessment and Research Studies.

Similar presentations


Presentation on theme: "Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn Swanson, M.S. – Center for Assessment and Research Studies."— Presentation transcript:

1 Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn Swanson, M.S. – Center for Assessment and Research Studies James Madison University

2 Session Outcomes By the end of this session, attendees will be able to: Explain how assessment design informs program design Describe the “Learning Assessment Cycle” Express the difference between a goal, learning objective and program objective Identify effective frameworks to design learning outcomes Define fidelity assessment and recognize its role in the Learning Assessment Cycle

3 By The Numbers Where We Were v. Where We Wanted to Be

4 Why Assess? It’s simple: The assessment cycle keeps us accountable and intentional We want to determine if the benefits we anticipated occur Are changes in student performance due to our program? If we don’t assess: Programming could be ineffective – we won’t know Our effective program could be terminated – we have no proof it’s working

5 Typical Assessment We’ll, we’ve got to do a program. Let’s put some activities together. Let’s ask them questions about what we hope they get out of it afterwards. Um…let’s ask if they liked the program too. And let’s track attendance. Survey says….well, they didn’t really learn what we’d hoped. But they liked it? And a good bit of people came? Success!

6 Proper Assessment What do we want students to know, think or do as a result of this program? Let’s define goals and objectives that get at what we want students to know, think or do. What specific, measurable things could show that we’re making progress towards these goals and objectives? What activities can we incorporate to get at those goals and objectives? We have a program!

7 How are these approaches different?

8 Learning Assessment Cycle Establish Program Objectives Create & Map Programming to Objectives Select and Design Instrument Implementation Fidelity Collect Objective Information Analyze & Maintain Information Use Information

9 Program Goals vs. Learning Goals

10 Goals, Objectives, & Items Item Goal Objective

11 Goals v. Objectives Goals can be seen as the broad, general expectations for the program Objectives can be seen as the means by which those goals are met Items measure our progress towards those objectives and goals

12 Goals vs. Objectives Goal General expectation of student (or program) outcome Can be broad and vague Example: Students will understand and/or recognize JMU alcohol and drug policies. Objective Statement of what students should be able to do or how they should change developmentally as a result of the program More specific; measurable Example: Upon completion of the BTN program, 80% of students will be able to identify 2 JMU Policies relating to alcohol.

13 Putting it All Together GOAL Objective Assessment

14 By The Numbers Program Goal Goal: To provide a positive classroom experience for students sanctioned to By the Numbers Objective: 80% of students will report that the class met or exceeded their expectations of the class. Item: Class Evaluation #15 – Overall, I feel like this class… Objective: 80% of students will agree (or better) with the statement “the facilitators presented the material in an non- judgemental way.” Item: Class Evaluation #5.5 – The facilitators presented the material in a non-judgemental way. Objective: 60% of students will report an engaging classroom experience. Item: Class Evaluation #5.1 – The facilitators encouraged participation. Item: Class Evaluation #5.4 – The facilitators encouraged discussion between participants. 1 of 3

15 By The Numbers Learning Goal Goal: To ensure student understanding and/or recognition of JMU alcohol and drug policies. Objective: After completing BTN, 80% of students will be able to identify 2 JMU Policies relating to alcohol. Objective: …identify the circumstances for parental notification. Objective: …identify the parties able to apply for amnesty in a given situation. Objective: …identify the geographic locations in which JMU will address an alcohol/drug violation. Objective: …articulate the three strike policy. 1 of 5

16 By The Numbers Learning Goal Goal: To ensure student understanding and/or recognition of concepts surrounding alcohol. Objective: After completing BTN, 60% of students will be able to provide the definition of a standard drink for beer, wine, and liquor. Objective: …identify the definition for BAC. Objective: …describe the relationship between tolerance and BAC. Objective: …identify at least 2 factors that influence BAC. Objective… identify the definition of the point of diminishing returns. Objective: …identify how the body processes alcohol and its effects on the body. 2 of 5

17 By The Numbers Learning Goal Goal: To ensure student understanding and/or recognition of concepts surrounding alcohol consumption. Objective: After completing BTN, 80% of students will be able to correctly identify the definition of the point of diminishing returns. Item: Assessment Question #12, #29 Activity: Tolerance Activity, Point of Diminishing Returns discussion Objective: After completing BTN, 80% of students will be able to identify how the body processes alcohol and its effects on the body. Item: Assessment Question #8, #9, #10 Activity: Alcohol in the Body Activity

18 Developing Learning Outcomes Should be Student Focused – Worded to express what the student will learn, know, or do (Knowledge, Attitude, or Behavior) Should be Reasonable – should reflect what is possible to accomplish with the program Should be Measurable – “Know” and “understand” are not measurable. The action one can take from knowing or understanding is. Should have Success Defined – What is going to be considered passing?

19 Bloom’s Taxonomy Less complex More complex LevelDescription 1. KnowledgeRecognize facts, terms, and principles 2. ComprehensionExplain or summarize in one’s own words 3. ApplicationRelate previously learned material to new situations 4. AnalysisUnderstand organizational structure of material; draw comparisons and relationships between elements 5. SynthesisCombine elements to form a new original entity 6. EvaluationMake judgments about the extent to which material satisfies criteria

20 Bloom’s Taxonomy Bloom’s Level Verbs 1. Knowledge match, recognize, select, compute, define, label, name, describe 2. Comprehension restate, elaborate, identify, explain, paraphrase, summarize 3. Application give examples, apply, solve problems using, predict, demonstrate 4. Analysis outline, draw a diagram, illustrate, discriminate, subdivide 5. Synthesis compare, contrast, organize, generate, design, formulate 6. Evaluation support, interpret, criticize, judge, critique, appraise

21 The ABCD Method A = Audience What population are you assessing? B = Behavior What is expected of the participant? C = Conditions Under what circumstances is the behavior to be performed? D = Degree How well must the behavior be performed? To what level? From “How to Write Clear Objectives”

22 Objective : After completing BTN, 80% of students will be able to describe the relationship between tolerance and BAC. AudienceBy the Numbers Participants BehaviorDescribe relationship between tolerance and BAC ConditionAfter taking the class Degree80% The ABCD Method: Example

23 Common Mistakes Vague behavior Example: Have a thorough understanding of the university honor code. Gibberish Example: Have a deep awareness and thorough humanizing grasp on… Not Student-Focused Example: Train students on how and where to find information.

24 Program Implementation Give the program you say you will. How?

25 Pre Test (Low Item Score) Program Post Test (High Item Score)

26 Pre Test (Low Item Score) Program Post Test (Low Item Score)

27 Fidelity Assessment Are you doing what you say you’re doing? Helps to ensure your program is implemented as you intended Links learning outcomes to programming Helps to answer “why” we aren’t observing the outcomes we think we should be observing

28 Fidelity Components Program Differentiation How are the many components of your program different from one another? Adherence Was your program delivered as intended? Quality How well were the components administered? Exposure How long did each component last? How many students attended? Responsiveness Were participants engaged during the program?

29 Fidelity Checklist - Generic Student Learning Outcomes Program Component DurationFeaturesAdherence to Features Quality Objective XComponent(s) aligned with Objective X Length of component List of specific features (Y/N) recorded for each feature Quality rating for each feature What is rated? The live/videotaped program Who does the rating? Independent auditors Facilitators Participants

30 You Must Assess. Please walk away with this:


Download ppt "Program Assessment: A Process From Start to Finish RJ Ohgren – Office of Judicial Affairs Mandalyn Swanson, M.S. – Center for Assessment and Research Studies."

Similar presentations


Ads by Google