Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.

Similar presentations


Presentation on theme: "Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008."— Presentation transcript:

1 Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008

2 Roles of Assessment “We assess to assist, assess to advance, assess to adjust”: Assist: provide formative feedback to guide student performance Advance: summative assessment of student readiness for what’s next Adjust: continuous improvement of curriculum, pedagogy. - Ruth Stiehl, The Assessment Primer: Creating a Flow of Learning Evidence (2007)

3 Formulating Questions for Assessment Curriculum designed backwards; Students’ journey forward: What do students need to DO “out there” that we’re responsible for “in here?” (Stiehl) Subsequent roles in life (work or future study, etc.) How do students demonstrate the intended learning now? What kinds of evidence must we collect and how do we collect it?

4 Assessment Questions & Strategies– Factors to consider: Meeting Standards Does the program meet or exceed certain standards? Criterion reference, commonly state or national standards Comparing to Others How does the student or program compare to others? Norm reference, other students, programs or institutions

5 Assessment Questions & Strategies- Factors to Consider: Measuring Goal Attainment Does the student or program do a good job at what it sets out to accomplish? Internal reference to goals and educational objectives compared to actual performance. Formative student-center. Professional judgment about evidence common.

6 Assessment Questions & Strategies- Factors to Consider: Developing Talent and Improving Programs Has the student or program improved? How can the student’s program and learning experience be improved even further? Formative and developmental. Variety of assessment tools and sources of evidence.

7 Formulating Assessment Strategies:

8

9

10 Direct vs. Indirect Evidence Direct What can the student actually do or demonstrate they know Can witness with own eyes Setting is structured/ contained Indirect What students say they can do Focus on the learning process or environment Things from which learning is inferred Setting is not easily contained/structured

11 Qualitative vs. Quantitative Qualitative Words Categorization of performance into groups Broad emergent themes Holistic judgments Quantitative Numbers Individual components and scores Easier calculations and comparisons plus presentation to a public audience

12 Formative vs. Summative Assessment for learning “In-progress” Provide corrective feedback Establish foundational learning for next step. Assessment for evaluative purpose “After the fact” Determine progress/ achievement/proficiency Readiness for next step/ role/learning experience

13 Means of Assessment- (Quantitative Judgments) Cognitive Standardized exams Locally developed exams Attitudes/beliefs Opinion surveys of students, graduates, employers

14 Means of Assessment- (Qualitative Judgments) Cognitive Embedded classroom assignments Behavior/performances (skills applications) Portfolios Public performances Juried competitions Internships Simulations Practical demonstrations Attitudes/beliefs Focus groups

15 see Handout- Using the Grading Process for Assessment15 Step 3. Means of Assessment- Grades Evaluation of individual students = assessment Focus is individual not groups of students A summative, not formative act Objectivity of single evaluator vs. group Generally not accepted as direct evidence Uses of the grading process Agreed upon course exam or part of exam Row and column model for assignments

16 Individual Student Scores and SLO Assessment- Embedded Assignments Total down the column for individual grading. Analyze across the row for assessment of intended outcomes from the group. Jim Nichols

17 Interpreting Results- How Good Is Good Enough? Norm Referencing Comparing student achievement against other students doing the same task Criterion Referencing Criteria and standards of judgment developed within the institution

18 Are Results Valid and Reliable? Validity Reliability Authentic assessment Important questions or easy questions Inform teaching and learning?

19 How Does Assessment Data Inform Decision-Making? Goal: Making sound curricular and pedagogical decisions, based on evidence Assessment questions are tied to instructional goals. Assessment methods yield data that is valid & reliable. A variety of measures are considered. Assessment is an ongoing cycle.

20 Assessment Process


Download ppt "Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008."

Similar presentations


Ads by Google