Presentation is loading. Please wait.

Presentation is loading. Please wait.

Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College

Similar presentations


Presentation on theme: "Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College"— Presentation transcript:

1

2 Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College gwilliams@craftonhills.edu Fred Trapp, Ph.D. Administrative Dean, Institutional Research/Academic Services, Long Beach City College October, 2007 ftrapp@lbcc.edu

3 Goals of the Presentation “De-mystify” the assessment process Provide practical approaches & examples for assessing student learning Answer questions posed by attendees that pertain to their assessment challenges

4 What’s It All About? An ongoing process aimed at understanding and improving student learning. Faculty making learning expectations explicit and public. Faculty setting appropriate standards for learning quality.

5 What’s It All About? Systematically gathering, analyzing and interpreting evidence to determine how well student performance matches agreed upon faculty expectations & standards. Using results to document, explain and improve teaching & learning performance. Tom Angelo AAHE Bulletin, November 1995

6 Roles of Assessment “We assess to assist, assess to advance, assess to adjust”: –Assist: provide formative feedback to guide student performance –Advance: summative assessment of student readiness for what’s next –Adjust: continuous improvement of curriculum, pedagogy. - Ruth Stiehl, The Assessment Primer: Creating a Flow of Learning Evidence (2007)

7 Formulating Questions for Assessment Curriculum designed backwards; Students’ journey forward: –What do students need to DO “out there” that we’re responsible for “in here?” (Stiehl) Subsequent roles in life (work or future study, etc.) –How do students demonstrate the intended learning now? –What kinds of evidence must we collect and how do we collect it?

8 Assessment Questions & Strategies– Factors to consider: Meeting Standards –Does the program meet or exceed certain standards? –Criterion reference, commonly state or national standards Comparing to Others –How does the student or program compare to others? –Norm reference, other students, programs or institutions

9 Assessment Questions & Strategies- Factors to Consider: Measuring Goal Attainment –Does the student or program do a good job at what it sets out to accomplish? –Internal reference to goals and educational objectives compared to actual performance. –Formative student-center. –Professional judgment about evidence common.

10 Assessment Questions & Strategies- Factors to Consider: Developing Talent and Improving Programs –Has the student or program improved? –How can the student’s program and learning experience be improved even further? –Formative and developmental. –Variety of assessment tools and sources of evidence.

11 Choosing Assessment Tools Depends upon the “unit of analysis” –Course –Program –Degree/general education –Co-curricular Also depends upon overall learning expectations

12 Formulating Assessment Strategies:

13

14

15 Direct vs. Indirect Evidence Direct –What can the student actually do or demonstrate they know –Can witness with own eyes –Setting is structured/ contained Indirect –What students say they can do –Focus on the learning process or environment –Things from which learning is inferred –Setting is not easily contained/structured

16 Qualitative vs. Quantitative Qualitative –Words –Categorization of performance into groups –Broad emergent themes –Holistic judgments Quantitative –Numbers –Individual components and scores –Easier calculations and comparisons plus presentation to a public audience

17 Formative vs. Summative Assessment for learning “In-progress” Provide corrective feedback Establish foundational learning for next step. Assessment for evaluative purpose “After the fact” Determine progress/ achievement/proficiency Readiness for next step/ role/learning experience

18 Means of Assessment- (Quantitative Judgments) Cognitive –Standardized exams –Locally developed exams Attitudes/beliefs –Opinion surveys of students, graduates, employers

19 Means of Assessment- (Qualitative Judgments) Cognitive –Embedded classroom assignments Behavior/performances (skills applications) –Portfolios –Public performances –Juried competitions –Internships –Simulations –Practical demonstrations Attitudes/beliefs –Focus groups

20 Interpreting Results- How Good Is Good Enough? Norm Referencing –Comparing student achievement against other students doing the same task Criterion Referencing –Criteria and standards of judgment developed within the institution

21 Are Results Valid and Reliable? Validity Reliability Authentic assessment Important questions or easy questions Inform teaching and learning?

22 How Does Assessment Data Inform Decision-Making? Goal: Making sound curricular and pedagogical decisions, based on evidence Assessment questions are tied to instructional goals. Assessment methods yield data that is valid & reliable. A variety of measures are considered. Assessment is an ongoing cycle.

23 Assessment Process

24 Collaboration Among Faculty, Administration & Researchers Assessment, the auto, and a road trip: an analogy –Who should drive the car? –Who provides the car, gas, insurance and maintenance? –Who brings the maps, directions, repair manual, tool kit, first aid kit, and stimulates the conversation along the journey?

25 Why Faculty are the Drivers Faculty have the primary responsibility for facilitating learning (delivery of instruction) Faculty are already heavily involved in assessment (classroom, matriculation) Faculty are the content experts Who knows better what students should learn than faculty?

26 Who Provides the Car and Keeps Gas in It? Administrators!

27 Role of Administrators Establish that an assessment program is important at the institution Ensure college’s mission and goals reflect a focus on student learning Institutionalize the practice of data-driven decision making (curriculum change, pedagogy, planning, budget, program review) Create a neutral, safe environment for dialogue

28 Where Does IR Sit in the Car?

29 Roles of Researchers Serve as a resource on assessment methods Assist in the selection/design and validation of assessment instruments Provide expertise on data collection, analysis, interpretation, reporting, and use of results Facilitate dialogue - train and explain Help faculty improve their assessment efforts

30 Faculty DON’Ts… Avoid the SLO process or rely on others to do it for you. Rely on outdated evaluation/grading models to tell you how your students are learning. Use only one measure to assess learning Don’t criticize or inhibit the assessment efforts of others.

31 Faculty DOs... Participate in SLO assessment cycle Make your learning expectations explicit Use assessment opportunities to teach as well as to evaluate. Dialogue with colleagues about assessment methods and data. Focus on assessment as a continuous improvement cycle.

32 Questions From the Field…


Download ppt "Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College"

Similar presentations


Ads by Google