Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate.

Similar presentations


Presentation on theme: "Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate."— Presentation transcript:

1 Tips and guidelines

2  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate us?  Actually the premise is not that faculty are doing a bad job, but that they are looking for ways to do a better job.  We already make “data-driven” decisions  Why document for others?

3  Common sense  Authority / seniority / expert status  Logic  Structured, empirical observation, a.k.a. “the scientific method”  Middle States refers to a “culture of evidence” in which the fourth of these has pride of place

4 MISSION General learning goals OBJECTIVES Specific competencies, skills, knowledge that support the learning goal PROCESSES Sites and methods for achieving objectives ASSESSMENT Do processes, as designed and implemented, lead to realization of objectives? Which processes work best?

5  Ineffable outcomes  Hard to measure or hard to defend?  Take a long, cold look at agency  Diversity of student learning goals  Faculty / discipline-centered or student-centered?  Student inputs versus student outcomes  “Exposure” / opportunities / curricular requirements as ends in themselves versus processes leading to specific learning outcomes

6  Goal statements should be comprehensive, organized, integrated  But, pick one or two goals / assessment loops to implement initially:  Something that has cross-cutting implications for your department  Something you’re already doing  Something you already believe is an area for improvement

7  Senior capstone / thesis requirement  Rubric-based assessment linked to departmental goals  Be sure to design in “closing the loop”  Core course(s) for the major  Goal statements here are usually clear, agreed-upon, and often more competency-based (as opposed to content-based).  You may be able to compare multiple sections of the same course (assess impact of experiments with differing pedagogy, etc.)

8  Focus on a single competency  Should be core to your departmental learning goals  Choose on that is eminently assessable, even (gasp!) perhaps with standardized tests ▪ Quantitative reasoning ▪ Oral communication  Curriculum matrix  Grid showing how all departmental courses support all learning goals – can provide a useful map to later efforts at assessment.

9  Often are not explicitly linked to learning goals for assignment or course  Often does not communicate strengths and weaknesses to student  Written comments are more “rubric-like” than the grade.  Standards vary across departments and across faculty within departments  Grades are about individual student performance – assessment is about departmental performance

10  Cross-rater consistency is key  Training, category definitions need to be explicit – often with explicit examples of student work  Can be enhanced by using persons other than faculty to apply the rubrics to student work  If copies of student work are archived, rubric- based assessment can be post hoc  Thoughtful sampling of student work can make the task manageable

11  At a minimum, archive any produced materials  Some reporting to the Provost documenting cycle(s) of closing the loop – a “one-pager”  Of course, more comprehensive reports are useful in many ways:  Grant support (which is plentiful in this arena)  Accountability to external agencies (MSCHE)  Avoid re-inventing the wheel in future generations

12 You may find these useful

13  Useful  Cost-effective / sustainable  Reasonably accurate and truthful  Direct versus indirect measures  Multiple methods  Planned, goal-oriented  Systemic, coordinated across levels  How do course goals related to dept goals?  How do dept goals relate to institutional goals?

14 MISSION ”Situate a good research question within the existing literature” OBJECTIVES Effective use of research databases; identifies gaps in extant literature; cites appropriately, etc. PROCESSES Research methods course; lit.review of research papers in dept; library research practices ASSESSMENT Rubric-based assessment of literature review portion of research methods course and across written papers in dept.; score on a library-organized test of research practices

15 Goal: Psychology majors will have a clear understanding of the logic of scientific inquiry and of psychological research method

16 Assertion: Bryn Mawr graduates pursue graduate education at higher rates than at peer institutions. Multiple data sources, all individually flawed, but collectively convincing since they all point in the same direction:  PhD rates (consistent with assertion, but don’t include non-PhD degrees)  Alumni survey data (consistent with assertion, but are self-report and are have a potentially high non-response bias)  Career services one-year out survey (consistent with assertion, but do not capture data beyond one year out)  Senior survey data (consistent with assertion, but are self-report and are “planned” grad school, not actual attendance at time of survey).  Student clearinghouse data (consistent with assertion, but don’t have good peer institution data).

17  Some web links:  Middle States expectations for assessment ▪ ations pdf ations pdf  Bryn Mawr IR website ▪ Past self-studies, links to other schools, external reports ▪ creditation.html ▪ General resources ▪ ssment.html


Download ppt "Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate."

Similar presentations


Ads by Google