Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.

Similar presentations


Presentation on theme: "Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist."— Presentation transcript:

1 Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist

2 Completion Dates Not actively engaged in program level assessment Actively engaged in program level assessment Sept 2009 Program level student learning outcomes revised and/or updated Dec 2009Assessment strategy in place Jan-Mar 2010Assessment strategy implemented April 2010 Assessment results available for faculty reflection and action May 2010 First cycle completed and improvement plans submitted At least one cycle completed and improvement plans submitted September 2010First annual LEARNING Improvement awards announced May 2011Two cycles completedAt least two cycles completed August 2011SACS Compliance Audit begins September 2011Second annual LEARNING Improvement awards announced

3  How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students?  What evidence do you have that students achieve your stated learning outcomes?  In what ways do you analyze and use evidence of student learning?  How do you ensure shared responsibility for student learning and for assessment of student learning?  How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning?  In what ways do you inform the public and other stakeholders about what and how well your students are learning?

4  University Assessment ◦ Campus-wide assessment of student learning at the program level (e.g., General Education) ◦ University assessment is the primary charge of the Office of Assessment  University assessment is separate and distinct from evaluation of teaching effectiveness ◦ Evaluation of teaching effectiveness is the responsibility of departments/colleges  Assessment data are analyzed and reported only in the aggregate  You can’t assess everything all the time! ◦ Plan for assessment that is practical, given current time and resource constraints ◦ Assess 1 or 2 outcomes per year

5  Assessment vs Evaluation ◦ Assessment requires us to “take a step back” from the interaction between student and teacher ◦ Grades are evaluations, generally not used for assessment  Team approach to evaluation ◦ Essentially a juried assessment in that more than one individual is scoring/evaluating ◦ A periodic, objective validation process of some kind required to ensure validity and reliability

6  Three levels of assessment ◦ Course ◦ Program  Undergraduate majors/programs  General education program  Graduate majors/programs ◦ Institutional  Course, Program, and Institutional outcomes should be aligned, but are not identical

7  Focused on curricular, environmental improvement  Formative and Summative, Direct and Indirect methods  Curriculum mapping, program improvement

8  Focus on broad skills developed over time ◦ Not restricted to a single course or learning experience  Demonstrate acquisition of specific disciplinary/professional knowledge and skills necessary after graduation ◦ Ask: “What makes a graduate of the program able to function and learn in a specific discipline/profession after the degree?”  Measurable ◦ Confirmable through evidence

9  Measures must be appropriate to outcomes ◦ Avoid cumbersome data-gathering ◦ Use both direct and indirect methods  Indirect methods measure a proxy for student learning  Direct methods measure actual student learning ◦ “Learning” = what students know (content knowledge) + what they can do with what they know

10  Information that tells you something directly or indirectly about the topic of interest  Evidence is neutral -- neither “good” nor “bad” ◦ Requires context to be meaningful  Two types of assessment evidence ◦ Direct (“authentic”) and Indirect  Best practice calls for multiple methods

11  Students show achievement of learning goals through performance of knowledge, skills: ◦ Scores and pass rates of licensure/certificate exams ◦ Capstone experiences  Individual research projects, presentations, performances  Collaborative (group) projects/papers which tackle complex problems ◦ Score gains between entry and exit ◦ Ratings of skills provided by internship/clinical supervisors ◦ Substantial course assignments that require performance of learning ◦ Portfolios

12  Indirect methods measure proxies for learning ◦ Data from which you can make inferences about learning but do not demonstrate actual learning, such as perception or comparison data ◦ Surveys  Student opinion/engagement surveys  Student ratings of knowledge and skills  Employers and alumni, national and local ◦ Focus groups/Exit interviews ◦ Course grades ◦ Institutional performance indicators  Enrollment data  Retention rates, placement data  Graduate/professional school acceptance rates

13  Create a visual map: ◦ Lay out program courses and learning outcomes (competencies) on a grid  Refer to examples (Handouts) ◦ Identify the courses at which each competency is:  Introduced  Reinforced  Emphasized

14 Basic Program Map Template OutcomesCourse #1; Baseline Assessment Course #2Course #3; Mid-Program Assessment Course #4Course #5; Capstone Assessment Outcome 1IRRER Outcome 2RRE Outcome 3IERE Outcome 4ERR I= Outcome is introduced; baseline, formative assessment R= Outcome is reinforced; formative assessment E = Outcome is emphasized; summative assessment

15  Lets you discover the evidence you already have, such as: ◦ Institutional Research data ◦ Student Life data ◦ Exit Surveys (seniors) ◦ Alumni Surveys  Start with the obvious … but don’t stop there

16  Institutional history ◦ “We’ve already done that, and it didn’t tell us anything!”  Territory; Politics ◦ Fighting for scant resources  Institutional policy/culture about sharing information ◦ “I don’t want somebody ‘policing’ my classrooms!”

17  Does the evidence address student learning issues appropriate to the institution?  Does the evidence tell you something about how well the institution is accomplishing its mission and goals? ◦ The questions you have about student learning should guide your choice of appropriate existing evidence and identify gaps where a new type of evidence might be needed

18  “This is a lot of work!” ◦ Use some sort of evidence inventory to help faculty understand how existing academic practices yield evidence ◦ Keep expectations reasonable, given limited time and resources  Remember: it is not necessary to gather all the evidence all of the time

19  “How do I know you won’t use this against me?” ◦ Be consistent and firm in the message that assessment is not faculty evaluation, that results will only be reported in the aggregate ◦ Remember: Assessment results will link to allocation of resources, ideally through the strategic planning process

20  Assessment is only a means to an end ◦ The purpose of assessment is continuous improvement of student learning  The assessment cycle is complete when assessment results have been used successfully for evidence-based decision making

21  Articulate expectations in the form of student learning outcomes  Measure achievement of expectations  Collect and analyze data  Use evidence to improve learning  Assess the effectiveness of improvement

22  Unit Assessment Plan Template (Handout) ◦ Use this template as a foundation for your unit assessment plan, revising and reshaping as necessary


Download ppt "Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist."

Similar presentations


Ads by Google