Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 BC Draft Assessment Plan Academic Senate Assessment Committee.

Similar presentations


Presentation on theme: "1 BC Draft Assessment Plan Academic Senate Assessment Committee."— Presentation transcript:

1 1 BC Draft Assessment Plan Academic Senate Assessment Committee

2 2 Why Learning Outcomes Assessment? Become the Best Learning Institution Possible Become the Best Learning Institution Possible Focus on student success and improved learningFocus on student success and improved learning Describe and validate what we doDescribe and validate what we do Provide data to make evidence-based decisionsProvide data to make evidence-based decisions Comply with WASC Accreditation StandardsComply with WASC Accreditation Standards Comply with the new Master PlanComply with the new Master Plan Improve accountability to the publicImprove accountability to the public

3 3 What is Learning Outcomes Assessment ? ACCJC accreditation standards define assessment as: ACCJC accreditation standards define assessment as: any method an institution uses to gather evidence for evaluating qualityany method an institution uses to gather evidence for evaluating quality any data that is direct (from the students on learning) and indirect (about the students assuming learning)any data that is direct (from the students on learning) and indirect (about the students assuming learning) including quantitative and qualitative dataincluding quantitative and qualitative data

4 4 Direct and Indirect Assessment Direct Measures actual student learning Measures actual student learning EssaysEssays ExamsExams RecitalsRecitals PerformancePerformance ProductProduct SpeechSpeech PortfolioPortfolio ProjectProject Indirect Measures an assumed indicator of learning Measures an assumed indicator of learning Retention Success Transfer Graduation rates Alumni & community surveys Employer surveys Student Satisfaction surveys

5 5 Draft Assessment Plan Team ∞ BC Senate Assessment Committee * Attendees at AAHE Conference Greg Baxley* ∞ Lara Baxley* ∞ Greg Chamberlain* Michael Einhaus ∞ Julie Esch* ∞ Janet Fulks* ∞ Sue Granger-Dickson* ∞ Jack Hernandez ∞ Joyce Kirst ∞ Ed Knudson* Clark Parsons* Chris Romanowich* ∞ Liz Rozell* ∞ Jerry Scheerer Rachel Vickrey* ∞ InSites Facilitators: Elizabeth O’Connell, Beverly Parsons

6 6 Draft BC Assessment Plan Class and Course Assessment Senate Assessment Committee & Curriculum Committee Program Assessment Program Review Institutional Assessment Institutional Effectiveness Student Services & Learning Support Services Assessment

7 7 Course Level Outcomes Each Faculty Member: Selects at least one class Selects at least one class Assesses at least 1 or 2 class SLOs per year Assesses at least 1 or 2 class SLOs per year Implements and documents changes made to improve the class Implements and documents changes made to improve the class Coordinates with program assessment plans Coordinates with program assessment plans Goal to assess all SLOs within a course over a 4-6 year cycle

8 8 Class and Course Outcomes OutcomesStrategiesMeasurable ProgressAttainmentNotes 1. Website for submission of SLOs Submission Form SLO Criteria Website Submissions Feb 1-early submission Mar 1- all SLOs submitted 144 Submissions by Feb 1; 64% of faculty; 83% departments 2. Provide feedback and review Group review Develop methodology Communication with facultyApril – May 04 Feedback given to all who request it IEC conscripted Assessment Committee 3. Workshops on class/course assessment plans Plan workshops summer 04;Present at opening day -Fall 04 Attendance at workshops Survey faculty for improved help and workshops Summer/Early Fall 04: Workshops held Develop workshops and strategy; 4. Initiate dialogue on course SLOs Faculty within departments begin to discuss course SLOs Submit plan to website or person or committee Fall 04: Official curriculum review document w/SLOs This dialogue will be ongoing 5. Write class/course assessment plans Faculty write assessment plan for one class Submit plan to websiteNov 04: Plans submitted Need data collection method – check plans 6. Implement assessment plans Faculty implement 1 class assessment plan Assessment DataSpring 05 plans implemented Need to change curriculum review format 7. Finalize course SLOs Coordinate course SLOS within each department Faculty dialogueSpring 05 Mar –Apr Finalize course SLOs for curriculum review Continue dialogue; refinement of SLOs 8. Develop course assessment plans Use successful plans as models Faculty input; ongoing dialogue Spring 06 Course Assessment plans due Submit assessment plans and data for courses 9. Close the loopGather data through mixed methods from faculty & dept on how SLOs effected education Collection of artifacts and records of reports Information used to change class delivery or outcomes success Need to have dialogue to help new faculty and improve what we are doing

9 9 How do Assessment and Grading Differ? Grading Individual instructor’s definitionIndividual instructor’s definition Purpose to rank studentsPurpose to rank students Less systematicLess systematic Includes data other than student learning (e.g. effort or participation)Includes data other than student learning (e.g. effort or participation) No external accountabilityNo external accountabilityAssessment Produced through dialogue Purpose to improve learning Systematic with criteria e.g. rubrics Focuses on specific outcomes and competencies Based upon criteria communicated to the public for accountability

10 10 Program Level Outcomes Define Program: pathways, departments, service areas, or other Course and Program SLO Alignment matrix Assess at least 1 or 2 program SLOs per year Implement and document changes made to improve program outcomes Goal: to assess, as part of program review, all program SLOs within a program review cycle

11 11 Course and Program SLO Alignment Matrix Measure program SLOs are learning outcomes for a program that will be measured to assess the effectiveness of that course of study Measure program SLOs are learning outcomes for a program that will be measured to assess the effectiveness of that course of study Analyze effectiveness both formatively (F) and summatively (S) Analyze effectiveness both formatively (F) and summatively (S) Course Program SLOs SLO 1 SLO 2 SLO 3 SLO 4 SLO5 Chem 11 ChemistryFFFF Bio 14 AnatomyFFS Bio 15 PhysioFSFS Bio 16 MicroSS

12 12 Program Level Outcomes Define Program: pathways, departments, service areas, or other Course and Program SLO Alignment matrix Assess at least 1 or 2 program SLOs per year Implement and document changes made to improve program outcomes Goal: to assess, as part of program review, all program SLOs within a program review cycle

13 13 Program Assessment & Program Review OutcomesStrategiesProgressAttainmentNotes 1. Program Definition Degrees, prerequisites, programs, departments, learning support services, GE outcomes, AB1725, based on SLOs Spring 04All programs definedUnit plans will help 2. Program review format Use Leicester, WASC accreditation Spring 04New format completedRevised program review will link course outcomes with institutional strategic initiatives. 3. Develop Program SLOs and assessment plans, begin assessment training. Use matrix of courses related to SLOs to depict program outcomes. Once SLOs & assessment plans are developed, assessment should begin on some of the SLOs. Spring 04-Fall 04 All programs will have SLOs and an assessment plan. Begin implementing assessment plan Fall 05. 4. Representative pilot programs use new program review format Pilot programs will assess at least one or two outcomes. Select and train pilot participants to use new program review format. Fall 04Pilot programs complete new program review and now have some assessment data. Evaluate and revise review format. PR should be easier with program outcomes assessed. 5. Pilot program presentation. Pilot programs will present PR to IEC Spring 05New program review format is tested. Ongoing revision and improvement of process. 6. New program review Select and fully implement PR; schedule reviews and begin cycle. Fall 05 – Spring 07 New program review process cycle takes place. 7. Close the loopGather data via mixed methods on how program assessment and review effects education. Collection of data, artifacts, reports, interviews, surveys. Data used to modify class delivery and/or outcomes success. Essential to document information for accreditation.

14 14 Institutional Level Outcomes 1. 1. Improve student access, retention, and success 2. 2. Provide effective learning and earning pathways for students 3. 3. Support student learning through appropriate technology 4. 4. Support student learning through streamlined systems and processes Institutional SLO’s based upon the 4 BC strategic initiatives

15 15 Pathway for Institutional Assessment Develop Institutional SLO’s, Effectiveness Indicators (EI’s) and assessment methods Assess at least 1 or 2 Institutional SLO’s/EI’s each year and communicate results Implement and document changes made to improve institutional outcomes Goal to assess institutional SLO’s/EI’s within an accreditation cycle Words of Wisdom: Don’t measure with a micrometer, and don’t measure everything! Use valuable but low-effort assessment methods.

16 16 Institutional Level Outcomes: Progress OutcomeStrategiesProgressAttainment Determine group responsible for task Build new subcommittee/ task force from College Council/ Senate Begin Spring 04Group formed and acknowledged by Senate/College Council and President’s Cabinet Define both Student Learning & Effectiveness Indicator (EI) outcomes – EI’s are productivity/ compliance focused Develop assessable outcomes based on 4 Strategic Initiatives Some assessment under way (tracking data for retention/ success/ Noel Levitz, PFE Goals, others) List of outcomes sent to campus community Coordinate current assessment methods with 4 Initiatives Develop and Pilot Assessment tools, mechanisms Research current practices, off shelf products Pilot evaluation and modifications Use assessment data for better planning and $$ allocation Annually communicate institutional outcomes: strengths & weaknesses BC Research Web site, presentation of access, retention, & success trend data. Collection of data, artifacts, and records of reports Annual presentation of Institutional Effectiveness Report to College Council/ Senate, and Board; Evidence of impact upon planning and budgeting

17 17 Draft BC Assessment Plan Class and Course Assessment Senate Assessment Committee & Curriculum Committee Program Assessment Program Review Institutional Assessment Institutional Effectiveness Student Services & Learning Support Services Assessment

18 18 Overall Time Line Course Assessment Fall 03 - SLO workshops - SLOs for one class Fall 04 - Spring 05 - Assessment workshops - Class assessment plans - Assess 1 or 2 class SLOs - New curriculum format Fall 05- Spring 06 - Course Assessment - Course assessment plans - Assess 1 or 2 Course SLOs - Close the loop – modify and improve Program Assessment Spring 04 -Define program - New program review & assessment format Fall 04 -Program matrices - Pilot new format Spring 05 - Present pilot Fall 05 - New program review & assessment Institutional Assessment Spring 04 -Institutional Committee - Define effectiveness indicators Fall 05 Develop pilot & assessment tools -Present pilot data Spring 06 - Use assessment for budgeting and planning

19 19 How Does Assessment Improve Learning? Helps align programs of study – pre-requisites have SLOs that directly connect to sequential classes Provides accountability to students, public and other instructors Should link to budgeting and planning Should improve instruction Should improve institutional effectiveness Should help communicate clear expectations for students, faculty and staff

20 20 Why Assessment? It’s all about improved student learning


Download ppt "1 BC Draft Assessment Plan Academic Senate Assessment Committee."

Similar presentations


Ads by Google