Presentation is loading. Please wait.

Presentation is loading. Please wait.

Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,

Similar presentations


Presentation on theme: "Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,"— Presentation transcript:

1 Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios, Dean of Human Resources & Labor Relations Stuart M. Asser, Chairman of Engineering Technology

2

3 QCC College Demographics  A constituent campus of the City University of New York  391 Fulltime Faculty/Approx. 500 Teaching Faculty  Enrollment of over 16,000 (Fall 2014)  Over 40% of students speak a language other than English

4 Foundational Truths  Assessment has become an integral part of college accreditation.  Assessment/Accountability will become even more important in the future.  Colleges and programs are at different places in the maturity of their assessment processes.  Colleges and programs have different resources available to them (e.g., number of faculty, availability of assessment expertise, time).  Colleges and programs have faculty who have different levels of expertise in their understanding of good assessment practice.

5 Division of Strategic Planning, Assessment and Institutional Effectiveness  Vice President of Strategic Planning, Assessment and Institutional Effectiveness  Dean for Accreditation, Assessment, and Institutional Effectiveness  Senior Director of Institutional Research  Assessment Manager  Office of Institutional Research and Assessment

6 Assessment  Assessment is one or more processes that identify, collect, and prepare data to evaluate the attainment of student outcomes and program educational objectives.  Effective assessment uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the objective or outcome being measured.  Appropriate sampling methods should be used as part of an assessment process. What

7 Evaluation  Evaluation is one or more processes for interpreting the data and evidence accumulated through assessment processes.  Evaluation determines the extent to which student outcomes and program educational objectives are being attained.  Evaluation results in decisions and actions regarding program improvement (closing the loop). How

8 Continuous Improvement  The program must regularly use appropriate, documented processes for assessing and evaluating.  The results of these evaluations must be systematically utilized as input for implementing enhancements to the program. When

9 Creating the Assessment Climate  Discussion at Department Meetings  Assessment Is Necessary o Accreditation o Grants  Decouple from faculty evaluation  Create Committee  Keep all Faculty Informed  Test the Assessment Process

10 Assessment Activities Form committee to draft  Objectives  Outcomes  Performance Indicators  Rubric Templates  Curriculum Map Faculty  Review/Revise committee drafts  Define Assignment, Exam, Project for Rubric  Perform Evaluation Committee summarizes data and reports to department Department evaluates for continuous improvement

11 Assessment Processes  Program Educational Objectives (should support Mission Statement)  General Education/Student Learning Outcomes  Assessment  Evaluation  Continuous Improvement (Closing the Loop)

12 Program Educational Objectives Graduates will be able to demonstrate ability to solve complex problems and participate in a team ‐ based environment Student Learning Outcomes Ability to function effectively on a team Performance Indicators Researches and gathers information Fulfills duties of team roles Shares in work of team Listens to other teammates Assessment Relationships

13 Objectives Vs. Outcomes Program educational objectives and student outcomes are similar but not the same. What are some of the differences?  Degree of specificity  Role of constituents  Types of measurements possible  Cycles of data collection

14 Importance of Well-Stated Performance Indicators  Provides faculty with clear direction for implementation in the classroom  Makes expectations explicit to students (great pedagogy)  Focuses data collection

15 Close The Loop! The biggest mistake made by people trying to create a continuous improvement process is not closing the loop.  Take the measurements.  Compare measurements from year to year.  Share these with all of the faculty and the Industrial Advisory Board.  Make and document changes based on measurements!

16 Faculty Role is Critical to Success! Outcomes assessment is a human process which must be owned by the faculty who must  Develop the student outcomes  Develop the performance indicators  Evaluate results of assessment  Identify and design areas for improvement  Implement changes  Assess impact

17 Resistance to Engage in Assessment  Motivation to participate is highly personal—each individual sees it in terms of how it affects him/her and work  People always resist things that they perceive not to be in their best interests.  Resistance is an expression of power — the ability to not get what you don’t want.

18 Motivating/Supporting Faculty Participation  Create understanding that assessment is necessary for accreditation  Tenure/Promotion/Recruitment  New Faculty Institute  Create a Faculty Assessment Institute  Create an Assessment Committee in the Faculty Senate  Provide resources/grants for assessment projects  Encourage faculty to attend accreditation and assessment conferences

19 Assessment Method Truisms  There will always be more than one way to measure any student outcome  No single method is good for measuring a wide variety of student abilities  There is generally an inverse relationship between the quality of measurement methods and their expediency  “Ideal” method means those that are best fit between program needs, satisfactory validity, and affordability (time, effort, and money)  It is important to pilot test to see if a method is appropriate for your program  Crucial to use multi ‐ method/multi ‐ source approach to maximize validity and reduce bias of any one approach

20 Things I Wish I Had Known…  Capitalize on what you are already doing  One size does not fit all  You don’t have to measure everything all the time  More data -- not always better  Pick your battles  Take advantage of local resources  Don’t wait for perfection  Go for the early win  Decouple from faculty evaluation

21 Questions WWW.QCC.CUNY.EDU/Assessment


Download ppt "Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,"

Similar presentations


Ads by Google