Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010.

Similar presentations


Presentation on theme: "The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010."— Presentation transcript:

1 The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010

2 Presenters Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs

3 Overview Framework to Address the APRs Structure/Challenges/Approach Examples of Metrics Current Action Plan Integration of End User Technology Next Steps Benefits of Our Approach Questions

4 Assessment Cycle-2005 Plan to meet Meet to plan Report out on planning Plan to meet Meet to plan

5 What we had to build on Strong focus on programs. State mandated 5-year academic program review in need of revision. Institutional Effectiveness Model (IEM) with performance indicators benchmarked through State and National data bases.

6 Mission Strategic Initiative: Access & Success Institutional Effectiveness

7 IEM Needed a way to assess how the College was performing on key metrics in relation to prior. years/semesters and compared to other institutions. Historical/Trend data Benchmark data –Pennsylvania & National Peers

8 Institutional Effectiveness Model

9 Where we started Restructured the Academic Program Review process Incorporated the use of technology

10 Goal of the restructuring Measure student performance as evidence by results of assessment of student learning outcomes. Measure program performance as evidenced by comparison of program performance to overall college performance on specific key indicator (current and aspirational).

11 Challenges Usual issues with assessment in general. Faculty had little knowledge of the Colleges performance indicators. Organizational separation of assessment of institutional and student learning outcomes.

12 Approach Began by building it backwards from the IEM by mapping out specific core indicators to program data, making additions where needed.

13 Examples of Metrics Used for APR

14 TARGETSCautionAcceptableAspirational Graduation Rate <19%19%-23%>23%

15

16 TARGETSCautionAcceptableAspirational Transfer Rate<29%29%-32%>32%

17 Definitions of Success & Retention Success=Grades of (A,B,C & P)/(A, B, C, D, P, D, F, & W) Retention=Grades of (W)/(A, B, C, D, P, D, F, & W)

18

19

20 Added a curricular analysis How well program goals support the colleges mission. How well individual course outcomes reinforce program outcomes. How well instruction aligns with the learning outcomes.

21 Specific assessment results. Changes made based on the assessment findings. Evidence of closing the loop Changes made to the assessment plan.

22 Action Plan Outcomes expected as a result of appropriate actions steps. Timelines and persons responsible for each action step. Resources needed with specific budget requests. Evaluation plan with expected benefits.

23 Bottom Line Is there sufficient evidence that the program learning outcomes are being met? Is there sufficient evidence that the program is aligned with the college on specific key indicators?

24 The Framework Planning and Budgeting (Standard 2) APR Action Plan APR Annual Report Annual Academic Planning Assessment Results Curriculum Committee Presidents Office Curriculum BOT & BOT

25 Addition of Technology Worked in concert with Information Technology to integrate iStrategy with ERP (Datatel). The implementation of this permitted end users to obtain the data needed for program assessment, without the middle man (IR and/or IT).

26

27

28

29

30 Next Steps in the Evolution of of College and Program Outcomes

31 Example of APR Report Card

32 Examples of Course Success

33 Success in ACC /FA2004/FA2005/FA 2006/F A 2007/F A 2008/F A 2009/F A % Success 61.4%57.1%55.4%55.3%51.4%44.2%48.3% # Success % Non Success 38.6%42.9%44.6%44.7%48.6%55.8%51.7% # Non Success

34 Success in ACC /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % Female Success 63.3%57.5%58.8%57.7%57.3%51.8%58.7% Female Success % Male Success 59.8%56.8%53.2%53.6%47.2%39.1%42.1% Male Success

35 Success in Math /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % Success53.6%46.3%47.3%45.7%44.8%43.3%47.4% Success % Non Success46.4%53.7%52.7%54.3%55.2%56.7%52.6% Non Success

36 Success in Math /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % African American Success 42.6%37.7%38.5%25.8%26.9%29.9%34.7% African American Success % Caucasian Success58.2%51.8%50.3%52.7%53.5%48.4%52.0% Caucasian Success

37 Benefits Build a bridge between Standards 7 and 14. Better data. By putting data in the hands of faculty, have them actively engaged with using data in decisions/planning. IR time better used.

38 Annual planning cycle developed. Built a culture of assessment in several of the academic divisions. Curricular changes that align with graduation initiative. Curricular and program improvement. Created a college-wide model for improvement of student learning.

39 Evolution of the Dashboard Creation of a Student Success Dashboard Metrics: Course level success and retention (Developmental and College-Level) Persistence (fall to spring and fall to fall) Progression of various cohorts of students College level success in Math or English after Developmental Math or English Graduation Transfer

40 Graphic Representation for the SSD

41

42 Final Thoughts Its not perfect, but it works for us. Do the research on which tools are appropriate for your college Assessment of the core curriculum Launching of assessment software It all starts with asking the right question PRR 2010

43 Questions

44 Presenters Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs


Download ppt "The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010."

Similar presentations


Ads by Google