Presentation is loading. Please wait.

Presentation is loading. Please wait.

We are using texting polls. Please follow the directions below.

Similar presentations


Presentation on theme: "We are using texting polls. Please follow the directions below."— Presentation transcript:

1 We are using texting polls. Please follow the directions below.

2 A systematic model of outcomes assessment and program improvement
Program Review: A systematic model of outcomes assessment and program improvement Kathleen Gorski, Ed.D Margaret Stemler, Ed.D.

3 National Louis University

4 Chicago, Elgin, Lisle, Skokie, and Wheeling
College of Professional Studies and Advancement Allied health Applied behavioral sciences Human services Business and Management National College of Education School of Teacher Preparation School of Advanced Professional Programs Founded in 1886 Campus Locations: Illinois: Chicago, Elgin, Lisle, Skokie, and Wheeling Florida: Tampa Enrollment: 7000 Gender: F=80% M=20% Diversity: 44% Minority Age: 35

5

6 Best Practices of Program Review
Student success data Student learning outcomes A review by external experts Future employment opportunities for students Analysis of the program’s curriculum Faculty teaching quality Aligned with the budgeting process

7 Best Practices of Program Review
Review of the program’s alignment between a program’s mission and the institution, department Review of the program’s alignment with regional and specialized accreditation standards Alignment with industry standards Input from the community, in satisfaction of hiring graduates

8 Program Review History
National Louis University

9 The original review process was a report that faculty spent months completing. Deans and program faculty reviewed the material and considered the review complete until the next 5 years.

10 The 30 Page Report Culture of Quality Financial Viability
Innovative Teaching and Scholarship Community Engagement Service and Operational Excellence The required content was meaningful. However…

11 When the report was complete,
It was PUT ON A SHELF

12 NLU thought they were doing the right thing, but it wasn’t working
They were asking good questions in their report. NLU thought they were doing the right thing, but it wasn’t working

13 How did they know it wasn’t working?
Prior to 2012 There were a few or no students enrolled in several programs There were programs that were not aligned with the University mission and vision The University did not have the resources to invest in the programs to gain an appropriate and sustainable market share Overall debt concerns

14

15 Program Prioritization
A remarkable and courageous departure in the way that NLU has done business Program Prioritization

16 Academic Portfolio Prioritization
Programs were Eliminated Reengineered Or Maintained The University used an engaging transparent process that involved stakeholders from throughout the institution with the end result of reducing overall budgets by 15% and re-positioning the institution to support strategic priorities that would advance growth and improve customer service in the process.

17 High level Academic Vision
We will position NLU for a vibrant sustainable future where… Excellence permeates our work and we have the evidence to prove it Students are engaged in learning that connects conceptual learning and practice. They are exposed to real life situations Programs are relevant and market driven The NLU experience advances student professional identity, enhances career goal achievement and instills a commitment to lead and to serve They created a high level academic vision to move to the future

18 As a result, a team was selected to create the new program review process.
Included: 20 Faculty from across university colleges and programs 5 Administrators I was a part of the council creating the new review process.

19 What we knew:

20 Faculty Perspectives:
The 30 page report took a up to a year to complete and was duplicative of other efforts, cumbersome and didn’t result in improvement The five year schedule was disruptive of their annual work There were concerns of additional program eliminations

21 It wasn’t necessary to reinvent the wheel
Culture of Quality Financial Viability Innovative Teaching and Scholarship Community Engagement Service and Operational Excellence

22 Based on faculty perspectives we needed to:

23 A new process was created including best practices
Program review criteria was redefined Faculty and administration worked together to select preferred metrics A rubric for program review was created A decision was made to use existing reports as evidence instead of completing ANOTHER report

24 Section I Program Effectiveness Best Practice Included: Student Success Data, Faculty Teaching Quality and Graduate Employment Information. The data criteria includes: Demand - Enrollment trends Quality - Graduation rates, Teaching effectiveness, Scholarly output Size, Scope and Productivity- Average class size, Retention rates, Percentage online, length of program Expenses and Revenue/Resources Generated- Program revenue and margins What is Needs improvement? What is Acceptable? What Is Effective? What is Distinctive? Why are we being compared to another programs?

25 Program Health

26 Section II Learning Outcomes Best Practice Included: Incorporating Learning Outcomes Assessment
Institutional and Program Level Assessment Curriculum map

27

28 Section III Program Impact, Rationale and Differentiation Best Practice included: Input from the Community Academic Research Service to the Community

29 Section IV Opportunity Analysis and Planning Best Practice Included: Review of the program’s alignment between a program’s mission and the institution, department. Aligned with the strategic plan and the budget. Program strengths Opportunities for improvement Actions that will leverage opportunities Action alignment to the strategic plan Resources needed to implement actions/improvements

30 Deans went on to comment on the reports
Deans went on to comment on the reports. Faculty wanted to know that their data was reviewed by their dean.

31

32 Ratings and Benchmarking

33 Rubric Ratings Improvement required Acceptable with improvement
Effective Distinctive

34 Why Are We Being Rated? What does Needs Improvement mean?
What is Acceptable? What is Effective? What is Distinctive?

35

36 Please define the following words:

37 Quality and Effectiveness became Health

38 “Health” refocused the purpose from elimination to support
Areas Of: Concern Competence Excellence Improvement required, effectiveness and distinction became areas of concern, competence, excellence. Benchmarks were created to identify which Area. The goal wasn’t supposed to be everything is Wonderful, the goal was to be thoughtful and transparent. Programs with concerns would be assisted in the budgeting process.

39 Annual Report / 5-year Review Overview
All programs complete a report each year Program faculty reflect upon the data and set program goals together by December 1. Dean or designee responds to each report by January 15th. Responses are reviewed across college programs and assists with program improvement decisions. Data is shared at the university level to look for trends and assure continuous improvement.

40 5-Year Review Programs in their 5-year review year were required to present at the University Program Review Council Strengths were shared Opportunities for improvement were shared Assessment results and discussion was shared

41 Let’s Talk About Assessment

42

43 Question: How is assessment incorporated and used in your process?

44 NLU’s Assessment Process
1. All programs identified and entry and exit assignment 2. Assignments were embedded in the course 3. Assessment Office extracted all of the identified assignments 4. Assignments were stripped of identifiers and set up in the LMS course shell for faculty to assess 5.Online norming sessions were conducted prior to faculty assessment 6. Scores were entered into the grade book. All artifacts were scored twice. If the two scores were not normed, the assignment was sent for a 3rd read

45

46

47

48

49

50 The binder is no longer on the shelf!
What did we do with the reports?

51 Administrative Review
After the Program Reviews are submitted, the College leadership teams conduct their own reviews. The teams focus on: Key performance indicators Identified strengths and weaknesses Goals for the following year Reflections and recommendations are made to the program faculty via comments on the program

52 A Deeper Dive - What we were looking for
Trends across the level (GRAD/UND), Departments, Schools and College Curriculum development Academic support Staffing Professional development Realistic goals and plans for the upcoming year New ideas Alignment with college/NLU strategic plans Growth opportunities

53 System planning and budgeting cycle
October-November Program health analysis December Annual Program Reports submitted January Administrative review and feedback to programs February College leadership & faculty engage in program planning meetings March Program action plans prioritized and incorporated into college level plans and budget proposal Apil - May Faculty workload plans and goals for the following year completed June Budget approved and planning finalized January: Written feedback to Program Chairs February College leadership team meets with Program Chair and faculty to discuss gaps, recommendations, and plans for the following year March: Program plans finalized including staffing, curriculum development, student learning, enrollment/outreach, student experience (retention and satisfaction), and professional development April: Program action plans prioritized and incorporated into College level plans and budget proposal May: Faculty complete workload plans and goals for the following year based on program and college level action plans June: Budget finalized

54 System planning and budgeting cycle
Program Health Analysis Program Planning College Planning and Budgeting Faculty Workload and Goals NLU Strategic Plan and Initiatives

55 How did we simplify and improve? We became systematic
Programs receive data on an annual basis Programs didn’t write an extensive report. They completed a web form during a faculty meeting. Programs review university assessment data Programs can set goals as they relate to their place within the university portfolio Results of the Areas inform the budget cycle, programs most in need can receive funds. All goals linked to the mission Accountability on goals and the opportunity to share best practices Form designed to be completed in a meeting or two. Faculty received university assessment data and program data was looped into the time frame. Benchmarks were created to assist with the Area ratings. Share best practices, those up for 5-year met with the council and shared practices.

56 Unintended Consequences

57 Collaboration and Communication Improved

58 References Barak, R. J., & Sweeney, J. D. (1995). Academic program review in planning, budgeting, and assessment. In R. J. Barak & L. A. Mets (Eds.), Using academic program review (pp. 3-18). San Francisco, CA: Jossey-Bass Publishers. Boothe, B. (2002). Linking assessment, strategic planning, and budget planning. (Doctoral dissertation). Lynchburg, VA: Liberty University Bresciani, M. J. (2006). Outcome-based academic and co-curricular program review: A compilation of institutional good practices. Sterling: Stylus Publishing. Christensen, C. M., & Eyring, H. J. (2011). The innovative university: Changing the DNA of higher education from the inside out. San Francisco, CA: Jossey-Bass Ewell, P. T. (2008). U. S. accreditation and the future of quality assurance: A tenth anniversary report from the council for higher education accreditation. Washington, DC: Council for Higher Education Accreditation. Germaine, R., Barton, G., & Bustillos, T. (2013). Program review: Opportunity for innovation and change. Journal of Research in Innovative Teaching, 6(1), Hoey, J. J. (1995). Organizational factors in program review. In R. J. Barak & L. A. Mets, (Eds.), Using academic program review (pp ). San Francisco, CA: Jossey-Bass Publishers. Jenefsky, C., Bresciani, M. J., Buckley, L., Farris, D., & Kasimatis, M. (2009). WASC guidelines for program review. Oakland, CA: WASC.


Download ppt "We are using texting polls. Please follow the directions below."

Similar presentations


Ads by Google