Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting.

Similar presentations


Presentation on theme: "Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting."— Presentation transcript:

1 Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting Workshop December, 2006

2 Assessment and Review  Outline of Presentation Why review/assess graduate programs A review process incorporating periodic external reviews and continuous program assessment

3 Marilyn J. Baker Revised and Updated by: Margaret King, Duane Larick, and Michael Carter NC State University

4 Background Information About Our Audience  How many of you are responsible for graduate program review at your institutions?  How many of you have this as a new responsibility?  How many of you have recently (or are considering) changing your procedure?

5  The primary purpose should be to improve in the quality of graduate education on our campuses By creating a structured, scheduled opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far- seeking, and as apolitical as possible Why Review/Assess Graduate Programs?

6  External Considerations To help satisfy calls for accountability  Especially at the State level Requirement for regional accreditation, licensure, etc.

7 SACS Principles of Accreditation  Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”

8 Why Review/Assess Graduate Programs?  Internal Considerations Meet long-term (strategic) College & Institutional goals  Creation of new degree programs  Elimination of existing programs  Funding allocation/reallocation Advanced understand of graduate education and factors influencing graduate education  Aids in identification of “common” programmatic needs

9 Why Review/Assess Graduate Programs  Internal Considerations Creates an opportunity to focus on key issues impacting graduate education  Causes of retention/attrition among students and faculty Meet short-term (tactical) objectives or targets at the program level  Documents achievements of faculty & students  Indicates the degree to which Program outcomes have been achieved  Suggests areas for improvement  Helps chart new programmatic directions

10 So The Questions We Need To Ask Ourselves Are  What are we currently doing?  Why are we currently doing it?  Is what we are currently doing accomplishing the external goals described above?  Is what we are currently doing accomplishing the internal goals described above?  Is there a better way?

11 Graduate Program Review – A Two Phase Process  Periodic formal review of graduate programs (external review)  Outcomes-based assessment (internal review that is a continuous and ongoing process)

12 Key Features of Formal Reviews  Evaluative, not just descriptive  Forward-looking: focus on improvement of program, not just current status  Based on program’s academic strengths and weaknesses, not just ability to attract funding  Objective  Independent, stands on its own  Action-oriented: clear, concrete recommendations to be implemented

13 Questions Answered by Formal Review  Is the program advancing the state of the discipline or profession?  Is its teaching and training of students effective?  Does it meet institutional goals?  Does it respond to the profession’s needs?  How is it assessed by experts in the field?

14 Issues to be Resolved Before Beginning  Locus of control  Graduate-only or comprehensive program review  Counting—and paying—the costs  Master’s and doctoral programs  Coordination with accreditation reviews  Scheduling the reviews  Multidisciplinary and interdisciplinary programs

15  Clear, Consistent Guidelines The purpose of graduate program review The process to be followed Guidelines for materials to be included in each phase A generic agenda for the review The use to which results will be put Key Elements of a Successful Program Review

16  Administrative Support Departmental resources: time, funding, secretarial help, etc. Central administrative support for larger review process Adequate and accurate institutional data, consistent across programs Key Elements of a Successful Program Review

17  Program Self-Study Engage the program faculty in a thoughtful evaluation of:  The program’s purpose(s)  The program’s effectiveness in achieving these purposes  The program’s overall quality  The faculty’s vision for the program Key Elements of a Successful Program Review

18  Surveys/Questionnaires Surveys from current students, faculty, alumni, and employers Factors to be considered:  Time and expense to develop, distribute and collect responses  Likely response rate  Additional burden on respondents  Uniqueness of information to be gained Key Elements of a Successful Program Review

19  Student Participation Complete confidential questionnaires Provide input into self-study Be interviewed collectively and individually by review team Serve on review teams and standing committees Key Elements of a Successful Program Review

20  Review Committee On-Campus Representation  A representative of the Graduate School  Internal reviewer from a field that gives him/her some understanding of the program(s) being reviewed External Reviewer(s)  Number of reviewers depends on scope and kind review  Selection process can vary – programs can have input but should not make the final decision Key Elements of a Successful Program Review

21  Final Report by Review Team Brief overview of program Strengths of program Areas for improvement Recommendations for improvement Key Elements of a Successful Program Review

22  Program Faculty’s Response to Report Clear up errors or misunderstandings Respond to the recommendations (have implemented, will implement, will consider implementing, cannot implement and why) Key Elements of a Successful Program Review

23  Implementation One or more meetings of key administrators (department, college, graduate school, and university) to discuss recommendations An action plan or memorandum of understanding drawn up and agreed on by all participants Discussion of the recommendations with program faculty for implementation Integration of the action plan into the institution’s long-range planning and budget process Key Elements of a Successful Program Review

24  Follow Up An initial report on progress toward implementation of action plan (1 or 2 years out) Follow-up reports until action plan is implemented or priorities change Discussion of recommendations and implementation in self-study for next review Key Elements of a Successful Program Review

25 Questions Relative to External Program Review?

26 What is Outcomes-Based Assessment?  It is a process that engages program faculty in asking 3 questions about their programs What are our expectations for the program? To what extent is our program meeting our expectations? How can we improve our program to better meet our expectations?  It is a process that provides program faculty the means to answer these questions By creating objectives and outcomes for their program By gathering and analyzing data to determine how well the program is meeting the objectives and outcomes By applying the results of their assessment toward improving their program

27 What is Outcomes-Based Assessment? continued  It entails a shift in emphasis from inputs to outcomes  It is continuous rather than periodic  It involves regular reports of program assessment to the institution  Its results are used by the program and institution for gauging improvement and for planning

28 What is Outcomes-Based Assessment? continued Faculty generate program objectives and outcomes Faculty decide how outcomes will be assessed Faculty assess outcomes Faculty use assessment findings to identify ways of improving their programs

29 Benefits of Outcomes Assessment  It provides the groundwork for increased responsiveness and agility in meeting program needs  It gives faculty a greater sense of ownership of their programs  It provides stakeholders a clearer picture of the expectations of programs  It helps institutions meet accreditation requirements

30 SACS Criterion for Accreditation Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.”

31 Drive Toward Greater Accountability on Our Campus  Professional accreditation agencies (e.g., engineering, social work, business)  Undergraduate assessment  Assessment of general education

32 Outcomes Assessment: A Process  Phase I: Identifying Objectives and Outcomes  Phase II: Creating Assessment Plans  Phase III: Implementing Assessment Plans  Phase IV: Reporting Assessment Results

33 A Procedure for Implementing Outcomes Assessment  Identify pilot programs to create assessment materials for each phase  Use pilot materials as a basis for DGP workshops for each phase  Offer individual support to DGPs as they created materials and assessed programs  Create online tools to aid DGPs

34 Phase I: Identifying Objectives and Outcomes

35 What Are Objectives? Program objectives are the general goals that define what it means to be an effective program.

36 Three Common Objectives  Developing students as successful professionals in the field  Developing students as effective researchers in the field  Maintaining/enhancing the overall quality of the program

37 What Are Outcomes? Program outcomes are specific faculty expectations for each objective that define what the program needs to achieve in order to meet the objectives.

38 Example for Outcome 1: Professional Development 1. To enable students to develop as successful professionals for highly competitive positions in industry, government, and academic departments, the program aims to provide a variety of experiences that help students to: a.achieve the highest level of expertise in XXXX, mastery of the knowledge in their fields and the ability to apply associated technologies to novel and emerging problems b.present research to local, regional, national, and international audiences through publications in professional journals and conference papers given in a range of venues, from graduate seminars to professional meetings c.participate in professional organizations, becoming members and attending meetings d.broaden their professional foundations through activities such as teaching, internships, fellowships, and grant applications

39 Example for Outcome 2: Effective Researchers 2.To prepare students to conduct research effectively in XXXX in a collaborative environment, the program aims to offer a variety of educational experiences that are designed to develop in students the ability to: a.read and review the literature in an area of study in such a way that reveals a comprehensive understanding of the literature b.identify research questions/problems that are pertinent to a field of study and provide a focus for making a significant contribution to the field c.gather, organize, analyze, and report data using a conceptual framework appropriate to the research question and the field of study d.interpret research results in a way that adds to the understanding of the field of study and relates the findings to teaching and learning in science Etc.

40 Example for Outcome 3: Quality of Program 3. To maintain and improve the program’s leadership position nationally and internationally, the program aims to: a.continue to be nationally competitive by attracting high- quality students b.provide effective mentoring that encourages students to graduate in a timely manner c.place graduates in positions in industry and academics d.maintain a nationally recognized faculty that is large enough and appropriately distributed across XXXX disciplines to offer students a wide range of fields of expertise

41 Phase II: Creating Assessment Plans

42 Four Questions for Creating an Assessment Plan 1. What types of data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

43 Types of Data Used 1. Take advantage of what you are already doing  Preliminary exams  Proposals  Theses and dissertations  Defenses  Student progress reports  Student course evaluations  Faculty activity reports  Student exit interviews

44 Types of Data Used 2. Use resources of Graduate School and institutional analysis unit  Enrollment statistics  Time-to-degree statistics  Student exit data  Ten-year profile reports  Alumni surveys

45 Types of Data Used 3.Use your imagination to find other types of data Dollar amount of support for faculty Student activity reports Faculty surveys

46 Data: Two Standards to Use in Identifying Data 1. Meaningful: Data should provide information that is suitable for assessing the outcome 2. Manageable: Data should be reasonable to attain (time, effort, ability, availability, resources)

47 Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

48 Sources of Data  Students  Faculty  Graduate School  Graduate Program Directors  Department Heads  Registration and Records  Advisory Boards  University Planning and Analysis

49 Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

50 Frequency of Data Collection  Every semester  Annually  Biennially  When available from individual graduate students At the preliminary exam At the defense At graduation

51 Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze the data?

52 Creating a Timeline for Analyzing Assessment Data  According to objective: year 1-objective 1; year 2-objective 2; year 3-objective 3; year 4- objective 1; etc. (3-year cycle)  More pressing outcomes earlier and less pressing ones later  Outcomes easier to assess earlier and outcomes requiring more complex data gathering and analysis later  Approximately the same workload each year of the assessment cycle

53 Four Questions for Creating an Assessment Plan 1. What data should we gather for assessing outcomes? 2. What are the sources of the data? 3. How often are the data to be collected? 4. When do we analyze and report the data?

54 Assessment Plan

55 Phase III: Implementing Assessment Plans Collecting, Analyzing, and Evaluating Data and Improving the Program

56 Collecting Data Goal: To have data readily accessible when it is time to analyze the data.

57 Typical Modes of Data Collection  Rubrics for prelims and defenses  Student Activity Reports/CVs  Statistics provided by Graduate School  Faculty Activity Reports  Student exit surveys or interviews

58 Suggestions for Collecting Data  Identify the kinds of data you need to collect, who is responsible for collecting them, and when they are to be collected.  Determine where the data are to be stored and check periodically to be sure data are up to date.  Make data collection and storage as much a departmental routine as possible.

59 Analyzing Data Goal: To put data into a form that will allow faculty to use them to evaluate the program.

60 Spreadsheet for Rubrics for Prelims and Defenses

61 Graphs from Graduate School Statistics

62 Evaluating Data Goal: To use the data to judge the extent to which the program is meeting faculty expectations.

63 Suggestions for Evaluating Data  In most cases, the primary criterion for evaluation is faculty expectations. Allow faculty to discuss their expectations as a way of defining criteria for evaluation.  Guide faculty discussion by asking them to identify strengths of the program and areas of concern.  Evaluation is typically a judgment call; encourage faculty to trust their judgments.

64 Making Decisions for Improving the Program Goal: To apply what has been learned in evaluating the data to identifying actions that address areas of concern.

65 Suggestions for Making Decisions for Improving Programs  Lead faculty in brainstorming; try to elicit multiple suggestions for actions.  All suggestions should be evaluated for feasibility and validity (do they offer a good chance of affecting the area of concern?).  It’s OK to conclude that change is not yet warranted, more data need to be collected.  Also encourage faculty to address the need for changes in assessment procedures.

66 Phase IV: Reporting Assessment Results

67 Reporting Assessment Results Goal: To submit a report every two years in which you summarize your assessment process and findings.

68 Creating a Timeline for Reporting Assessment Data  Standard practice appear to call for an annual or biennial assessment report  Longer cycles lose the impact on the continuous and ongoing nature  When possible correlate with pre- existing external review program

69 Two Purposes of Assessment Reports 1. Primary: To maintain a record of assessment and improvements for you and subsequent DGPs to be used for self-studies, accreditation agencies, boards of advisors, etc. 2. Secondary: To provide evidence of a process of accountability for the university.

70 Questions to Guide Reports 1. What outcomes were you scheduled to assess during the present biennial reporting period? What outcomes did you assess? 2. What data did you collect? Summarize your findings for these data. 3. What did you and your faculty learn about your program and/or your students from the analysis of the data? What areas of concern have emerged?

71 Questions to Guide Reports 4.As a result of your assessment, what changes, if any, have you and your faculty implemented or considered implementing to address areas of concern? 5.What outcomes are you planning to assess for the upcoming biennial reporting period?

72

73 What We Have Learned  The process of change takes time  Communication is the key to success  It is important to pilot assessment processes before taking it to all graduate programs.

74 What We Have Learned continued  This kind of review process must be ground (faculty) up not top (administration) down  This kind of review process requires significant human resources Training, data collection, analysis, and interpretation, etc. A key to our success is how much of this can be institutionalized

75 Managerial Tools Created for Program Review - Website

76 Assessment and Review - Connecting the Two  Both must be owned by the faculty The self-study required for formal program review must have input from the entire faculty  The resulting “action plan” must also be agreed on by the faculty in the Program The objectives, outcomes and assessment plan for outcome based assessment must have buy in and participation by all faculty

77  Continuous and ongoing review should inform and enhance formal program review Formal review self-study should include a summary of the assessment findings and changes implemented. Ideally, these incremental improvements will have resulted in a stronger program and fewer “surprises” at the time of the formal review. Assessment and Review - Connecting the Two

78  The formal review process may suggest additional or revised program outcomes and assessment measures Formal review self-study should include an outline of the program outcomes and assessment plan for reviewer comment Assessment and Review - Connecting the Two

79 ` Questions & Discussion

80 Managerial Tools Created for Program Review - Website

81

82

83 Managerial Tools Created for Program Review – Review Document Management

84

85

86


Download ppt "Assessment & Review of Graduate Programs- Doctoral Duane K. Larick & Michael P. Carter North Carolina State University Council Of Graduate Schools Pre-Meeting."

Similar presentations


Ads by Google