Office of Institutional Research Marilyn Blaustein Krisztina Filep Alan McArdle Heather Young Office of Institutional Research University of Massachusetts.

Slides:



Advertisements
Similar presentations
Planning for Academic Program Review Site Visits
Advertisements

Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Carol Livingstone Associate Provost for Management Information
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
1 UMass Dartmouth Conflicts of Interest Policies UMass Dartmouth Liz Rodriguez February 17, 2011.
Carol Livingstone Associate Provost for Management Information
PROGRAM PRIORITIZATION Open Forum January 16, 2013.
1 The Path to the Ph.D. in IS: Part 3, Advanced coursework and dissertation research.
Office of the Provost James V. Staros Provost & Senior Vice Chancellor for Academic Affairs Univ. of Massachusetts Amherst Evaluating Research at Levels.
Academic Analytics: Obtaining Benchmarks for Faculty Productivity Carol Livingstone Division of Management Information
Orientation for Academic Program Reviews
Library Resources Phase 2 of New Program Proposals CSU Library-IT Task Force February 19, 2009.
Office of the Provost - Institutional Research Colombian Leadership Dialogue: The Renewal of Research Universities Value of Comparative Data Lydia Snover.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
McLean Promotion to Associate Professor at Harvard Medical School Maureen T. Connelly, MD, MPH McLean Hospital February 3, 2010.
Digital Measures Managing and Reporting on Faculty Accomplishments Steve Hare Project Manager Office of Institutional Research, Assessment, and Effectiveness.
Orientation for Academic Program Reviews
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Dr. Edward Pai Dean, Institutional Effectiveness Phillip Briggs Research analyst.
Working Toward a Statewide Information System to Track the Effectiveness of Student Aid Financial Programs in Maryland Michael J. Keller Director of Policy.
Design and Development Awards Spring 2015 TLOS Networked Learning Design and Strategies (NLDS)
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
UGA’S STRATEGIC PLANNING DASHBOARD Allan Aycock Director for Assessment and Accreditation Shweta Doshi Business Intelligence Application Analyst 1.
Framework for Excellence Presentation to the Faculty Senate December 2, 2010.
Academic Analytics: Obtaining Benchmarks for Faculty Productivity Carol Livingstone Division of Management Information
Academic Assessment Task Force Report August 15, 2013 Lori Escallier, Co-Chair Keith Sheppard, Co-Chair Chuck Taber, Co-Chair.
University of Massachusetts Boston FY11 Budget Process February 25, 2010.
Columbia University :: Office of the Provost :: Planning and Institutional Research NRC Assessment of Research-Doctoral Programs October 27,
PERIODIC ASSESSMENT OF PROGRAMS AT UNIVERSITÉ DE MONTRÉAL Office of the Provost Hélène David, associate vice-rector academic affairs Claude Mailhot, Professor.
1 CU-Boulder Doctoral Programs and the NRC Study Lou McClelland Planning, Budget, and Analysis February 2006
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
Academic Analytics Faculty Detail Craig W. Abbey Associate Vice Provost and Director of Institutional Analysis April 9, 2014 Cambridge, MA.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
+ Meeting of Assistant Professors June 29, Faculty and Academic Affairs Leadership Steven Abramson, M.D., Vice Dean for Education, Faculty and.
Associate Professor to Professor Associate Professor to Professor Robert T. Burns, PhD. PE Assistant Dean & Professor University of Tennessee UTIA Promotion.
1 Faculty Motivation and Policies Steven R. Hall Professor of Aeronautics and Astronautics Chair of the MIT Faculty.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Updates to Program Approval Process and Graduate Faculty Nominations Dr. George Hodge Assistant Dean for Program Development.
ASEE Profiles and Salary Surveys: An Overview
Office of Core and Shared Resources Faculty Council Meeting October 9, 2012.
Quality assurance and graduate student support Fred L Hall Former Dean of Graduate Studies at University of Calgary, McMaster University,
ACADEMIC PROMOTIONS Promotions Criteria Please note, these slides only contain a summary of the promotions information – full details can be found.
Academic Program Viability Report February 2010 Florida Association of Institutional Research 2010 Annual Conference.
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
Academic Program Review Workshop 2017
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Academic Program Review
Research Administrator Meeting December 1, 2010 UI Implementation of Responsible Conduct of Research Requirements Twila Fisher Reighley Assistant Vice.
Dutchess Community College Middle States Self-Study 2015
John Halpin, Associate Dean, Perkins & Work Experience
Faculty Evaluation Faculty Workshop on Retention April 2, 2010
Your Career at Queen’s: Merit Review and Renewal, Tenure, & Promotion New Faculty Orientation August 24, 2017 Teri Shearer Deputy Provost (Academic.
Evan Siemann for Faculty Senate 11/16/16
Q3 Academic Year (January – March 2018)
Selected Digital Repository Projects
Overview of Sabbatical Leave Policies and Procedures
Promotion Tenure and Reappointment
Program Review Workshop
Some Approaches to Faculty Assignments
Some Approaches to Faculty Assignments
Graduate Program Update
Program Review Guidelines & Processes at SUNY New Paltz
MIT Institutional Research
Florida State University PhD Completion Project Phases I & II
Promotion Tenure and Reappointment
Biosketches and Other Attachments
Some Approaches to Faculty Assignments
Presentation transcript:

Office of Institutional Research Marilyn Blaustein Krisztina Filep Alan McArdle Heather Young Office of Institutional Research University of Massachusetts Amherst Inventing a New Wheel: Assembling a Campus-Wide Doctoral Program Review

2 Office of Institutional Research Overview  Background and Purpose  Doctoral Review Process  Data Sources Academic Analytics and Google Scholar Institutional Data (Admissions, Graduate Tracking, HR) National Research Council  Creation and Distribution of Reports  Results To Date  Lessons Learned  Looking Ahead

3 Office of Institutional Research UMass Amherst Facts  The flagship campus of the UMass system, the only public Carnegie very high research university in the state  Enrollment: over 28,000 students (Fall 2012) 78% undergraduate, 22% graduate students 2,600 students in doctoral programs  Over 50 academic departments Over 100 undergraduate majors 76 master’s and 50 doctoral programs  Research and development expenditures over $181 million (FY 2011)

4 Office of Institutional Research Background  Program reviews every 5-7 years Include graduate component of a program  Comprehensive doctoral review never been done (with the exception of participation in NRC assessment)  2006: National Research Council ( Results published Fall 2010

5 Office of Institutional Research Timeline  New chancellor on campus with renewed aspirations of reaching the level of other public AAU institutions Ohio State had just completed a doctoral review Chancellor charges Dean of Graduate School with conducting review for UMass Amherst  New provost Framework for Excellence Increase size of faculty to 1,200 by 2020 Double federal research awards/expenditures Increase post-doctoral appointments by 50% Increase doctorates awarded to 375 degrees/year

6 Office of Institutional Research Timeline (continued)  Advisory committee formed to oversee doctoral review Graduate Dean of Ohio State invited to campus RFP out for faculty scholarly productivity data  (Fall) Contract with Academic Analytics finalized (Fall) Appointment of Special Assistant to the Provost charged with oversight of the review Developed conceptual design and data requirements (January) IR charged with producing statistical reports Reports sent to programs February 1, 2012  Fall 2012 New chancellor with mandate for strategic plan

7 Office of Institutional Research Purpose and Principles  The Doctoral Review was meant to provoke discussion: of strengths and weaknesses, of future directions, of impediments to growth or improvement.  Guiding principles: Transparent: open access to the data Flexible: programs able to propose peers Fair: programs compared to other programs in the same field (not to each other within the university)

8 Office of Institutional Research The Process  Discussions with deans/program chairs/graduate program directors  Doctoral Programs Participated in identification of peer list Reviewed faculty lists Had full access (all programs) to research productivity data (AA portal)  Limitation: comparative data were retrospective Institutional data reflected current academic year (although comparative data were from NRC)

9 Office of Institutional Research Peers  Recommended peer list: primarily AAU public universities without a medical school (13 institutions)  For each program, peers without the program were removed  Each program was allowed to add or substitute peers  Peer groups for programs ranged from 5 to 13 institutions (70 additional institutions were selected as peers for at least one program)

10 Office of Institutional Research Academic Analytics  Private company, founded in 2005  Goal: Provide universities with an annual release of accurate data on faculty performance in a comparative, disciplinary context  Collects data independently  Contains comparative data on Ph.D. programs from 380+ institutions  Faculty classified into 172 disciplines  Data in 6 areas of research activity: books, journal publications, journal citations, proceeding citations, awards and grants

11 Office of Institutional Research Academic Analytics (continued)

12 Office of Institutional Research Faculty Lists  Academic Analytics mines institutional data sources (online directories, catalogs, websites) Lists are updated annually Member institutions review list and make changes (non- member institutions may review lists as well)  Faculty to include were tenure-stream faculty and individuals expected to perform research Programs were asked to review Programs able to add other faculty actively involved in research (e.g. emeritus professors, faculty from research affiliates)

13 Office of Institutional Research Google Scholar  For selected disciplines in Humanities, Social Science, Management and Education, productivity measures were not adequately captured by Academic Analytics AA has since expanded bibliographic coverage – may still not be sufficient  Conducted our own study of citations in Google Scholar Graduate student employee used ‘Publish or Perish’ software to obtain productivity data for thousands of faculty (ours and those of our peers) Needed to cross-reference faculty websites to make sure citations for the right person were being counted

14 Office of Institutional Research Student Data  Institutional admissions data (same as what is used for standard reporting)

15 Office of Institutional Research Graduate Tracking Data  Used our updated graduate tracking system to build a comprehensive picture of doctoral student progress

16 Office of Institutional Research Compiling the Report  Some data in Excel (Faculty lists, peers, notes, Google Scholar)  Academic Analytics downloaded from web portal  Student data (Admissions, Grad Tracking)

17 Office of Institutional Research Compiling the Report (continued)

18 Office of Institutional Research Compiling the Report (continued)

19 Office of Institutional Research Doctoral Program Reports  14 page reports, containing: Guidance & Instructions List of peers List of faculty, faculty characteristics and comparison Academic Analytics data shown in 27 charts (grants and grant $, books, publications, citations and awards) Quintile analysis of faculty and grant $ to size relationship Student characteristics (demographics, size of class) Admission trends and GRE scores Completion data: graduation rates, outcomes, median milestone times  Summary of doctoral student post-graduation activities as reported by programs

20 Office of Institutional Research Doctoral Program Reports (continued)  Highlight reports primarily for Deans were 2 pages with subset of information Only graphical data All productivity data showed counts per capita Different information was presented depending on the school/college  Both the full and highlight reports showed unadjusted, unweighted data Did not report Faculty Scholarly Productivity Index (FSPI)

21 Office of Institutional Research

22 Office of Institutional Research Distribution  Distributed electronically  Each program received its own report Excel macro sent same with appropriate attachment  All reports and highlights were available on a restricted shared drive Access was provided to deans, department/program chairs and graduate program directors

23 Office of Institutional Research Results To Date  Program self-assessments Programs able to question and/or augment data flaws Most programs submitted a self-assessment Many identified own areas for improvement, proposed plan and have started acting Self-assessments were sent to deans  Deans reports Most Dean’s reports have been received Very different styles of reports  Advisory Committee review Tasked by Chancellor to “identify areas of emphasis” (i.e., where can a modest investment make a difference?)

24 Office of Institutional Research Looking Ahead  Recommendations of the Advisory Committee  Program-level actions to improve performance  Sub-committee of Graduate Council to discuss time to degree and completion rate already in place  Tie-in with comprehensive program review  Updating reports with new data Already done for highlight reports

25 Office of Institutional Research Lessons Learned  Transparency and involvement of programs paid off in terms of buy-in Having an involved senior faculty member responsible for the review who worked with programs lent credibility to the process (but not necessarily the data) Importance of on-going reviews and access to new data keeps programs engaged  Ability to provide feedback to data source provider improves data quality  Time investment to create good database is well worth it and automating report production allowed for easy re-runs with corrected data

26 Office of Institutional Research Questions? For more information:  Website:  Marilyn Blaustein  Krisztina Filep