Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Evolution of a QEP: One Institution’s Mid-Cycle Report

Similar presentations


Presentation on theme: "The Evolution of a QEP: One Institution’s Mid-Cycle Report"— Presentation transcript:

1 The Evolution of a QEP: One Institution’s Mid-Cycle Report
Learning Without Borders: Internationalizing the Gator Nation University of Florida Gainesville, Florida Timothy S. Brophy, Professor and Director of Institutional Assessment Matthew Jacobs, Associate Professor and Director of Undergraduate Programs Cynthia Tarter, Assistant Director of Undergraduate Programs

2 Today’s learning outcomes
Using UF’s QEP as a foundation, we will: Examine our process for modifying the QEP while it is being implemented; Discuss and critique the process of data analysis and plan modification; and Interpret the utility of the UF process model as it applies to your institution.

3 UF’s QEP: Learning Without Borders
Institutional planning and development The initiatives Study Abroad On-campus course development Campus Life International Scholars Program International Calendar and Marketing

4 UF QEP: Student Learning Framework

5 Approach to implementation
The “plan” and the “reality” The institutional level—integration and buy-in The personnel level—staffing assessments The faculty level—engagement and collaboration The student level—benefits and opportunities Director, QEP (Dean) Assistant Director (Admin) Graduate Assistant (Assessment) ? Director, Undergraduate Academic Programs Gold Silver

6 UF Process Model: Assessment and Data Reporting
Assessment and Institutional Effectiveness Data Reporting Establish Mission, Goals, and Outcomes Assessment Planning Implement the Plan and Gather Data Interpret and Evaluate the Data Modify and Improve February – Assessment Plans submitted for the next AY October - Assessment Data, results, and use of results for previous AY reported

7 Process Model: UF Assessment System
Develop Academic Assessment plans and data reports System entry: Submit for institutional review Implement plan and collect data System entry: Submit reports

8 Validity and Reliability

9 Validity Validity is “a unitary concept – it is the degree to which all the accumulated evidence supports the intended interpretation of test scores for the proposed use.” APA/AERA/NCME, Standards for Educational and Psychological Testing, 2014. For institutional assessment, the evidence is SLO data (the ‘test scores’) and the proposed use of this data is to determine the degree to which an SLO has been met by students in the program. Interpretation – the faculty set criteria for success, and make inferences from the SLO data as to the degree to which their students achieve the SLO.

10 Checking for Validity at the institutional level
All plans and data reports are reviewed by Institutional Assessment staff All measures of goals and SLOs are reviewed to ensure that they lead to data pertinent to the goal or outcome (validity) If there are questions, the plan or report is returned for modification or clarification

11 Reliability/Precision
In its broadest sense, “reliability refers to the consistency of scores across replications of a testing procedure…this is always important…and the need for precision increases as the consequences of decisions and interpretations grow in importance.” APA/AERA/NCME, Standards for Educational and Psychological Testing, 2014.

12 Checking for Reliability at the institutional level
Reliability of SLO assessments is the responsibility of the academic program; in this case, the QEP staff Faculty have access to the built-in reliability functions of our Learning Management System (Canvas) – they can program the LMS to collect data on their program SLOs for content assessments We do monitor the reliability of our QEP measures, which are administered institutionally – institutionally developed measures (IntCrit and IntComm)

13 Getting to results: Data collection and analysis
The process and challenges—incentives and obligations • Student surveys—By cohorts, program participants • Faculty—course SLO data • Identifying other sources of data Interpreting results • Were you asking the right questions? • Anomalies and outliers? • Factors inhibiting growth/change Key findings • Study abroad and curriculum enhancement grants • Applying global student learning outcomes across multiple disciplines • Student’s intercultural interaction and active learning of different cultural norms

14 Mid-cycle Modifications: Use of results
More strategic approach Study abroad program development and course development Unit based vs. individual faculty; curriculum integration; diversity Marketing Different constituencies require different approaches On-campus engagement

15 Conclusions Busting myths
OK to change and adapt based on assessment and analysis Faculty participation Campus-wide knowledge and engagement Energy intensive administration and outreach Anticipation and mitigation of challenges Flexibility, sensitivity to challenges Solution oriented and collaborative Use of results to modify and improve plan

16 Contact Information and Q&A Timothy S. Brophy, Director of Institutional Assessment and Professor, Music Education University of Florida Office of the Provost Matthew F. Jacobs, Director of Undergraduate Academic Programs and the International Studies Program, Associate Professor of History University of Florida International Center and College of Liberal Arts and Sciences Cynthia M. Tarter, Assistant Director of Undergraduate Academic Programs and the International Studies Program


Download ppt "The Evolution of a QEP: One Institution’s Mid-Cycle Report"

Similar presentations


Ads by Google