Download presentation
Presentation is loading. Please wait.
Published byAnastasia Norman Modified over 9 years ago
1
Lily Hwang, Director, Institutional Research Juliana Lancaster, Director, Institutional Effectiveness
2
4-year, State College in the University System of Georgia Authorized by GA Legislature in May 2005 President hired in September 2005 Campus opened with 118 students and 10 faculty in August 2006 Home of the Grizzlies!
3
Students: Fall 2007 Enrollment: Headcount 787 Spring 2007 Enrollment: Headcount 867 Fall 2008 Enrollment: Headcount 1563 Faculty ( Fall 2008): Instructional full-time faculty: 120 Instructional part-time faculty: 10 Facilities: 6 Buildings: A, B, C, D (Student Services Ctr), E (Valentine Bldg), F (Fitness Ctr) Building E not occupied yet Total: 474,351 square feet Parking Deck: 734 cars Total acreage: >200 Four Degree Programs BBA, Business; BS, Biology; BS, Information Technology; BS, Psychology
4
Commitment at every level to student learning and effectiveness Institutional focus on interdisciplinary/ integrated education Openness to going “outside the box” – provided there is a plan for assessment Created the opportunity for a ground-up design of an INSTITUTIONAL assessment plan and of well- integrated institutional research functions
5
Initial Design The First Full Year Lessons Learned Next Steps
6
Advantages of starting from scratch Strong executive level support for and understanding of IE Limited number of programs and offices at start-up Absence of legacy or standing processes and structures Disadvantages to starting from scratch Absence of legacy or standing processes and structures Each individual brings a different set of assumptions and expectations Rapid growth and hiring leads to continuous need for explanation/education
7
In order to get “…ongoing, integrated, and institution-wide research- based planning and evaluation processes…[SACS]” for we needed: Structure and resources Broad buy-in, consensus and agreement Working “ground rules” Institution-wide and pervasive Integrated with institution’s mission & strategic plan Faculty/staff participation and basic control Interdisciplinary and developmental assessment of student learning
8
Program level student learning outcomes and assessment plans General Education curriculum designed around student learning outcomes Agreement to develop and assess for institutional student learning outcomes Agreement to integrate curricular and co-curricular student learning efforts Leading to: Integrated Educational Experience (IEE) Student Learning Outcome Goals for GGC
9
Integrated Educational Experience SLO Goals Institutional Goals Administrative Unit Outcome Goals General Education Goals Program of Study Goals Course Goals Lesson Objectives Student Affairs Goals Student Affairs Activity Goals Conceptual Relationships Among Outcome Goals and Objectives
10
Organizational Structure to Manage Resulting Flood of Data IEE Goal Team Interdisciplinary Operationally define & plan assessment(s) Integrated review of program findings IEE Assessment Review Committee Communication Integrated review of IEE assessment results Assessment Steering Committee Integrated review of all assessment results Strategic analysis of results; impact on strategic plans Administrative Review Committee General Education Committee General Education Goal Teams Program Goal Teams
11
Planning All operating units, both academic and administrative developed assessment plans. Academic units focused on course-level, embedded assessments. All faculty and numerous staff engaged in discussing and planning assessment. Goal teams developed operational definitions of each institution-level student learning outcome (GE and IEE)
12
Execution All units attempted to fully execute their assessment plans Some outcomes were not measurable Some measures called for unobtainable data All units were able to collect valid data on at least one outcome Most units were able to identify at least one needed action in response to assessment 60% identified needed changes in curriculum or operations 34% identified needed changes in assessment plans
13
Challenges & Lessons Learned Implementing program-level assessment plans while still developing the institutional framework Communicating the history of and basis for having both General Education and IEE student learning outcomes at the institutional level Articulating the initial task of the Goal Teams: To operationally define each Student Learning Outcome Managing expectations at multiple levels
14
Next Steps Review the conceptual and actual relationships between the two sets of institution-wide student learning outcomes Initiate a campus-wide discussion about whether or not to make changes and, what those might be Continue developing a broad base of informed, skilled individuals across campus to lead assessment efforts. Continue efforts to establish systematic, manageable assessment at all levels
15
Unique Setting/Environment Major Tasks Major Challenges
16
Institutional Environment Banner hosted institution -- technical environment located at a central location – Office of Information & Instructional Technology (OIIT) Internal support available for IR: a core data manager (Banner function person), and a programmer (IT).
17
Major Tasks To learn legacy data system, e.g., Student Information Reporting System (SIRS) and Curriculum Inventory Reporting (CIR), etc. To learn USG reports, e.g., Semester Enrollment Report (SER)—their definitions. To learn new Academic Data Mart (ADM) systems. Producing reports (routines, ad hoc/internal & external). Producing the College Fact Book.
18
New Major Task Began IPEDS reporting Began many other surveys: CUPA Faculty Salary Survey (began earlier) National Postsecondary Student Aid Study (did not have data due to non-Title IV status at the data point) The Consortium for Student Retention Data Exchange (CSRDE), National Student Clearinghouse—supported by USG.
19
Major Challenges Entering in the transitional period from the legacy data system to new ADM system; allowing very brief learning curve. Learning together with other Units, e.g., the Registrar’s Office, Human Resources; requiring close relationships.
20
Example: A collaborative effort on establishing a CIP list representing GGC’s teaching disciplines/areas. Why is this important for GGC? GGC does not have departments. School >>Major (program) >> Tracks/Concentration
21
IE and IR As does every unit of GGC, IR operates within the college framework IE facilitates and monitors. Specific tasks for IR in support of IE operations: Institutional information request for accreditation purposes Information support for assessment projects, e.g., NSSE and Course Evaluations Anticipated tasks for IE in support of IR Providing benchmark and assessment data for Fact Book Collaboration in design of specific studies
23
Presenters: Juliana Lancaster Director, Institutional Effectiveness jlancaster@ggc.usg.edu Lily Hwang Director, Institutional Research lhwang@ggc.usg.edu
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.