Presentation is loading. Please wait.

Presentation is loading. Please wait.

Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions.

Similar presentations


Presentation on theme: "Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions."— Presentation transcript:

1 Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions

2 Mo Noonan Bischof Assistant Vice Provost mabischof@wisc.edu Amy Goodburn Associate Vice Chancellor agoodburn1@unl.edu Nancy Mitchell Director, Undergraduate Education nmitchell1@unl.edu

3 LEAP Integrative Learning Synthesis and advanced accomplishment across general and specialized studies demonstrated through the application of knowledge, skills, and responsibilities to new settings and complex problems.

4 Challenge: Assessing Integrative Learning Can/Should the same assessment tools be used for assessing within a course, a unit, and/or institution? Does it apply to integrating knowledge and skills within a discipline, among disciplines, or both? How can we align quality improvement levels while respecting disciplinary purposes & values?

5 21,615 employees… 2,177 faculty 1,635 instructional academic staff 1,261 research academic staff 5,291 graduate assistants UW-Madison Learning Community 42,820 students … 29,118 undergraduates 9,183 graduate students 2,774 professional students 1,745 Non-degree students

6 Annually: 7,400 new undergraduates 29,500 enrolled undergraduates 6,500 Bachelor’s degree graduates

7 More than 300 200-299 100-199 50-99 1-49 Annual Degrees 13 academic schools/colleges distributed responsibility and governance ~500 academic programs, all levels 134 Bachelor’s level degree programs

8

9 Program-level learning goals, assessments Institutional-level learning goals, assessments Includes WI- X and ELO’s Program-level learning goals, assessments

10 Why pilot the AAC&U VALUE Rubrics? Identified gap: institutional level assessment, direct measure approach Evaluates student learning across programs Aligns with AAC&U Essential Learning Outcomes Aligns with VSA/College Portrait demonstration project First pilot project summer 2012, second pilot 2013 Main Goal: bring faculty across disciplines together to evaluate student work

11 AAC&U VALUE Rubric Project ScorersArtifactsRubrics Cohort of 25 faculty Cross-disciplinary representation Focus on faculty engagement AAC&U VALUE written communication rubric “ Value-added” approach to compare first year students and students near graduation

12 Written Communication VALUE Rubric Selected written communication for ease of identifying artifacts across disciplines/programs Dimensions: Context and Purpose for Writing Content Development Genre and Disciplinary Convention Sources and Evidence Control of Syntax and Mechanics

13 Artifacts: “Value-added” Approach Goal was to collect 350 artifacts at each level, FYR and NGR Identified 52 courses that had high numbers of FYR and NGR and seemed likely to have a suitable writing assignment 22 courses (41 instructors) had a suitable assignment and agreed Invited 2450 students to submit artifacts Collected 451 submissions

14 Scorers: Faculty Engagement 1.5 day workshop in June 2013 Set ground rules 3 structured rounds intended to get faculty familiar with the rubric and to “test” scorer agreement Asked faculty to think beyond their field/discipline Each scorer rated about 40 artifacts Discussion revealed challenge with the 4-point scale and what is “mastery” ScorersArtifactsRubrics

15 *Zmw score is from the Mann Whitney U-Test. Zmw scores >1.96 indicate that the two groups are significantly different at p=0.05.

16

17 Summary Findings Percent of nearly graduating students who were judged proficient or better (a score of 3 or 4 on 4 point scale) on each of the dimensions was fairly high—ranged from 64%- 83%. Across all dimensions: 74.7% Levels of significant difference between first-year and nearly graduating students were weak Inter-scorer reliability was problematic (“mastery” issue…) – Overall 67% of scorer pairs showed weak agreement or – Systematic disagreement

18 What did we learn? Importance of assignment (artifact) development Adapt rubric: program mix and/or campus culture (language, LOs) Engagement of faculty = high quality discussions (ground rules/calibration ) Next Steps: continue to engage faculty at program and disciplinary levels Contact Information Mo Noonan Bischof, Assistant Vice Provost, University of Wisconsin-Madison, mabischof@wisc.edumabischof@wisc.edu More about our project: http://apir.wisc.edu/valuerubricproject.htmhttp://apir.wisc.edu/valuerubricproject.htm

19

20 University of Nebraska-Lincoln Research One, Big Ten Conference, Land-Grant 24,000 students 8 independent colleges

21 Achievement-Centered Education (ACE) 10 Student Learning Outcomes (30 credits) 600 courses across 67 departments Transferable across 8 colleges Requires assessment of collected student work

22 UNL Assessment Context Review of each ACE course on 5-year cycle Biennial review of all undergrad degree programs 50 disciplinary program accreditations 10-year North Central/HLC accreditation

23 ACE 10 Generate a creative or scholarly product that requires broad knowledge, appropriate technical proficiency, information collection, synthesis, interpretation, presentation, and reflection.

24 HLC Quality Initiative: ACE 10 Project 25 faculty across colleges meet monthly to Explore methods and tools for assessing work Develop a community to share ideas Connect ACE 10 & degree program assessment Develop process for creating assessment report Create team of assessment “ambassadors”

25 Discussing Assessment Practices

26

27 A Common Rubric disciplinary vs. institutional goals

28

29 Inquiry Project Results Abandoned idea to pilot a common rubric Revised syllabus to focus on processes, not tools Developed poster session for public sharing Streamlined ACE & program review processes Creating process for 5-year ACE program review

30 Group Discussion How do you address differences across disciplinary norms and cultures? How can program/ disciplinary assessments inform institutional assessment and vice versa? What strategies can you use to develop shared goals and understanding? What are some effective practices for supporting and sustaining faculty and staff engagement?

31


Download ppt "Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions."

Similar presentations


Ads by Google