Presentation is loading. Please wait.

Presentation is loading. Please wait.

For this session, please arrange yourselves by “size of institution” (see table markers) below 1000 students 1000-2500 students 2500-5000 students above.

Similar presentations


Presentation on theme: "For this session, please arrange yourselves by “size of institution” (see table markers) below 1000 students 1000-2500 students 2500-5000 students above."— Presentation transcript:

1 For this session, please arrange yourselves by “size of institution” (see table markers) below 1000 students 1000-2500 students 2500-5000 students above 5000 students

2  Two schools, one academic program  Liberal Arts, Residential, Catholic, Benedictine  3,900+ undergraduates  300+ FTE faculty  80% of faculty tenured or tenure track Who We Are

3 CSB/SJU First-Year Seminar  Required of all FY (approximately 1000 per year)  Year-long course  16 students in each class; stay in cohort for year  Goals  Critical Thinking, Reading, Writing, Discussion, Public Speaking, Information Literacy  First semester emphasizes essays  Second semester emphasizes research paper  65 sections, 50 faculty  Faculty-determined content topics  Faculty recruited from across disciplines (HUM/SS/NS/FA)  Both tenured and contingent faculty

4  Assessment of student work done annually  Pre and post essays  Holistic grading  Little/no incentive for student effort  Assessment not clearly tied to directives in Faculty Assembly motion  Results not shared with faculty or used FYS Assessment 1990s - 2008

5  Meaningful assessment  to shape faculty development efforts  to improve student learning  Create clearer understanding of whole faculty’s expectations for course New Vision

6  Create culture of assessment  Don’t just do assessment, but see it as useful  Enhance sense of a joint endeavor  Create sense of community among 50+ faculty teaching in program New Aspirations

7  Decision on key areas for assessment  Rubric  Used to evaluate research papers produced in second semester One Result

8  Please take a few moments to have a brief conversation at your table about the following:  What do you do at your school  What do you see in our program  What would you like us to explore/explain in our remaining time together Activity: Table Conversations

9  Creating Culture of Assessment  Process  Results  On-going Issues Possible Areas

10

11  Faculty conversations on goals for FYS  Faculty conversations on how best to assess in ways that fit the Faculty mandate  Faculty conversations on creation of key areas Creating a Culture of Assessment: Community Understanding

12  FYS faculty buy-in on three key areas for assessment  Ability to Present a Clear Argument  Ability to Address Different Points of View  Ability to Use Evidence in a Convincing Manner  Understood from the beginning that we were “aiming high”  Creation of rubric  FYS faculty discussions and revisions Creating a Culture of Assessment: Essential Step

13  Emphasizing that it is a faculty driven process  Not punitive, but helping teachers develop skills necessary to improve student learning in key areas  Reminders that assessment is not the same as grading  Worked from the assumption that if good students weren’t meeting our goals, then we needed to find more effective teaching strategies (or change our goals)  Not really interested in results from students who didn’t put in much effort Creating a Culture of Assessment: Re-branding the “A” Word

14  FYS faculty receive student scores from their section (numerical and written comments)  Aggregate results are shared at FYS department meeting  Multiple conversations on “best practice” in areas of greater difficulty  Led by faculty with ideas that have been successful  Training sessions led by director and team  Week-long May workshop plus several in-semester meetings Creating a Culture of Assessment: Disseminating Results

15

16  Evaluate research papers done in second semester  Draws together multiple strands taught over the year  Major portion of grade, so high degree of student effort  Select three papers with the highest grade from each section  Do “best” students meet our assessment goals? Assessment Process

17  Experienced FYS faculty  Most are graduates of Teagle Grant Assessment Training  Day long training/conversation for inter-rater reliability  Revisions of rubric considered  Four teams of two  Read individually, score, and compare ratings  Provide numerical rating and 4-5 sentence explanation of why  When disagreement, paper goes to third reader Assessment Teams

18

19  Gradual improvement in ALL categories of assessment (Combining Exceptional and Acceptable categories) What We Found Out 20092012 Ability to Present a Clear Argument56.40%70.80% Ability to Use Different Points of View49.30%61.40% Ability to Use Evidence60.80%79.10%

20  Scores in “Ability to Address Different Points of View” (Critical Thinking) were lower  Recognition that level of intellectual development makes this harder for traditional age FY students  Tend to rely on authority or “everyone has right to own opinion”  Makes it difficult to wrestle with argument that doesn’t fit student view Thinking About Results

21  Promote understanding of typical responses for FY level  Rather than seeing this as permanent state  Seek ways of “nudging” students forward  Emphasis on small steps  Structured teaching of critical thinking over the year Response to Student Results

22  Accountability focused attention on core learning goals  Vast majority of faculty began to adhere to page/source requirements, ending wide variability between sections  Increased sense of common purpose  Greater attendance at FYS department meetings  Greater attendance at workshop and other training  More conversations among faculty ACROSS disciplines  More FYS sections where faculty were doing “paired” work Assessment Impact on FYS Faculty

23 Greater Willingness to Cooperate 20092014 Percent of Faculty Who Turned in Research Papers for Scoring 45%98%

24

25  Some tenured faculty not as engaged  Tend to teach one FYS every 3-4 years  Departmental affiliation takes precedence  Don’t attend training or meetings as frequently  Therefore less likely to share wisdom  Don’t modify expectations to fit goals  Assessment results often lower On-Going Concerns: Senior Faculty

26 On-Going Questions: 2014 Results 200920122014 Ability to Present a Clear Argument56.40%70.80%60.80% Ability to Use Different Points of View49.30%61.40%52.40% Ability to Use Evidence60.80%79.10%73.60%

27  No significant change in profile of class  Inter-rater reliability over time  Most of team has read 4-5 years  Have made adjustments in gloss on rubric  Immersed in goals, so unconsciously expect more?  Need to include earlier papers in annual meeting to establish inter-rater consistency  See if there has been shift in standards Thoughts on 2014 Results

28  May be due to a slight change in who was teaching  Experience matters  Sections taught by “veteran” faculty averaged  2.2 “exceptional” ratings per class out of 9 possible  2.2 “unsatisfactory” per class out of 9 possible  Sections taught by “inexperienced” faculty averaged  1.1 “exceptional” ratings  4.2 “unsatisfactory” ratings Examination of 2014 Data

29  Ken Jones (kjones@csbsju.edu)kjones@csbsju.edu  John Kendall (jkendall@csbsju.edu)jkendall@csbsju.edu  To download a copy of this presentation or any other related material in this presentation, please visit the following webpage at CSB/SJU: http://employees.csbsju.edu/jkendall/http://employees.csbsju.edu/jkendall/ For further information


Download ppt "For this session, please arrange yourselves by “size of institution” (see table markers) below 1000 students 1000-2500 students 2500-5000 students above."

Similar presentations


Ads by Google