Presentation is loading. Please wait.

Presentation is loading. Please wait.

CIRTL Network Meeting www.cirtl.net/networkgroup September 19, 2013 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded.

Similar presentations


Presentation on theme: "CIRTL Network Meeting www.cirtl.net/networkgroup September 19, 2013 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded."— Presentation transcript:

1 CIRTL Network Meeting www.cirtl.net/networkgroup September 19, 2013 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded Begin by running the Audio Setup Wizard Select : Tools>Audio> Audio Setup Wizard or select the audio wizard icon Backup Call-in: Call-in Number: 1 (571) 392-7703 Passcode: 967 353 205 578#

2 Agenda Announcements Blackboard Collaborate Phone & Room Changes Topic 1: Planning Local Program Evaluations (Part 1) (Ann Austin) Derek Bruff Don Gillian-Daniel Rique Campa Laura Border Part 2: Friday, September 27, 2013 Time: 11:00 ET / 10:00 CT / 9:00 MT / 8:00 PT

3 Changes to Blackboard Collaborate Each event series has it own room – no worries about scheduling Each room has its own phone bridge – participants can initiate the call on their own The CIRTL Events page links each event to its associated room. (http://www.cirtl.net/events) What we call Room1 and Room2 will go away at the end of September. We will update the meeting agenda pages to link to the new “CIRTL Admin & Network Meetings” room. Next week’s Evaluation session will stay in BBC2 – Room 2 (the room we are in today.)

4

5

6

7 Overview of the Planning Process for Evaluating CIRTL Institution-Level Programs and Outcomes Today’s Workshop – Overview of the planning process for evaluating institution-level CIRTL programs – Examples of approaches to institution-level evaluations – Next Steps Teleconference—Friday, September 27 Network Meeting—October 10-11

8 Overview of Evaluation Plans History The start of a collaborative process for evaluation across the Network Guiding Question: What is the impact on participants of involvement in CIRTL programs? Philosophical Approach – Alignment – Collaboration

9 CIRTL Outcomes ProgramsEvaluation

10 Developing Evaluation Plans: Network Discussions In the coming months, beginning with the Network meeting, we will be involved in Network discussions on: – Clarifying and refining the CIRTL outcomes – Program Outcomes: Developing a common instrument to gather basic data about the outcomes of CIRTL institution-level programs – Impact on Participants: Developing a common instrument to focus on the learning experiences of participants at different levels of the CIRTL outcomes (starting with those who do TAR projects) – Focused Institution-Level Evaluations: Developing plans for local institutional program evaluation

11 Today’s Focus Examples of Evaluation Approaches used at the Local Institution Level – Interviews—Derek Bruff (Vanderbilt) – Rubrics—Don Gillian-Daniel (UW) – Pre/Post Surveys—Rique Campa (MSU) – Monthly monitoring—Laura Border (Colorado)

12 Interviews for Assessment Derek Bruff, Vanderbilt University

13 Methodology Draft interview questions. Select interview participants. Conduct, document interviews. Analyze interviews for patterns, outliers.

14 Case Study: TAR Fellows ‘09 & ‘10 Teaching-as-ResearchHow did you plan your study? What data did you collect? Learning CommunitiesHow would you rate the helpfulness of sharing ideas with other TAR participants and why? Learning through DiversityIn what ways has participation in the TAR program caused you to consider variety of learning strategies and differences in students’ learning? Box of ChocolatesWhat would you consider as the most valuable thing(s) you learned from your TAR project?

15 Pros and Cons Pros – Thoughtful responses – Unexpected findings – Stories to share Cons – Time, time, time – Small n – Tough choices about rigor Image: “Sounds,” Elena Pérez Melgarejo, Flickr (CC)

16 Using rubrics to evaluate student learning Don Gillian-Daniel Associate Director Delta Program in Research, Teaching & Learning University of Wisconsin-Madison

17 Description of the method

18 How we used the strategy and our rationale Rubrics are used to provide “standardized” criteria for evaluating Teaching-As-Research projects across both students and cohorts TAR project/Action Plan rubric categories include: – Introduction and Research Question – Literature Background and Support – Project Objectives – Evidence/Assessments – Project Approach Students receive rubric during the semester to facilitate project development

19 Benefits Allows for more straight- forward comparison across student projects Promotes consistency among multiple reviewers Provides students with clear expectations during project development Allows for cross-institution comparison of projects Limitations Development can be time intensive (use someone else’s as a starting point!) Iterative – use will promote review & revision of the rubric, so it will change with time The benefits and limitations of using this strategy

20 Michigan State University For low, medium, and high engagement programs Electronic program registration system – collects some demographic information (some overlap with pre- survey). Use of pre- and post-program surveys (paper): to evaluate if cognitive and behavioral objectives (of a specific program) were met (open-ended & Likert scale questions). Pre-program surveys: Demographics-Who attended? Cognitive – What do they know? Behavioral – What do they do? Post-program surveys: Cognitive – What do they know? Behavioral – What will they do? IRB approval - Exempt Description: Pre- & Post-Program Surveys

21 How did MSU use the strategy and why? Pre- & Post-Program Surveys 21 Michigan State University For low- and medium-engagement programs, paper pre-program surveys are given to participants as they sign in (by a GS employee not associated with the program) Tell participants why you are conducting surveys. 1-page, front/back—Keep it simple, keep it short! Return prior to the start of the program. Very high returns. Post-program surveys (paper) – Distributed at the end of the program. 1-page front/back—Keep it simple, keep it short! Some common questions on pre- and post-program surveys (Likert scale) Some common questions among all programs.

22 Common Pre- & Post-Program Survey Questions (among all programs) Post- “What 3 skills or ideas from this workshop do you think will be most useful to you?” “How do you plan to use them?” “What more would you like to know about this topic?” 22 Michigan State University Pre- Demographics-e.g., gender, age, ethnic/racial group, international or domestic, dept./college, degree or post-doc, yrs in graduate school

23 Benefits, Limitations? Pre- & Post-Program Surveys Benefits: Simple to implement & participants complete them Allows MSU to evaluate potential impacts of programs and who attended Provides data to the GS for future planning and to colleges, departments to improve graduate education Limitations: What are the long-term impacts of the short-term gains quantified with pre- and post-program surveys? Do short-term gains following programs stimulate involvement in other professional development programs (i.e., thresholds)? 23 Michigan State University

24 TAR Progress Reports at CU Boulder Laura L. B. Border, Director, Graduate Teacher Program

25 TAR Fellow Monthly Progress Reports Description: one page form that tracks the following: IRB training, submission, completion Research progress, complications, and resources Meetings with the TIGER Team and faculty mentor Attendance at TIGER Teaches meetings Attendance at other relevant meetings or workshops

26 Cross-Disciplinary Strategy All TAR Fellows submitted monthly reports. Form allowed TIGER TEAM to track work progress; provide guidance across multiple disciplines as TAR Fellows moved through proposal writing, IRB submission and approval, implemented research and completed analysis; and offer guidance on publication opportunities. Form allows monitoring, problem identification and solutions, and follow-up to assure project completion.

27 Outcomes Benefits: We were able to maintain continuous monitoring and communication with TAR Fellows. 15 TAR Fellows completed reports and projects. Form created discussion and contact. Limitations: N/A, we thought they would complain, but they liked it.

28 Next Session Part 2: Friday September 27, 2013 Time: 11:00a ET/ 10:00a CT/ 9:00a MT / 8:00a PT Please post your questions at the URL below. (Also reachable via the Discussion Forum in the Network Group) http://www.cirtl.net/evaluationquestions

29 Discussion Forum


Download ppt "CIRTL Network Meeting www.cirtl.net/networkgroup September 19, 2013 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded."

Similar presentations


Ads by Google