CIRTL Network Meeting www.cirtl.net/networkgroup September 19, 2013 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded.

Slides:



Advertisements
Similar presentations
1.) Identify the learning goals of one of your campus CIRTL programs. To provide a diverse group of STEM Ph.D. students with mentored teaching and research.
Advertisements

Don Gillian-Daniel Associate Director Delta Program in Research, Teaching and Learning University of Wisconsin-Madison In collaboration with… Riqué Campa.
CIRTL Network Meeting: Overview Ann Austin Texas A&M University June 3, 2009.
CENTER FOR THE INTEGRATION OF RESEARCH, TEACHING, AND LEARNING University of Colorado at Boulder, Howard University, Michigan State University, Texas A&M.
Freehold Borough Teacher Evaluation System Freehold Intermediate School Friday – February 15, 2013 Rich Pepe Director of Curriculum & Instruction.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Writing an Effective Proposal for Innovations in Teaching Grant
Do Now: Matching Game  Match the numbers from Column A to the clues in Column B to learn fun facts about Title IIA Massachusetts Department of Elementary.
SACS CASI Background Dedicated to advancing excellence in education worldwide, AdvancED provides accreditation, research, and professional services to.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Leader Evaluation and Professional Growth (LEPG) Model Module 3: Reflection, Rating, and Planning 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Dallas Baptist University College of Education Graduate Programs
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Building a Peer Mentoring Program For Transfer Students From the Ground Up Presented by Mary Von Kaenel, Associate Director for Transfer Academic Programs.
Presented by Margaret Shandorf
Assessing Financial Education: A Practitioner’s Guide December 2010.
Action Research: For Both Teacher and Student
Measuring for Success Module Nine Instructions:
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Yakini Brandy, Shimelis Hailu, Pogisego Dinake, Folahan O. Ayorinde Howard University Presented at The 238 th American Chemical Society National Meeting.
Nursing Research Capacity Building. Background CON –opened as 9 th College at SQU in 2008 The CON’s next challenge is promoting nursing care based on.
Setting Clear Expectations and Creating Mentoring Partnership Agreements Session begins at 1PM ET/12 PM CT/11 AM MT/ 10AM PT. Please configure.
South Western School District Differentiated Supervision Plan DRAFT 2010.
Project Based Learning What, Why & How. Objectives for Today Have you experience the beginning of a project (= Making your own project) Analyze your experience,
CIRTL Network Data Collection 3/2/2013. Institutional Portrait: Purpose Consistency with the TAR principle Accountability: – Helps us all monitor Network.
Measuring for Success Module Nine. Reflecting on the Previous Session What was most useful? What progress have you made? Any comments or questions?
Th e Heart of TPEP: Learning Centered Conferencing Michelle Lewis John Hellwich TPEP.
CIRTL Curriculum Committee ET/2- 3 CT/1-12 MT/12-1 PT Begin by running the Audio Setup Wizard, Select :
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Academic Standing Policy Review Academic Governing Council September 10, 2013.
Developing An Excellent Education Plan for your Faculty Early Career Development (CAREER) Program Proposal.
Instructional Leadership: Planning Rigorous Curriculum (What is Rigorous Curriculum?)
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
CIRTL Network Meeting 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded Begin by running.
Using PLCs to Build Expertise Community of Practice October 9, 2013 Tammy Bresnahan & Tammy Ferguson.
CIRTL Network Meeting Thursday June 18, :00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will.
CIRTL Network Meeting January 16, :00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded.
1 Michigan State University Name: Future Academic Scholars in Teaching (FAST) Fellowship Program Person responsible: Rique Campa, 20% FTE Time Commitment:
CIRTL Network Meeting May 15, :00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded.
CIRTL Network Meeting 11:00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded.
Assessing student learning in Delta’s Diversity in the College Classroom: Achievement Gap course Don Gillian-Daniel University of Wisconsin-Madison.
CIRTL Network Administrative Meeting 11:00-12:00 ET/ 10:00-11:00 CT/ 9:00-10:00 MT / 8:00-9:00 PT This meeting will be recorded.
CIRTL Network Administrative Meeting 11:00-12:00 ET/ 10:00-11:00 CT/ 9:00-10:00 MT / 8:00-9:00 PT This meeting will be recorded.
1 Program Details Name: TIGER DAD (Design & Development) of College Pedagogy Courses Persons responsible: Laura Border, PJ Bennett, Vivek Kaila, Abby Watrous,
CIRTL Network Administrative Meeting Wednesday, November 5, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT.
CIRTL Network Administrative Meeting 11:00-12:00 ET/ 10:00-11:00 CT/ 9:00-10:00 MT / 8:00-9:00 PT This meeting will be recorded.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
CIRTL Network Meeting Thursday January 22, :00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will.
CIRTL Network Administrative Meeting Wednesday May 6, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT This meeting will be recorded Backup.
CIRTL Network Administrative Meeting Wednesday, May 7, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT This.
CIRTL Network Administrative Meeting Wednesday, December 4, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT.
CIRTL Network Administrative Meeting Wednesday, September 4, 2013 (New Day & Time!) 12:00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00.
Instructional Leadership Supporting Common Assessments.
CIRTL Network Administrative Meeting 11:00-12:00 ET/ 10:00-11:00 CT/ 9:00-10:00 MT / 8:00-9:00 PT This meeting will be recorded.
CIRTL Network Administrative Meeting 11:00-12:00 ET/ 10:00-11:00 CT/ 9:00-10:00 MT / 8:00-9:00 PT This meeting.
CIRTL Needs Assessment Meeting 3:00-4:00 ET/ 2:00-3:00 CT/ This meeting will be recorded Begin by running the Audio.
1 Vanderbilt University Name: Vanderbilt TAR Fellows Program Persons responsible: Thomas R. Harris, Derek Bruff, Jean Alley Time Commitment: Introductory.
Success in the Online Environment Lawrence C. Ragan, Ph.D., Penn State’s World Campus Mount St. Vincent University April 12th 2005.
CIRTL Network Administrative Meeting Wednesday, April 2, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT This.
CIRTL Network Administrative Meeting Wednesday April 1, :00-1:00 ET/ 11:00-12:00 CT/ 10:00-9:00 MT / 9:00-8:00 PT This meeting will be recorded.
Professional Development: Imagine Difference Shapes and Sizes
SLP Training Day 3 30th September 2016
SOESD’s Teacher Evaluation & Support System
IEP Team Meeting Facilitation: What is it and How can it benefit Georgia districts? Today we are here to introduce to you a new and exciting initiative.
GC University Lahore Quality Enhancement Cell
Presentation transcript:

CIRTL Network Meeting September 19, :00-12:30 ET/ 10:00-11:30 CT/ 9:00-10:30 MT / 8:00-9:30 PT This meeting will be recorded Begin by running the Audio Setup Wizard Select : Tools>Audio> Audio Setup Wizard or select the audio wizard icon Backup Call-in: Call-in Number: 1 (571) Passcode: #

Agenda Announcements Blackboard Collaborate Phone & Room Changes Topic 1: Planning Local Program Evaluations (Part 1) (Ann Austin) Derek Bruff Don Gillian-Daniel Rique Campa Laura Border Part 2: Friday, September 27, 2013 Time: 11:00 ET / 10:00 CT / 9:00 MT / 8:00 PT

Changes to Blackboard Collaborate Each event series has it own room – no worries about scheduling Each room has its own phone bridge – participants can initiate the call on their own The CIRTL Events page links each event to its associated room. ( What we call Room1 and Room2 will go away at the end of September. We will update the meeting agenda pages to link to the new “CIRTL Admin & Network Meetings” room. Next week’s Evaluation session will stay in BBC2 – Room 2 (the room we are in today.)

Overview of the Planning Process for Evaluating CIRTL Institution-Level Programs and Outcomes Today’s Workshop – Overview of the planning process for evaluating institution-level CIRTL programs – Examples of approaches to institution-level evaluations – Next Steps Teleconference—Friday, September 27 Network Meeting—October 10-11

Overview of Evaluation Plans History The start of a collaborative process for evaluation across the Network Guiding Question: What is the impact on participants of involvement in CIRTL programs? Philosophical Approach – Alignment – Collaboration

CIRTL Outcomes ProgramsEvaluation

Developing Evaluation Plans: Network Discussions In the coming months, beginning with the Network meeting, we will be involved in Network discussions on: – Clarifying and refining the CIRTL outcomes – Program Outcomes: Developing a common instrument to gather basic data about the outcomes of CIRTL institution-level programs – Impact on Participants: Developing a common instrument to focus on the learning experiences of participants at different levels of the CIRTL outcomes (starting with those who do TAR projects) – Focused Institution-Level Evaluations: Developing plans for local institutional program evaluation

Today’s Focus Examples of Evaluation Approaches used at the Local Institution Level – Interviews—Derek Bruff (Vanderbilt) – Rubrics—Don Gillian-Daniel (UW) – Pre/Post Surveys—Rique Campa (MSU) – Monthly monitoring—Laura Border (Colorado)

Interviews for Assessment Derek Bruff, Vanderbilt University

Methodology Draft interview questions. Select interview participants. Conduct, document interviews. Analyze interviews for patterns, outliers.

Case Study: TAR Fellows ‘09 & ‘10 Teaching-as-ResearchHow did you plan your study? What data did you collect? Learning CommunitiesHow would you rate the helpfulness of sharing ideas with other TAR participants and why? Learning through DiversityIn what ways has participation in the TAR program caused you to consider variety of learning strategies and differences in students’ learning? Box of ChocolatesWhat would you consider as the most valuable thing(s) you learned from your TAR project?

Pros and Cons Pros – Thoughtful responses – Unexpected findings – Stories to share Cons – Time, time, time – Small n – Tough choices about rigor Image: “Sounds,” Elena Pérez Melgarejo, Flickr (CC)

Using rubrics to evaluate student learning Don Gillian-Daniel Associate Director Delta Program in Research, Teaching & Learning University of Wisconsin-Madison

Description of the method

How we used the strategy and our rationale Rubrics are used to provide “standardized” criteria for evaluating Teaching-As-Research projects across both students and cohorts TAR project/Action Plan rubric categories include: – Introduction and Research Question – Literature Background and Support – Project Objectives – Evidence/Assessments – Project Approach Students receive rubric during the semester to facilitate project development

Benefits Allows for more straight- forward comparison across student projects Promotes consistency among multiple reviewers Provides students with clear expectations during project development Allows for cross-institution comparison of projects Limitations Development can be time intensive (use someone else’s as a starting point!) Iterative – use will promote review & revision of the rubric, so it will change with time The benefits and limitations of using this strategy

Michigan State University For low, medium, and high engagement programs Electronic program registration system – collects some demographic information (some overlap with pre- survey). Use of pre- and post-program surveys (paper): to evaluate if cognitive and behavioral objectives (of a specific program) were met (open-ended & Likert scale questions). Pre-program surveys: Demographics-Who attended? Cognitive – What do they know? Behavioral – What do they do? Post-program surveys: Cognitive – What do they know? Behavioral – What will they do? IRB approval - Exempt Description: Pre- & Post-Program Surveys

How did MSU use the strategy and why? Pre- & Post-Program Surveys 21 Michigan State University For low- and medium-engagement programs, paper pre-program surveys are given to participants as they sign in (by a GS employee not associated with the program) Tell participants why you are conducting surveys. 1-page, front/back—Keep it simple, keep it short! Return prior to the start of the program. Very high returns. Post-program surveys (paper) – Distributed at the end of the program. 1-page front/back—Keep it simple, keep it short! Some common questions on pre- and post-program surveys (Likert scale) Some common questions among all programs.

Common Pre- & Post-Program Survey Questions (among all programs) Post- “What 3 skills or ideas from this workshop do you think will be most useful to you?” “How do you plan to use them?” “What more would you like to know about this topic?” 22 Michigan State University Pre- Demographics-e.g., gender, age, ethnic/racial group, international or domestic, dept./college, degree or post-doc, yrs in graduate school

Benefits, Limitations? Pre- & Post-Program Surveys Benefits: Simple to implement & participants complete them Allows MSU to evaluate potential impacts of programs and who attended Provides data to the GS for future planning and to colleges, departments to improve graduate education Limitations: What are the long-term impacts of the short-term gains quantified with pre- and post-program surveys? Do short-term gains following programs stimulate involvement in other professional development programs (i.e., thresholds)? 23 Michigan State University

TAR Progress Reports at CU Boulder Laura L. B. Border, Director, Graduate Teacher Program

TAR Fellow Monthly Progress Reports Description: one page form that tracks the following: IRB training, submission, completion Research progress, complications, and resources Meetings with the TIGER Team and faculty mentor Attendance at TIGER Teaches meetings Attendance at other relevant meetings or workshops

Cross-Disciplinary Strategy All TAR Fellows submitted monthly reports. Form allowed TIGER TEAM to track work progress; provide guidance across multiple disciplines as TAR Fellows moved through proposal writing, IRB submission and approval, implemented research and completed analysis; and offer guidance on publication opportunities. Form allows monitoring, problem identification and solutions, and follow-up to assure project completion.

Outcomes Benefits: We were able to maintain continuous monitoring and communication with TAR Fellows. 15 TAR Fellows completed reports and projects. Form created discussion and contact. Limitations: N/A, we thought they would complain, but they liked it.

Next Session Part 2: Friday September 27, 2013 Time: 11:00a ET/ 10:00a CT/ 9:00a MT / 8:00a PT Please post your questions at the URL below. (Also reachable via the Discussion Forum in the Network Group)

Discussion Forum