Replacing “Ready, Aim, Fire” with “Research, Inform, Action”

Slides:



Advertisements
Similar presentations

Advertisements

Report to the KSD Board June 9, Provide Kent School District the necessary guidance and assistance to create an equitable, academically enriching,
The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
+ Partners for Learning Assessment Exploration. + Overview Introducing Partners for Learning Assessment Exploration Sorting out what we do Finding out.
P-16 Council Overall Goals Regional change agents for “Closing the Gaps” Engaging community stakeholders Parents K-12 teachers K-12 administrators College.
Board of Regents of the University System of Georgia P-16 Initiatives Jan Kettlewell July 13, 2007
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
A Systemic Approach February, Two important changes in the Perkins Act of 2006 A requirement for the establishment of Programs of Study A new approach.
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
WHERE WE’VE BEEN Troup Glynn Forsyth Butts Elbert Bulloch Liberty.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
Whitlow Elementary # Principal’s Chat Whitlow Elementary # Principal’s Chat STEM.
EDUCATIONAL ALLIANCE OVERVIEW Founded in 1889 as a settlement house on the Lower East Side of Manhattan 1951: First NYC-subsidized child care for low income.
Framing Our Conversation
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Presented by Margaret Shandorf
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Improving Secondary Education and Transition Using Research-Based Standards and Indicators An initiative of the National Alliance on Secondary Education.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
PARENT, FAMILY, AND COMMUNITY ENGAGEMENT
DRAFT Building Our Future 2017 Fulton County Schools Strategic Plan Name of Meeting Date.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Report to the Board of Education October 15, 2007.
A New Vision for Summer School Jeff Smink Bridge Conference Seattle, WA October, 2011.
Strategic Planning Board Update February 27, 2012 Draft - For Discussion Purposes Only.
The County Health Rankings & Roadmaps Take Action Cycle.
The Evaluation Plan.
Program Evaluation and Logic Models
Reaching for Excellence in Middle and High School Science Teaching Partnership Cooperative Partners Tennessee Department of Education College of Arts and.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Working Definition of Program Evaluation
Texas Science Technology Engineering and Math (T-STEM) Initiative Robin Gelinas—Texas Education Agency Director of Policy Initiatives.
Using Intermediary Organizations to Gain Access to Quality Internships Presented by: Deanna Hanson, California Director, NAF.
Carla Wade Digital Learning and STEM Education Specialist Oregon Department of Education.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Jr. ACE Advisory Board. 6th Grade Summer Institute Academic Enrichment 7 th Grade Academic Year College Readiness Retention Activities 7 th Grade Summer.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Science Mentoring Program Hughes STEM High School Community Partnership Experiences February 7, 2012 Kent Buckingham, Ph.D., Program Coordinator.
Preparing Students for the Global Economy California PLTW Leadership Meeting February 21-22, 2013.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Office of Service Quality
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Funded by a grant from the National Science Foundation, GSE/EXT: STEM Equity Pipeline Project, Grant No. HRD © 2009 National Alliance for Partnerships.
Champaign Unit 4 Parent Advocacy Committee Update Cheryl Camacho & Tony Howard April 22, 2013.
CAREER PATHWAYS THE NEW WAY OF DOING BUSINESS. Agenda for our Discussion Today we’ll discuss: Career Pathways Systems and Programs Where we’ve been and.
Understanding the Skills Gap in Grey County Presented by Gemma Mendez-Smith |
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
 The NEFEC STEM Initiative Nancy Thompson Supervisor of Curriculum and Instruction.
The Urban Assembly School for Global Commerce Sharing experiences of building a CTE maritime program to achieve great outcomes for students Year Zero to.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
SCEP Evaluation Albany Elementary School.
ACCEL 1 ESL Providers Network (EPN) Adult English Language Acquisition Cohort March 13, 2015.
Asking the Right K-12 Questions How to Answer Them to Evaluate K-12 STEM Outreach and Engagement Carlos Rodriguez, Ph.D., Principal Research Scientist.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Making the Most of a Logic Model
The Challenge Building Infrastructure for Marlborough STEM’s School-Industry Partnerships   HOW TO… Design the infrastructure that effectively implements.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
“CareerGuide for Schools”
Engaging Students through CTE and STEM
Implementation Guide for Linking Adults to Opportunity
Seminar on the Evaluation of AUT STEM Programme
Presentation transcript:

Replacing “Ready, Aim, Fire” with “Research, Inform, Action”

Today’s Session The Kansas City STEM Alliance and the Kansas City Area Education Research Consortium collaborate with schools and nonprofits to recruit and transition youth into STEM programs across the metropolitan area. Come explore how research with local students, schools, out-of-school programs, and community volunteers informs program decisions and illustrates a return-on-investment of resources leveraged by businesses, K-12 schools, higher education, and foundations. Also participate in discussions about how what Kansas City is learning can be translated to inform efforts in your region. 2

Our Vision The vision of the KC STEM Alliance us is to see that a diverse, innovative and sustainable STEM workforce becomes a reality. By developing an environment that leverages the strengths of educators, STEM organizations, and local industry we can create a collaborative network to encourage and sustain STEM careers. 3 Working together will allow this…

KC STEM Alliance Partners 4

Building on Our Success: More than 27 area school districts and higher education partners to provide curriculum, professional development for teachers Nationally recognized Project Lead the Way (PLTW) and US FIRST - 15,000 students in the Kansas City Metro area Since 2006 – well over 20,000 students have participated in these two programs. Significant resources – private, public sectors and school districts 5

Our strategy Ongoing program evaluation and data collection Increase participation of existing initiatives Seek out new partnerships and opportunities Raise awareness about STEM in Kansas City Secure support from area companies and organizations 6

Mission Statement Our shared goal is to improve P-20 education for all students in the Kansas City metropolitan area by providing powerful tools for data-driven educational research, evaluation and implementation. Schools and School Districts Collaborating Universities Leading Community Organizations Local Education Agencies, Foundations, Chambers and Economic Development Entities, Community and Other Colleges, State Education Departments of Kansas & Missouri.

Beyond Descriptive KC-AERC works with local education programs to help create measures of program effects on students Traditional quantifiable measures –Ex. Reading or Math gains pre and post tests Non-quantifiable skill measures –Ex. Socio-emotional skills, attitudes towards science

Methods Discuss desired program goals and effects with program leaders Research appropriate instruments for analyzing effects on students Devise appropriate statistical model –Mixed methods of quantitative and qualitative Collaborate with program leaders on final design

Quantitative Methods Treatment and control groups to identify if effects can be linked to program –“Gold Standard” in attempting to find causality –Many times difficult to find and engage a control group Regression analysis to factor a variety of background characteristics into results Use of Pre and Post tests to identify growth through program (T- tests, ANOVA)

Qualitative Methods Focus groups to gather thoughts from those closest to subject area Use quantitative methods to identify strong and weak areas of program, qualitative investigation of why some aspects work Interviews with leaders for in depth details of program implementation

KC STEM Alliance Initial Research Questions What are the effects of STEM programs on high school achievement? What are the effects of PLTW, FIRST Robotics and other STEM programs on post-secondary educational outcomes? Why do students participate/not participate in STEM activities? Measure in-class variability in implementation of PLTW as measured by scores on the exit exams? How much of this variability can be explained by differences in students’ academic backgrounds (other math course taken, grades earned etc.)? Design Methodology

13 Lower participation rate among minorities (when compared to overall school population – FIRST) Low female participation in STEM engineering programs – but, high in biomedical programs Higher participation of females in urban districts Urban programs are less robust enrollment, participation, funding and support What did we learn in the first 6 months:

What are issues raised by these pilots? Process Data for Program Improvement: Comprehensive Program Implementation with School Districts Definitions of Program Participation Understanding Within-Group Variation in Number of PLTW Courses Complete Understanding FIRST Robotics Participation Systematically PLTW Course Outcome Data Outcome Data for Program Impact: High School Data Formats and Availability Data on College Outcomes

Logic Model for KC STEM Alliance Evaluation

Highlights of FIRST Robotics Surveys – FIRST Robotics, the majority of participants are White and Male. – Majority have lived in the United States their whole life, come from households where English is spoken, and have highly educated parents (i.e., college degree or beyond) who in turn hold high expectations for their children. – Slightly more than half of students surveyed (54%) were participating in FLL for the first time.

Highlights of FIRST Robotics Surveys FIRST Coaches reported: 72% participating in a FIRST competition Of those participating, 44% participated in more than one competition. The number of mentors per team ranged from 0-15, with 24% of coaches reporting their teams did not have even one mentor. Every single coach whose team had a mentor(s) reported their mentors helped in “mechanical component design;” less than half of coaches reported that their mentors helped with marketing, business plans, computer applications, or website development.

Highlights of FIRST Robotics Surveys FIRST Volunteers/Mentors (N = 77) reported: Majority of Event Volunteers were White and Male. Almost all volunteers lived in the Kansas City metropolitan area. Majority employed full-time and represented a variety of occupations. When asked about their satisfaction with their work in the FIRST Robotics program, they gave responses that were generally very positive; however, there were relatively less positive responses to questions about the perceived effectiveness of information provided about the volunteer job.

KC 2012 FIRST Robotics Mentors

Think with a partner: What is the difference between “EVALUATION” and “ASSESSMENT? ”

Typically, “EVALUATION” = program level and “ASSESSMENT” = individual level TYPES of DATA: Participant Level (interest, engagement, achievement) Program Level (quality activities, professional development) Systems Level (Policy changes)

How can you make this happen for your organization?

Research, Inform, Action Two-Way Street Evaluation Planning to Implementation to Utilization –Step 1: Program Purpose –Step 2: Data –Step 3: Plan to Obtain Data –Step 4: Communication/Share Data

Step 1: Document the nature and purpose of the program What is the program? What is the need? How is the program meeting the need? What are the goals of the program in terms of measurable outcomes? –Short-term –Long-term Note: Increases in knowledge, skills, and attitudes are laudable goals.

Research, Inform, Action Inputs Outputs Measurable Outcomes/Goals Evidence/Possible Sources of Evidence (Data)

How can you make this happen for your organization?

Step 2: Identify data needed to measure progress against goals Choose just the important data; avoid collecting data you will not use Look for data among the people you directly serve Consider: –Who, when, and how often participants attend –What participants know and can do Note: Evidence of learning is more valuable than what participants say they learned.

Step 3: Develop a plan to obtain the data with current resources Brainstorm with team: –What constitutes success? –What evidence can be collected? –What notes/data can frontline staff record? –What data do you need assistance with gathering and understanding? RE – visit goals to see if they capture everything that is important

Step 4: Share how the data will be used Provide feedback to stakeholders about how to improve what we do Information to different audiences: –Current and future funders –Schools/School districts –Parents/Students/Families –Participants

Research, Inform, Action Two-Way Street Evaluation Planning to Implementation to Utilization –Step 1: Program Purpose –Step 2: Data –Step 3: Plan to Obtain Data –Step 4: Communication/Share Data

Replacing “Ready, Aim, Fire” with “Research, Inform, Action” Dr. Leigh Anne Taylor Knight Executive Director, KC-AERC Laura Loyac0no Director, KC STEM Alliance