Host: Don Pratt Monday, April 11, 2011 9:30 – 10:30 AM EDT Monday, April 11, 2011 9:30 – 10:30 AM EDT Performance Measures for PHCAST Demonstration Projects.

Slides:



Advertisements
Similar presentations
Administrators Meeting April 21, Key Areas of Grant-Based Monitoring Schools to be Served Instructional Assessments Instructional Strategies and.
Advertisements

Response to Intervention (RtI) in Primary Grades
PROFESSIONAL REFLECTIVE JOURNALS AND TIME LOGS Tools for an Effective Field Placement.
Goals-Based Evaluation (GBE)
Making a Difference: Measuring Your Outcomes Montgomery County Volunteer Center February 4, 2014 Pam Saussy and Barry Seltser, Consultants.
Teacher Evaluation Model
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Simulation-Based Athletic Training Education: Beyond the smoke and mirrors Florida International University Miami, Florida Jennifer Doherty-Restrepo, PhD,
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Objectives for Session Nine Observation Techniques Participatory Methods in Tanzania Hand back memos.
“Charting the Course Together” Implementing the Common Core State Standards -Mathematics- Middle School Leadership Teams February 6, 2014.
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
AISSA “Down the Track” Seminar April 7-8, 2011 Backward Mapping: curriculum planning for aligning pedagogy and assessment Dr. Katie Weir.
Instructional System Design
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Leading Change Through a Walk-Through Protocol
Clinical Observer Training Session 1 Course Overview Courtesy: HIP.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
The Academy of Pacesetting Districts Introducing...
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Approaches to quality assurance TIPA’s perspectives Fatmir Demneri.
The Instructional Decision-Making Process 1 hour presentation.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
IPMA Executive Conference Value of IT September 22, 2005.
Teacher Evaluation and Professional Growth Program Module 4: Reflecting and Adjusting December 2013.
Focused Monitoring November 10, 2010 Bureau of Special Education 1.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Welcome to Curriculum Mapping… The attendee will understand the purpose of curriculum mapping, with a focus on the alignment of instruction with desired.
March Madness Professional Development Goals/Data Workshop.
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Innovation Network, Inc. Program Logic Model Program Planning Communication Evaluation.
E-Learning to support SAET activities. Overview What is e-Learning CLP4Net Proposal: Structure for an e-Learning supported SAET activity.
Consistency of Assessment (Validation) Webinar – Part 1 Renae Guthridge WA Training Institute (WATI)
Human Society and its Environment K-6 (HSIE) Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Performance Management Training October , 2015 Grace Gorenflo, MPH, RN Principal Gorenflo Consulting, Inc.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Data to collect and questions to ask to understand the meaning of the data.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Instructional Leadership Supporting Common Assessments.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Train the Trainer Inland Navigation Simulators. Welcome Please tell us: Who you are Where your from What your experiences are as instructor Why you are.
Designing Effective Evaluation Strategies for Outreach Programs
Evaluating your Repository
High Quality Coaching: How Do We Know It? April 1, 2015
New Goal Clarity Coach Training October 27, 2017
بسم الله الرحمن الرحيم.
Educator Effectiveness Regional Workshop: Round 2
Multi-Sectoral Nutrition Action Planning Training Module
School Improvement Plans and School Data Teams
Program Planning and Evaluation Essentials
Program Planning and Evaluation Methods
School of Dentistry Education Research Fund (SDERF)
Presentation and Evaluation
Introductions Introduction
San Mateo County Fall Prevention Task Force
DAVIS COLLABORATIVE TEAMS
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
E-Learning to support SAET activities
Student Learning Outcomes Assessment
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Host: Don Pratt Monday, April 11, :30 – 10:30 AM EDT Monday, April 11, :30 – 10:30 AM EDT Performance Measures for PHCAST Demonstration Projects

2 Agenda 1. Performance measurement involves Performance measure examples 3. Anatomy of a performance measure 4. Additional considerations 5. Group discussion: What can you track for your project? Purpose of today’s session: Share information and ideas about what you are already measuring—and what you may be thinking of measuring—to assess the performance of your project. This session an opportunity to share in the process of identifying some common performance measures.

Performance Measures for PHCAST Demonstration Projects3 Performance measurement involves...  Tracking work completed (activities/outputs)  Measuring outcomes (usually “soft” measures and short-term)  Routine/cyclical data collection  Comparing observed results to performance targets  Using results to improve program management and outcomes

Performance Measures for PHCAST Demonstration Projects4 Performance Measure Examples  Output: Develop new training curriculum aligned with competency models (end of year one).  Output: Train X participants using new curriculum (middle of year two).  Output: Collect feedback from participants and instructors to assess implementation (middle of year two).  Outcome: X percent of trainees report training met their learning needs (end of year two).

Performance Measures for PHCAST Demonstration Projects5 Output: Training X participants using new curriculum (middle of year two). Anatomy of a Performance Measure  What will occur? Participants will be trained using new curriculum  How much will be accomplished? X participants will be trained  When will it occur? By middle of year two

Performance Measures for PHCAST Demonstration Projects6 Additional Considerations  What schedule will you follow for data collection, and who are the partners in this process? (planning and logistics)  Who will provide the data? (data source)  What evidence will you rely on to know if results occurred? (indicators)  What instruments will you use to collect the data? (tools)  How will you ensure data quality? (reliability, validity)

Performance Measures for PHCAST Demonstration Projects7 What results do you currently track in the areas of curriculum, certification and trainees? How do you track these results? Are there other results you don’t currently track but that you would like to track?

Performance Measures for PHCAST Demonstration Projects8 How do you envision using this information to improve your program?

Performance Measures for PHCAST Demonstration Projects9 What are some areas you think would be best addressed through more formal evaluation efforts (and why)?