Quality Assessment July 31, 2006 Informing Practice.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Coastal Plains RESA Assessment Literacy: Formative Instructional Practices March 27, April 23, April 30, May 7 Session One: Modules 1 & 2 Session Two:
Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM.
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
 will be able to write a learning outcome from the student perspective  will understand the difference between writing about an activity and learning.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Assessing Learning in the Gifted Classroom
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Assessing Student Learning Outcomes – Jeff Mackay/Dan Fergueson Welcome.
Customer Focus Module Preview
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
National Food Service Management Institute
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Outcomes Assessment and Program Effectiveness at Florida Atlantic University : Student Affairs Gail Wisan, Ph.D. July 13, 2010.
Section 29.2 The Marketing Survey
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
The Comprehensive School Health Education Curriculum:
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Strategic Planning Module Preview This PowerPoint provides a sample of the Strategic Planning Module PowerPoint. The actual Strategic Planning PowerPoint.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Interstate New Teacher Assessment and Support Consortium (INTASC)
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Too expensive Too complicated Too time consuming.
Chapter 12: Survey Designs
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Chapter 5 Building Assessment into Instruction Misti Foster
Understanding Meaning and Importance of Competency Based Assessment
Foundations of Assessment I Understanding the Assessment Process.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Community Health Assessment: Primary Data Collection LHD TA Project – Learning Collaborative 1 Community Health Assessment Second Learning Session Sheena.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Alternative Assessment
TAH Project Evaluation Data Collection Sun Associates.
Teacher-Librarian Supported Inquiry-Based Learning
ATL’s in the Personal Project
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
Chapter 5: NEEDS ASSESSMENT “Acting without thinking is like shooting without aiming.” B. C. Forbes.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
March Madness Professional Development Goals/Data Workshop.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Formative Assessment Formative Assessment Assessment carried out during the instructional process for the purpose of improving teaching or learning.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
Developing Program Learning Outcomes To help in the quality of services.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Tri City United Public Schools August 6, 2013 “Leading for educational excellence and equity. Every day for every one.”
Instructional Leadership Supporting Common Assessments.
Designing Quality Assessment and Rubrics
Monitoring, Evaluation and Learning
Group Product(s) and Performance(s) Spreadsheets/Data Tables
Presentation transcript:

Quality Assessment July 31, 2006 Informing Practice

By the end of this session, we will … Participate, learn and have fun! Answer, –Why is it important to ask? –How do we “inform our practice” through four stages? –Will assessing make a difference? –Do I have the skills to begin successfully and feel good about what I am doing?

Why Bother? If you always do … Our need to know... –How are we doing? –How well are our students doing? –How do we know? –Have we made a difference? –Have we met our goals? Answer in a systematic way:  credibility

Asking: Provides “informed answers” - Speak knowledgeably to answer, “How do you know?” Demonstrates that we are serving our students: –Interested in knowing if we delivered what we promised;  Listening –Gathering evidence to use to improve Contributes to our own learning Why Bother?

Natural Process Asking after an event occurs: What was good about it? What was not so good? What will we do different next time? Why?

Guiding Principles There is no single “right answer” Process of learning together Value added process through synergy

Informing Practice Assessment Cycle 3. Data Collection 4. Data Analysis, Reporting, and Action 1. Setting Measurable Goals 2. Planning to Reach Goals Mission Statement Strategic Goals

1. Setting Measurable Goals Planning QuestionsGoalsGoals Students will participate in an effective experience that develops their interpersonal and leadership skills. Students will be able to rate the two aspects of the experience and demonstrate an example of applying their skills of collaboration and evaluating their leadership actions. What do we want our students to be able to know and do? What are observable and measurable outcomes (behaviors to track) that will let us know what our students know and can do? What tasks will students engage in that demonstrate what we expect of them? Students will engage in experiences with two aspects where they will apply their skills of collaboration and evaluate their leadership actions. Tools may be a survey, interview, observation checklist, etc. based on outcomes. What tool is used to measure the indicator?

2. Planning to Reach Goals Advice from “The expert” Review goals Data collection design Data analysis, reporting, and action Set future goals

3. Data Collection Planning for Success: Purpose Process: –Who? –What? –How? –When? Lessons Learned

4. Data Analysis, Reporting, and Action Results Analyze data to learn what was said Report and communicate to “Close the Loop” Action plans for the future  Data driven decision making  Completing the Cycle

3. Data Collection review Planning for Success: Purpose Process: –Who? –What? –How? –When? Lessons Learned

Purpose Clearly write: The purpose for our survey is to... State: –What do we want to know? –What we will do with the information; how we will use the assessment results for improvement?  Informs our Data Collection Design  Turns into Letter to Participants determine discover

Process – Who? Who do we ask? As researchers, we cannot assume that we know what everyone is thinking The ones who are able to answer the questions from their perspective

Process – What? What do we ask? –Review purpose statement Pilot: –Do the questions work? –What information will they give us? –Will the information inform our decision making? –Be cognizant of time of participant

Process – How? How do we ask to get information? Open-ended question format Close-ended question format –Likert (feelings / attitude / opinion) scale of 1 to 5; 1 to 7; 1 to 4; othersLikert –Yes / No answers Paper / electronic Focus groups –Using scripts; recorder/ x-check; skilled interviewer Consider need ? Consent letter / Institutional Review Board (IRB)? ? Anonymous / confidential?

Process – When? When do we ask? Immediately, or risk “time heals” syndrome Later, to benefit from reflection Check Survey CentralSurvey Central –Has it been asked before? –Avoid “survey fatigue”

Lessons Learned Critique survey examples Response population analyzed Letter to Participants Pilot Scales and ratings

Lessons Learned Critique survey examples

Lessons Learned Response population analyzed What is a good response rate? Sample Size Calculator Population Profile

Lessons Learned Letter to Participants Content Message

Lessons Learned Write an excellent letter of invitation to participate: Identify self, why the survey is happening What will be done with the results Note changes made in past “Data will be reported in aggregate form only” Incentives? Signatory? Personal connection makes a difference Remember: Be empathetic

Lessons Learned Pilot Is wording clear? Do the questions “work”? –What information will they give us? –Is the information meaningful? Be cognizant of time of participant

Lessons Learned Likert Scale 5 point scale: mid-point cluster 7 or higher point scale: many choices 4 point scale: forced opinion Define each level When rating, ask why... Rating was   only negative responses. Rating was  replicate the good.

Lessons Learned Constructing questions: Short and clear. Avoid misinterpretations. Consider pairing statements to ensure reliability and validity Avoid “and”. e.g. This limits statements to a single issue. Pilot

Will assessing make a difference? Data has contributed to our need to know... –We are… –Our students are… –We know… –We made a difference in these ways… –The goals we met are… We have the evidence!

Informing Practice Quality Assessment July 31, 2006 Written and produced by Halyna Kornuta