DISSERTATION DEFENSE Title: “Implementing the standard-based assessment: Developing and validating a set of laboratory tasks in high school biology”.

Slides:



Advertisements
Similar presentations
Praxis Assessment Overview Educational Testing Service Kentucky Education Professional Standards Board Frankfort, Kentucky January 21, 2007 Copyright 2004.
Advertisements

General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
GCAT-SEEK Workshop, 2013 Dr. Tammy Tobin, Susquehanna University Adapted from Mary J. Allen, SACS-COC Summer Institute Developing and Using Rubrics to.
Assessment in the Biology Department in AY Caroline Solomon Friday December 5.
Analysis of Institutional Data of the Assessment of SLOs October 27, 2009.
Structure of a Research Proposal or Report INFO4990: Information Technology Research Methods DECO3008: Design Computing Prep Honours Mary Lou Maher March.
Stony Brook Model for General Education Assessment Pilot Report November 13, 2003 GEAR as a Catalyst for Change Beginning to Build a Campus- Wide Culture.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
Effect of Staff Attitudes on Quality in Clinical Microbiology Services Ms. Julie Sims Laboratory Technical specialist Strengthening of Medical Laboratories.
Elsevier Science (USA) items and derived items copyright © 2003, Elsevier Science (USA). All rights reserved. Chapter 2 Introduction to the Quantitative.
FLCC knows a lot about assessment – J will send examples
Proposal Writing.
Conducting a Job Analysis to Establish the Examination Content Domain Patricia M. Muenzen Associate Director of Research Programs Professional Examination.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
MULTIPLE MEASURES What are they… Why are they… What do we do… How will we know… Dr. Scott P. Myers KLFA Wednesday, August 28, 2013.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
© 2005 Pearson Education Canada Inc. Chapter 2 Sociological Investigation.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Professional Development for the Teachers of Tomorrow’s Children WACTE October 28, 2008 Sheila Fox, WWU.
MFAS Mathematics Formative Assessment System We shall be using tasks from MFAS.
Record Keeping and Using Data to Determine Report Card Markings.
Classroom Diagnostic Tools. Pre-Formative Assessment of Current CDT Knowledge.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Session 4 Reliability and Validity. Validity What does the instrument measure and How well does it measure what it is supposed to measure? Is there enough.
A Closer Look Quality Goals Appropriate Assessments.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Study of Device Comparability within the PARCC Field Test.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Defining and measuring organizational readiness for change
Barry Williams1 Designing & Conducting Formative Evaluation Dick & Carey Chp. 10.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
CAEP Standard 4 Program Impact Case Study
EVALUATING EPP-CREATED ASSESSMENTS
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Bob Michael Associate Vice Chancellor, University System of Georgia
Lecture 5 Validity and Reliability
Elayne Colón and Tom Dana
General Education Assessment Subcommittee Report
Week 3 Class Discussion.
Measurement Characteristics of Client Assessment
Mosby items and derived items © 2005 by Mosby, Inc.
Institutional Learning Outcomes Assessment
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Collecting and Interpreting Quantitative Data – Introduction (Part 1)
Approach Section: The “Meat” of the Proposal
Explaining the Methodology : steps to take and content to include
Bob Michael Associate Vice Chancellor, University System of Georgia
Analyzing Reliability and Validity in Outcomes Assessment
Designing & Conducting Formative Evaluation
Mosby items and derived items © 2005 by Mosby, Inc.
Collecting and Interpreting Quantitative Data
Interpreting & Summarizing
Presentation transcript:

DISSERTATION DEFENSE Title: “Implementing the standard-based assessment: Developing and validating a set of laboratory tasks in high school biology”. by Gouranga Saha State University of New York at Buffalo U.S.A.

SCIENCE ASSESSMENT STANDARDS Understanding important relationships, processes, mechanisms, and application of concepts are considered critical for science learning outcomes by educational reform documents. Assessment standards emphasize that all assessments should correlate well with these intended science learning goals.

Science Education Reform Efforts Constructivist Paradigm Impact on Assessment Performance-based Assessment Tasks Authentic Lab. Practical Tasks Assessment of Biology Learning outcomes LITERATURE REVIEW

How can laboratory-based performance tasks be designed and developed to ascertain that they are doable by all students for whom they are meant? Do student responses from these tasks validly represent the intended process skills that new biology learning standards want students to acquire? And Are these tasks psychometrically sound as individual tasks and as set? RESEARCH QUESTIONS

Designing the tasks Developing the tasks Sampling the subjects Collecting data Analyzing data METHODOLOGY

Designing Tasks l Pooling Tasks l Brain-storming l Modifying existing tasks

Developing Tasks l Trial Testing

Trial Testing Micro Testing Mini Testing Pilot Testing Field Testing

PIPE-LINE ANALOGY

SCORING RUBRIC From subjective to objective Holistic

ADDRESSING THE STANDARDS Content standards Process standards

TASK AS AN ASSESSMENT INSTRUMENT Nature Items to tap science process skills

Sampling the Subjects Randomization Process

Collecting Data Conducting the Tests Scoring Student Responses

Analyzing Data Collating the raw data Organizing the data Analysis

Percent Agreement Coefficient ‘r’ Item-wise analysis ‘r’ across tasks & across items of each task Convergent & divergent evidences ‘r’ across items of same skill category

Analysis (Contd.) Application of ‘G’-Theory Differential Item analysis Skill category ANOVA Interpretive Analysis

Summary Gowin’s V-Diagram