A ssessment & E valuation. Assessment Answers questions related to individuals, “What did the student learn?” Uses tests and other activities to determine.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

ASSESSMENT DEVELOPMENT: BUILDING VALID AND RELIABLE TESTS Habeeb Quadri, M.Ed Denis Jarvinen, Ph.D.
L2 program design Content, structure, evaluation.
The Research Consumer Evaluates Measurement Reliability and Validity
Psychometrics William P. Wattles, Ph.D. Francis Marion University.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 5 Reliability.
VALIDITY AND RELIABILITY
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Chapter Fifteen Understanding and Using Standardized Tests.
Evaluation: Testing, Objective-to-Test-Item Matching and Judgments of Worth EDTEC 540 James Marshall.
Classroom Assessment (1)
Setting Performance Standards Grades 5-7 NJ ASK NJDOE Riverside Publishing May 17, 2006.
Can you do it again? Reliability and Other Desired Characteristics Linn and Gronlund Chap.. 5.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Concept of Measurement
Measuring Achievement and Aptitude: Applications for Counseling Session 7.
Lesson Eight Standardized Test. Contents Components of a Standardized test Reasons for the Name “Standardized” Reasons for Using a Standardized Test Scaling.
1 General look at testing Taking a step backwards.
BASIC TERMINOLOGY n Test n Measurement n Evaluation.
Formative and Summative Evaluations
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Using statistics in small-scale language education research Jean Turner © Taylor & Francis 2014.
Reliability of Selection Measures. Reliability Defined The degree of dependability, consistency, or stability of scores on measures used in selection.
Standardized Test Scores Common Representations for Parents and Students.
Formative and Summative Assessment
MEASUREMENT AND EVALUATION
Chapter 14 Understanding and Using Standardized Tests Viewing recommendations for Windows: Use the Arial TrueType font and set your screen area to at least.
Developing Evaluation Instruments
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Psychometrics William P. Wattles, Ph.D. Francis Marion University.
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
The World of Assessment Consider the options! Scores based on developmental levels of academic achievement Age-Equivalent scores.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
Reliability & Validity
EDU 8603 Day 6. What do the following numbers mean?
Evaluating Instruction
Educational Psychology: Theory and Practice Chapter 14 Standardized Tests This multimedia product and its contents are protected under copyright law. The.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Session 7 Standardized Assessment. Standardized Tests Assess students’ under uniform conditions: a) Structured directions for administration b) Procedures.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
PSY 231: Introduction to Industrial and Organizational Psychology
Chapter 3 Selection of Assessment Tools. Council of Exceptional Children’s Professional Standards All special educators should possess a common core of.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Standards-Based Tests A measure of student achievement in which a student’s score is compared to a standard of performance.
TEST SCORES INTERPRETATION - is a process of assigning meaning and usefulness to the scores obtained from classroom test. - This is necessary because.
Assessment Assessment is the collection, recording and analysis of data about students as they work over a period of time. This should include, teacher,
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 5 What is a Good Test?
GRADING n n Grading that reflects the actual ability of the student n n Grading as a personal communication.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
Dr. Jeffrey Oescher 27 January 2014 Technical Issues  Two technical issues  Validity  Reliability.
Test evaluation (group of 4; oral presentation, mins) *Purpose: Apply the principles learned in your reading and class lectures to evaluate an existing.
1 Measurement Error All systematic effects acting to bias recorded results: -- Unclear Questions -- Ambiguous Questions -- Unclear Instructions -- Socially-acceptable.
Assessment of Learning 1
CHAPTER 3: Practical Measurement Concepts
Test Design & Construction
MANA 4328 Dennis C. Veit Measurement MANA 4328 Dennis C. Veit 1.
MANA 4328 Dennis C. Veit Measurement MANA 4328 Dennis C. Veit 1.
Understanding and Using Standardized Tests
Presentation transcript:

A ssessment & E valuation

Assessment Answers questions related to individuals, “What did the student learn?” Uses tests and other activities to determine student knowledge and ability.  Identifies Adverbs  Adds two-digit numbers  Defines democracy A+ Def: assessment - to determine the amount of [e.g., knowledge]

Evaluation Answers global questions, “How effective is Program X?” and “What difference does it make?”  Formative  Summative Def: evaluation - to determine the significance or worth by careful appraisal and study

Types of Tests Used to evaluate changes in skills and knowledge

Test Types: Norm-Referenced Compare an individual's performance to the performance of other people. Require varying item difficulties. Assume not everybody is going to "get it"  Discern those who "got it" from those who didn't.

Normal Distribution

Test Types: Norm-Referenced Norm-referenced tests compare the individual to the group.  Accomplished statistically by “norming” the test with large numbers of people. Consider:  You’re evaluating a GRE preparation class.  GRE scores for the group of 100 enrolled students average as follows.  Can you make recommendations to the school that will help them improve the instruction?  Why or why not?

Test Types: Criterion-Referenced Compares an individual's performance to the acceptable standard of performance for those tasks. Requires completely specified objectives.  Asks: Can this person do that which has been specified in the objectives? Results in yes-no decisions about competence.

Test Types: Criterion-Referenced Applications  Diagnosis of individual skill deficiencies  Certification of skills  Evaluation and revision of instruction

Ideas in Testing Measurement Error Validity Reliability

Measurement Error Many causes: mechanical or scoring errors poor wording (confusing, ambiguous) poor subject matter, content (validity) score variation from one time to another (reliability) score variation from "equivalent" tests test administration procedure inter-rater reliability mood of the student

Validity Does the test assess what's important? Does it really seek out the skill and knowledge linked to the world? (content validity) Types:  Content Validity  Assessed by a panel of experts: Face validity Construct validity  Predictive Validity (e.g. SAT, GRE)

Reliability Are the scores produced by the test trustworthy and stable over time? Assessed by:  parallel (equivalent) forms or test-retest  internal consistency

Utility of Test Scores Selection & screening (before):  mastery of prerequisites -- for remediation  mastery of terminal objectives -- for acceleration Individual diagnosis and prescription (along the way) Practice (along the way) Grades & summative scores (at or after the end):  promotion  certification and licensure Administrative:  course evaluation  trainer accountability

Finding Tests Evaluator constructed Commercially available Hybrids - localization, repurposing What is the main issue with repurposing a validated test?