Presentation is loading. Please wait.

Presentation is loading. Please wait.

EDM 311 – Educational Testing, Measurement and Program Evaluation

Similar presentations


Presentation on theme: "EDM 311 – Educational Testing, Measurement and Program Evaluation"— Presentation transcript:

1 EDM 311 – Educational Testing, Measurement and Program Evaluation
Karen Luz Y. Teves, PhD.

2 Course Description This graduate course is about the concepts, principles, and procedures of measurement and evaluation, proper classroom test construction and administration, criterion-referenced and norm-referenced measures, technical and practical issues related to test development and use, marking procedure/grading systems, testing program and the use of statistical procedures in evaluation.

3 Learning Objectives The graduate student should, upon completion of the course, be able to : understand the nature, characteristics, differences, and uses of measurement and evaluation. Understand the types of tests and uses of norm-referenced and criterion referenced tests. learn and be able to apply the learnings of proper construction and administration of tests understand the purpose of item analysis and be able to demonstrate the basic skills and interpretation of item analysis results. understand the different types and uses of assessment and demonstrate ability to conceptualize, construct, and score alternative assessments.

4 Learning Objectives learn and use basic statistical concepts, calculations, and analyses identify and practice the qualities desired in measurement and assessment procedures understand and appropriately apply various marking systems within the context of classroom assessment. understand the major strategies for: assigning student academic achievement grades, factors that influence grading, and the influence of student assessment strategies on student performance interpret the meaning of derived scores from standardized tests and other measures and review test content with respect to intended use. distinguish between measurements of validity and reliability and strategies for estimating validity and reliability.

5 Course Content Introduction to Educational Assessment, Measurement and Evaluation Tests, their Uses in Educational Assessment, Constructing, Administering and Scoring of Tests Assessment of Cognitive, Psychomotor and affective Domains of learning Test Item Data Use and Analysis Grading, Correlating, Organizing, Summarizing Test Scores and Test Reliability Assessing Student Outcomes, Grading and Reporting Student Performance

6 Learning Activities and Requirements
Class Participation and Practice Exercises – in between/after the lecture, the professor will design different types of activities for such as topical situation critique-discussion, practice exercises from selected data sets, testing and statistical computations. Development of a Criterion Referenced Test - students will develop a Criterion Referenced Test for use in the subject, grade/year level and topic of their choice following the practices and methods as discussed in the lectures. A rubric will be provided for students to follow in the design of their tests. Development of an Alternative Assessment - students will develop an Alternative Assessment for use in the subject, grade/year level and topic of their choice following the practices and methods as discussed in the lectures. A rubric will be provided for students to follow in the design of their assessments. Pass the Final Examination.

7 Educational Assessment, Measurement and Evaluation - An Introduction
Measurement – in education, is the quantification of what students learned through the use of tests, questionnaires, rating scales, checklists and other devices. E.g. – A teacher gave his class a 10-item quiz is undertaking measurement of what his students learned from the lesson Assessment – the full range of information gathered and synthesized by teachers about their students and their classrooms by observation, giving of assignments, tests, report/output Evaluation – a process of making judgements, assigning value or deciding on the worth of students’ performance E.g. – teacher assigns a grade to the quiz score obtained by the student

8 Test: 1. Measurement instrument
2. Designed to elicit specific sample of behavior Measurement: 1. Quantification: assigning numbers 2. Characteristics: abilities, traits, attributes, constructs 3. Rules and procedures: must be replicable

9 Measurement – how much does a student learn or know
Measurement – how much does a student learn or know? Assessment – how much change has occurred on the student’s acquisition of skill, knowledge or value before and after a given learning experience? Evaluation – how good, adequate or desirable is the student’s performance?

10 Evaluation: 1. Systematic 2. Gathering of information 3
Evaluation: 1. Systematic 2. Gathering of information 3. Making decisions: value judgments Measurement = Testing+ Quantitative Data Evaluation = Testing + Quantitative Data + Qualitative Data + Judgment (Bachman, 1990)

11 Purposes of Educational Assessment, Measurement and Evaluation
Improvement of student learning Identification of students’ strengths/weaknesses Verification of accomplishment of an instructional objective through the use of a particular teaching strategy Verification of effectiveness of curriculum Assessment and improvement of teaching effectiveness Communication with and involvement of parents in their children’s learning To diagnose nature of difficulties To measure student’s achievement Prediction of an individual’s level of achievement in future activities or predict one measure from another measure Motivation

12 Types of evaluation 1- Formative evaluations:
It is an ongoing classroom process that keeps students and educators informed of students’ progress toward program learning objectives. The main purpose of formative evaluation is to improve instruction and student learning.

13 2- Summative evaluations
It occurs most often at the end of a unit. The teacher uses summative evaluation to determine what has been learned over a period of time, to summarize student progress, and to report to students, parents and educators on progress relative to curriculum objectives.

14 3- Diagnostic evaluation
It usually occurs at the beginning of the school year or before a new unit. It identifies students who lack prerequisite knowledge, understanding or skills. Diagnostic testing also identifies student interests. Diagnostic evaluation provides information essential to teachers in designing appropriate programs for all students.

15 providing feedback on student work
Bloom’s Taxonomy - provides an important framework for teachers to use to focus on higher order thinking. By providing a hierarchy of levels, this taxonomy can assist teachers in designing performance tasks, crafting questions for conferring with students, and providing feedback on student work

16 Cognitive Objectives Knowledge Comprehen sion Application Analysis
Evaluation Synthesis Simple Complex

17 Knowledge The remembering of previously learned material
Examples of learning objectives: -know common terms -know specific facts -know methods and procedures -know basic concepts -know principles – who, what, why, when, omit, where, which, choose, find, how, define, label, show, spell, list, match, name

18 Comprehension Examples of learning objectives:
The ability to grasp the meaning of material Examples of learning objectives: -understand facts and principles -interpret verbal materials -interpret charts and graphs -translate verbal material to mathematical formulae justify methods and procedures - compare, contrast, demonstrate, interpret, explain, extend, illustrate, infer, outline, relate, rephrase, classify

19 Application The ability to use learned material in new and concrete situations Examples of learning objectives: -apply concepts and principles to new situations apply laws and theories to practical situations solve mathematical problems construct graphs and charts demonstrate the correct usage of a method or procedure - apply. build, choose, construct, develop, interview, make use of, organize, experiment, with, plan, select, solve, utilize

20 Analysis The ability to break down material into its component parts
Examples of learning objectives: - recognize unstated assumptions -recognize logical fallacies in reasoning distinguish between facts and inferences evaluate the relevancy of data analyze the organizational structure of a work - analyze, categorize, classify, compare, contrast, discover, dissect, divide, examine, inspect, distinguish, simplify

21 Evaluation The ability to judge the value of material for a given purpose based on definite criteria Examples of learning objectives: - judge the logical consistency of written material -judge the adequacy with which conclusions are supported by data judge the value of a work by the use of internal criteria (organization) or external standards of excellence - build, choose, combine, compile, compose, construct, create, design, develop, estimate, formulate, imagine, invent

22 Synthesis The ability to put parts together to form a new whole
Examples of learning objectives: -write a well organized theme -give a well organize speech write a creative short story propose a plan for an experiment integrate learning from different areas into a plan for solving a problem - award, choose, conclude, criticize, decide, defend, determine, dispute, evaluate, judge, justify, measure, appraise

23 Synthesis (Example) Write a paragraph summarizing the text you have read. Your summary should have a topic sentence defining the problem, some of the causes, some of the effects, and a conclusion.

24 Principles of Evaluation
Evaluation should be 1. Based on clearly stated objectives 2. Comprehensive 3. Cooperative 4. Used Judiciously 5. Continuous and integral part of the teaching – learning process

25 Qualities of a Good Measuring Instrument
Validity: the extent to which the instrument measures what it is intended to measure. Reliability: the consistency with which an instrument measures a given variable. Objectivity: the extent to which independent and competent examiners agree on what constitute a good answer for each of the elements of a measuring instruments Practicability: the overall simplicity of the use of a test both for test constructor and for students.

26 Validity: Reliability:
Does the test measure what it is supposed to measure? Reliability: Will this test yield stable scores over repeated administrations?

27 Approaches to Evaluation
Criterion-referenced (c-r) standard is used to determine if someone has attained a specified standard Norm-referenced (n-r) standard is used to judge an individual’s performance in relation to the performances of other members of a well-defined group

28 Criterion-referenced (c-r) standards are useful for setting performance standards for all
Norm-referenced (n-r) standards are valuable for comparisons among individuals when the situation requires a degree of sensitivity or discrimination in ability

29 • Norm-referenced standards
- developed by testing a large group of people - using descriptive statistics to develop standards - percentile ranks are a common norming method • Major concern - group characteristics used to develop norms may not result in desirable norms

30 Criterion-referenced standards
- predetermined standard of performance shows the individual has achieved a desired level of performance - performance of individual is not compared with that of other individuals

31 “Evaluation is the process of giving meaning to a measurement by judging it against some standard”


Download ppt "EDM 311 – Educational Testing, Measurement and Program Evaluation"

Similar presentations


Ads by Google