Formal Assessment Week 6 & 7. Formal Assessment Formal assessment is typical in the form of paper-pencil assessment or computer based. These tests are.

Slides:



Advertisements
Similar presentations
Parts of a Lesson Plan Any format that works for you and your JTEs is ok… BUT! Here are some ideas that might help you set up your LP format. The ALTs.
Advertisements

Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Assessing Student Performance
Assessment types and activities
Square Peg and Round Hole… As parents and educators, the change in grading systems requires a fundamental switch in our thinking… 4=A 1=F 2=D 3=B.
MAP: Basics Overview Jenny McEvoy and Page Powell.
Professor Gary Merlo Westfield State College
Reliability for Teachers Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Reliability = Consistency.
VALIDITY.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 5 Making Systematic Observations.
Research Methods in MIS
Assessing Intelligence
Standardized Test Scores Common Representations for Parents and Students.
VALIDITY. Validity is an important characteristic of a scientific instrument. The term validity denotes the scientific utility of a measuring instrument,
Measurement Concepts & Interpretation. Scores on tests can be interpreted: By comparing a client to a peer in the norm group to determine how different.
Reading Assessments for Elementary Schools Tracey E. Hall Center for Applied Special Technology Marley W. Watkins Pennsylvania State University Frank.
Curriculum Alignment Refers the “match” between the content, format, and level of cognition of the curriculum or textbook (English, p ).
COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010.
Classroom Assessment: Concepts and Applications Chapter 5: Summative Assessments.
Lecture 6 Standardized testing Standard-based assessment Performance based assessment Chapter 4 & 5 BROWN, 2004.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Technical Adequacy Session One Part Three.
Language and Content-Area Assessment Chapter 7 Kelly Mitchell PPS 6010 February 3, 2011.
Nativity Catholic School IOWA Testing. 70 years of educational research and test development Form E Big Picture student’s classroom work homework projects.
An Introduction to Measurement and Evaluation Emily H. Wughalter, Ed.D. Summer 2010 Department of Kinesiology.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
© TNTP 2013 ACE Observer Training Stoudt For observers new to TNTP and the ACE Instructional Framework.
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
Reliability vs. Validity.  Reliability  the consistency of your measurement, or the degree to which an instrument measures the same way each time it.
Session 7 Standardized Assessment. Standardized Tests Assess students’ under uniform conditions: a) Structured directions for administration b) Procedures.
Grading and Analysis Report For Clinical Portfolio 1.
End-of –year Assessments MAP and DRA Testing Workshop for Parents
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Intelligence & Testing
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Welcome to MMS MAP DATA INFO NIGHT 2015.
Provisionally Certified Teachers In-Service Program August 3 & 4, 2004     Utah State Office of Education ASTE Department, Utah State University  
Nurhayati, M.Pd Indraprasta University Jakarta.  Validity : Does it measure what it is supposed to measure?  Reliability: How the representative is.
Chapter 6 - Standardized Measurement and Assessment
Types of Assessment in Education Week 2. Types of Assessment In the previous week’s lesson we saw that there’s two basic types of assessment formal and.
Assessment Assessment is the collection, recording and analysis of data about students as they work over a period of time. This should include, teacher,
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
Career Counseling: A Holistic Approach
Georgia Milestones Assessment Parent Meeting Davis Elementary School April 15, 2015.
Intelligence A concept, not a “thing.” Intelligence – Mental quality consisting of the ability to learn from experience, solve problems, and use knowledge.
What Do Teachers Need to Know About Assessment? Professor Norland ED342.
The Importance of Data-Based Decision Making
Assessing Musical Behavior
Assessment of Learning 1
VALIDITY by Barli Tambunan/
Ch. 15 S. 1 What Are Psychological Tests?
CHAPTER 3: Practical Measurement Concepts
What Do Teachers Need to Know About Assessment?
The Importance of Data-Based Decision Making
The 5 Ws of Testing
Assessment Theory and Models Part II
Test Validity.
Classroom Assessment Validity And Bias in Assessment.
پرسشنامه کارگاه.
Bursting the assessment mythology: A discussion of key concepts
Making Sense of Assessment
Formative Assessments Director, Assessment and Accountability
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
61.1 – Discuss the history of intelligence testing.
Jennifer Rodriguez TB610-60
Accelerated Reader® 101 for Parents
EDUC 2130 Quiz #10 W. Huitt.
Chapter 3: How Standardized Test….
Review Session: Week 9 Intelligence & Testing AP Psychology
Presentation transcript:

Formal Assessment Week 6 & 7

Formal Assessment Formal assessment is typical in the form of paper-pencil assessment or computer based. These tests are standardized because each student receives exactly the same assessment with the same instructions. The outcome is graded and the answers are standard based on information that the student is expected to possess.

Formal Assessment Norm Referenced Tests One form of formal assessment is a norm referenced test. A norm referenced test or normative assessment is an evaluation where those being tested are compared to a sample group of his or her peers. This sample group is called the normative sample. Therefore each test taker is compared to the ability of his or her peers.

Formal Assessment Norm Referenced Tests (NRT’s) are typically multiple choice, but they can also be open- ended and short answer exams. Norm referenced tests are the exams in which the students score is shown in as a percentile. Thus showing what percentile s/he scored according to the normative group. The design of NRT’s is to rank test takers in comparison to other student who also took the test generally that year or at a given time.

Formal Assessment Problems With Norm Referenced Tests These types of tests can be biased against certain students depending on who the normative group was. For example a normative group of suburban children might have the tendency to be biased against inner city students. If the normative group were upper middle class Christian students, the test might be biased against all other possibilities and specifically lower socio-economical status, non- Christian students.

Formal Assessment Problems with NRT’s Many mistakes can be made by using this type of test to make ultimate decisions on a students placement, graduation or retention because they only assess a small portion of a student’s educational ability. These assessments focus so much of their attention on information that is learned through memorization, rote, and routine learning. NRT’s often encourage teachers focusing on facts and not higher order thinking skills. It also can diminish educational land academic expectations.

Formal Assessment Restraints Norm referenced tests are usually given within certain time limits. This can be a problem for some students because the concept of having 30 minutes for 20 questions can cause some anxiety. NRT’s also typically do not allow student to go back over a section once that section has been passed. This can be problematic for slower test takers who may not have finished one section, but finished another section early.

Formal Assessment Accuracy These assessments give a few questions of the many possibilities, thus the score is representative of how any given student might have done given all the questions. However, when discussing scores and the accuracy of scores we must discuss validity and reliability.

Formal Assessment Validity and reliability are extremely important in assessment. In assessing the bias of an assessment requires looking at the validity and reliability.

Formal Assessment Reliability Reliability tests the consistency of the assessment. The question is each time this assessment is given does it assess the same thing each time. If you are at the market and you weigh your tomatoes and they weigh 2.2 lbs. You would expect that when the cashier weighs them they will weigh 2.2 lbs. Why? It is assume that the scale is reliable and will always weigh the tomatoes accurately.

Formal Assessment Types of Reliability One type of reliability of an assessment is internal consistency. It is measured by comparing one part (or half) of the assessment to another part (or half). Another type of reliability of an assessment is alternate form. It is measured by creating two forms of the same test. The correlation between the two tests scores states the reliability. Another type of reliability of an assessment is stability. This is measured by giving the assessment twice. The correlation of the scores from each time that the assessment was taken determines the reliabili ty.

Formal Assessment Validity Validity refers to the accuracy of an assessment. In other words, does the assessment measure what it is supposed to measure. Validity is more important than reliability.

Formal Assessment Types of Validity If you are assessing the validity of the content, whether instructional objectives and content of the test match. A cumulative exam of a class would measure the validity of the overall course objectives. An assessment of two or three weeks would not measure the validity of the overall course objectives.

Formal Assessment Types of Validity If you are assessing the validity of the criterion of an assessment. This means when the validity is concurrent or predictive of external criterion. For example, if a 10 th grade English final correlate highly with a statewide 10 th grade English assessment, this would show a high concurrent validity.

Formal Assessment Types of Validity This type of validity shows the extent to which assessment corresponds, as predicted by a rationale or theory, with other variables.

Quick Quiz 1. What is reliability? 2. What is validity? 3. What is the difference between reliability and validity? 4. Why would validity be more important than reliability? 5. In your own words tell how reliability and validity would be used in norm referenced testing. 6.Would you use norm referenced tests? Why or why not?

Assignment for Week 6 & 7 1. Read PowerPoint notes 2. Read Ch. 4 and Master’s students: read additional pages and use Reading Report Form. 4. Take Week 6 Quiz. 5. Write up: Do you think that norm referenced testing should be used in Christian education? Explain. How would this assessment help in your ministry? (This is Journal 6 & 7 for online students.) 6. Begin working on writing assignment (see Week 9).