As teachers, we tend to believe that how we feel affects how we

Slides:



Advertisements
Similar presentations
A Vehicle to Promote Student Learning
Advertisements

Developmentally Appropriate Practice
C Domain Teaching for Student Learning. The focus in the C Domain is on the act of teaching and its overall goal of helping students connect with the.
Gifted Eligibility Process Target  Automatic  System-wide assessment – ITBS, CogAT, Renzullis  All students in grades 1, 3, and 5 are reviewed.
Student Survey Results and Analysis May Overview HEB ISD Students in grades 6 through 12 were invited to respond the Student Survey during May 2010.
Research Methodology Lecture No : 11 (Goodness Of Measures)
Collecting data Chapter 5
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
© West Educational Publishing Measuring Personality and Personal Abilities C HAPTER 15 M ost psycholo- gical testing measures personality, apti- tude,
Chapter 10 Teaching and Learning Strategies
Chapter Fifteen Understanding and Using Standardized Tests.
The Test Assessment Questionnaire Katherine M. Sauer William Mertens Metropolitan State College of Denver University of Colorado at Boulder
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 8 Using Survey Research.
Exceptionality and Special Education
Course: Required Textbook: Exceptional Learners: An Introduction to Special Education, 11 th Edition by Daniel P. Hallahan, James M. Kauffman, and Paige.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Assessing Student Learning
Assessing Personality
Chapter 4. Validity: Does the test cover what we are told (or believe)
Assessment of Special Education Students
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Benefits from Formal and Informal Assessments
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
Principles of Assessment
Classroom Assessment: Concepts and Applications Chapter 5: Summative Assessments.
Classroom Assessment A Practical Guide for Educators by Craig A
Business Communications & Presentations.  Numbers are so much a part of your life that you probably pay little attention to them:  “The unemployment.
Psychological Tests Ch 15 notes.
Study Session   The purpose of the Comprehensive Examination is for Graduate students to synthesize in writing the knowledge, skills, and competencies.
OB : Building Effective Interviewing Skills Building Effective Interviewing Skills Structure Objectives Basic Design Content Areas Questions Interview.
Chapter 13 Psychological Testing
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessing Program Quality with the Autism Program Environment Rating Scale.
Data Collection Methods
STOP DOING MATH LONG ENOUGH TO LEARN IT How to Study Math –Short Version Delano P. Wegener, Ph.D. Spring 2005.
Teaching Today: An Introduction to Education 8th edition
Armstrong School District An Introduction to the Advanced Placement Program ®
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
Teacher Engagement Survey Results and Analysis June 2011.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
MATH COMMUNICATIONS Created for the Georgia – Alabama District By: Diane M. Cease-Harper, Ed.D 2014.
Resources for Paraeducators Website
Assessing the Affective Domain... As teachers, we tend to believe that how we feel affects how we think... of course, cognitive psychology tells us it.
A Parent’s Guide to Formative Assessment Communication is Key! Education is shared between the home and the school. Good communication is important as.
Facilitate Group Learning
Chapter 7: Assessment. Group Work – Key points 1. Role of assessment, who does it? (pp ) 2. Components of assessments (pp ) 3. Keys to.
Giftedness Identification Instructional Strategies.
Chapter 14: Affective Assessment
Project Impact CURR 231 Curriculum and Instruction in Math Session 3 Chapters 3.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
RtI Response to Instruction and Intervention Understanding RtI in Thomspon School District Understanding RtI in Thomspon School District.
 Chapter 4-6 Davies  Assignment Expectation Review  Looking at mini unit and assessment information  Phone conversations  Writing Notes to parents.
Curriculum Compacting GUIDELINES, PRACTICE AND NEXT STEPS COACHES MEETING MARCH 6, 2015.
Chapter Twelve Copyright © 2006 McGraw-Hill/Irwin Attitude Scale Measurements Used In Survey Research.
Chapter 7: Assessment Identifying Strengths and Needs “Assessment is the process of gathering data for the purpose of making decisions about individuals.
What attitude towards assessment did you have as a high school student?
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Shrewsbury High School Susie Eriole, AP Coordinator Maureen Monopoli, Assistant Principal for Curriculum & Assessment An Introduction to the Advanced Placement.
WELCOME TO MICRO ECONOMICS AB 224 Discussion of Syllabus and Expectations in the Class.
Unit 3 Learning Styles Learning Styles Study Styles LASSI Discussion Assignment Seminar.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Chapter 5 Early Identification and Intervention
ASSESSMENT OF STUDENT LEARNING
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment Validity And Bias in Assessment.
Journal Suppose you were asked to select the best person to be your teacher from among a group of applicants. How would you go about making the selections?
Understanding and Using Standardized Tests
EDUC 2130 Quiz #10 W. Huitt.
Assessment Chapter 3.
Presentation transcript:

Chapters 9 & 12 “The time has come,” the Walrus said, “To talk of many things . . . .” As teachers, we tend to believe that how we feel affects how we think . . . of course, cognitive psychology tells us it is the other way around . . . so, let’s explore! For purposes of discussion, let’s use the passé phrase “Affective Domain” for this important sub-surface area related to achievement. In this chapter we explore ways to attain and analyze this type of useful information.

“Affective Domain” Explorations . . . Interest Inventories / Attitude Surveys Ability and Aptitude Tests Creativity Tests Personality Tests Non-test Indicators & Unobtrusive Measures

Let’s begin with a little attitude Let’s begin with a little attitude . . . Satisfaction Surveys / Self-Assessment Reports Could be used with individuals in your class, grade level, building or school district. Organizing the survey: Whose is the target (students, parents, public)? What questions will be asked (school climate, achievement)? How will it be administered (in class, sent home, telephone)? Typical survey statements: I believe I am doing well in class. My child’s teacher really knows my child. Teachers teach me in a way that makes me want to learn. I feel my tax money is being well spent.

Thoughts on include . . . Student Self-Reports and Self-Assessment May encourage students to develop skills in self-awareness and self-assessment. Like any self-report, honesty is an issue. Classroom needs to have a positive atmosphere. Don’t use to determine a student’s grade; anonymous data collection could ensure this. Best used for your own feedback. In a nonthreatening environment, there is a positive correlation better self-reports of achievement and actual achievement measured on academic tests. Most of assessments of this nature use a Likert Scale . . . let’s learn a bit about this scale.

We’ve got class, some classroom ideas on using We’ve got class, some classroom ideas on using . . . Inventories/Surveys, and the Likert Scale The Likert Scale is the most common method used in assessment for the areas in the Affective Domain. It is both simple and flexible. A Likert Scale can be created related to any topic on which you want to assess students’ interests, attitudes, opinions, or feelings. Simply: Define an affective domain topic related to your classroom. Think of different facets about the topic. Generate a series of favorable and unfavorable statements regarding the topic. These are sometimes called “survey items” and the whole group is often called a “survey” or “inventory.” Develop the response scale for the survey. Administer the survey. Score the results. Identify and eliminate items that fail to function in accord with the other items (i.e., look for bad items).

Rensis Likert (1903–1981) (pronounced 'Lick-urt') Likert, born and raised in Cheyenne, WY, was training to be an engineer with the Union Pacific Railroad when the Great Railroad Strike of 1922 occurred. The lack of communication between the two parties made a profound impression on him and may have led him to study conflict management and organizational theory for most of his life. In 1926, he graduated from the University of Michigan, Ann Arbor. He returned there in 1946 as professor of psychology and sociology. In addition to his famous “Likert Scale” he is noted for his management dictum that “The greater the loyalty of a group toward the group, the greater is the motivation among the members to achieve the goals of the group, and the greater the probability that the group will achieve its goals.”

A Likert-type item may have many . . . Response Label Variations

Creating Scores for Likert-type Items Provide number values to the scale; add them up to suggest an individual’s overall attitude score. (See below). If you have many people’s opinions, you can also add the numbers by opinion topic then divide by the number of respondents to get an average attitude score (See President Concerns, 2008).

Reverse Wording Option when . . . Creating and Scoring Likert-type Items To avoid having some students straight line their responses, state some statements in a reverse direction. Be sure to remember you did this when you total the points. (See Below) Also see Rosenberg Self-Esteem Scale.

Pitfalls to Avoid . . . when creating a survey for your classroom or school. Be sure you and your students know the purpose of the survey and how the information will be used. (e.g., Will individual responses be confidential?) Keep it short (e.g., generally one page is sufficient). Beware of lingo or jargon terms (e.g., Do you favor inclusion?). Watch out for ambiguous meaning (e.g., Which class is best?). Do not ask more than one question at a time (e.g., Do you favor more homework and more library assignments?). Avoid loaded or leading questions (e.g., Do you believe that it is important to treat your fellow students fairly?). Make sure that fixed-response questions have a place for every possible answer (e.g., Would you prefer to study history or economics?). Place the more sensitive questions at the end of the survey. Run the survey by other professionals before you distribute it. If necessary, obtain clearance from your principal or school district. Don't reward or punish students based on their responses.

Interests, Attitudes and Opinion Assessment: . . . some closing questions What about student faking? May choose “socially desirable” response. May try to please or shock the teacher. Main remedy is non-threatening environment. How stable are students’ interests, attitudes and opinions? May depend on the topic and person. We do expect to change them . . . (or do you?). What about using constructed or free-response measures? Can be used. Not often used in practice.

And a closing example . . . Career Interest Inventories These tests attempt to match a person’s personality and interests with a specific work environment and/or career. Problems: Honesty 1 . . . “Would you rather compute wages for payroll records or read to a blind person?” Which is more socially acceptable? Honesty 2 . . . Knowing where the questions are leading . . . I want Special Education so I know I should choose “read to blind person.” Is there a connection between what one would like to do and what one would really be good at doing? What about the idea of “learning to like” on the job and/or in developing new interests? Widely used inventories (taking more than one is recommended) Strong Kuder

Mental Ability Tests . . . usual purpose of ability testing is prediction Mental ability (also called intelligence, aptitude, learning ability, academic potential, cognitive ability, ad infinitum). We have already discussed the IQ as a normed score . . . Let’s look a little deeper. Theories about Mental Ability Unitary theory; “g” Multiple, independent abilities (about 7) e.g., verbal, numerical, spatial, perceptual Hierarchical theory – currently dominant [See next slide]

Hierarchical Theory of Mental Ability

Individually Administered Mental Ability Tests General features One-on-one administration Requires advanced training for administration Usually about 1 hour Mixture of items Examples WISC-IV Stanford-Binet

Group Administered Mental Ability Tests General Features Administered to any size group Types of items; content similar to individually administered but in multiple-choice format Examples Elementary/secondary Others (e.g.. SAT, ACT, GRE)

Learning Disabilities The basic definition is simple, to wit, there is a discrepancy between measured intelligence and measured achievement. But enter the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) The DSM-IV organizes each psychiatric diagnosis into five levels (axes) relating to different aspects of disorder or disability. Located in the Axis 1 level are developmental and learning disorders. Common Axis I disorders include phobias, depression, anxiety disorders, bipolar disorders, and learning disabilities (like reading disorder, mathematics disorder, disorder of written expression, ADHD) and communications disorders (like stuttering). The DSM-IV manual states that this manual is produced for the completion of Federal legislative mandates and its use by people without clinical training can lead to inappropriate application of its contents. Appropriate use of the diagnostic criteria is said to require extensive clinical training, and its contents “cannot simply be applied in a cookbook fashion”.

. . . Among Professionals The American Psychological Association (APA) has stated clearly that its “diagnostic labels” are primarily for use as a “convenient shorthand” among professionals. Do you think the “among professionals” phrase as used by the APA applies to high teachers whose field is not Special Education? Implications?

IDEA 2004 . . . . . . children with learning disabilities in high school. The Individuals with Disabilities Education Act (IDEA) is a law ensuring services to children with disabilities throughout the nation. Children and youth (ages 3-21) receive special education and related services under IDEA Part B. In updating the IDEA, Congress found that the education of children with disabilities, including learning disabilities (LD), can be made more effective by having high expectations for such children and ensuring their access to the general education curriculum in the regular classroom to the maximum extent possible. If students with LD are going to succeed in school, they must have access to teachers who know the general curriculum, as well as support from teachers trained in instructional strategies and techniques that address their specific learning needs. Unfortunately, studies have shown that students with LD are often the victims of watered down curriculum and teaching approaches that are neither individualized nor proven to be effective.

Creativity (aka Creative Thinking) Definition? More than 60 different definitions of creativity can be found in the psychological literature. Other terms one often hears associated with creative thinking are: divergent thinking, originality, ingenuity, unusualness. Creative thinking is generally considered to be involved with the creation or generation of ideas, processes, experiences or objects. Critical thinking is concerned with their evaluation. In measuring creativity, we typically use constructed response items looking for one or more of the “creativity” characteristics. The format is similar to that of an essay, the student is given a prompt . . . except now we are looking for divergent thinking, not convergent thinking.

Example “Prompts” for creative thinking General All the uses for (a common object) All the words beginning with (a letter) Captions or titles for . . . Field specific (tailored to content) How would U.S. differ today if . . . Different ending for a play or story Diverse descriptions for a work of art

Creativity: Scoring . . . a menu of five primary ways to score responses. Count – sum the number of ideas or responses. Count with quality rating – each response has a quality rating (e.g. 1-3); sum the ratings. Single best response – scan all responses, find the student’s “best” response, rate only that response using a quality rating scale (e.g. 1-3). Originality – the response(s) provided is/are infrequently seen (you might need experience to determine this). Different perspectives – count opposing ideas the student generates in responding to a prompt.

Standardized Tests for Creativity Nationwide, many school districts use standardized creativity tests for the purpose of screening and identifying gifted students. One of the most often used is the Torrance Tests of Creative Thinking (TTCT). It is produced in two forms: “Thinking Creatively with Pictures” & “Thinking Creatively with Words.” Check it out at the “Scholastic Testing Service – Gifted” website. http://www.ststesting.com/2005giftttct.html

In the News . . Cleveland Plain Dealer, January 22, 2008, post on Gifted Education in Ohio “No federal law requires school districts to identify or serve gifted students -- unlike special education for children with disabilities. That leaves it up to the individual states, and only 31 of them require districts to provide gifted services, according to the National Association for Gifted Children. Ohio is not among them.” In Ohio, “districts are only required to identify gifted students. Roughly 16 percent of the state's public school enrollment is classified as gifted. But last school year, only 26 percent of those students received either full or partial services, according to data filed with the Ohio Department of Education . . . . Research shows that while some gifted students do well without special services, the majority need more than the usual classroom experience.”

As Paul Harvey might say, And now, the rest of the story . . . Teachers like to use “rest of” variations in teaching. These tasks (aka tests) are often called “projective techniques.” Students are asked to be creative and think about what came before, what might happen next, or how a story might end. Before leaving this area, let’s take a look some “rest of” tests. To the top right is an example of the Rorschach Ink Blot Test . What do you think it measures?

The Thematic Apperception Test . . . what do you think this test measures?

The “Draw a Person” Test . . . What do you think this test measures?

Behavior Rating Scales Many professionals express the need to move away from norm-referenced measures and recommend utilizing a more functional assessment approach. Members of the counseling field also are advocating for behavioral assessment alternatives to more formal procedures. The following recommendations for educators are appropriate when considering implementing a behavior rating scale: Have a variety of people who know the child complete the scale (e.g., caregivers, parents, teachers). Make sure ratings on the child’s behavior is being collected from a number of different environments. Before using a particular rating scale, make sure it reflects overall goals of the assessment process. Care should be taken so that information about the student is not skewed toward the negative. Be aware that scales reflect perceptions about students and multiple informants and inter-rater reliability checks can corroborate or contradict these perceptions.

Areas often covered on a . . . Behavior Rating Scale Aggression Anger Anxiety Depression Hyperactivity Inattention Opposition Withdrawal

Example of a simple . . . Behavior Rating Scale 0 1 2 3 This student . . . Never Sometimes Often Always 1. Arrives late for class 0 1 2 3 2. Daydreams 0 1 2 3 3. Does sloppy work 0 1 2 3 4. Talks inappropriately 0 1 2 3 5. Disrespects authority 0 1 2 3 6. Completes work late 0 1 2 3 7. Seeks attention 0 1 2 3 8. Hits other students 0 1 2 3

Non-test Indicators as Non-test Indicators as . . . important sources of data on student accomplishment. These indicators serve to remind us that schools are pursing goals other than high test scores. The Ohio School Report Card includes both test and non-test indicators Examples of Data Collected: Routine Record Indexes Absentee rates; tardiness rates; graduation rates; discipline rates; athletic, club, volunteer participation rates, teachers who students took for class. Destination after high school – what are the connections among grades, test scores, routine records to later “achievements”? College? Where (selective or open admission)? Scholarship? Stay with it or drop out? Workforce? Type of Job? Pay? Fired (Why)? Other? (jail, unemployed, etc.) – Is school complicit? IF YOU CAN’T READ THIS, THANK A TEACHER . . . .

Non-test indicators include . . . Unobtrusive measures Unobtrusive measures are those assessments that occur in the normal environment and that the persons involved are oblivious to the assessment. Examples: Student graffiti - desktops, lockers, restrooms . . . Library books checked out in Spanish . . . Winners at YSU History Day . . . Hits on class website . . . The best source of unobtrusive data is gathered daily, in the classroom, by the teachers like you, as they teach.

How about . . . An unobtrusive measure for yourself. Stress & the Biodot . . . Notice the scale differences on the two cards displayed

Practical Advice Include objectives related to interests and attitudes in your objectives and in assessment. Identify and use a few non-test indicators of student accomplishment. Practice making up simple scales for measuring interests and attitudes using the Likert method. Apply concepts of reliability and validity to all these tests. Gain experience in developing prompts calling for divergent thinking and in scoring responses.

Terms Concepts to Review and Study on Your Own (1) cognitive outcomes non-cognitive outcomes convergent thinking divergent thinking faking Likert method non-test indicator unobtrusive measure

Terms Concepts to Review and Study on Your Own (2) behavior rating scale DSM-IV hierarchical model projective technique self-report inventory