Presentation is loading. Please wait.

Presentation is loading. Please wait.

The process of collecting information about students and classrooms for the purpose of making instructional decisions.

Similar presentations


Presentation on theme: "The process of collecting information about students and classrooms for the purpose of making instructional decisions."— Presentation transcript:

1

2 The process of collecting information about students and classrooms for the purpose of making instructional decisions.

3  Observations – rating forms, narrative descriptions, checklists, logs and anecdotal notes  Performance samples – work products, artifacts  Tests – informal reading inventories, end-of- unit tasks, teachers’ quizzes

4 Assessment Decision Cycle for Student Learning Reflection Assessment Instructional Design Instruction

5 ASSESSMENT FUNCTIONS Determine students’ needs, interests, and current knowledge/skills Determine students’ needs, interests, and current knowledge/skills Make instructional decisions Make instructional decisions Monitor instruction to provide teacher and students with feedback and progress information Monitor instruction to provide teacher and students with feedback and progress information Evaluate student outcomes and performance Evaluate student outcomes and performance Accumulate a body of evidence of student achievement Accumulate a body of evidence of student achievement Evaluate unit outcomes and overall programs Evaluate unit outcomes and overall programs

6 PPre-Assessment: Prior to building the unit – ◦W◦What do students already know, do and what are their attitudes/dispositions? ◦W◦What do students need to know, do, and be like? DDetermine students’ prior knowledge and experiences DDetermine students’ needs, interest EEmbedded Assessments: During the unit – ◦L◦Linked to each lesson plan ◦D◦Determine students’ progress ◦C◦Confirm or modify instructional decisions PPost-Assessment: At the end of the unit ◦W◦What do the students know, do and what are their attitudes/dispositions? ◦D◦Did student reach their targeted learner outcomes? DDid your unit work? WWhere do we go from here?

7 Informal When students are evaluated on a daily and informal basis using observations, anecdotal notes and checklists Formal When students are evaluated through precise and thorough quizzes, written tests or alternative assessment

8  Valid– measures what it claims to measure  Reliable – produces dependable, consistent scores for persons who take it more than once over a time period  Objective – eliminates biases, prejudgments and personal feelings

9  Diagnostic – used at the outset of a unit, semester or year to identify problems and assess prior knowledge  Feedback – used during instruction to provide corrective feedback to students  Reporting – used at the end of a unit or semester to determine progress or grades and make judgments about student achievement

10  Assess all instructional objectives  Cover all cognitive domains  Use appropriate test items  Make tests valid and reliable  Use tests to improve learning

11  Begin with least difficult questions  Make tests items reflect instructional objectives and content taught  Watch vocabulary of the test itself  Make it possible for everyone to demonstrate what they have learned  Make test directions clear  Place all items of the same type together  Include all the information and materials students will need for the test  Include several test items for each objective  Make more items than you will need and use the best ones  Design questions that use both high and low cognitive levels  Use tests to improve learning

12  Informal and student centered measures  Selected response measures  Constructed response measures  Performance and portfolio measures

13  KWL Charts  Show what you know charts  Self-assessment measures such as graphs of progress toward acquisition of standards  Paper and pencil measures such as tickets to leave, journal entries, log entries, question of the day

14  Teacher observation data recorded anecdotally (in the form of notes) or more discrete measures (frequency counts, absences, duration, fluency)  Subject specific assessment guided by teaching and learning (use of problem solving process, use of scientific method)

15  Used to ascertain students’ mastery of larger domains of content  Measure only lower-order kinds of cognitive capabilities

16  True & False  Matching  Multiple-choice  Completion/sentence stems

17  Fill-in-the Blanks – when measuring students’ abilities to recall factual information  Multiple-Choice – when measuring objective information of either factual or higher-level analytic skills  Matching – when measuring student recall of a fairly large amount of factual information  True/False – when the content calls for students to compare alternatives  Short answer – when measuring higher-level analytic skills  Essay – when measuring higher-level thought processes and creativity

18  50% chance of correct guess  Keep a balance between true and false  Avoid broad generalizations (never, always)  Use clear language and avoid using terms denoting degree (large, long time, regularly)  Avoid using negative statements  Underline the word that makes it true or false  Encourage revision of statements that are false  Keep true and false items the same length

19  Never use all of the above or none of the above  Avoid negatively stated stems  Distribute the order of correct answers randomly  Make the wording simple and clear  Use appropriate distracters  Make sure there is only one correct answer  Make sure all distracters are plausible  Use either sentence stems or questions  Separate the stem from the possible answers  Use three to four possible responses

20  Include no more than 10 items to be matched  Make the phrases in the descriptors list longer than the phrases in the options list  Put definitions on the left and words on the right  Make directions clear on how to match  Underline the key word (person, place, etc)  Make sure all options are plausible distracters  Specify in the directions whether options can be used more than once  Put it all on a single page

21  Provide a single-word answer or a brief, definite statement  Supply enough context to give it meaning  Omit insignificant words  Avoid textbook language  Provide clues, if necessary  Provide enough blanks for each word  Put the blank toward the end  Allow students to use a word or sentence bank  Provide first letter clues

22  Elicit responses more closely approximating the kinds of behavior students must display in real life  Require students to perform

23  Short answer  Essay

24  Call for students to supply a work, phrase or a sentence in response to either a direct question or an incomplete statement  Suitable for assessing relatively simple kinds of learning outcomes such as those focused on students’ acquisition of knowledge  Students have to produce a correct answer, not just recognize it  More difficult to score

25  Employ direct questions rather than incomplete sentences  Nurture concise responses with short blanks  Limit to one or two blanks  Provide first letter clues when necessary

26  Gages a student’s ability to synthesize, evaluate and compose  Difficult to score  Restricted-response item limits the form and content of the response  Extended response item provides students with more latitude in responding

27  Make the wording of a question as clear as possible  Provide guidance on how students should use their time  Write a sample answer ahead of time and assign points to various parts of the answer  Have students justify their answers  Allow students more time, if needed  Provide sentence stems or word banks  Use holistic scoring

28  Score responses holistically and or analytically ◦ Holistically has general criteria ◦ Analytically has degrees of acceptability for each criteria  Prepare tentative scoring key in advance of judging students’ responses  Score all responses to one item before scoring responses to the next item  Evaluate items anonymously  Decide on the importance of mechanics

29  The backbone for post-assessments of units  Designed to promoted enduring understanding  Tied to real-life, authentic, functional activities  Experientially-based  Age appropriate  Differentiated ◦ Content ◦ Product ◦ Process  Clear criteria for performance

30  PERFORMANCE ASSESSMENT -Requires students to demonstrate that they can perform tasks.  AUTHENTIC ASSESSMENTS - Requires students to apply and extend what they know or can do in relation to a significant and engaging problem or question about real life.  PORTFOLIOS – Is a purposeful collection of student work that exhibits a student’s effort and achievement over a period of time.

31  Written work like lab reports, book reports, research papers, journals, etc.  Oral work like class discussions, panels, debates, simulations, games, etc.  Performances like speeches, role playing, presentations of visual materials, etc.

32  Challenge students to: ◦ Tackle project work regularly and frequently ◦ Judge their own work ◦ Collaborate and converse with others ◦ Distinguish a real audience for their work beyond the classroom teacher ◦ Continue their learning and development over time ◦ Understand what it means to do better

33 Rubrics Specify varying levels of quality for a specific assignment Specify varying levels of quality for a specific assignment Usually used with complex, long-term, performance-based assignments or assessments Usually used with complex, long-term, performance-based assignments or assessments Have two features: Have two features: Specifies what counts – the criteria Specifies what counts – the criteria Illustrates gradations in the quality of work, using descriptors for strong, middling, and problematic student work. Illustrates gradations in the quality of work, using descriptors for strong, middling, and problematic student work.

34 Why Rubrics? Easy to explain Easy to explain Supports learning of meta-cognition through self- assessment, monitoring, and self-management (Goodrich, 1996) Supports learning of meta-cognition through self- assessment, monitoring, and self-management (Goodrich, 1996) Provides students with feedback about strengths and areas for improvement Provides students with feedback about strengths and areas for improvement Supports development of specific skills (e.g., writing - Andrade, 1999) Supports development of specific skills (e.g., writing - Andrade, 1999)

35 How do you develop a rubric? Deconstruct the complex, final performance into subsets of skills Deconstruct the complex, final performance into subsets of skills With students, look at models of good/poor work, work with students to determine what differentiates one from another With students, look at models of good/poor work, work with students to determine what differentiates one from another List the criteria (what counts), considering level of content understanding, process skills, standards, technology, format, etc. List the criteria (what counts), considering level of content understanding, process skills, standards, technology, format, etc. Pack and unpack the criteria until you can formulate and create the categories to be judged (format, organization, items, etc.) Pack and unpack the criteria until you can formulate and create the categories to be judged (format, organization, items, etc.) Generic form Generic form Kid-friendly language Kid-friendly language Articulate levels of quality (Yes, Yes but, No but, No) Articulate levels of quality (Yes, Yes but, No but, No)

36 Sample Criterion: Briefly summarize the plot of the story Yes, I briefly summarized the plot using significant details. Yes, I briefly summarized the plot using significant details. Yes, I summarized the plot but, I included some unnecessary details or left out key information. Yes, I summarized the plot but, I included some unnecessary details or left out key information. No, I didn’t summarize the plot, but I did include some details from the story. No, I didn’t summarize the plot, but I did include some details from the story. No, I didn’t summarize the plot. No, I didn’t summarize the plot.

37  www.http://pblchecklist.4t eachers.org/ www.http://pblchecklist.4t eachers.org/  http://rubistar.4teachers.o rg/index.php http://rubistar.4teachers.o rg/index.php  http://ncsu.edu/midlink/h o.html http://ncsu.edu/midlink/h o.html  http://www.odyssey.on.ca /%7Eelaine.coxon/rubrics. htm http://www.odyssey.on.ca /%7Eelaine.coxon/rubrics. htm  http://www.rubrics4teach ers.com/ http://www.rubrics4teach ers.com/  http://www.teach- nology.com/web_tools/rubr ics/ http://www.teach- nology.com/web_tools/rubr ics/  http://www.idecorp.com/as sessrubric.pdf http://www.idecorp.com/as sessrubric.pdf  http://landmarks4schools.o rg/classweb/tools/rubric_b uilder.php3 http://landmarks4schools.o rg/classweb/tools/rubric_b uilder.php3  http://school.discovery.co m/schrockguide/assess.ht ml http://school.discovery.co m/schrockguide/assess.ht ml

38  Group all items of similar format together  Arrange items from easy to hard  Space the items for easy reading  Keep items and options on the same page  Position illustrations near descriptions  Decide whether to use a separate answer sheet  Check test directions  Provide space for name and date


Download ppt "The process of collecting information about students and classrooms for the purpose of making instructional decisions."

Similar presentations


Ads by Google