Presentation is loading. Please wait.

Presentation is loading. Please wait.

Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology.

Similar presentations


Presentation on theme: "Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology."— Presentation transcript:

1 Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology

2 Measurement Essentials New attention in the measurement community is on building ASSESSMENT FOR LEARNING in all forms of assessment including large scale assessment and classroom assessment

3 Assessment for Learning Assessment should be consistent with our understanding of learning in the subject matter – we need a model of learning to provide a guide for assessment design.

4 Model of Learning Effective assessment for learning requires a model of learning the subject matter. For example, research on the development of statistical reasoning or development of specific skills and understanding statistics content will provide the background needed to develop strong assessments.

5 Model of Learning A model of learning can describe the learning process, development stages of understanding, knowing, and doing A model of learning can distinguish novice learners from expert learners; identifying the nature of proficiency and prerequisite skills for progression

6 Model of Learning A model of learning can provide keys to the kinds of knowledge and skills that are required for achieving content standards or learning objectives. This allows us to describe the features of tasks that illuminate these aspects of knowledge and skills

7 Learning Statistics Ideas from research on teaching and learning statistics include: 1.Importance of context 2.Importance of sequencing tasks and knowledge structures 3.Importance of using multiple representations of ideas and concepts

8 Essential: Purpose 1.Clearly define your purpose a.Progress Monitoring (formative assessment) b.Objective/Instructional Feedback c.Grading (summative assessment) d.Placement

9 Essential: Blueprint 2.Create an assessment blueprint a.Content to be covered b.Cognitive tasks to be assessed c.Format of items d.Number of items (given time limits)

10 Quality MC Items ContentKnowledgeComprehensionApplicationTotal Central Tendency 25% Variability50% Shape of Distribution 25% Total20%30%50%

11 Essential: Item Quality 3.Design effective items & tasks a.Use accepted principles of item writing b.Tryout new item types c.Review items prior to use – peer review

12 Writing MC Items Questions should require students to consider novel contexts Use reference materials (graphical displays) that are authentic Options should be plausible – common errors or misconceptions Use only the number of options you need or can develop (3 is sufficient)

13 Developing CR Items Use CR tasks to assess thinking and skills that cannot easily be measured by MC items (worth the cost and effort) Assumptions necessary to respond correctly should be related to the content demands of the assessment Avoid ambiguous task features – provide full opportunity for students to perform – let them know what is expected

14 Assessments for Learning Formative assessments are specifically designed to support, enhance, and improve learning. Assessments are only formative if they can inform teaching and learning – requiring a feedback loop to students and teachers.

15 Formative Assessments: Provide an organizational framework for content, knowledge, skills – organize content based on the structure of the assessment. Confirm “storage” of knowledge by solidifying the connections among different pieces of knowledge. Shape study behavior. Enhance academic motivation and effort through provision of feedback.

16 Formative Assessments: Enhance the quality and strength of skills by providing unique opportunities to display knowledge and skills. Explicitly articulate and communicate learning objectives and achievement targets – typically vaguely defined by teachers. Confirm the importance of hard work, time spent studying, and effort.

17 Formative Assessments: Demonstrate the kinds of thinking and processes valued by the instructor. Allow students to communicate their thinking about the content and process, convey understanding and misunderstanding. Confirm one’s own level of understanding and ability to respond on demand. Provide opportunities for students to identify their own strengths and weaknesses.

18 References Embretson, S.E. (2007). Construct validity: A universal validity system or just another test evaluation procedure? Educational Research, 38(8), 449-455. Rodriguez, M.C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24(2), 3-13. Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309-334.

19 Additional References Popham, J. (2003). Test better, teach better: The instructional role of assessment. Alexandria, VA: Association for Supervision and Curriculum Development. Brookhart, S. (2009). Grading (2 nd edition). Allyn Y Bacon. Instructional Tools in Educational Measurement and Statistics (ITEMS) for School Personnel http://items.education.ucsb.edu/pages/modules.html

20 Table 1. Most Commonly Cited Reasons that MN P–12 Educators Engaged in Data-Driven Decision Making Reason Count Provides individualized effort and intervention to students33 Determines if school goals are being met30 Assesses current and future needs–proactive planning27 Engages in continuous improvement19 Identifies causes of problems17 Decides what needs to change16 Meets accountability requirements of NCLB14 Aligns instruction (or work) to standards, goals, objectives13 Places student or determines eligibility for special services11

21 Table 2. Most Commonly Cited Kinds of Data that MN P–12 Educators Used in Decision-Making Processes Kinds of DataCount State test results26 Informal assessments25 Classroom assessments23 Other achievement test data21 District test results19 Student background information17 Behavior records16 Classroom grades12 Attendance11 Previous student-school history information10 Student surveys10 Discipline referrals5

22 Table 3.1. Outcomes Achieved through Data-Driven Decision Making Cited by MN P–12 Educators Improved transition times to alleviate issues during passing time Instituted periodic data-review meetings Uncovered motivation issues from discrepancies between Minnesota Comprehensive Assessments and district assessments Improved summaries of great quantities of information through effective graphical displays Observed improvements in academic and behavioral outcomes Eliminated some individualized intervention programs Improved identification of students struggling with specific academic content

23 Table 3.2. Outcomes Achieved through Data-Driven Decision Making Cited by MN P–12 Educators Improved school improvement plans with stronger objectives Reduced special education referrals, inappropriate referrals Improved communication with parents Increased graduation rates and number of National Merit Scholars Increased scores on Advanced Placement exams Achieved 100% graduation of English-language learners Identified and implemented effective reading program, targeting the skills needed to improve performance Developed study skills plans for middle-school students Received positive parent feedback and response to data presented in informative ways

24 Table 4.1. Recommendations for Improving School- Based DDDM Cited by MN P–12 Educators Expand the use of achievement data Move data from discussions about school Adequate Yearly Progress (a requirement of the No Child Left Behind legislation) to student performance and focus more attention on those students with the greatest needs Provide ongoing training and support to engage in DDDM for all school staff Implement building-wide progress-monitoring practices Improve reporting time of achievement data collected at the state and district levels

25 Table 4.2. Recommendations for Improving School- Based DDDM Cited by MN P–12 Educators Improve methods of data presentations for teachers, innovative data displays that make important features of the data meaningful and accessible Implement a model of Response to Intervention approach to target interventions to students who need them most Develop longitudinal models of student progress on class work that provide more diagnostic information about strengths and weaknesses Gather student engagement data to address solution strategies


Download ppt "Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology."

Similar presentations


Ads by Google