Presentation on theme: "Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology."— Presentation transcript:
Michael C. Rodriguez Formative Assessments that Support Assessment for Learning (DDDM) Quantitative Methods in Education Department of Educational Psychology
Measurement Essentials New attention in the measurement community is on building ASSESSMENT FOR LEARNING in all forms of assessment including large scale assessment and classroom assessment
Assessment for Learning Assessment should be consistent with our understanding of learning in the subject matter – we need a model of learning to provide a guide for assessment design.
Model of Learning Effective assessment for learning requires a model of learning the subject matter. For example, research on the development of statistical reasoning or development of specific skills and understanding statistics content will provide the background needed to develop strong assessments.
Model of Learning A model of learning can describe the learning process, development stages of understanding, knowing, and doing A model of learning can distinguish novice learners from expert learners; identifying the nature of proficiency and prerequisite skills for progression
Model of Learning A model of learning can provide keys to the kinds of knowledge and skills that are required for achieving content standards or learning objectives. This allows us to describe the features of tasks that illuminate these aspects of knowledge and skills
Learning Statistics Ideas from research on teaching and learning statistics include: 1.Importance of context 2.Importance of sequencing tasks and knowledge structures 3.Importance of using multiple representations of ideas and concepts
Essential: Blueprint 2.Create an assessment blueprint a.Content to be covered b.Cognitive tasks to be assessed c.Format of items d.Number of items (given time limits)
Quality MC Items ContentKnowledgeComprehensionApplicationTotal Central Tendency 25% Variability50% Shape of Distribution 25% Total20%30%50%
Essential: Item Quality 3.Design effective items & tasks a.Use accepted principles of item writing b.Tryout new item types c.Review items prior to use – peer review
Writing MC Items Questions should require students to consider novel contexts Use reference materials (graphical displays) that are authentic Options should be plausible – common errors or misconceptions Use only the number of options you need or can develop (3 is sufficient)
Developing CR Items Use CR tasks to assess thinking and skills that cannot easily be measured by MC items (worth the cost and effort) Assumptions necessary to respond correctly should be related to the content demands of the assessment Avoid ambiguous task features – provide full opportunity for students to perform – let them know what is expected
Assessments for Learning Formative assessments are specifically designed to support, enhance, and improve learning. Assessments are only formative if they can inform teaching and learning – requiring a feedback loop to students and teachers.
Formative Assessments: Provide an organizational framework for content, knowledge, skills – organize content based on the structure of the assessment. Confirm “storage” of knowledge by solidifying the connections among different pieces of knowledge. Shape study behavior. Enhance academic motivation and effort through provision of feedback.
Formative Assessments: Enhance the quality and strength of skills by providing unique opportunities to display knowledge and skills. Explicitly articulate and communicate learning objectives and achievement targets – typically vaguely defined by teachers. Confirm the importance of hard work, time spent studying, and effort.
Formative Assessments: Demonstrate the kinds of thinking and processes valued by the instructor. Allow students to communicate their thinking about the content and process, convey understanding and misunderstanding. Confirm one’s own level of understanding and ability to respond on demand. Provide opportunities for students to identify their own strengths and weaknesses.
References Embretson, S.E. (2007). Construct validity: A universal validity system or just another test evaluation procedure? Educational Research, 38(8), Rodriguez, M.C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24(2), Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3),
Additional References Popham, J. (2003). Test better, teach better: The instructional role of assessment. Alexandria, VA: Association for Supervision and Curriculum Development. Brookhart, S. (2009). Grading (2 nd edition). Allyn Y Bacon. Instructional Tools in Educational Measurement and Statistics (ITEMS) for School Personnel
Table 1. Most Commonly Cited Reasons that MN P–12 Educators Engaged in Data-Driven Decision Making Reason Count Provides individualized effort and intervention to students33 Determines if school goals are being met30 Assesses current and future needs–proactive planning27 Engages in continuous improvement19 Identifies causes of problems17 Decides what needs to change16 Meets accountability requirements of NCLB14 Aligns instruction (or work) to standards, goals, objectives13 Places student or determines eligibility for special services11
Table 2. Most Commonly Cited Kinds of Data that MN P–12 Educators Used in Decision-Making Processes Kinds of DataCount State test results26 Informal assessments25 Classroom assessments23 Other achievement test data21 District test results19 Student background information17 Behavior records16 Classroom grades12 Attendance11 Previous student-school history information10 Student surveys10 Discipline referrals5
Table 3.1. Outcomes Achieved through Data-Driven Decision Making Cited by MN P–12 Educators Improved transition times to alleviate issues during passing time Instituted periodic data-review meetings Uncovered motivation issues from discrepancies between Minnesota Comprehensive Assessments and district assessments Improved summaries of great quantities of information through effective graphical displays Observed improvements in academic and behavioral outcomes Eliminated some individualized intervention programs Improved identification of students struggling with specific academic content
Table 3.2. Outcomes Achieved through Data-Driven Decision Making Cited by MN P–12 Educators Improved school improvement plans with stronger objectives Reduced special education referrals, inappropriate referrals Improved communication with parents Increased graduation rates and number of National Merit Scholars Increased scores on Advanced Placement exams Achieved 100% graduation of English-language learners Identified and implemented effective reading program, targeting the skills needed to improve performance Developed study skills plans for middle-school students Received positive parent feedback and response to data presented in informative ways
Table 4.1. Recommendations for Improving School- Based DDDM Cited by MN P–12 Educators Expand the use of achievement data Move data from discussions about school Adequate Yearly Progress (a requirement of the No Child Left Behind legislation) to student performance and focus more attention on those students with the greatest needs Provide ongoing training and support to engage in DDDM for all school staff Implement building-wide progress-monitoring practices Improve reporting time of achievement data collected at the state and district levels
Table 4.2. Recommendations for Improving School- Based DDDM Cited by MN P–12 Educators Improve methods of data presentations for teachers, innovative data displays that make important features of the data meaningful and accessible Implement a model of Response to Intervention approach to target interventions to students who need them most Develop longitudinal models of student progress on class work that provide more diagnostic information about strengths and weaknesses Gather student engagement data to address solution strategies