Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Assessment types and activities
Test Development.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
CH. 9 MEASUREMENT: SCALING, RELIABILITY, VALIDITY
DQC Workshop Detroit Airport Westin - November 20, 2010 Wright Room, Westin Hotel.
An inquiry learning progression for carbon-transforming processes Dr. Jenny Dauer Michigan State University Department Teacher Education.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Measurement and Data Quality
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Teaching Experiments and a Carbon Cycle Learning Progression 2009 AERA Presentation Written by: Lindsey Mohan and Andy Anderson (Michigan State University)
A Cross-cultural Study: Comparing Learning Progression for Carbon-Transforming Processes of American and Chinese Students 2010 NARST Presentation Written.
A Framework for Inquiry-Based Instruction through
Elementary Assessment Data Update Edmonds School District January 2013.
Instrumentation.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Unanswered Questions in Typical Literature Review 1. Thoroughness – How thorough was the literature search? – Did it include a computer search and a hand.
Angela H. DeBarger PI Carlos Ayala PI Jim Minstrell PI Chemistry Facets: Formative Assessment to Improve Student Understanding in Chemistry This material.
Completion, Short-Answer, and True-False Items
Teachers’ Uses of Learning Progression- Based Tools for Reasoning in Teaching about Water in Environmental Systems Kristin L. Gunckel, University of Arizona.
Learning Progressions in Environmental Science Literacy Presentation at the Learning Progressions in Science (LeaPS) Conference, Iowa City, IA. Written.
Overview: Using Learning Progressions Research to Teach for Environmental Science Literacy Presentation Written by: Anderson, C.W (Michigan State University)
Karen Draney (University of California, Berkeley) Lindsey Mohan (Michigan State University) Philip Piety (University of Michigan) Jinnie Choi (University.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Endangered Species Conservation as a Context for Understanding Student Thinking about Genetic Diversity 2011 NARST Presentation Written by: Shawna McMahon.
Learning Progressions Immersion Activity Power point presented to teachers during professional development to help teachers learn about learning progressions.
This research is supported in part by three grants from the National Science Foundation: Developing a research-based learning progression for the role.
EDU 8603 Day 6. What do the following numbers mean?
Carbon Dioxide Process Tool Power point Presentation to accompany Carbon Teaching Experiment Written by: Jonathon Schramm A, Eric Keeling B, Dijanna Figueroa.
Promise and Problems of Learning Progression-guided Interventions Hui Jin, Hyo Jeong Shin, Michele Johnson, Jinho Kim.
Understanding of Carbon Cycling: An Interview Study in the US and China 2009 NARST Presentation Written by : Hui Jin, Li Zhan, Charles W. Anderson (Michigan.
A comparison study on American and Chinese secondary students’ learning progression for carbon cycling in socio- ecological systems 2009 AERA Presentation.
School Improvement and Educational Accountability M. David Miller University of Florida.
ENVIRONMENTAL LITERACY PROJECT This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
Analyzing students’ learning performances in terms of practices for developing accounts Hui Jin, Jiwon Kim and Charles W. Anderson.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning Progression for the Role of Carbon.
This research is supported in part by three grants from the National Science Foundation: Developing a research-based learning progression for the role.
ENVIRONMENTAL LITERACY Developing a Learning Progression for Energy and Causal Reasoning in Socio-ecological Systems 2010 NARST Presentation Written by:
Student Understanding of Species Diversity & Function in Ecosystems 2011 NARST Presentation Written by: Jonathon Schramm and Brook Wilke (Michigan State.
The Effects of Teaching Materials and Teachers’ Approaches on Student Learning about Carbon- transforming Processes Li Zhan, Dante Cisterna, Jennifer Doherty,
Connections between students’ explanations and interpretations of arguments from evidence Allison L. Freed 1, Jenny M. Dauer 1,2, Jennifer H. Doherty 1,
ENVIRONMENTAL LITERACY PROJECT This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning.
Learning Progressions to Inform the Development of Standards 2009 AERA Presentation Written by: Charles W. (Andy) Anderson & Lindsey Mohan (Michigan State.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Investigating Mass Gain and Mass Loss Power point Power point to accompany Carbon Teaching Experiment Written by: Jonathon Schramm A, Eric Keeling B, Dijanna.
Defining an Occasion of Sensemaking
Long Term Ecological Research Math Science Partnership
Tracing Matter Process Tools
Validity and Reliability
Reliability & Validity
Charles W. Anderson Michigan State University
Powers of Ten—Air Power Point Presentation
Cellular Respiration Power point
American and Chinese Secondary Students’ Written Accounts of Carbon Cycling in Socio-ecological Systems Jing Chen1, Charles, W. Anderson1, & Xinghua Jin2.
General Level Structure
Written by: Jennifer Doherty, Cornelia Harris, Laurel Hartley
(Michigan State University)
Teaching Experiments and a Carbon Cycle Learning Progression
Jennifer Doherty, Karen Draney and Andy Anderson
Long Term Ecological Research Math Science Partnership
Powers of 10 Poster with animation
Validation of a Multi-Year Carbon Cycle Learning Progression
Supporting Material for the Biodiversity Teaching Experiment
Components of Productive Level 3 Reasoning
EPAS Educational Planning and Assessment System By: Cindy Beals
Vernier Probe Difficulties Power point (Sample Data)
2009 AERA Annual Meeting, San Diego
Presentation transcript:

Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles W. Anderson (Michigan State University), Jinnie Choi, Yong Sang Lee, Karen Draney (University of California, Berkeley) Culturally relevant ecology, learning progressions and environmental literacy Long Term Ecological Research Math Science Partnership April 2010 Disclaimer: This research is supported by a grant from the National Science Foundation: Targeted Partnership: Culturally relevant ecology, learning progressions and environmental literacy (NSF ). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items Jing Chen Charles W. Anderson Michigan State University Jinnie Choi Yong Sang Lee Karen Draney University of California, Berkeley

PURPOSE OF THIS STUDY PURPOSE analyze how well ordered multiple choice (OMC), multiple True or False (MTF) items differentiate students among achievement levels by c omparing students’ responses to these items to their responses to the same problems set in open-ended (OE) formats inform the development of OMC and MTF items to measure students’ achievement levels more reliably OMC and MTF items OMC item-- each of the possible answer choices is linked to a particular developmental level of student understanding for the construct being measured. (Briggs, Alonzo, Schwab, & Wilson, 2006). MTF item-- a set of true or false questions asks students to judge True or False.

RESEARCH QUESTIONS How well can OMC and MTF items diagnose students’ achievement levels? How well can our mixed types of items assess students’ performances?

Carbon cycle learning progression framework Related studies LEVELSExplaining Progress variable LEVELSNaming Progress Variable 4Linking processes with matter and energy as constraints 4Scientific statements 3Changes of Molecules and Energy Forms with Unsuccessful Constraints 3Scientific words of organic molecules, energy forms, and chemical change 2Force-dynamic accounts with hidden mechanisms 2.5Easier scientific words with mixed meanings 2Hidden mechanism words 1Macroscopic force-dynamic accounts 1.5Easier hidden mechanism words 1Words about actors, enablers, and results Test scores based on OMC items have greater validity than test scores based on TMC items, without sacrificing reliability. There a weak to moderate positive correlation between students’ scores on OMC items and their scores on traditional MC items (Briggs, Alonzo, Schwab, & Wilson, 2006). Compare to OE items, OMC items appear to provide more precise diagnoses of students’ learning progression levels and to be more valid, eliciting students’ conceptions more similarly to cognitive interviews (Alonzo & Steedle, 2008) Inconsistency in students’ responses to items in different formats, students performed differently on multiple-choice and short-answer items (Steedle, 2006) THEORETICAL FRAMEWORK

Participants Assessments students in grade 4 to assessment (454 pre, 550 post, 181 elementary, 377 middle, 446 high) collected during items in total include 8 OMC, 8 MTF items and their paired OE items in this study (32 in total) matter energy transformation in 5 macroscopic events and large scale events. RESEARCH METHOD

Data Analysis OMC items  cross-tabulation  Spearman rank-order correlation between OMC and OE levels MTF items  cross-tabulation  point-biserial correlation between MTF and OE responses For all items  IRT partial credit model; IRT analysis  qualitative evaluation by a group of researchers RESEARCH METHOD

RESULTS OF OMC ITEMS BREAD ITEM A loaf of bread was left uncovered for two weeks. Three different kinds of mold grew on it. Assuming that the bread did not dry out, which of the following is a reasonable prediction of the weight of the bread and mold together? A) The mass has increased, because the mold has grown. (Level 1) B) The mass remains the same as the mold converts bread into biomass. (level 2) C) The mass decreases as the growing mold converts bread into energy. (level 2) D) The mass decreases as the mold converts bread into biomass and gases. (level 3) OE response levelsOMC response levels 1 (A)2 (B,C)3 (D)

RESULTS OF OMC ITEMS Correlation between OMC scores and paired OE item scores OMC ITEM PAIRED OE SAMPLE SIZE CORRELATI ON WTLOSWTLOS_N ** WTLOS_E ** BODYTEBODYTE_N ** BODYTE_E ** BREADBREAD_N ** BREAD_E ** MATCHMATCH_N ** MATCH_E ** LEAVESLEAVES_N ** LEAVES_E ** STORENSTOREN_E ** TROPTROP_E ** LIGHTLIGHT_E ** TOTALTOTAL_N.654** TOTAL_E.755** There are weak to moderate positive correlations between students’ OMC levels and their OE levels. The correlation between overall OMC scores and overall OE naming scores is. 654, and the correlation between overall OMC scores and overall OE explaining scores.755. ** correlation is significant at the.01 level (2-tailed)

RESULTS OF OMC ITEMS Reliability of OMC and OE items OMC (5 items administrated at high school form C) OE (5 paired items administrated at high school form C) Cronbach’s alpha (N of cases =138) Expected reliability for a longer test (using spearman-brown prophecy formula) (a test with 15 OMC items)

RESULTS OF OMC ITEMS Item difficulties for OMC items, paired OE items (naming and explaining) ****** Test information comparison (one test includes 8 OMC items only, the other included 8 paired OE items only) Compare the item difficulties for 8 OMC and 8 paired OE items (both naming and explaining), OMC items are easier than OE naming, which is easier than OE explaining. The test that contains 8 OE items will get more information at the higher ability range compare to a test that includes 8 paired OMC items.

RESULTS OF MTF ITEMS Students’ choices to “sunlight”, “air”, and “plant create their own energy” do not have significant correlation to their levels in the paired OE item. Students who circled N for water (water0), N for nutrients (nutrients0) are more likely to be at higher levels compare to students who circled Y for these 2 indicators. PLANT ENERGY: Which of the following are sources of energy for plants? Circle yes or no for each of the following: a). Water Yes / No b). Light Yes / No c). Air Yes / No d). Nutrients in soil Yes / No e). They make their own energy Yes / No Explain what you think is energy for plants

MTF ITEMS (INDICATORS) RESULTS OF MTF ITEMS MTF ITEMSABCDEFG TREEG ROWTH SUNWATERAIRSOIL. HUMAN ENERGY SUNWATERNUTRIENTSFOODEXERCO2O2 NNYNNN PLANT ENERGY SUNWATERAIRNUTRIENTSOWN ENERGY NN HUMAN GROW SUNWATERAIRFOOD. RUN FOODWATERAIRENERGYSLEEP Y CARBON SOILPLANTCO2INSECTENERGY ANIMAL WINTER HEATWATER+ GAS WASTEOTHER MATERIALS Y STONE WARMRUNCLOTHFOOD Y

The classical item discrimination index shows the correlation between students’ scores on this item and their total scores. There is a moderate to strong correlation between students’ score on most of the items to their total score. The red dot items are the OMC items. They generally show lower correlation between students’ scores on the OMC item and their total scores compare to the OE items. RESULTS OF ALL ITEMS

Average person location for each score All the items are fitting well to the model (for each item, the weighted MNSQ is within the range from.77 to 1.33). This graph indicate that our assessment items can fairly differentiate students with different abilities among levels.

CONCLUSIONS and IMPLICATIONS Conclusions OMC items are generally easier than OE items, OMC items do not measure students at high ability levels as precise as OE items do OMC items have weak to moderate correlation with OE items that measure the same content lower reliability compare to OE items. The choices in many MTF items did not associate with students’ levels in OE format The entire assessment differentiate students among levels well 1. Though it’s hard to write OMC options at higher achievement levels without using “science-y” terminologies, we still need to develop options for students at high level. 2. The design of MTF items need to be informed by more research. So the options should be better indicators that actually differentiate students among levels. 3. The number of OMC items in a test should be relatively large in order to get equivalent amount of test information (test reliability) compare to a test with OE items or with mixed types of items. Implications

Thank you for your attention!

abcd

WHY STUDY THIS? OMC and MTF items can be more easily used in large-scale assessments compare to OE items OMC, MTF items are informed by educational research, options in OMC items represent typical students’ understandings, from the least to most scientifically sophisticated accounts, found in science education research. OMC and MTF items are more diagnostic compare to traditional MC and T or F items, can be used to provide teachers quick feedbacks. however, OMC and MTF items do not give students opportunity to create their own responses given the advantages and limitations of the OMC and MTF items, the analysis of how well the OMC and MTF items can diagnose students’ achievement levels can direct the future use of OMC and MTF items.