Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles.

Similar presentations


Presentation on theme: "Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles."— Presentation transcript:

1 Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles W. Anderson (Michigan State University), Jinnie Choi, Yong Sang Lee, Karen Draney (University of California, Berkeley) Culturally relevant ecology, learning progressions and environmental literacy Long Term Ecological Research Math Science Partnership April 2010 Disclaimer: This research is supported by a grant from the National Science Foundation: Targeted Partnership: Culturally relevant ecology, learning progressions and environmental literacy (NSF-0832173). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

2 Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items Jing Chen Charles W. Anderson Michigan State University Jinnie Choi Yong Sang Lee Karen Draney University of California, Berkeley

3 PURPOSE OF THIS STUDY PURPOSE analyze how well ordered multiple choice (OMC), multiple True or False (MTF) items differentiate students among achievement levels by c omparing students’ responses to these items to their responses to the same problems set in open-ended (OE) formats inform the development of OMC and MTF items to measure students’ achievement levels more reliably OMC and MTF items OMC item-- each of the possible answer choices is linked to a particular developmental level of student understanding for the construct being measured. (Briggs, Alonzo, Schwab, & Wilson, 2006). MTF item-- a set of true or false questions asks students to judge True or False.

4 RESEARCH QUESTIONS How well can OMC and MTF items diagnose students’ achievement levels? How well can our mixed types of items assess students’ performances?

5 Carbon cycle learning progression framework Related studies LEVELSExplaining Progress variable LEVELSNaming Progress Variable 4Linking processes with matter and energy as constraints 4Scientific statements 3Changes of Molecules and Energy Forms with Unsuccessful Constraints 3Scientific words of organic molecules, energy forms, and chemical change 2Force-dynamic accounts with hidden mechanisms 2.5Easier scientific words with mixed meanings 2Hidden mechanism words 1Macroscopic force-dynamic accounts 1.5Easier hidden mechanism words 1Words about actors, enablers, and results Test scores based on OMC items have greater validity than test scores based on TMC items, without sacrificing reliability. There a weak to moderate positive correlation between students’ scores on OMC items and their scores on traditional MC items (Briggs, Alonzo, Schwab, & Wilson, 2006). Compare to OE items, OMC items appear to provide more precise diagnoses of students’ learning progression levels and to be more valid, eliciting students’ conceptions more similarly to cognitive interviews (Alonzo & Steedle, 2008) Inconsistency in students’ responses to items in different formats, students performed differently on multiple-choice and short-answer items (Steedle, 2006) THEORETICAL FRAMEWORK

6 Participants Assessments students in grade 4 to 12 1044 assessment (454 pre, 550 post, 181 elementary, 377 middle, 446 high) collected during 2008-2009 48 items in total include 8 OMC, 8 MTF items and their paired OE items in this study (32 in total) matter energy transformation in 5 macroscopic events and large scale events. RESEARCH METHOD

7 Data Analysis OMC items  cross-tabulation  Spearman rank-order correlation between OMC and OE levels MTF items  cross-tabulation  point-biserial correlation between MTF and OE responses For all items  IRT partial credit model; IRT analysis  qualitative evaluation by a group of researchers RESEARCH METHOD

8 RESULTS OF OMC ITEMS BREAD ITEM A loaf of bread was left uncovered for two weeks. Three different kinds of mold grew on it. Assuming that the bread did not dry out, which of the following is a reasonable prediction of the weight of the bread and mold together? A) The mass has increased, because the mold has grown. (Level 1) B) The mass remains the same as the mold converts bread into biomass. (level 2) C) The mass decreases as the growing mold converts bread into energy. (level 2) D) The mass decreases as the mold converts bread into biomass and gases. (level 3) OE response levelsOMC response levels 1 (A)2 (B,C)3 (D) 0134 142152 212432 30012

9 RESULTS OF OMC ITEMS Correlation between OMC scores and paired OE item scores OMC ITEM PAIRED OE SAMPLE SIZE CORRELATI ON WTLOSWTLOS_N306.272** WTLOS_E306.342** BODYTEBODYTE_N138.313** BODYTE_E138.476** BREADBREAD_N137.581** BREAD_E137.651** MATCHMATCH_N283.217** MATCH_E283.258** LEAVESLEAVES_N306.300** LEAVES_E306.320** STORENSTOREN_E287.360** TROPTROP_E269.411** LIGHTLIGHT_E289.219** TOTALTOTAL_N.654** TOTAL_E.755** There are weak to moderate positive correlations between students’ OMC levels and their OE levels. The correlation between overall OMC scores and overall OE naming scores is. 654, and the correlation between overall OMC scores and overall OE explaining scores.755. ** correlation is significant at the.01 level (2-tailed)

10 RESULTS OF OMC ITEMS Reliability of OMC and OE items OMC (5 items administrated at high school form C) OE (5 paired items administrated at high school form C) Cronbach’s alpha (N of cases =138).441.704 Expected reliability for a longer test (using spearman-brown prophecy formula) 0.703 (a test with 15 OMC items)

11 RESULTS OF OMC ITEMS Item difficulties for OMC items, paired OE items (naming and explaining) ****** Test information comparison (one test includes 8 OMC items only, the other included 8 paired OE items only) Compare the item difficulties for 8 OMC and 8 paired OE items (both naming and explaining), OMC items are easier than OE naming, which is easier than OE explaining. The test that contains 8 OE items will get more information at the higher ability range compare to a test that includes 8 paired OMC items.

12 RESULTS OF MTF ITEMS Students’ choices to “sunlight”, “air”, and “plant create their own energy” do not have significant correlation to their levels in the paired OE item. Students who circled N for water (water0), N for nutrients (nutrients0) are more likely to be at higher levels compare to students who circled Y for these 2 indicators. PLANT ENERGY: Which of the following are sources of energy for plants? Circle yes or no for each of the following: a). Water Yes / No b). Light Yes / No c). Air Yes / No d). Nutrients in soil Yes / No e). They make their own energy Yes / No Explain what you think is energy for plants

13 MTF ITEMS (INDICATORS) RESULTS OF MTF ITEMS MTF ITEMSABCDEFG TREEG ROWTH SUNWATERAIRSOIL. HUMAN ENERGY SUNWATERNUTRIENTSFOODEXERCO2O2 NNYNNN PLANT ENERGY SUNWATERAIRNUTRIENTSOWN ENERGY NN HUMAN GROW SUNWATERAIRFOOD. RUN FOODWATERAIRENERGYSLEEP Y CARBON SOILPLANTCO2INSECTENERGY ANIMAL WINTER HEATWATER+ GAS WASTEOTHER MATERIALS Y STONE WARMRUNCLOTHFOOD Y

14 The classical item discrimination index shows the correlation between students’ scores on this item and their total scores. There is a moderate to strong correlation between students’ score on most of the items to their total score. The red dot items are the OMC items. They generally show lower correlation between students’ scores on the OMC item and their total scores compare to the OE items. RESULTS OF ALL ITEMS

15 Average person location for each score All the items are fitting well to the model (for each item, the weighted MNSQ is within the range from.77 to 1.33). This graph indicate that our assessment items can fairly differentiate students with different abilities among levels.

16 CONCLUSIONS and IMPLICATIONS Conclusions OMC items are generally easier than OE items, OMC items do not measure students at high ability levels as precise as OE items do OMC items have weak to moderate correlation with OE items that measure the same content lower reliability compare to OE items. The choices in many MTF items did not associate with students’ levels in OE format The entire assessment differentiate students among levels well 1. Though it’s hard to write OMC options at higher achievement levels without using “science-y” terminologies, we still need to develop options for students at high level. 2. The design of MTF items need to be informed by more research. So the options should be better indicators that actually differentiate students among levels. 3. The number of OMC items in a test should be relatively large in order to get equivalent amount of test information (test reliability) compare to a test with OE items or with mixed types of items. Implications

17 Thank you for your attention!

18 abcd 1851117 2316510 300 1 40010

19 WHY STUDY THIS? OMC and MTF items can be more easily used in large-scale assessments compare to OE items OMC, MTF items are informed by educational research, options in OMC items represent typical students’ understandings, from the least to most scientifically sophisticated accounts, found in science education research. OMC and MTF items are more diagnostic compare to traditional MC and T or F items, can be used to provide teachers quick feedbacks. however, OMC and MTF items do not give students opportunity to create their own responses given the advantages and limitations of the OMC and MTF items, the analysis of how well the OMC and MTF items can diagnose students’ achievement levels can direct the future use of OMC and MTF items.


Download ppt "Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles."

Similar presentations


Ads by Google