Presentation is loading. Please wait.

Presentation is loading. Please wait.

R.K. Thornton Effective methods for the use, creation, analysis, and interpretation of short- answer student conceptual evaluations. Ronald Thornton Professor.

Similar presentations


Presentation on theme: "R.K. Thornton Effective methods for the use, creation, analysis, and interpretation of short- answer student conceptual evaluations. Ronald Thornton Professor."— Presentation transcript:

1 R.K. Thornton Effective methods for the use, creation, analysis, and interpretation of short- answer student conceptual evaluations. Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

2 R.K. Thornton What was I thinking? I’ll paint your house and walk your dog as well. I’ll paint your house and walk your dog as well.

3 R.K. Thornton In Defense of Thoughtful Multiple Choice Conceptual Assessment Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

4 R.K. Thornton Modest Suggestions from a Chemically Illiterate Physicist. Ronald Thornton Professor of Physics and Education Center for Science & Math Teaching Tufts University

5 R.K. Thornton Center for Science and Math Teaching Tufts University Educational Research Computer Tool Development Curriculum Development Teacher & Professor Education

6 R.K. Thornton FundingFunding NSF National Science Foundation FIPSE Fund for the Improvement of Post Secondary Education US Department of Education

7 R.K. Thornton Wouldn’t it be nice if teachers could understand what students know from a simple conceptual evaluation? and they knew what to do to help the student learn

8 R.K. Thornton What use might this talk be? If you intend to develop a chemistry concept inventory these suggestions may help you make it more useful. If you intend to develop a chemistry concept inventory these suggestions may help you make it more useful. If you intend to use a chemistry concept inventory these ideas should help you pick a useful one. If you intend to use a chemistry concept inventory these ideas should help you pick a useful one.

9 R.K. Thornton We have spent years Creating effective learning environments for introductory science(physics) courses (curricula, tools, pedagogical methods, group structures) And developing methods of conceptual evaluation to measure student learning and guide our progress.

10 R.K. Thornton Why Multiple Choice? More easily administered to large numbers of students. More easily administered to large numbers of students. Evaluation takes less time. Evaluation takes less time. Student responses can be reliably evaluated even by the inexperienced. Student responses can be reliably evaluated even by the inexperienced. Can be designed to guide instruction. Can be designed to guide instruction. With proper construction, student views can be evaluated from the pattern of answers, changes over time can be seen, frequency of student views can be measured. With proper construction, student views can be evaluated from the pattern of answers, changes over time can be seen, frequency of student views can be measured. Multiple choice combined with open response can help the teacher/researcher explicate the students response. Multiple choice combined with open response can help the teacher/researcher explicate the students response.

11 R.K. Thornton Why not? Every “good” educator knows multiple choice questions are no good. Every “good” educator knows multiple choice questions are no good. Badly constructed multiple choice can give misleading results. Badly constructed multiple choice can give misleading results. Unless very carefully constructed, multiple choice will not identify student thinking. Unless very carefully constructed, multiple choice will not identify student thinking. The choices may be inappropriate when used with different audiences The choices may be inappropriate when used with different audiences

12 R.K. Thornton First steps Why do you want to make (use) a conceptual evaluation? Why do you want to make (use) a conceptual evaluation? In what conceptual area do you want to know how students think? In what conceptual area do you want to know how students think?

13 R.K. Thornton Why? There are pre-requisite areas of conceptual knowledge that students need to know to actually understand chemistry. There are pre-requisite areas of conceptual knowledge that students need to know to actually understand chemistry.

14 R.K. Thornton What? Three modest suggestions. Explore student beliefs in the atomic nature of matter. (students may say atoms exist but few believe it in any functional matter) Explore student beliefs in the atomic nature of matter. (students may say atoms exist but few believe it in any functional matter) Explore student beliefs the dynamic nature of equilibrium. (Most students seem to have a static model) Explore student beliefs the dynamic nature of equilibrium. (Most students seem to have a static model) Explore student beliefs about the difference between heat energy and temperature. (Most students do not clearly make this distinction.) Explore student beliefs about the difference between heat energy and temperature. (Most students do not clearly make this distinction.)

15 R.K. Thornton Our research has shown. Student conceptual responses can be context dependent. Student conceptual responses can be context dependent. Student domains of applicability can be different from those of a scientist. Student domains of applicability can be different from those of a scientist. Students (and scientists) can hold apparently inconsistent views simultaneously. (and it doesn’t mean they are stupid.) Students (and scientists) can hold apparently inconsistent views simultaneously. (and it doesn’t mean they are stupid.) Conceptual transitions are not instantaneous. Conceptual transitions are not instantaneous. There is statistical evidence of a hierarchy of student conceptual views. There is statistical evidence of a hierarchy of student conceptual views. You can do more with large-scale conceptual evaluation than just generating a single number. You can do more with large-scale conceptual evaluation than just generating a single number.

16 R.K. Thornton Good Practice for the Construction of Conceptual Multiple Choice All answers, "right or wrong," should help evaluate student views. All answers, "right or wrong," should help evaluate student views. Derive the choices in the questions from from student answers to free response questions and from student interviews. Derive the choices in the questions from from student answers to free response questions and from student interviews. Check to see students almost always find an answer that they are satisfied with. Random answers should be few. Check to see students almost always find an answer that they are satisfied with. Random answers should be few. Ask similar questions in different representations. Ask similar questions in different representations. Check results with different student populations. Check results with different student populations.(more)

17 R.K. Thornton Good Practice (continued) Look at correlations among questions and use patterns to understand student thinking. Look at correlations among questions and use patterns to understand student thinking. Understand the implications of “correct” and “incorrect” answers to their performance on other tasks. Understand the implications of “correct” and “incorrect” answers to their performance on other tasks. Check for gender differences Check for gender differences Identify circumstances for “false positive” answers Identify circumstances for “false positive” answers If at all possible, construct the evaluation so it is useful to guide instruction. If at all possible, construct the evaluation so it is useful to guide instruction.

18 R.K. Thornton Multiple Choice Conceptual Evaluation Conceptual evaluation for Conceptual evaluation for  kinematics (description of motion) and  dynamics (force and motion which is well characterized by Newton’s Laws). Force & Motion Conceptual Evaluation (FMCE) Conceptual evaluation for heat energy and temperature Conceptual evaluation for heat energy and temperature The Heat and Temperature Conceptual Evaluation (HTCE) Both developed by the Center for Science and Math Teaching at Tufts

19 R.K. Thornton Using the FMCE as an example Student answers correlate well (well above 90%) with written short answers in which students explain the reason for their choices Student answers correlate well (well above 90%) with written short answers in which students explain the reason for their choices Almost all students pick choices that we can associate with a relatively small number of student models. (Conceptual Dynamics, R.K. Thornton in ICUPE proceedings edited by Redish) Almost all students pick choices that we can associate with a relatively small number of student models. (Conceptual Dynamics, R.K. Thornton in ICUPE proceedings edited by Redish) Testing with smaller student samples shows that those who can pick the “correct” graph under these circumstances are almost equally successful at drawing the graph correctly without being presented with choices. Testing with smaller student samples shows that those who can pick the “correct” graph under these circumstances are almost equally successful at drawing the graph correctly without being presented with choices.

20 R.K. Thornton FMCE as example Because we are able to identify statistically most student views from the pattern of answers (and because there are very few random answers), we are also able to identify students with less common beliefs about motion and follow up with opportunities for interviews or open-ended responses to help us understand student thinking. Because we are able to identify statistically most student views from the pattern of answers (and because there are very few random answers), we are also able to identify students with less common beliefs about motion and follow up with opportunities for interviews or open-ended responses to help us understand student thinking. The use of an easily administered and robust multiple choice test has also allowed us and others to track changes in student views of dynamics and to separate the effects of various curricular changes on student learning. The use of an easily administered and robust multiple choice test has also allowed us and others to track changes in student views of dynamics and to separate the effects of various curricular changes on student learning.

21 R.K. Thornton FMCE as example Use multiple representations  The Force Graph questions require explicit knowledge of coordinate systems and graphs but require little reading.  The Force Sled questions use natural language and make no explicit reference to a coordinate system or graphs.

22 R.K. Thornton

23

24 Comparison with short answer As with all the questions on the test students who answered correctly were also able to describe in words why they picked the answers they did. As with all the questions on the test students who answered correctly were also able to describe in words why they picked the answers they did. Statistically one of the last questions to be answered in a Newtonian manner is the force on a cart rolling up a ramp as it reverses direction at the top (question 9). Statistically one of the last questions to be answered in a Newtonian manner is the force on a cart rolling up a ramp as it reverses direction at the top (question 9).

25 R.K. Thornton Back to best practices. Consider All answers, "right or wrong," should help evaluate student views. All answers, "right or wrong," should help evaluate student views. Derive the choices in the questions from from student answers to free response questions and from student interviews. Derive the choices in the questions from from student answers to free response questions and from student interviews. Check to see students almost always find an answer that they are satisfied with. Random answers should be few. Check to see students almost always find an answer that they are satisfied with. Random answers should be few. Look at correlations among questions and use patterns to understand student thinking. Look at correlations among questions and use patterns to understand student thinking.

26 R.K. Thornton An example from the H&T Conceptual Evaluation Distinguishes different student models for the relationship between heat and temperature. Distinguishes different student models for the relationship between heat and temperature.

27

28 R.K. Thornton Results by category

29 R.K. Thornton What about 1 number results Not my favorite, but useful in some situations Not my favorite, but useful in some situations Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI

30 R.K. Thornton

31 Still one number Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI Let’s compare the performance of 350 RPI students in the beginning physics course on the FMCE and the FCI

32 R.K. Thornton Correlation Coefficient 0.791

33 R.K. Thornton Correlation Coefficient 0.8309

34 R.K. Thornton Are the evaluations the same? Yes? Very high correlations (about 0.8 pre and post with different instructional methods) Yes? Very high correlations (about 0.8 pre and post with different instructional methods) Yes? A high score on one implies a high score on the other. Yes? A high score on one implies a high score on the other. No? FCI fractional scores are almost always higher than FMCE scores No? FCI fractional scores are almost always higher than FMCE scores No? Evaluations are measuring different things No? Evaluations are measuring different things No? A low score on the FMCE (non-Newtonian student) does not imply a low score on the FCI No? A low score on the FMCE (non-Newtonian student) does not imply a low score on the FCI Lets look at a group of non-Newtonian students Lets look at a group of non-Newtonian students

35 R.K. Thornton

36 The conceptual threshold effect (looking at pre-post correlations)

37 R.K. Thornton Tufts University Calculus-based Physics (N=181) FMCE Post vs. Pre 0.00 0.20 0.40 0.60 0.80 1.00 0.000.200.400.600.801.00 Before Instruction Spring 1994 (N=48) Spring 1995 (N=37) Spring 1997 (N=43) Spring 1998 (N=53) Pre/Post Evaluation--The Threshold Effect

38 R.K. Thornton University Physics Courses Before Instruction 100806040200 Velocity Acceleration Force Before Instruction Average College and University Results % of Students Understanding Concepts

39 R.K. Thornton 100806040200 Velocity Acceleration Force Before Instruction After Traditional Instruction Average College and University Results % of Students Understanding Concepts University Physics Courses After Normal Instruction

40 R.K. Thornton We have evidence of substantial, persistent learning of such physical concepts by a large number of students in varied contexts in courses and laboratories that use methods I am about to describe. Such methods also work for students who have traditionally had less success in physics and science courses: women and girls, minority students, and those who are badly prepared. Physics & Science Courses Using New Methods

41 R.K. Thornton University Physics Courses After New Methods 100806040200 Velocity Acceleration Force After New Methods Average College and University Results % of Students Understanding Concepts Before Instruction After Traditional Instruc.

42 R.K. Thornton “I still don’t have all of the answers, but I’m beginning to ask the right questions.” Our Instructional and Assessment Philosophy


Download ppt "R.K. Thornton Effective methods for the use, creation, analysis, and interpretation of short- answer student conceptual evaluations. Ronald Thornton Professor."

Similar presentations


Ads by Google