The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.

Slides:



Advertisements
Similar presentations
Alternative Conceptions and the Geoscience Concept Inventory Heather L. Petcovic - What are alternative conceptions? - The Geoscience Concept Inventory.
Advertisements

Test Development.
Why are we spending time writing assessment reports? 1.Because documenting your process can potentially lead to more trustworthy results and meaningful.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
How Effective Are Interactive Biology Tutorials as Learning Enhancement Tools? Jean Heitz, E. Michelle Capes, Robert Jeanne and Jan Cheetham University.
© McGraw-Hill Higher Education. All rights reserved. Chapter 3 Reliability and Objectivity.
Changing the Culture of Science Teaching at a Large Research University William B. Wood, Katherine Perkins, and Carl Wieman Departments of MCD Biology.
Utilization-focused Assessment in Foundations Curriculum: Examining RCLS 2601: Leisure in Society Clifton E. Watts, PhD Dept. of Recreation & Leisure Studies.
 I would like to thank my dear teachers who have taken all the trouble to come to this remote place in upper Egypt.
Chapter 4 Validity.
Item Analysis Ursula Waln, Director of Student Learning Assessment
Dissemination and Critical Evaluation of Published Research Peg Bottjen, MPA, MT(ASCP)SC.
BASIC STEPS OF CARRYING OUT RESEARCH  Select a research topic.  Formulate a research question/problem/statement of purpose.  A Research Problem is a.
Chapter 7 Correlational Research Gay, Mills, and Airasian
TEMPLATE DESIGN © Measuring Students’ Beliefs about Physics in Saudi Arabia H. Alhadlaq 1, F. Alshaya 1, S. Alabdulkareem.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Physics I MOOC – Educational Outcomes David Lieberman*, Michael Dubson ¶, Katherine Goodman ¶, Ed Johnsen ¶, Jack Olsen ¶ and Noah Finkelstein ¶ * Department.
Research & Statistics Student Learning Assessment comparing Online vs. Live (Face to Face) Instruction.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
 Do non-majors learn genetics at a different rate than majors?  What factors affect how students think about and learn difficult genetics concepts? Jenny.
Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Comparing Generic Student Learning Outcomes among Fresh Graduates in.
Collecting, Presenting, and Analyzing Research Data By: Zainal A. Hasibuan Research methodology and Scientific Writing W# 9 Faculty.
LOG O Development of a diagnostic system using a testing-based approach for strengthening student prior knowledge Computers & Education (September 2011)
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
The Good The Bad & The Ugly Real-Life Examples of the SLO Assessment Report Form With Tips on How to Complete It August 21, 2012.
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Are there “Hidden Variables” in Students’ Initial Knowledge State Which Correlate with Learning Gains? David E. Meltzer Department of Physics and Astronomy.
CRLT GSI Training: Using Online Resources Presented By: Jay Holden GSIs GRADUATE STUDENT INSTRUCTORS +
Cluster 5 Spring 2005 Assessment Results Sociocultural Domain.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
Data analysis was conducted on the conceptions and misconceptions regarding hybrid learning for those faculty who taught in traditional classroom settings.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Numerical Methods Part: Simpson Rule For Integration.
NRTs and CRTs Group members: Camila, Ariel, Annie, William.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
How do these groups of students think about and learn genetics?
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Appraisal and Its Application to Counseling COUN 550 Saint Joseph College For Class # 3 Copyright © 2005 by R. Halstead. All rights reserved.
VALIDITY AND VALIDATION: AN INTRODUCTION Note: I have included explanatory notes for each slide. To access these, you will probably have to save the file.
Philip W. Young Dept. of Chemistry & Engineering Physics, University of Wisconsin-Platteville, Platteville, WI Correlation between FCI Gains and.
Grading and Analysis Report For Clinical Portfolio 1.
Research Problem Integrate Core Concepts and throughout the Biology Curriculum (Vision and Change, AAAS, 2009) Conceptual learning is is an understanding.
RELIABILITY AND VALIDITY OF ASSESSMENT
Diagnostic Testing David E. Meltzer University of Washington.
Learning Assessment Loop Closing Activity in the Biology Department Vivian Navas Biology Department Assessment Coordinator May 2006.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Chapter 10 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
BIOLOGY CONCEPT INVENTORIES: Development and Use.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
Redesigning Introductory Biology Lisa Elfring University of Arizona.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
Research Problem Misconceptions of molecular biology concepts prevents students from understanding and retaining the correct information.
Philip W. Young Dept. of Chemistry & Engineering Physics, University of Wisconsin-Platteville, Platteville, WI Correlation between FCI Gains and.
EVALUATING EPP-CREATED ASSESSMENTS
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
Update on Data Collection and Reporting
Concept of Test Validity
Classroom Analytics.
Greg Miller Iowa State University
Towards a Concept Inventory for Algorithm Analysis Topics
Understanding Statistical Inferences
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Tests are given for 4 primary reasons.
Presentation transcript:

The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative and Dept. of MCD Biology, University of Colorado, Boulder 1566/B17 Abstract We have designed, developed, and validated a 25-question Genetics Concept Assessment (GCA), to test achievement of nine broad learning goals in majors and non-majors undergraduate genetics courses. Written in everyday language with minimal jargon, the GCA is intended for use as a pre-test and post-test (embedded in the final exam) to measure student learning gains. The GCA has been reviewed by genetics experts, validated by student interviews, and taken by over 600 students at three institutions. Normalized learning gains on the GCA are positively correlated with averaged exam scores, suggesting that the GCA measures understanding of topics relevant to instructors. Statistical analysis of our results shows that differences in the item difficulty (P) and item discrimination (D) index values between different questions on pre- and post-tests are useful in combination for identifying concepts that are well or poorly learned during a course. Table 1. Steps in development of the GCA 1. Review literature on common misconceptions in genetics. 2. Interview genetics faculty and develop a set of learning goals that most instructors would consider vital to understanding genetics. 3. Develop and administer a pilot assessment based on known and perceived misconceptions. 4. Reword jargon, replace distracters with student supplied incorrect answers, and rewrite questions answered correctly by more than 70% of students on the pre-test. 5. Validate and revise the GCA through 33 student interviews and input from 10 genetics experts at several institutions. 6. Give current version of the GCA to a total of 607 students in both majors and non-majors genetics courses at three different institutions 7. Statistically evaluate the GCA by measuring item difficulty, item discrimination, and reliability. Physics instruction has been improved by the use of carefully developed multiple-choice concept inventories (CIs) that assess student conceptual understanding. Such assessments have been used to show that student learning can be increased substantially by substituting interactive group activities in class for traditional lectures. For introductory biology, a few CIs are available for comparison of different instructional approaches, but more are needed for different subdisciplines, such as genetics. We developed the Genetics Concept Assessment (GCA) 1 for this purpose. The GCA assesses understanding of basic concepts likely to be taught in courses for both majors and non- majors. It is designed to be administered at the start of a course as a pre-test and at the end of the course as a post-test, in order to measure student learning gains. 2 The development, statistical evaluation, and some uses of the GCA are described in the tables and figures below. Fig. 1. Correlations of pre-test scores, post-test scores, and learning gains vs. average exam score in majors genetics course at U. Colorado. The equation of the trend line and R 2 values are displayed on each graph. Figure 2. Item difficulty index (P) values for each question on the GCA pre-test and post-test. Based on combined responses from 607 students in five different genetics courses. Questions are grouped according to learning goal (LG) (see Table 2). P = fraction of correct answers - lower values indicate more difficult questions). Increases in P from pre-test to post-test indicate learning for corresponding concepts. Figure 3. Item discrimination index (D) values for questions on the GCA pre-test and post-test. Results were calculated from the same data set as in Figure 2. Questions that have higher D values more effectively discriminate between students whose overall test scores identify them as strong or weak students. Questions that show high D values on the pre-test (only strong students answer them correctly) and low D values on the post-test (most students answer correctly, e.g. #8) correspond to concepts on which most students gained understanding during the course. Questions with high D values on both the pre- and post-tests (e.g.#12) correspond to concepts that primarily only the stronger students understood at the end of the course, suggesting a need for improved instruction. Questions are grouped according to learning goal (LG) (see Table 2). Table 2. Learning goals - see table to right of main poster Table 3. Expert opinions on the 25 GCA questions Query to experts Agreement of experts >90% >80% >70% Number of Questions The question tests achievement of the specified learning goal The information in the question is scientifically accurate The question is clear and precise Table 4. Mean pre-test, post-test, and learning gain scores for students, TAs/LAs, and genetics experts 1 Group n 2 Mean Mean Mean pre-test post-test learning gain Students % (0.6) 74% (0.7) 57% (1.0) TAs/LAs 18 77% (3.7) 88% (3.8) 40% (12.1) Experts 10 NA 93% (5.2) NA 1 All scores are shown +/- SE (parentheses). 2 Number of people who took the GCA. Students were enrolled in either majors or non-majors genetics courses at three different institutions. TAs and LAs were graduate and undergraduate students, respectively, at CU. Scores of genetics experts from several institutions who took the GCA are included for comparison. Notes: 1 A complete description of this project, including references to the work cited above, can be found in Smith, M. K., Wood, W. B., and Knight, J. K. (2008) CBE- Life Sciences Education 7: (Also published in the 2008 Highlights Issue.) 2 All learning gains reported here are normalized so that gains for students with differing prior knowledge (pre-test scores) can be meaningfully compared, using the formula = 100 (post-test score - pre-test score)/(100 - pre-test score) where is normalized learning gain as percent of possible gain. Table 5. Statistical criteria for evaluating assessments Criterion Accepted range 1 GCA pre-test GCA post-test Reliability NA Item difficulty index (P) Item discrimination index (D) > Note that the accepted ranges for P and D are for standardized tests such as the SAT. These criteria are not appropriate for evaluation of CIs.* E.g. it is precisely the wide range of D values that makes this statistic useful in judging the success of instruction (Figures 3 and 4). 2 Coefficient of stability, test-retest method. * See reference 1 under Notes, lower left. Conclusion. The GCA is a validated, reliable conceptual assessment instrument that can be used by genetics instructors to measure their students’ learning gains as well as to identify strengths and weaknesses in their teaching approaches for specific concepts. We hope that use of the GCA will become widespread, and we will be happy to supply the complete set of questions, with answers, on request (see right panel). We will continue to revise the GCA as we obtain feedback from GCA users and will validate significant revisions before they are incorporated. A growing number of institutions is planning to use the GCA, and analysis of combined data from these users will be posted online at Figure 4. Using item difficulty (P) and item discrimination (D) index values to compare two modes of instruction for effectiveness. Bars show P and D values for a question on mitochondrial inheritance in two majors courses taught by different instructors. Students in the two courses showed similar P and D values on the pre- test, but on the post-test, students in Course 1 had a significantly (p<0.05) higher P value and a lower D value when compared with students in Course 2. These results suggest that Course 1 was more effective in promoting student learning of the concept addressed in this question (see legends to Figures 2 and 3).