Colorado Learning About Science Survey for Experimental Physics Benjamin Zwickl, Heather Lewandowski & Noah Finkelstein University of Colorado Physics.

Slides:



Advertisements
Similar presentations
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
PROBLEM-BASED LEARNING & CAPACITY BUILDING
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Critical Thinking In Every Classroom Teaching Academy: New Faculty Orientation August 11, 2007.
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
PER User’s Guide. Development of the PER User’s Guide: Identifying key features of research-based pedagogical tools for effective implementation Sam McKagan.
Examining the Gender Gap in Introductory Physics Lauren Kost Steven Pollock, Noah Finkelstein Department of Physics, University of Colorado at Boulder.
Enhancing the Quality of both Student Learning and Faculty Teaching through Assessment.
Teaching Mathematics for Elementary Teachers through Problem Solving Martha VanCleave MathFest 2000 UCLA August 5, 2000.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
ASSESSMENT Formative, Summative, and Performance-Based
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Writing Across the Curriculum (WAC) at Sojourner Douglass College Faculty and Staff Session One Saturday, November 9, 2013.
Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Printed by Give Students a Compass – Inquiry based and Active Learning in the Physics Freshman Lab CSU Campus: CSU Campus: California.
New Faculty Syllabus Workshop Session 1 Activity #1: Why Syllabi? Take five minutes to complete a brief journal entry in response to the following prompt:
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
The Changing Face of Education: How Common Core Impacts Our Curriculum Beth Smith President, ASCCC Oct. 31, 2013.
Middle/High School Model Exploration Kirk Brown and Lissa Gilmore
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Educator Effectiveness Academy Day 2, Session 1. Find Someone Who…. The purpose of this activity is to review concepts presented during day 1.
Removing the Mask of Inquiry Presented by the Delaware Valley Science Fairs Barbara Lorenzon Director of Education Dick Close.
GATEWAY INITIATIVE Hillsborough Community College Fall 2007 Preliminary Results A Formative Evaluation.
Research Problem Across the Undergrad Science curriculum some concepts are consistently hard to acquire despite repeated exposure and appear to have a.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: The traditional approach.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Applying Principles of Learning Diane Ebert-May Department of Plant Biology Michigan State University Assessment.
Research Problem Context: A recently redesigned Classroom-based Undergraduate Research Experience (CURE) - 3 rd year immunology lab - Uses guided inquiry.
AUTHOR: NADIRAN TANYELI PRESENTER: SAMANTHA INSTRUCTOR: KATE CHEN DATE: MARCH 10, 2010 The Efficiency of Online English Language Instruction on Students’
+ Student Centered Teaching Tools Teaching Certification and Mentorship for Adjunct STEM Faculty: Montgomery College Rockville Christine R. Rai Montgomery.
Proficiency Based Physics Brooke Schmidt and Eric Hawker Illinois Mathematics and Science Academy.
1 Program Details Name: TIGER DAD (Design & Development) of College Pedagogy Courses Persons responsible: Laura Border, PJ Bennett, Vivek Kaila, Abby Watrous,
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
+ A Case Study of Teaching Job Interviews in Introductory Public Speaking Chris Cruz-Boone California State University, Bakersfield College to Workplace:
TOM TORLAKSON State Superintendent of Public Instruction 1 Welcome to the STEM Task Force Funding provided by:
Differentiation through 4MAT
Laboratory Science and Quantitative Core Requirements.
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Center for Assessment and Improvement of Learning
LOGIC MODEL A visual depiction of what a project does and what changes it is expected to bring about. Learn more: Readings, template, examples:
Course credit hour definition (7.13)
Evaluation of An Urban Natural Science Initiative
Stephen Hughes Assessment Coordinator October 18, 2016
Effects of Targeted Troubleshooting Activities on
Starting with the End in Sight…
General Education Assessment
Meredith A. Henry, M.S. Department of Psychology
Promoting the Transfer of Mathematical Skills in Food Science Programmes Colette Fagan November 13, 2018.
Course format and credit hour (7.13)
The Heart of Student Success
Designing Programs for Learners: Curriculum and Instruction
Learning Community II Survey
PD Goals Program Overview December, 2012
PD Goals Program Overview December, 2012
Alastair McLean1, Bei Cai1, Lindsay Mainhood2 and Ryan Groome1
Presentation transcript:

Colorado Learning About Science Survey for Experimental Physics Benjamin Zwickl, Heather Lewandowski & Noah Finkelstein University of Colorado Physics Department, JILA

How do our labs impact students? “Traditional introductory laboratory courses generally do not capture the creativity of STEM disciplines. They often involve repeating classical experiments to reproduce known results, rather than engaging students in experiments with the possibility of true discovery. Students may infer from such courses that STEM fields involve repeating what is known to have worked in the past rather than exploring the unknown.” - PCAST, Engage To Excel (2012)

New Lab Environments Emphasis: Error Analysis Guide to the Expression of Uncertainty in Measurement Real Time Physics Emphasis: Conceptual Learning Single Photon Experiments I.S.L.E. Emphasis: Scientific Abilities

Assessment Alternatives How much did the following aspects of the course help you in your learning? As a result of your work in this class, what gains did you make in your understanding of each of the following concepts? As a result of your work in this class, what gains did you make in the following skills? Customizable!

Assessment Alternatives Embed assessment in your course activities 1.Define proficiency in different learning goals 2.Create tasks that align 3.Students carry out tasks and get feedback

Towards a Common Evaluation Tool 2004: CLASS (Colorado Learning Attitudes about Science Survey) 2012: E-CLASS 1) Common evaluation tool that can be applied to the existing variety of lab experiences 2) Aspect of the lab that can be evaluated easily. 3) Aligns with a common learning goal.

Transformation Model What should students learn? What are students learning? What approaches improve student learning? Consensus learning goalsAssessments Research-based curriculum development Department Faculty PER Postdocs

Development of Learning Goals 21 facultyLiteratureCommunity

1)A new survey focused on experimental physics 2)Validated for all levels of university students 3) A common evaluation tool than can be applied to a variety of lab experiences across the country.

E-CLASS Dimensions Affect Argumentation Confidence Experimental design Math-Physics-Data connection Modeling the measurement system Physics community Purpose of labs Statistical uncertainty Systematic error Troubleshooting

Validation- Students 19 interviews with all levels of college students. Students took the survey and then explain how they answered it. Students: “What do I think vs. what should I think?” Add: “What would a physicist say?” Students: about lab class or their research lab? Modify: “What would a physicist say about their research?” Students: what about theorists? Final: “What would experimental physicists say about their research?”

E-CLASS Design Paired Questions Actionable evidence for instructor

Example: modeling the measurement system Personal / Expert Pairs Impressions of Course

Validation- Faculty Six faculty with active experimental research labs took the 30 statement survey to establish the expert view. 22 statements: 100% agreement 5 statements: 5 matching agreement + 1 neutral 2 statements: 4 matching agreement “Working in a group is an important part of doing physics experiments.” “When I encounter difficulties in the lab, my first step is to ask an expert, like the instructor.” 1 statement did not have consensus. (2 Agree, 3 Neutral, 1 Disagree) “Nearly all students are capable of doing a physics experiment if they work at it.”

Final Survey 23 core concepts/practices (evaluated in all 4 quadrants) 7 additional ideas related to affect, confidence, purpose of labs ( assessed only using the personal/expert pairs) Pre-test: 60 items (5-7 minutes) Post-test 106 items (10-15 minutes) ExpertPersonal Core Concept GradeFrequency ExpertPersonal Core Concept 5 min

Example: Question Asking Personal Expert Grade Practice

Example: Research confidence Personal Expert

Example: Experimental confidence Personal Expert

Initial implementation (Sp 2012 – Post) Statement: “Doing error analysis (such as calculating the propagated error) usually helps me understand my results better.” 1) Understand physicists see it as important, but not helpful for students. 2) Is it not helpful because they are not evaluated on this aspect?

Perceived Importance for Grade 1) Important for earning a good grade but not helpful for understanding their data. 2) Maybe it would have been helpful if they actually engaged in the process frequently

Practice Frequency for Class 1)They engaged in the practice frequently and consistent with their perception of the grading importance.

Offer to the Community We would like to offer the survey to as many students as possible in diverse environments and courses. 1) provide the online administration of the survey. 2) deal with IRB approval for the survey at CU- Boulder. 3) give you a report with analysis of the anonymized students’ responses. 4) provide a list of students that completed the survey. 1) contact us as soon as possible before the semester starts. 2) give a very small amount of credit or extra credit to incentivize participation. 3) fill out a short factual survey about your university, department, and course. All you have to do is… We will…

Conclusions Manuscript submitted to PERC: “Development and Validation of the Colorado Learning Attitudes about Science Survey for Experimental Physics” spot.colorado.edu/~bezw0974/E-CLASS Survey.html 1)E-CLASS survey has the promise to become an broadly useful assessment tool to measure one class of learning outcomes. 2)We urge the community of physics lab instructors to implement this easy and hopefully informative assessment. 3)There are many other learning outcomes that must be measured and we are actively working on those assessment tools.