Comparative effectiveness of research ethics teaching methods Michael Kalichman and Dena Plemmons UC San Diego Research on Research Integrity Annual Meeting.

Slides:



Advertisements
Similar presentations
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Advertisements

Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
Making Assessment Simple … & Easy For the members of the University Assessment Committee.
How do we improve skills required for critical thinking in pharmacy students? Jennifer Short, Betty Exintaris, Paul White & Sab Ventura Department of Pharmaceutical.
POGIL vs Traditional Lecture in Organic I Gary D. Anderson Department of Chemistry Marshall University Huntington, WV.
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Comparison of an Integrated (HBSE/Practice) Blended Learning Course with Non-integrated Face-to-Face Courses Rose McCleary Leigh Collins Sam Jenkins California.
PS124 Introduction to Psychology Unit 1
Context  Learning Problem: Many students are under-prepared for freshman biology courses. They have difficulty using their biology textbook and performing.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Experience in Applying Online Learning Techniques in Computer Science & Engineering Dr. Aiman H. El-Maleh Computer Engineering Department King Fahd University.
Measuring Learning Outcomes Evaluation
Life Course Perspective Seminar Series LCPSS Evaluation Leadership Project URLEND 2011 Brooke Sevy Caroline Hagedorn, PNP Eduardo Ortiz, PhD Sarah Winter,
Techniques for Improving Student Learning Outcomes Lynn M. Forsythe Ida M. Jones Deborah J. Kemp Craig School of Business California State University,
GRADUATE EDUCATION IN RESEARCH ETHICS FOR SCIENTISTS AND ENGINEERS Jorge J. Ferrér-Negrón 1, William J. Frey 1, Efraín O’Neill-Carrillo 2, Carlos Ríos-Velázquez.
Data Collection and Preliminary Analysis Our survey addressed the first two of the questions presented in this study. The Qualtrics survey was framed by.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Business and Management Research
Blazing Trails: On the Path to Information Literacy: lighting the path to collections through collaboration Brena Smith and Alison Armstrong, UCLA Library.
Developing a programme of information literacy. Strategy Will you work at an institutional level? Will you work at a course level? Will you work at a.
Using Interactive Multimedia to Teach Parent Advocacy Skills.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
EDTEC 572 Week 8 Media Selection & LMS Ideas Minjuan Wang SDSU EDTEC 572 Summer 2011 It’s Time to Start your Final Project!
1 What do We Learn from Challenging TEFL CALL Students’ Projects? Presenter: Professor Lyra Riabov Southern New Hampshire University NNETESOL Fall 2006.
Developing Psychology Lesson Plans: A Collaboration between High School and College Students Debra Mashek 1, Mike Callahan 2, & Shelley DeFord 1 1 Harvey.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Deborah Gentry, Instructional Development Center Randi Sutter, Library 14 th Annual Illinois Community College Assessment Fair.
RESPONDENT BACKGROUND DISTRIBUTION Data from 31 survey respondents Student Assessment of Their Learning Gains from Conducting Collaborative Research Projects.
Republic of Belarus REPORT on courses development for Eco-BRU TEMPUS Project Zilina, April 2015.
Integrating the Natural & Social Sciences in a "Sustainable Agriculture Science & Policy" Course Heather D. Karsten 1 and Clare Hinrichs 2, 1 Dept. of.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
McGraw-Hill/Irwin Teaching Excellence Project funded by CELT Teaching Economics through Innovative Content and Effective Teaching Methods Necati Aydin,
Teaching Thermodynamics with Collaborative Learning Larry Caretto Mechanical Engineering Department June 9, 2006.
Institutional Learning Outcomes: Assessing Community and Global Responsibility Convocation – Fall 2015 College of the Redwoods Angelina Hill & Dave Bazard.
Evaluating HRD Programs
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: That students do not.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
EXPLORING THE EFFECTIVENESS OF RCR EDUCATION IN THE SOCIAL AND BEHAVIORAL SCIENCES Jim Vander Putten Department of Educational Leadership Amanda L. Nolen.
November 26, :00am – 11:00am Monitoring Quality of Tutor Instruction
11 Report on Professional Development for and Update Developed for the Providence School Board March 28, 2011 Presented by: Marco Andrade.
.. SAN Distance Learning Project Student Survey 2002 – 2003 School Year BOCES Distance Learning Program Quality Access Support.
BMFP 4513 Teaching & Learning Planning &Strategy 2011/2012 Team: Hasoloan Haery I.P. Dr. Zuriah Ebrahim Dr. Seri Rahayu.
“ I'm still loving the fact that I share a virtual classroom with such a mix of students from all over the world. Other strengths include: engaging, up-
GATEWAY INITIATIVE Hillsborough Community College Fall 2007 Preliminary Results A Formative Evaluation.
SoTL Institute Presentation Samantha Elliott Kerry July 18, 2008.
Students as Change Agents Exploring issues of Student Engagement among On- Campus MSc Students Denise Ryder, Jonathan Doney, Nii Tackie-Yaoboi With Nadine.
Stacy Keyte EDCI 538 Dr. Stetson. Rules and Procedures What I learned:  I learned the difference between rules and procedures as well as the way to effectively.
CSU Center for Teacher Quality Assessing Teacher Preparation Outcomes for Program Improvement and Institutional Accountability CSU Academic Council Meeting.
Student and Faculty Perceptions of Goal Achievement in General Education Courses C. “Griff” Griffin Director, General Education Grand Valley State University,
WELCOME TO BUS 302 The Gateway Experience For more information visit:
EVALUATION AND SELFASSESSMENT SYSTEMS FOR THE VIRTUAL TEACHING: MULTIPLE CHOICE QUESTIONNAIRES Claudio Cameselle, Susana Gouveia Departamento de Enxeñería.
Instructional Plan Template | Slide 1 AET/515 Instructional Plan Template Jami Anderson.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Inclusive Curriculum and Assessment: the student experience in a literature - based subject: Levelling the playing field in History Deborah Altman and.
Wendy Heck Grillo North Carolina Central University Dept. of Biology Biology Scholars 2009.
Lisa Elfring Background /context  I teach Intro Cell/Molecular Biology (~360 students in one lecture) and am interested in learner-centered strategies.
Klas-Göran Olsson, School of Health Sciences Jönköping University, Sweden Anne Karin Larsen, Bergen University College, Norway SW-VirCamp Consortium Meeting.
WELCOME TO OPEN HOUSE! September 6, 2012 AP BIOLOGY G119 Mrs. Vanderfin Please sign-in at the side counters.
Using a Blog to Enhance Business Students’ Writing Experience: An Assessment of Effectiveness by Marie E. Flatley, Ph.D. Information and Decision Systems.
Supporting Your Child in the IB MYP and Diploma Programme.
Portfolio Assessment 2 Presentation Format
LIBRARY INDUCTION MOVES FROM STAGE TO SCREEN: Adapting our performance for smarter teaching delivery Damian J. J. Farnell (School of Dentistry), Erica.
How do you use library instruction?. Library Instruction That Improves Self-Efficacy and Academic Achievement 2016 Innovations Conference, Chicago, Illinois.
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Evaluating the Effectiveness of Clickers in a Biology Lab
Using MOOCs for development of transversal skills
FRAMEWORK OF EVALUATION QUESTIONS
Using MOOCs for development of transversal skills
John Thompson, Ph.D. Buffalo State College
Learning Community II Survey
Presentation transcript:

Comparative effectiveness of research ethics teaching methods Michael Kalichman and Dena Plemmons UC San Diego Research on Research Integrity Annual Meeting Niagara Falls, NY May 16, 2009

Goal Assess effectiveness of teaching research ethics Challenges  Different teaching objectives  Different institutions and audiences  Different instructors Assess relative effectiveness of different methods for teaching research ethics

Approach: The Course The course: Scientific Ethics (UC San Diego)  10 sessions, 1-1/2 hrs per week  3 sections each week  2 instructors (DP, MK) Teaching Methods:  Week 1: Lecture  Weeks 2-10 Lecture + Small group case discussion Lecture + Role play Case-based Lecture

Approach: The Students Students  Graduate Students, some postdocs  Biology, Neurosciences, Other (e.g., Bioengineering)  UC San Diego, Other (Salk, The Scripps Research Institute) Number  Total = 57  18 – 20 assigned per section

Approach: Randomization Week Section 1Section 2Section 3 1 MK Introductory Lecture 2 DP LCR 3 MK CRL 4 DP RLC 5 MK LCR 6 DP CRL 7 MK RLC 8 DP LCR 9 MK CRL 10 DP RLC L =Lecture + small group case discussion R =Lecture + role-play C =Case-based lecture

Approach: Outcomes 1.Pre-Test: 18 multiple choice questions (2 for each of 9 topics) 2.Weekly Quiz: 5 multiple choice questions 3 attitude questions about the topic 2 attitude questions about the survey (enjoyable? useful?) 3.Post-Test: Same as Pre-Test 4.Final Evaluation: Satisfaction, Perspectives

Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course?  “not at all”  “not much”

Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course?  “great increase in knowledge”  “definitely more aware and now know options available”  “the course stressed topics that are easy to ignore, …useful for recognizing similar situations before it's too late”

Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course?  “…I will be able to more effectively deal with many types of situations”  “learned new strategies for dealing with complex ethical issues”  “I am now better prepared to deal with problems like the ones discussed in class”

Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course?  “It has helped me to see other perspectives and change my mind on several topics. ”  “it increase[d] my moral outrage”

Sample Question #1 Research data are the property of: A.The institution which employs those who are collecting the data. B.The person(s) who collect the data. C.The head of the research group that collects the data.

Sample Question #2 The primary basis for requirements for review of human subjects research is in regulations created by the A.State government B.Federal government C.University

Overall Impact of Course: Knowledge Knowledge improved (P<0.001) However, improvement was modest  Pre-test median = 12.0  Post-test median = 13.0

Effects of Methods Knowledge: No statistically significant difference among methods for any of the 9 test weeks. Attitudes: No statistically significant difference among methods for any of the 9 test weeks.

Perceptions of Methods Useful or Enjoyable?  No statistically significant difference among methods for 5 of the 9 test weeks.  In 4 of the 9 test weeks, Lecture+Roleplay judged to be more enjoyable than Case-Based Lecture and/or Lecture+Case Discussion  In 3 of the 9 test weeks, Lecture+Roleplay was judged to be more effective than Case-Based Lecture and/or Lecture+Case Discussion However, when different methods compared across all weeks (2-factor ANOVA), no effect of method.

Student Preference - Methods Which of the methods did you find to be most useful for meeting the goals of this course?

Student Preference - Methods Which of the methods did you find to be most enjoyable for meeting the goals of this course?

Student Preference - Methods In future courses, would you recommend using:

Student Preference - Comments “I really disliked the role-playing. I didn’t think it was beneficial at all.” 13 of 51 respondents (>25%) specifically commented on not liking the role-play exercises

Summary and Conclusions Summary Student perceptions: positive impact on knowledge, skills, and attitudes Knowledge improved, but is it worth the cost? No difference in methods for knowledge or attitudes Lecture+roleplay considered more enjoyable and/or useful during several weeks of course, but least liked overall at end of course Mixed methods preferred Conclusions 1.Impact of course on attitudes needs to be assessed 2.Teachers may be more important than methods