California Educational Research Association Annual Meeting Rancho Mirage, CA – December 5, 2008 Hoky Min, Gregory K. W. K. Chung, Rebecca Buschang, Lianna.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
College Algebra Course Redesign Southeast Missouri State University.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
LEXIA GAMES: WHAT EFFECT DOES TECHNOLOGY HAVE ON STUDENTS READING COMPREHENSION? Leah G. Doughman University of West Georgia MEDT 8484 Fall 2010.
CORE California Office to Reform Education Fall Performance Assessment Pilot October-December 2012.
Chapter 4 Validity.
Valentin Razmov, Richard Anderson {valentin,
Formative and Summative Evaluations
Faculty Perceptions about Barriers to Active Learning
Analytical methods for Information Systems Professionals Week 13 Lecture 1 CONCLUSION.
Classroom Climate and Students’ Goal Structures in High-School Biology Classrooms in Kenya Winnie Mucherah Ball State University Muncie, Indiana, USA June,
Obtaining reliable feedback from students about teaching
Improving Learning via Tablet-PC-based In-Class Assessment Kimberle Koile, MIT CS and AI Lab David Singer, MIT Brain & Cognitive Sciences Classroom Presenter.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Technology and Motivation
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
 It is an “out of the classroom” intervention  A tutor works with a small group of 3 students  The goal is to get the students ready for the maths classroom.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Research.
Qatar University Exemplary Online Course Award
SENSE 2013 Findings for College of Southern Idaho.
CRIOP Professional Development: Program Evaluation Evaluatio Susan Chambers Cantrell, Ed.D. Pamela Correll, M.A. Victor Malo-Juvera, Ed.D.
Evaluation of Inspired Writing Voices of Littleton Students September 27, 2010 Evaluation of Inspired Writing Voices of Littleton Students September 27,
Student Engagement Survey Results and Analysis June 2011.
A Framework for Inquiry-Based Instruction through
STUDENT SUCCESS IN AN ONLINE ENVIRONMENT JASON BALDWIN EDU 601: STUDENT SUCCESS JULY 5, 2015 DOUGLAS GOSS.
Railside High School Study
1 / 27 California Educational Research Association 88 th Annual Conference Formative Assessment: Implications for Student Learning San Francisco, CA November.
Integrating Data Analysis At Berea College Jill Bouma Berea College August 13,
Qualitative Methods vs. Quantitative Methods. Qualitative Methods? Quantitative Methods?
Why study educational psychology?
Learning within Teaching What professors can learn about their students and themselves as teachers when they innovate in their teaching ANABELLA MARTINEZ,
Project Director – Dr. Mark Lung Dept of Natural & Environmental Sciences Western State College of Colorado Project Evaluator – Dr. Dave Shannon Educational.
Evaluating a Research Report
GETTING HIGH SCHOOL STUDENT’S BUY-IN: Target Language Only Mandarin Chinese Classes.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Quality Assessment July 31, 2006 Informing Practice.
Assessing Student Learning to Improve Teaching Jeff Bell and Jim Bidlack.
Classroom Diagnostic Tools. Pre-Formative Assessment of Current CDT Knowledge.
California Educational Research Association Annual Meeting CERA Session 3 Day 1 – Preparation and Professional Development Rancho Mirage, CA – December.
MASTER CLASS HIGHER PE Command Words.
CPS Proposal Sarah L. Beery MSU – CEP 812. The Background   Required Math and Science classes   Population of 9th and 10th graders   Many different.
New Media and Teaching: A “Comfortable Distance” for Controversy? J. Lynn McBrien, University of South Florida New Agendas for Media Literacy Conference.
The Relationship between Elementary Teachers’ Beliefs and Teaching Mathematics through Problem Solving Misfer AlSalouli May 31, 2005.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Grant # STEM Modeling: Content and Practice David Erickson, PI, LABT Katie Kinney, Noyce Scholar, Biology Alan Marr, Noyce Scholar, Biology Wes.
Assessing the Assessment: Studying our Student Climate Survey What statistics and students have to say about WCSD’s Annual Student Climate Survey.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing An Exploratory Study.
Helpful hints for planning your Wednesday investigation.
Increasing Rigor in the Classroom Natalie Redman.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
University of California, San Diego Beth Simon Sr. Associate Director of Learning Sciences and Technology, Center for Teaching Development and Lecturer.
The journey towards successfully flipping the classroom: a community of practice approach MINNESOTA ELEARNING SUMMIT July 29th 2015 Minnesota Technical.
CLASSROOM ASSESSMENT TECHNIQUES Departmental Workshop Wayne State University English Department January 11, 2012.
WICOR: COLLABORATION AVID PROFESSIONAL LEARNING
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
Dr. Richard Charnigo Professor of Statistics and Biostatistics 07 December 2015.
Scott Kissau, PhD, University of North Carolina at Charlotte Laura Hart, EdD, University of North Carolina at Charlotte Annual Conference of the American.
Integrating Coursework and Early Clinical Experiences in a Teacher Education Program: Lessons Learned from a One-Year Pilot Debbie Shelden Mary O’Brian.
Rachel Glazener, Assistant Professor Natural and Behavioral Sciences
Collaborative Learning in the Classroom: A Faculty Perspective
CHAN Ka Wing, LAM Chung Man and YEUNG Yau Yeun
MEDT 8461: Project 3 – Interview 2
Effective Feedback, Rubrics, and Grading
Activities and Technology in the Classroom
Week 12 Slides.
Discourse Measurement
National 5 PE Command Words.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Using Online Courses to Flip Your Classroom
Presentation transcript:

California Educational Research Association Annual Meeting Rancho Mirage, CA – December 5, 2008 Hoky Min, Gregory K. W. K. Chung, Rebecca Buschang, Lianna Johnson, William Kaiser The Influence of the Use of an Open- Ended Classroom Response System on Student Outcomes

2 / 29 Overview of Talk Background of research (Greg) Survey constructs (Greg) Analyses (Hoky) Results (Hoky) Implications and next step (Greg)

3 / 29 3 Background Develop a classroom response system UCLA developed (Elec. engineering by Bill Kaiser) – 3I: Individualized, interactive instruction Different from clickers—focus on the process of problem solving, not just the final answer Test the machinery To what extent can teachers make use of real-time student responses? How do students perceive the experience? How does the mode affect student learning?

4 / 29 4 Overview of Research Develop and validate a survey measure of students’ perceptions of processes experience with a classroom response system Examine technical quality of measure Examine relation between perception and outcomes

5 / I—Individualized, Interactive Instruction Use computers to help with immediate feedback and formative assessment Typical lesson Present problem / question / prompt … Students type their response Teacher interprets student responses and adjusts instruction immediately—moves on, reviews, elaborates, discusses, …

6 / 29 6

7 / 29 7

8 / 29 Student’s View

9 / 29 Instructor’s View

10 / 29 All students in session participated, drastically improved interaction Clear and immediate feedback Rate of receiving questions and observing responses to problems is much higher than conventional sessions Method exceeds interactivity of one-on-one from instructor perspective Instructor Perceptions

11 / 29 Interviewed students and gathered written responses during pilot tests Learning, interaction, interest Comfort participating Engagement Developed survey items based on qualitative data Examine technical quality of measure and relation to student outcomes (this study) Student Perceptions

12 / 29 Learning Interaction Interest Comfort participating Engagement Scales

13 / 29 Undergraduate—genetics 59 students 3I used for weekly discussion sessions (9 weeks) Middle school—summer school remedial math 104 students (6th, 7th, 8th grade) 3I used for guided practice sessions (twice over 4 weeks) Minimal instructor training Method

Analyses and Results

15 / 29 What We Did…

16 / 29 Research Questions To what extent does the survey measure students’ perceptions on the use of the technology in class? To what extent do students’ perceptions influence their class achievement?

17 / 29 Structural Equation Modeling (SEM) A statistical technique that tests hypotheses, theories, and models as to relationships among variables Latent variables: Theoretical constructs underlying performance or scores on measures Observed variables: Scores or performance on measures

18 / 29 Structural Equation Modeling (SEM) Measurement model (confirmatory factor analysis)

19 / 29 Structural Equation Modeling (SEM) Structural regression model

20 / 29 Measurement Model (College)

21 / 29 Structural Regression Model (College) Model 1

22 / 29 Structural Regression Model (College) Model 2

23 / 29 Measurement Model (Middle School)

24 / 29 Structural Regression Model (Middle School) Model 1

25 / 29 Structural Regression Model (Middle School) Model 2

26 / 29 Summary of Findings For college and middle-school levels, the survey measures are valid indicators of students’ perceptions of the learning processes evoked from the use of 3I Students’ perception does not predict class achievement Students’ perception and class achievement are both affected by their existing knowledge on the subjects

27 / 29 Implications Why was there no relation between students’ perception of classroom processes and outcomes? Classroom interaction doesn’t matter Poor measure Duration of use too short Ceiling effect with university students Relative coverage of content (with respect to outcome) in 3I sessions was much less than lectures Instructor training

28 / 29 Next Steps Improve instructor support Develop structured problem sets  a priori -- Common errors, possible knowledge gaps behind errors, instructional strategies Experimental design With 3I vs. without 3I (business as usual), control for content Challenging

30 / 29 Perceived Learning The sessions helped to reinforce what I had learned from lectures and the book. It was a good way to solidify any potential questions I may have had regarding specific circuits. Using the computer based tools was a nice alternative to pencil and paper or white-boarding. I think that the answer to this question is based on the type of individual. From my perspective, it is easier for me to take notes on problems and go over it at a later time, individually. I felt some pressure when solving the problems in a group setting.

31 / 29 Perceived Comfort I think that maintaining anonymity is very crucial in the interaction aspect of the discussion. Many, including myself, may feel a little embarrassed asking a "dumb" question but w/ this method, I don't feel that people will hesitate to ask those questions. The whole "instant messaging" system was cool, but seemed impersonal. Also, it felt intimidating to message the professor. It seemed to make more sense if we just asked the questions in person rather than messaging.

32 / 29 Perceived Engagement I was definitely more prone to sit and give my full attention in this section than I am normally in any discussion. I did not fall asleep, where normally I will doze off during normal discussion. I think it's a lot easier to pay attention because I feel like I actually have to do the problem myself, rather than sit back and let some brainy kid figure it out for me, like I will tend to do when I feel lazy normally. Whenever we were assigned a problem to do, I always ended up taking out a piece of paper and pencil to write out the problem. Having the problem on the computer made it harder to see the whole problem because the screen was too small to fit the problem into the screen.

33 / Typical Approach Whole-group instruction Difficult to get immediate feedback from students Feedback is usually only from a few students Not all students may be engaged