Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies www.jmu.edu/assessment/

Slides:



Advertisements
Similar presentations
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Advertisements

TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
“The Primary Goal of the Cleveland Metropolitan School District is to become a premier school district in the United States of America.” The Cleveland.
Classroom Conflict Prevention Developing Learning Cohesions Resources.
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Broadening Participation in the Assessment of Student Communication Skills Joan Hawthorne University of North Dakota.
Preparing Future Faculty in Engineering & Applied Science A program for graduate students who: are in the College of Engineering and Applied Science are.
Refining Your Research Question. In this session, we will… Discuss guidelines for creating a ‘good’ research question Provide time to revisit and revise.
Understanding Validity for Teachers
Online Course Observation. Objectives: 1.Articulate the steps of an online faculty observation 2.Explain the elements of the GRCC Online Course Observation.
What should be the basis of
Obtaining reliable feedback from students about teaching
Department of Humanities College of Sciences and Liberal Arts Writing Program Assessment at New Jersey Institute of Technology Carol Siri Johnson Associate.
1 Core Assessment: Attributes of a Successful Program at James Madison University TAIR Professional Development Workshop September 2004 Dr. Dena Pastor.
COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010.
Curriculum and Assessment
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Albany High School Advanced Placement and International Baccalaureate Programs.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
NCATE Standard 6 Governance and Resources: Debunking the Myths AACTE/NCATE Workshop Arlington, VA April 2008 Linda Bradley James Madison University
THE DRAGON CONNECTION March Who are we?  Jefferson City Schools  Small, rural school district 60 miles north of Atlanta, 18 miles north of the.
Advancing Assessment of Quantitative and Scientific Reasoning: A Progress Report on an NSF Project Donna L. Sundre, Douglas Hall, Karen Smith,
NASPA IARC 2009 – New Orleans, LA Thank you for downloading this presentation from the NASPA IARC Web site. For additional information on student motivation.
Cluster 5 Spring 2005 Assessment Results Sociocultural Domain.
Noel-Levitz Student Satisfaction Survey of Classroom and Online Students Conducted Spring 2008.
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
Marilyn Kurata, Ph.D. Director of Core Curriculum Enhancement Marilyn Kurata, Ph.D. Director of Core Curriculum Enhancement Faculty Forum March 24, 2012.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Preparing Future Faculty in Engineering & Applied Science A program for graduate students who: are in the College of Engineering and Applied Science are.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Should Students Have A Voice?
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Department of Secondary Education Program Assessment Report What We Assessed: Student Learning Outcomes (SLOs) and CA State Teaching Performance.
Pennsylvania Standard B Algebra and Functions Discover and generalize patterns, including linear, exponential and simple quadratic relationships.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
An Integral Part of Professional Learning Communities.
Motivating Adult Learners Why is understanding what motivates adult learners important? Adults comprise a large proportion of the workforce as well as,
Motivating adult learners can sometimes be a challenge. This module will provide you with information on how to design instructional content that will.
The James Madison University Story Donna L. Sundre, Professor of Graduate Psychology Executive Director Center for Assessment and Research Studies James.
Advancing Assessment of Quantitative and Scientific Reasoning Donna L. Sundre Amy D. Thelk Center for Assessment and Research Studies (CARS) James Madison.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
Assessment Means. Learning Outcome Participates will create assessment processes that demonstrate student learning outcomes are being met.
© 2005 MSU PROM/SE Promoting Rigorous Outcomes in Mathematics and Science Education, Supported by NSF Cooperative Agreement EHR International Comparative.
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 3
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
CAEP Standard 4 Program Impact Case Study
Reliability and Validity in Research
Classroom Assessment Validity And Bias in Assessment.
College of the Canyons’ Math PAL: Accelerating Students to Completion
Leanne Havis, Ph.D., Neumann University
Presented by: Skyline College SLOAC Committee Fall 2007
The Heart of Student Success
Quantitative Reasoning
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
What to do with your data?
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Assessment Literacy: Test Purpose and Use
College Work Readiness Assessment
Presentation transcript:

Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies

After almost 20 years, I can tell you:  Teaching faculty must be involved  One size does not fit all  We need to be intentional  There is no easy way to do this

If we want to assess Quantitative Reasoning…… We’re going to have to use some

Think about Desired Test Use  One test cannot fulfill all needs  Accountability vs. Program Improvement Our philosophy: “If we conduct quality assessment, accountability will take care of itself.”  So far, this has been true with SACS, AACSB, ABET, NCATE, SCHEV, and many others.

It’s all about inferences……  What population?  Who should we sample?  How should we sample?  What is the construct?  What are the learning objectives?  Are the students motivated to perform well?

Population Sample

Construct Test

A Thought Experiment:  Create a group of 3 people that you DO NOT KNOW  Begin a discussion of Quantitative Reasoning (QR) to answer the following questions:  What population do you want to make inferences about?  How do you conceptualize QR for this group?

What Did We Learn?  Populations can be defined at many levels:  Classroom of students  Students in a given major  A university general education program  High school students across the nation  Adult learners  Each level involves very different inferences  Each requires different sampling

How Does the Construct Change?  We may be interested in Quantitative Reasoning at all levels, but the construct will change  Classroom Inferences  Relate to focused learning objectives  Very useful for informing instruction and learning  Generally, easier to write items

Algebra Test

Inferences About Majors  Learning objectives become more global  It becomes harder to explicate and agree on the desired outcomes  Creating assessment methods requires more cooperation and real involvement  This work takes time and commitment

Mathematics Major Test

Inferences About General Education  It can take years to define meaningful learning objectives  Most institutions have not really completed this step successfully  Very difficult to write good items  Items cannot privilege one course over another  Takes you back to the construct again and again and again….

Inferences about General Education: Quantitative Reasoning  General Education Program Inferences  Learning objectives will become more global  This is the essence of general education  Gaining faculty involvement requires an infrastructure and support  Generally, much more difficult to write good items  Faculty write better items outside their area of expertise

Quantitative Reasoning Test

Assessment Days at JMU Twice a Year  Fall Assessment Day: All entering 1 st year students in August  Spring Assessment Day: All students with credit hours Use of student ID numbers for assignments Spring Assessment Day: Classes are canceled  No time or room conflicts!  This day is also used for assessment in the major We hire and train community and student proctors Result: Excellent data collection!!

Data Collection Scheme: Repeated Measures COHORT 1 COHORT 2 COHORT 3 Students in each cohort are tested twice on the same instrument – once as incoming freshmen and again in the second semester of the sophomore year.

Stages of the Assessment Process 1. Establishing objectives 2. Selecting or designing methods 3. Collecting credible information 4. Analyzing and maintaining information 5. Using information for teaching and learning improvement *Regardless of the level of assessment required, whether it be a single learning objective, a course, a curriculum, or an entire program, the process is the same.

Stages of the Assessment Process Establishing Objectives Selecting/ Designing Instruments Collecting Information Analyzing/ Maintaining Information Using Information Continuous Cycle

Stage 1: Establishing Program Objectives This is the hardest step! In order to create a successful assessment program, clear program goals and objectives must be established and agreed upon. Objectives drive the assessment process; assessment methods are based on the objectives that are being measured. Student learning objectives form the assessment engine! Assessment Objectives

Stage 2: Selecting or Designing Instruments Clear learning objectives will determine what assessment method is best. An appropriate instrument must be used to conduct meaningful assessment  Pre-existing instruments can be found at other universities or from other sources  If existing instruments do not closely match the objectives being assessed, an instrument can be created Define expected outcomes for every instrument

Stage 3: Not Just Any Data Will Do… If we want faculty and the public to pay attention to the results, we need credible evidence To obtain credible evidence:  We need all students to participate  We need good instrumentation  Representative sample from content domain  Reliability and validity  We also need students who are motivated to perform

Stage 4: Different Analytic Methods Group Differences: Do we see expected differences in performances by different student groups? Relationships: Do we see relationships between performances and grades in relevant courses? Growth: Do student performances change over time? Competency: Do students meet faculty performance expectations?

Stage 5: Using Information for Program Improvement This is where the infrastructure must come into play  Committees that work, not just meet  This SHOULD be intellectually stimulating! Involves feedback from faculty members and careful consideration of the assessment results I meet with QR/SR faculty every 2 weeks!! Examples of using information for program improvement: curricular change, resource allocation or reallocation, changes in instructional delivery and emphasis; course resequencing

What Have We Learned? Here’s a Sampler:  It’s very difficult to create a good test  We are on our 9 th version; it does get better  Our faculty wrote this test; they like it  The test isn’t about QR use in physics and chemistry; it assesses process and thinking  Our students like the test; they feel like they have a chance to perform well  They like tables, charts, pictures and graphs  We think it’s about General Education

What Have We Learned? Here are a few more findings:  Entering 1 st year students are not a pre-test  Students do change significantly with more related course work  Correlations between QR scores and Grades in QR courses are positive  Students completing their QR course work don’t perform to the level our faculty would like  AP and JMU grades are good predictors of QR  Transfer credit hours are not

What Have We Learned? Here are a few more findings:  Our test items map to the objectives of other institutions:  Truman State University: 100%  Michigan State University: 98%  Virginia State University:97%  St. Mary’s University (TX)92%  Virginia Tech84%  Virginia CC System78% Our NSF grant helped us to advance assessment of QR and SR nationally

Let’s Open The Session for Questions You may have advice for the group If you want more information; go to Look under assessment resources Contact me (Donna Sundre) at: