Presentation is loading. Please wait.

Presentation is loading. Please wait.

CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.

Similar presentations


Presentation on theme: "CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College."— Presentation transcript:

1 CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College

2 START WITH A BASIC QUESTION What do you want to know?

3 SOME BASIC ASSESSMENT QUESTIONS 1.Are our students meeting our standards? 2.How do our students compare to their peers? 3.How much are we impacting student learning? 4.Have our changes made a difference? 5.Are we maximizing student learning potential?

4 1. ARE OUR STUDENTS MEETING OUR STANDARDS? This question calls for standards-based assessment approach A.k.a. competency-based or criterion-based assessment Examples of standards: Course student learning outcomes (SLOs) Program SLOs CNM General Education SLOs NMHED General Education Common Core Competencies Accreditation standards Professional standards Licensure Certification Discipline-based association standards

5 STANDARDS-BASED ASSESSMENT Often driven by a need for accountability Evaluated based on established targets for achievement May be internally derived: What is good enough? May be externally referenced, defined, and/or mandated Typically summative Done at the end of the unit, course, series, or program May be internal (such as a final exam) or external (such as an evaluation done by a supervisor in a field placement)

6 STANDARDS-BASED ASSESSMENT May be an add-on assessment (such as a licensure exam or a program portfolio) or may be embedded within course assignments, quizzes, or tests Typically a direct assessment of student proficiency (such as a test) but may be indirect (such as an employer rating)

7 2. HOW DO OUR STUDENTS COMPARE TO THEIR PEERS? This question calls for benchmark comparisons Benchmark: something that can be used to judge the quality or level of other, similar things; a point of reference from which measurements may be made Examples of benchmarks: Published norms from standardized assessments National means and percentiles on standardized exams Statistically derived target response rates established by survey producers (based on national or comparison-group means, percentiles, effect size, etc.) Statistics from or about comparable or ‘aspirational’ institutions or programs

8 BENCHMARK COMPARISONS Often driven by a need for accountability A ‘best practices’ expectation Also supports institutional self-assessment and marketing interests Often limited by the availability of benchmarking data Usually evaluated based on where scores fall relative to those of the peer comparison groups Though not technically a benchmark, looking at whether the majority of institutions or programs have a similar characteristic can also inform peer comparisons

9 BENCHMARK COMPARISONS May be formative (such as testing or surveying incoming students) or summative (such as testing or surveying students who are graduating or otherwise moving to a next level) Typically external (administered outside of the curriculum) E.g., comparing mean standardized test scores to national means Typically an add-on (not part of the instructional process) May be direct (as in tests) or indirect (as in surveys)

10 3. HOW MUCH ARE WE IMPACTING STUDENT LEARNING? This question calls for a value-added assessment approach Uses either longitudinal or cross-sectional comparisons Longitudinal: measuring something at the beginning and again at the end of the instructional period (unit, course, series, or program) Statistical analysis looks at case-by-case gains Cross-sectional: comparing advanced students’ outcomes to those of beginning students Both groups can be assessed at the same time Assumes the two groups come in with comparable characteristics and have equivalent experiences (which is rarely entirely true) Statistical analysis compares the means of the two groups

11 VALUE-ADDED ASSESSMENT Often driven by a need for accountability Sometimes considered a fair way of assessing educational impact irrespective of students’ prior academic preparation Looks at achievement gains versus closing the achievement gap Can be useful in quantifying the responses of students to instruction Data interpretation can be complicated by lack of student motivation on pre-assessments and by incoming transfers Evaluated based on the size of the gains observed

12 VALUE-ADDED ASSESSMENT Involves, by definition, both formative and summative assessment, E.g.: Pre- and post-tests using the same measurement/instrument Surveys comparing responses of entering and exiting students May be internal or external May be an add-on or embedded assessment May be direct (as in a test or assignment) or indirect (as in a survey or interview)

13 4. HAVE OUR CHANGES MADE A DIFFERENCE? This question calls for an investigative assessment approach Follows a scientific model, with varying degrees of formality May range from evaluating minor instructional alterations at the course level to conducting full-fledged, publishable research IRB approval may be necessary if investigation is not purely operational in nature, i.e., is experimental and/or may be used for a thesis or any other purpose outside of CNM Often driven by faculty interest in improving student learning outcomes (and/or desire for professional advancement)

14 INVESTIGATIVE ASSESSMENT Evaluated based on observed differences between groups Usually compares outcomes following a change to baseline outcomes obtained prior to the change The measurement/instrument needs to be consistent Inferring a causal relationship between a change made and any difference in learning outcome may be supported if: A statistically significant correlation exists The difference in the outcome occurs only after the change is implemented There are no confounding factors, such as introduction of other interventions, notable differences in student demographics, or environmental changes

15 INVESTIGATIVE ASSESSMENT Typically (but not always) summative May be internal (as in a writing assignment) or external (as in looking at next-level success) May be an add-on or embedded assessment May be direct (as in evaluations of student performance) or indirect (as in surveys, interviews, or next-level success rates)

16 5. ARE WE MAXIMIZING STUDENT LEARNING POTENTIAL? This question calls for a process-oriented assessment approach Examines the dynamics of student learning in relation to a desired outcome, analyzing inputs and context to determine under what conditions students learn best Often driven by faculty interest in improving student learning outcomes

17 PROCESS-ORIENTED ASSESSMENT Evaluated based on increases in proportions of students demonstrating learning gains Involves both formative and summative assessment Typically internal Typically embedded May be direct or indirect

18 GIVE IT YOUR OWN TWIST To Fit the Circumstance

19 WHAT’S YOUR ANGLE? Rather than adopt an approach, adapt what’s useful to fit your needs: Building on the basic assessment question, what specifically do you want to find out? What options are available to you? Does the general approach associated with the basic question fit your interests? Will a somewhat different or more eclectic approach work better? Are you assessing at the program level or the course level?

20 PROGRAM-LEVEL RECAP Basic Questions 1.Are our students meeting our standards? 2.How do our students compare to their peers? 3.How much are we impacting student learning? 4.Have our changes made a difference? 5.Are we maximizing student learning potential? General Approaches 1.Standards-based assessment 2.Benchmark comparisons 3.Value-added assessment 4.Investigative assessment 5.Process-oriented assessment

21 BEYOND COURSE OUTCOMES Program-level assessment can employ extra-curricular measures, such as: Surveys, interviews, and focus groups involving students, faculty, field supervisors, employers, and/or community members Pre-and post assessments (tests or surveys) Benchmark comparisons on commercially developed surveys and/or standardized tests (or other types of data comparisons with peer programs) Analyses of factors associated with differences in outcomes (demographics, service utilization, enrollment and attendance patterns, etc.) Analysis of next-level success indicators (transfer, employment)

22 COURSE-LEVEL ASSESSMENT Basic Questions Re-Phrased 1.Are the students in my class meeting the necessary standards? 2.How do the students in my class compare to their peers? 3.How much of what I teach are the students learning? 4.Have my changes made a difference? 5.Am I maximizing student learning potential? Same General Approaches 1.Standards-based assessment 2.Benchmark comparisons 3.Value-added assessment 4.Investigative assessment 5.Process-oriented assessment

23 OUTCOME ALIGNMENTS Base decisions about what to assess at the course level on alignments between course-level student learning outcomes and program-level student learning outcomes Program SLO Course 1 Aligned SLO Course 2 Aligned SLO Course 3 Aligned SLO

24 RELEVANCE Base decisions on how to assess at the course level on what the instructor wants to know Program SLO Identifying a Course SLO that’s aligned to the Program SLO The instructor deciding what he/she wants to know The instructor planning the course-level assessment The instructor conducting the course- level assessment The instructor interpreting and applying the findings at the course level The program faculty pooling and interpreting all related findings

25 TAKE IT UP A STEP Capture a More Complete Portrait through Multiple Assessments

26 BUILD A COLLAGE Student learning is dynamic, and assessments are like snapshots of student learning More than one of the basic assessment questions may apply A variety of snapshots viewed together gives a more comprehensive impression than a single snapshot Having multiple points of evidence taken from various perspectives over time supports better decisions about the usefulness of any individual bit of assessment data

27


Download ppt "CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College."

Similar presentations


Ads by Google