Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Education Consultant Specializing in Assessing Student Learning
Analyzing Student Work
The Teacher Work Sample
Curriculum Development and Course Design
Victorian Curriculum and Assessment Authority
Designing Instruction Objectives, Indirect Instruction, and Differentiation Adapted from required text: Effective Teaching Methods: Research-Based Practice.
1 Aligning Assessment Methods with Learning Outcome Statements and Curricular Design Presented at CCRI April 8, 2005 Material from Maki,
1 Module 2: Tasks that Prompt Students’ to Represent Your Outcome Statements “ Every assessment is also based on a set of beliefs about the kinds of tasks.
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
Curriculum, Instruction, & Assessment
Principles of High Quality Assessment
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Assessment as Learning Presented at the Teaching & Learning Innovations 17 th Annual Conference University of Guelph May 12, 2004 Peggy Maki, Ph.D.
Looking at Student work to Improve Learning
Meeting SB 290 District Evaluation Requirements
1 Module 1: Developing and Validating Learning Outcome Statements Presented at CSU San Marcos Peggy Maki April 25 and 25, 2007.
1 Assessing for Learning Presented by Peggy L. Maki June 7, 2004 Seattle Pacific University Material from Maki, P. (2004). Assessing.
Interstate New Teacher Assessment and Support Consortium (INTASC)
A Framework for Inquiry-Based Instruction through
PEGGY MAKI, PH.D. EDUCATION CONSULTANT IN ASSESSING STUDENT LEARNING PRESENTED AT CCRI SEPTEMBER 21, 2012 Assessment at Three Levels: Institution, Program,
1 Selecting Appropriate Assessment Methods Presented at the Teaching & Learning Innovations 17 th Annual Conference At the University of Guelph May 12,
1 Assessing for Learning Workshop Presented at CCRI February 23, 2005 Peggy Maki, Ph.D.
1 Taking a Backward-Designed Collaborative Approach to Assessing Student Learning to Explore both Students' Learning Processes and Products Peggy Maki.
Thomas College Name Major Expected date of graduation address
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
MODULE 7 Putting All Together and Designing the Course.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
What is the TPA? Teacher candidates must show through a work sample that they have the knowledge, skills, and abilities required of a beginning teacher.
ationmenu/nets/forteachers/2008s tandards/nets_for_teachers_2008.h tm Click on the above circles to see each standard.
ATL’s in the Personal Project
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
Curriculum Report Card Implementation Presentations
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
1 Orientation Session at 2003 Assessment Conference A Richer and More Coherent Set of Assessment Practices Peggy L. Maki Senior Scholar Assessing for Learning.
1 Roles and Responsibilities of The Learning Evidence Team at CCRI Presented at CCRI Peggy Maki
Christine Yang March 17, As a teacher it is critical for me to demonstrate mastery of technology teacher standards. ISTE-NETS Teacher Standards.
Lecture # 32 SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION Professional Standards for Teaching Science.
Resources and Reflections: Using Data in Undergraduate Geosciences Cathy Manduca SERC Carleton College DLESE Annual Meeting 2003.
Selecting Appropriate Assessment Measures for Student Learning Outcomes October 27, 2015 Cathy Sanders Director of Assessment Office of Assessment and.
Using edTPA Data for Program Design and Curriculum Mapping Mary Ariail, Georgia State University Kristy Brown, Shorter University Judith Emerson, Georgia.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Student Learning Objectives (SLO) Resources for Science 1.
Stuart Birnbaum Department of Geological Sciences The University of Texas at San Antonio Learning objectives and assessments June 15, 2015.
College of Education and Allied Studies Office of Semester Conversion Academic Programs and Graduate Studies February 4, :00 pm – 4:00 pm Oakland/Concord.
Identifying Outcomes Peggy Maki Senior Scholar Assessing for Learning American Association for Higher Education
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
1 Far West Teacher Center Network - NYS Teaching Standards: Your Path to Highly Effective Teaching 2013 Far West Teacher Center Network Teaching is the.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Designing Quality Assessment and Rubrics
 Processing New Information Learning Content in Digestible Bites.
8/23/ th ACS National Meeting, Boston, MA POGIL as a model for general education in chemistry Scott E. Van Bramer Widener University.
Helping Students Examine Their Reasoning
Using Cognitive Science To Inform Instructional Design
Effective Outcomes Assessment
A five-year community effort to improve Earth literacy and build a workforce prepared to tackle environmental and resource issues InTeGrate supports integrated.
A five-year community effort to improve geoscience literacy and build a workforce prepared to tackle environmental and resource issues InTeGrate supports.
Advanced Program Learning Assessment
The Heart of Student Success
Assessing Academic Programs at IPFW
Presentation transcript:

Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September 27, 2013

Foci I.A Problem-based Framework for RFPs II.The Principles and Processes of Assessing The Efficacy of Your Educational Practices III. Elements of An RFP 2

What Is the Problem for First-Year Physics Students? How to restructure incorrect understanding of physics concepts became the work of physics faculty at the University of Colorado (PhET project). That is, physics faculty became intellectually curious about how they could answer this question to improve students’ performance over the chronology of their learning. 3

4

A. What Research Tells Us about Learners EgocentricitySociocentricityNarrow-mindednessRoutinized habits 5 Learners Create Meaning

6 Meta-cognitive processes are a significant means of reinforcing learning (thinking about one’s thinking) Learning involves creating relationships between short-term and long-term memory

7 Practice in various contexts creates deep or enduring learning Inert learning; Activated learning Transfer of new knowledge into different contexts is important to deepen understanding

Threshold Concepts pathways central to the mastery of a subject or discipline that change the way students view a subject or discipline, prompting students to bring together various aspects of a subject that they heretofore did not view as related (Land, Meyer, & Smith). 8

People learn differently and may hold onto folk or naive knowledge, incorrect concepts, misunderstandings, false information Deep learning occurs over time transference 9

Learning Progressions knowledge-based, web-like interrelated actions or behaviors or ways of thinking, transitioning, self-monitoring. May not be developed successfully in linear progression--thus necessitating formative assessment along the trajectory of learning. Movements towards increased understanding (Hess). 10

11 Deep Learning Occurs When Students Are Engaged in Their Learning

Writing beyond what is visually presented during a lecture Identifying clues to help organize information during a lecture Evaluating notes after class Reorganizing notes after class 12 Learning Strategies of Successful Students

Comparing note-taking methods with peers Using one’s own words while reading to make notes Evaluating one’s understanding while reading Consolidating reading and lecture notes Source: Calvin Y. Yu: Director of Cook/Douglass Learning Center, Rutgers University 13

How well do your students … IntegrateTransferAnalyze (Re)ApplyRe-useSynthesize Restructure previous incorrect learning… 14

Within a course or module or learning experience? Along the chronology of their studies and educational experiences? From one subject or topic or focus or context to another one such as from an exercise to a case study or internship? 15

Integrated Learning…. 16 Cognitive Affective Forms of Representation within Contexts Psychomotor

17 1. Identify The Outcome or Outcomes You Will Assess 2. State the Research or Study Question You Wish to Answer 3. Conduct a Literature Review about That Question 4. Develop a Plan to Collect Direct and Indirect Assessment Results that Will Answer Your Question 5. Analyze and Interpret Students’ Work and Students’ Responses 6. Collaboratively Discuss Ways to Innovate Pedagogy or Educational Practices 7. Implement Agreed-upon Changes and Reassess 8. Share Developments Within and Outside The Institution to Build Knowledge about Educational Practices A Problem-based Assessment Framework

II. The Principles and Processes of Assessing The Efficacy of Your Educational Practices What do you expect your students to demonstrate, represent, or produce by the end of your course or educational experience, by the your program of study, or by the end of students’ undergraduate or graduate studies? What chronological barriers or difficulties do students encounter as they learn--from the moment they matriculate? How well do you identify and discuss those barriers with students and colleagues and then track students’ abilities to overcome them so that increasingly “more” students achieve at higher levels of performance? 18

A student learning outcome statement is a complete sentence that describes what you expect students to demonstrate, represent, produce or do as a result of your teaching and learning practices. It relies on active verbs that describe what you expect students to demonstrate, and it becomes the basis of determining how you actually will assess that expectation. 19

Purposes of Student Learning Outcome Statements Orient Students to the College’s and Each Program’s Expectations upon Their Entry into The College or into Their Major Program of Study (FYE?) Enable Students to Identify Where and How They Have Learned or Are Learning across The Institution Position Students to Make Connections Between and Among Their Learning Experiences along Their Educational Journey Lead to Collaborative Agreement about Direct and Indirect Methods to Assess Students’ Achievement of Outcomes 20

Cognitive Levels of Learning: Revised Bloom’s Taxonomy (Handout 1) CreateEvaluateAnalyzeApplyUnderstand Know (remember) 21 (Lorin et. als.)

Student Learning Outcome Statements Institution-level Outcomes (GE) Program- or Department-level Outcomes (including GE) Course Outcomes/ Service Outcomes/Educational Opportunities Outcomes (including GE) 22

 Institution-Level ( for example, FSC’s GE) CRITICAL THINKING (REASONING) Students will: (1) identify, analyze, and evaluate arguments as they occur in their own or others' work; and (2) develop well-reasoned arguments.  Department- or Program-level (FSC’s Nursing) Integrate evidence-based findings, research, and nursing theory in decision making in nursing practice. 23

 Course- or Educational Experience-level Integrate concepts into systems (BCS 101: Pullan) Analyze human agency (Reacting to the Past: Menna) Think critically (EGL 101 and BUS 109: Shapiro and Singh) 24

25 B. Develop Curricular and Co-curricular Maps Help us determine coherence among our educational practices that enables us, in turn, to design appropriate assessment methods (See Handouts 2-3) Identify gaps in learning opportunities that may account for students‘ achievement levels Provide a visual representation of students’ learning journey

Help students make meaning of the journey and hold them accountable for their learning over time Help students develop their own learning map—especially if they chronologically document learning through eportfolios (See Handouts 2- 3) 26

C. Focus on the challenges, obstacles, or “tough spots” that students encounter – Research or Study Questions in Your RFPs 27 Often Collaboratively developed Open-ended Coupled with learning outcome statements (Reference RFPs) Developed at the beginning of the assessment process

The Seeds of Research or Study Questions Informal observations around the water cooler Results of previous assessment along the chronology of learning or at the end of students’ studies Use of a Taxonomy of Weaknesses, Errors, or Fuzzy Thinking (see Handout 4) 28

Some Examples of Research/Study Questions What kinds of erroneous ideas, concepts, or misunderstandings predictably interfere with students’ abilities to learn or may account for difficulties they encounter later on? What unsuccessful approaches do students take to solve representative disciplinary, interdisciplinary, or professional problems? Counter that with learning about how successful students solve problems. 29

What conceptual or computational obstacles inhibit students from shifting from one form of reasoning to another form, such as from arithmetic reasoning to algebraic reasoning? What kinds of cognitive difficulties do students experience across the curriculum as they are increasingly asked to build layers of complexity? 30 See Handout 5. What is your research/study question? Reference Annotated RFPs

31 D. Review your Course or Sequence of Courses in a Program or Department for Alignment (See Handouts 6, 7-10, 11) Program outcomes Course or experience outcomes Criteria / standards to assess outcome (see ) Course design: pedagogy learning context Assignments (see syllabus example, ) Student feedback

E. Identify or Design Assessment Methods that Provide Evidence of Product and Process 32 Direct Methods, including some that provide descriptive data about students’ meaning-making processes, such as “Think Alouds” Indirect Methods, including some that provide descriptive data, such as Small Group Instructional Design or salgsite.org Survey Institutional data (course taking patterns, percentages or usage rates accompanied with observation or documentation of impact)

Some Direct Methods to Assess Students’ Learning Processes Think Alouds: Pasadena City College, “How Jay Got His Groove Back and Made Math Meaningful” (Cho & Davis) Word edit bubbles Observations in flipped classrooms Students’ deconstruction of a problem or issue (PLEs in eportfolios can reveal this - tagging, for example) 33

Student recorder’s list of trouble spots in small group work or students’ identification of trouble spots they encountered in an assignment Results of conferencing with students Results of asking open-ended questions about how students approach a problem or address challenges Use of reported results from adaptive or intelligent technology Observations based on 360-degree classroom design (students show work as they solve problems) 34

Use of reported results from adaptive or intelligent technology Focus on hearing about or seeing the processes and approaches of successful and not so successful students Analysis of “chunks of work” as part of an assignment because you know what will challenge or stump students in those chunks 35

Some Direct Assessment Methods to Assess Students’ Products Scenarios—such as online simulations Critical incidents Mind mapping Questions, problems, prompts 36

Problem with solution: Any other solutions? Chronological use of case studies Chronological use of muddy problems Analysis of video Debates Data analysis or data conversion 37

Visual documentation: videotape, photograph, media presentation Observation of students or other users in representative new or revised practices—what kinds of difficulties or challenges do they continue to face? Assessment of the quality of X, such as proposals, based on criteria and standards of judgment Comparison of “before” and “after” results against criteria and standards of judgment 38

Documentation of areas of improved or advanced ability, behavior, or results using a scoring rubric that identifies traits or characteristics at different levels of performance (Refer to Handouts 7-10) Asking students to respond to a scenario to determine changed behavior (such as in judicial decisions or in decision making about behaviors or choices) 39

Sentence or story completion scenarios (consider the validity of responses in relation to actual behavior) Process/841process6bl1c4bf.htm Process/841process6bl1c4bf.htm Other for FSC? 40

Some Indirect Methods that Probe Students’ Learning Experiences and Processes SALG (salgsite.org): Student Assessment of Their Learning Gains Small Group Instructional Design Interviews with students about their learning experiences--about how those experiences did or did not foster desired learning, about the challenges they faced and continue to face. (Refer to Handout 12 for a list of direct and indirect methods of assessment you might use to assess your students’ learning/development) 41

F. Chronologically Collect and Assess Evidence of Student Learning Baseline—at the beginning--to learn about what students know or how they reason when they enter a program Formative—along the way--to ascertain students’ progress or development against agreed upon expectations. Summative—at the end--to ascertain students’ levels of achievement against agreed upon expectations. 42

Referring to pages or to Handout 12,identify both direct and indirect methods you might use to gauge evidence of the efficacy of your educational practice(s) based on baseline evidence: Professional or legal standards/expectations for performance such as those established by the Council for the Advancement of Standards, the Association of Governing Boards, The New Leadership Alliance for Student Learning and Accountability, or AAC&U’s VALUE rubrics 43

Determine the kinds of inferences you will be able to make based on each method and the problem you are trying to solve Identify other institutional data that might be useful when you interpret results, such as judiciary board sanctions or other records. 44

Your Method of Sampling Ask yourself what you want to learn about your students and when you want to learn: All students Random sampling of students Stratified random sampling based on your demographics—informative about patterns of performance that can be addressed for specific populations, such as non-native speakers 45

Scoring Faculty: Determine when work will be sampled. Identify who will score student work (faculty, emeritus faculty, advisory board members, others?). Establish time and place to norm scorers for inter-rater reliability on agreed upon scoring rubric. (See Handout 13) 46

G. Report on Results of Scoring Evidence of Student Learning Office of IR, Director of Assessment, or “designated other”: Analyzes and represents scoring or testing results that can be aggregated and disaggregated to represent patterns of achievement and to answer the guiding research or study question(s) Develops a one-page Assessment Brief 47

The Assessment Brief Is organized around issues of interest, not the format of the data (narrative or verbal part of the brief). Reports results using graphics and comparative formats (visual part of the brief, such as trends over time, for example, or achievement based on representative populations). 48

49

50

51 Results based on scoring students’ written work 61% 20% 11% 8% 0% 10% 20% 30% 40% 50% 60% 70% EmergingDevelopingProficientExemplary

H. Establish Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results or To Hear About Your Interpretation of Results Identify patterns against criteria and cohorts (if possible) Tell the story that explains the results— triangulate with other reported data, such as results of student surveys. 52

Determine what you wish to change, revise, or how you want to innovate and develop a timetable to reassess once changes are implemented. (See Handouts ) 53

Implement agreed upon changes Re-assess to determine efficacy of changes Focus on collective effort—what we do and how we do it 54 Collaboratively Agree on and Re- Assess Changes

III. Elements of An RFP 55 State Your Outcome or Outcomes for a Time Period (cycle of inquiry) Identify the research or study question you will try to answer within the context of current literature Identify two methods you will use to assess that outcome or set of outcomes

Identify your baseline data or initial state (where you started) Identify the criteria and standards of judgment you will use to chart progress (professional or agreed upon performance standards, scoring rubrics) Identify when you will collect data 56

Calendar when you will analyze and interpret results Identify when you will submit a report that briefly describes your interpretation of results, further needed actions, and conclusions. Tell the story that explains the results based on triangulating evidence and data you have collected Calendar when further actions will be taken, including plans to reassess to determine the efficacy of those further actions. (See sample plan in Handouts 14-15) 57

What if we…. Collaboratively use what we learn from this approach to assessment to design the next generation of curricula, pedagogy, instructional design, educational practices, and assignments to help increasingly more students successfully pass through trouble spots or overcome learning obstacles; 58

and, thereby, collaboratively commit to fostering students’ enduring learning in contexts other than the ones in which they initially learned. (See Handout 16 to identify where you see the need to build your department’s or program’s assessment capacity.) 59

Works Cited Cho, J. and Davis, A Pasadena City College. “How Jay Got His Groove Back and Made Math Meaningful.” Hess, K Developing and Using Learning Progressions as a Schema for Measuring Progress. National Center for Assessment, Land, R., Meyer, J.H.F., and Smith, J. Eds Threshold Concepts and Transformational Learning. Rotterdam: Sense Publishers. Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A. Eds. (2000). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Boston, MA.: Allyn and Bacon.Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A 60

Maki, P nd Ed. Assessing for Learning: Building a Sustainable Commitment Across the Institution. VA: Stylus Publishing, LLC National Research Council Knowing What Students Know: he Science and Design of Educational Assessment. Washington, D.C. Yu, C. Y. “Learning Strategies Characteristic of Successful Students.” Maki, P p