Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September.

Similar presentations


Presentation on theme: "Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September."— Presentation transcript:

1 Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September 27, 2013 pmaki86@gmail.com

2 Foci I.A Problem-based Framework for RFPs II.The Principles and Processes of Assessing The Efficacy of Your Educational Practices III. Elements of An RFP 2

3 What Is the Problem for First-Year Physics Students? How to restructure incorrect understanding of physics concepts became the work of physics faculty at the University of Colorado (PhET project). That is, physics faculty became intellectually curious about how they could answer this question to improve students’ performance over the chronology of their learning. 3

4 4

5 A. What Research Tells Us about Learners EgocentricitySociocentricityNarrow-mindednessRoutinized habits 5 Learners Create Meaning

6 6 Meta-cognitive processes are a significant means of reinforcing learning (thinking about one’s thinking) Learning involves creating relationships between short-term and long-term memory

7 7 Practice in various contexts creates deep or enduring learning Inert learning; Activated learning Transfer of new knowledge into different contexts is important to deepen understanding

8 Threshold Concepts pathways central to the mastery of a subject or discipline that change the way students view a subject or discipline, prompting students to bring together various aspects of a subject that they heretofore did not view as related (Land, Meyer, & Smith). 8

9 People learn differently and may hold onto folk or naive knowledge, incorrect concepts, misunderstandings, false information Deep learning occurs over time transference 9

10 Learning Progressions knowledge-based, web-like interrelated actions or behaviors or ways of thinking, transitioning, self-monitoring. May not be developed successfully in linear progression--thus necessitating formative assessment along the trajectory of learning. Movements towards increased understanding (Hess). 10

11 11 Deep Learning Occurs When Students Are Engaged in Their Learning

12 Writing beyond what is visually presented during a lecture Identifying clues to help organize information during a lecture Evaluating notes after class Reorganizing notes after class 12 Learning Strategies of Successful Students

13 Comparing note-taking methods with peers Using one’s own words while reading to make notes Evaluating one’s understanding while reading Consolidating reading and lecture notes Source: Calvin Y. Yu: Director of Cook/Douglass Learning Center, Rutgers University 13

14 How well do your students … IntegrateTransferAnalyze (Re)ApplyRe-useSynthesize Restructure previous incorrect learning… 14

15 Within a course or module or learning experience? Along the chronology of their studies and educational experiences? From one subject or topic or focus or context to another one such as from an exercise to a case study or internship? 15

16 Integrated Learning…. 16 Cognitive Affective Forms of Representation within Contexts Psychomotor

17 17 1. Identify The Outcome or Outcomes You Will Assess 2. State the Research or Study Question You Wish to Answer 3. Conduct a Literature Review about That Question 4. Develop a Plan to Collect Direct and Indirect Assessment Results that Will Answer Your Question 5. Analyze and Interpret Students’ Work and Students’ Responses 6. Collaboratively Discuss Ways to Innovate Pedagogy or Educational Practices 7. Implement Agreed-upon Changes and Reassess 8. Share Developments Within and Outside The Institution to Build Knowledge about Educational Practices A Problem-based Assessment Framework

18 II. The Principles and Processes of Assessing The Efficacy of Your Educational Practices What do you expect your students to demonstrate, represent, or produce by the end of your course or educational experience, by the your program of study, or by the end of students’ undergraduate or graduate studies? What chronological barriers or difficulties do students encounter as they learn--from the moment they matriculate? How well do you identify and discuss those barriers with students and colleagues and then track students’ abilities to overcome them so that increasingly “more” students achieve at higher levels of performance? 18

19 A student learning outcome statement is a complete sentence that describes what you expect students to demonstrate, represent, produce or do as a result of your teaching and learning practices. It relies on active verbs that describe what you expect students to demonstrate, and it becomes the basis of determining how you actually will assess that expectation. 19

20 Purposes of Student Learning Outcome Statements Orient Students to the College’s and Each Program’s Expectations upon Their Entry into The College or into Their Major Program of Study (FYE?) Enable Students to Identify Where and How They Have Learned or Are Learning across The Institution Position Students to Make Connections Between and Among Their Learning Experiences along Their Educational Journey Lead to Collaborative Agreement about Direct and Indirect Methods to Assess Students’ Achievement of Outcomes 20

21 Cognitive Levels of Learning: Revised Bloom’s Taxonomy (Handout 1) CreateEvaluateAnalyzeApplyUnderstand Know (remember) 21 (Lorin et. als.)

22 Student Learning Outcome Statements Institution-level Outcomes (GE) Program- or Department-level Outcomes (including GE) Course Outcomes/ Service Outcomes/Educational Opportunities Outcomes (including GE) 22

23  Institution-Level ( for example, FSC’s GE) CRITICAL THINKING (REASONING) Students will: (1) identify, analyze, and evaluate arguments as they occur in their own or others' work; and (2) develop well-reasoned arguments.  Department- or Program-level (FSC’s Nursing) Integrate evidence-based findings, research, and nursing theory in decision making in nursing practice. 23

24  Course- or Educational Experience-level Integrate concepts into systems (BCS 101: Pullan) Analyze human agency (Reacting to the Past: Menna) Think critically (EGL 101 and BUS 109: Shapiro and Singh) 24

25 25 B. Develop Curricular and Co-curricular Maps Help us determine coherence among our educational practices that enables us, in turn, to design appropriate assessment methods (See Handouts 2-3) Identify gaps in learning opportunities that may account for students‘ achievement levels Provide a visual representation of students’ learning journey

26 Help students make meaning of the journey and hold them accountable for their learning over time Help students develop their own learning map—especially if they chronologically document learning through eportfolios (See Handouts 2- 3) 26

27 C. Focus on the challenges, obstacles, or “tough spots” that students encounter – Research or Study Questions in Your RFPs 27 Often Collaboratively developed Open-ended Coupled with learning outcome statements (Reference RFPs) Developed at the beginning of the assessment process

28 The Seeds of Research or Study Questions Informal observations around the water cooler Results of previous assessment along the chronology of learning or at the end of students’ studies Use of a Taxonomy of Weaknesses, Errors, or Fuzzy Thinking (see Handout 4) 28

29 Some Examples of Research/Study Questions What kinds of erroneous ideas, concepts, or misunderstandings predictably interfere with students’ abilities to learn or may account for difficulties they encounter later on? What unsuccessful approaches do students take to solve representative disciplinary, interdisciplinary, or professional problems? Counter that with learning about how successful students solve problems. 29

30 What conceptual or computational obstacles inhibit students from shifting from one form of reasoning to another form, such as from arithmetic reasoning to algebraic reasoning? What kinds of cognitive difficulties do students experience across the curriculum as they are increasingly asked to build layers of complexity? 30 See Handout 5. What is your research/study question? Reference Annotated RFPs

31 31 D. Review your Course or Sequence of Courses in a Program or Department for Alignment (See Handouts 6, 7-10, 11) Program outcomes Course or experience outcomes Criteria / standards to assess outcome (see ) Course design: pedagogy learning context Assignments (see syllabus example, ) Student feedback

32 E. Identify or Design Assessment Methods that Provide Evidence of Product and Process 32 Direct Methods, including some that provide descriptive data about students’ meaning-making processes, such as “Think Alouds” Indirect Methods, including some that provide descriptive data, such as Small Group Instructional Design or salgsite.org Survey Institutional data (course taking patterns, percentages or usage rates accompanied with observation or documentation of impact)

33 Some Direct Methods to Assess Students’ Learning Processes Think Alouds: Pasadena City College, “How Jay Got His Groove Back and Made Math Meaningful” (Cho & Davis) Word edit bubbles Observations in flipped classrooms Students’ deconstruction of a problem or issue (PLEs in eportfolios can reveal this - tagging, for example) 33

34 Student recorder’s list of trouble spots in small group work or students’ identification of trouble spots they encountered in an assignment Results of conferencing with students Results of asking open-ended questions about how students approach a problem or address challenges Use of reported results from adaptive or intelligent technology Observations based on 360-degree classroom design (students show work as they solve problems) 34

35 Use of reported results from adaptive or intelligent technology Focus on hearing about or seeing the processes and approaches of successful and not so successful students Analysis of “chunks of work” as part of an assignment because you know what will challenge or stump students in those chunks 35

36 Some Direct Assessment Methods to Assess Students’ Products Scenarios—such as online simulations Critical incidents Mind mapping Questions, problems, prompts 36

37 Problem with solution: Any other solutions? Chronological use of case studies Chronological use of muddy problems Analysis of video Debates Data analysis or data conversion 37

38 Visual documentation: videotape, photograph, media presentation Observation of students or other users in representative new or revised practices—what kinds of difficulties or challenges do they continue to face? Assessment of the quality of X, such as proposals, based on criteria and standards of judgment Comparison of “before” and “after” results against criteria and standards of judgment 38

39 Documentation of areas of improved or advanced ability, behavior, or results using a scoring rubric that identifies traits or characteristics at different levels of performance (Refer to Handouts 7-10) Asking students to respond to a scenario to determine changed behavior (such as in judicial decisions or in decision making about behaviors or choices) 39

40 Sentence or story completion scenarios (consider the validity of responses in relation to actual behavior) http://www.ryerson.ca/~mjoppe/Research Process/841process6bl1c4bf.htm http://www.ryerson.ca/~mjoppe/Research Process/841process6bl1c4bf.htm Other for FSC? 40

41 Some Indirect Methods that Probe Students’ Learning Experiences and Processes SALG (salgsite.org): Student Assessment of Their Learning Gains Small Group Instructional Design Interviews with students about their learning experiences--about how those experiences did or did not foster desired learning, about the challenges they faced and continue to face. (Refer to Handout 12 for a list of direct and indirect methods of assessment you might use to assess your students’ learning/development) 41

42 F. Chronologically Collect and Assess Evidence of Student Learning Baseline—at the beginning--to learn about what students know or how they reason when they enter a program Formative—along the way--to ascertain students’ progress or development against agreed upon expectations. Summative—at the end--to ascertain students’ levels of achievement against agreed upon expectations. 42

43 Referring to pages 34-42 or to Handout 12,identify both direct and indirect methods you might use to gauge evidence of the efficacy of your educational practice(s) based on baseline evidence: Professional or legal standards/expectations for performance such as those established by the Council for the Advancement of Standards, the Association of Governing Boards, The New Leadership Alliance for Student Learning and Accountability, or AAC&U’s VALUE rubrics 43

44 Determine the kinds of inferences you will be able to make based on each method and the problem you are trying to solve Identify other institutional data that might be useful when you interpret results, such as judiciary board sanctions or other records. 44

45 Your Method of Sampling Ask yourself what you want to learn about your students and when you want to learn: All students Random sampling of students Stratified random sampling based on your demographics—informative about patterns of performance that can be addressed for specific populations, such as non-native speakers 45

46 Scoring Faculty: Determine when work will be sampled. Identify who will score student work (faculty, emeritus faculty, advisory board members, others?). Establish time and place to norm scorers for inter-rater reliability on agreed upon scoring rubric. (See Handout 13) 46

47 G. Report on Results of Scoring Evidence of Student Learning Office of IR, Director of Assessment, or “designated other”: Analyzes and represents scoring or testing results that can be aggregated and disaggregated to represent patterns of achievement and to answer the guiding research or study question(s) Develops a one-page Assessment Brief 47

48 The Assessment Brief Is organized around issues of interest, not the format of the data (narrative or verbal part of the brief). Reports results using graphics and comparative formats (visual part of the brief, such as trends over time, for example, or achievement based on representative populations). 48

49 49

50 50

51 51 Results based on scoring students’ written work 61% 20% 11% 8% 0% 10% 20% 30% 40% 50% 60% 70% EmergingDevelopingProficientExemplary

52 H. Establish Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results or To Hear About Your Interpretation of Results Identify patterns against criteria and cohorts (if possible) Tell the story that explains the results— triangulate with other reported data, such as results of student surveys. 52

53 Determine what you wish to change, revise, or how you want to innovate and develop a timetable to reassess once changes are implemented. (See Handouts ) 53

54 Implement agreed upon changes Re-assess to determine efficacy of changes Focus on collective effort—what we do and how we do it 54 Collaboratively Agree on and Re- Assess Changes

55 III. Elements of An RFP 55 State Your Outcome or Outcomes for a Time Period (cycle of inquiry) Identify the research or study question you will try to answer within the context of current literature Identify two methods you will use to assess that outcome or set of outcomes

56 Identify your baseline data or initial state (where you started) Identify the criteria and standards of judgment you will use to chart progress (professional or agreed upon performance standards, scoring rubrics) Identify when you will collect data 56

57 Calendar when you will analyze and interpret results Identify when you will submit a report that briefly describes your interpretation of results, further needed actions, and conclusions. Tell the story that explains the results based on triangulating evidence and data you have collected Calendar when further actions will be taken, including plans to reassess to determine the efficacy of those further actions. (See sample plan in Handouts 14-15) 57

58 What if we…. Collaboratively use what we learn from this approach to assessment to design the next generation of curricula, pedagogy, instructional design, educational practices, and assignments to help increasingly more students successfully pass through trouble spots or overcome learning obstacles; 58

59 and, thereby, collaboratively commit to fostering students’ enduring learning in contexts other than the ones in which they initially learned. (See Handout 16 to identify where you see the need to build your department’s or program’s assessment capacity.) 59

60 Works Cited Cho, J. and Davis, A. 2008. Pasadena City College. “How Jay Got His Groove Back and Made Math Meaningful.” http://www.cfkeep.org/html/stitch.php?s=13143081975303&id=189465 94390037 http://www.cfkeep.org/html/stitch.php?s=13143081975303&id=189465 94390037 Hess, K. 2008. Developing and Using Learning Progressions as a Schema for Measuring Progress. National Center for Assessment, 2008. http://www.nciea.org/publications/CCSSO2_KH08.pdfhttp://www.nciea.org/publications/CCSSO2_KH08.pdf Land, R., Meyer, J.H.F., and Smith, J. Eds. 2010. Threshold Concepts and Transformational Learning. Rotterdam: Sense Publishers. Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A. Eds. (2000). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Boston, MA.: Allyn and Bacon.Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A 60

61 Maki, P. 2010. 2nd Ed. Assessing for Learning: Building a Sustainable Commitment Across the Institution. VA: Stylus Publishing, LLC National Research Council. 2002. Knowing What Students Know: he Science and Design of Educational Assessment. Washington, D.C. Yu, C. Y. “Learning Strategies Characteristic of Successful Students.” Maki, P. 2010. p. 139. 61


Download ppt "Assessment across A Culture of Inquiry Peggy Maki, Ph.D. Education Consultant Specializing in Assessment Presented at Farmingdale State College September."

Similar presentations


Ads by Google