Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 The Driver Behind Assessment: Intellectual Curiosity about Research or Study Questions that Lead to Changes in Pedagogy Peggy L. Maki

Similar presentations


Presentation on theme: "1 The Driver Behind Assessment: Intellectual Curiosity about Research or Study Questions that Lead to Changes in Pedagogy Peggy L. Maki"— Presentation transcript:

1 1 The Driver Behind Assessment: Intellectual Curiosity about Research or Study Questions that Lead to Changes in Pedagogy Peggy L. Maki PeggyMaki@aol.com Assessment Consultant, Editor and Writer Presented at University Faculty Conference Pepperdine University Hyatt Regency Newport Beach, October 3, 2008

2 2 Foci  Research on Learning that Informs Teaching, Learning and Assessment  Case Studies that Illustrate How Assessment Results Lead to Effective Changes in Pedagogy and Educational Practices  Research or Study Questions that Guide Inquiry into Student Learning

3 3  The Design or Selection of Direct and Indirect Methods and Standards and Criteria of Judgment  Development of a Plan to Answer Your Research or Study Question and Develop Research-Based Curricula: A Plan Used in Research

4 4 How People Actually Learn “I reverted to what I learned about trigonometry from how I learned trigonometry in my home country. I could never follow what the American faculty member was telling us to do —I learned it differently.” (international student) “I was supposed to diagnose a patient the way the faculty member described, but that’s not how I really did it at all. Yet I still was the only one in my class to present the correct diagnosis. I never diagnosed the way I was taught but always made the correct diagnosis.” (neurologist)

5 5  “I still use my fingers to count.” (student)  “I never did well on memory tests about dosages of medicine to prescribe because I knew as a Veterinarian that I would be able to look up the dosages. I would ask instead: ‘Observe me diagnosing an ailment, identifying the treatment, and then looking up the dosage needed.” (Veterinarian)

6 6 Some Things We Know about Learning That Inform the Relationship among Teaching, Learning, and Assessment  Learning is a complex process of interpretation- -not a linear process  Learners create meaning as opposed to receive meaning  Knowledge is socially constructed (importance of peer-to-peer interaction in high impact practices such as learning communities and service learning ) 1.

7 7  People learn differently—prefer certain ways of learning (learning inventories, such as Solomon and Felder: http://www.engr.ncsu.edu/learningstyles/ilsweb. html ; Teaching Style Inventory, such as Pratt: http://www.teachingperspectives.com/html/tpi_fr ames.htm)  Deep learning occurs over time—transference

8 8 Approaches to Learning  Surface Learning  Deep Learning

9 9  Meta-cognitive processes are a significant means of reinforcing learning (thinking about one’s thinking)  Learning involves creating relationships between short-term and long-term memory

10 10  Transfer of new knowledge into different contexts is important to deepen understanding NRC. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, D.C.

11 11 What lines of inquiry can we explore to design the next generation of curricula-co- curricula design that responds to what we are learning and can learn about student learning to improve student achievement?

12 12 The four case studies in front of you illustrate various ways in which faculty have changed pedagogy, instructional design, and strategies to improve student learning based on students’ performance in assigned work or after agreed upon times to capture students’ learning along the chronology of students’ studies. Collectively identifying a problem in student learning, faculty pursued the reason for the problems they identified and then developed alternative ways to improve student learning.

13 13 At your tables, assign various people to read these four case studies; then through collaborative discussion at your table identify problems you may already consistently seen in student work or identify how you might work together to identify common problems through your assessment of student learning that would promote collaborative discussion about improving teaching and learning.

14 14 Research or Study Questions that Guide Inquiry into Student Learning Couple and align learning outcome statements with a research or study question about the efficacy of educational practices along the chronology of students’ learning.

15 15 Levels of Learning Outcome Statements Institution-level Outcome Statements (including GE or Core) Department-,Program-, Certificate-level Outcome Statements Course/Service/Educational Experience Outcome Statements

16 16 Characteristics of Outcome Statements  Describe learning desired within a context  Rely on active verbs (create, compose, calculate, construct, apply, for example with a focus on the highest levels by the time students’ graduate)  Emerge from our collective intentions over time

17 17  Can be assessed quantitatively or qualitatively during students’ undergraduate and graduate careers  Can be mapped to curricular and co-curricular practices (ample, multiple and varied opportunities to learn over time)  Are written at a course, program, or the institution-level

18 18 Distinguishing between Objectives and Outcomes  Objectives state overarching expectations such as-- Students will develop effective oral communication skills. OR Students will understand different economic principles.

19 19 Quantitative Literate Graduates according to MAA Should be Able to: 1. Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them. 2. Represent mathematical information symbolically, visually, numerically, and verbally. 3. Use arithmetical, algebraic, geometric, and statistical methods to solve problems.

20 20 4. Estimate and check answers to mathematical problems in order to determine reasonableness, identify alternatives, and select optimal results. 5. Recognize that mathematical and statistical methods have limits. (http://www.ma.org/pubs/books/grs.html) The Mathematics Association of America (Quantitative Reasoning for College Graduates: A Complement to the Standards, 1996).

21 21 Ethics—Students should be able to…(institution-level)  Identify and analyze real world ethical problems or dilemmas, and identify those affected by the dilemma.  Describe and analyze the complexity and importance of choices that are available to the decision-makers concerned with this dilemma

22 22  Articulate and acknowledge their own deeply held beliefs and assumptions as part of a conscious value system  Describe and analyze their own and others’ perceptions and ethical frameworks for decision- making  Consider and use multiple choices, beliefs, and diverse ethical frameworks when making decisions to respond to ethical dilemmas or problems. California State University Monterey Bay: University Learning Requirements, 2002

23 23 Ways to Articulate Outcomes  Adapt from professional organizations  Derive from mission of institution/program/department/service  Derive from students’ work

24 24  Derive from ethnographic process  Derive from exercise focused on listing one or two outcomes “you attend to”  Draw from taxonomies, such as Bloom’s

25 25 How well does your outcome statement meet characteristics of a good statement? (Refer to pages 16-17) Ask the person next to you to apply the characteristics of a good outcome statement to your outcome statement(s); then discuss that person’s assessment of your statements. How might each of you improve your statements?

26 26 External Validation  Advisory boards  Recent alums  Survey of individuals in a field  Developments in Professional Organizations such as AAC&U

27 27 Sample Research or Study Questions that You Can Join to Your Outcome Statements…  How do students… come to know material that we teach? represent their learning to themselves? initially construct meaning in a field, discipline, or even a course? create mental models?

28 28  integrate new learning into previous learning?  store and draw on previous learning?  reposition or expand their understanding?  develop dispositions to learn over time?  reuse or apply stored learning or transfer it?

29 29 build layers of complexity in their learning— conceptual complexity, for example? reposition or modify or change altogether long-held understanding, misunderstanding, or beliefs? learn or don’t learn as a result of demands of time or “coverage”?  develop spiritual behaviors, actions, attitudes, values?

30 30 How well do your students…  Integrate  Transfer  Apply or re-apply  Re-use  Synthesize  Re-position their understanding of their GE outcomes or outcomes in their major program of study?

31 31  Within a course  Along the chronology of students’ studies and educational experiences  From one discipline or topic or focus to another  From one context or situation to another, such as from courses to co-curriculum

32 32 Integrated Learning…. Cognitive Affective Expressive Psychomotor

33 33 Questions about Pedagogy or Other Educational Practices in promoting….  Recall and recognition  Transfer  Integration  Synthesis  Application and re-application  Use and re-use  Change in perspective or understanding  Sustained learning 

34 34 What Do You Want to Discover about Teaching and Learning? Discovery Questions  Efficacy of kinds of pedagogy (problem- based, experiential, didactic) that promote complex problem solving in a discipline  Efficacy of theory behind your teaching and instructional design  Efficacy of curricular or relevant course(s) design or co-curricular design

35 35  Efficacy of the use of educational experiences— service learning, learning communities, for example  Efficacy of intentional scaffolding through on-line or face-to-face instruction along the curriculum  Efficacy of the use of out-of-course assistance, such as tutorials or software programs  Efficacy of instructional design (computer-based, for example )

36 36  What strategies enable students’ to develop strong conclusions (use of graphic organizers, for example)  What kinds of representational models develop complex conceptual understanding. Or--What kinds of representations are conducive to learning in your field? (Physics)  What are the relationships between students’ study habits and deep learning?

37 37  What’s the extent to which students engage and develop higher order thinking skills and critical reflection in a discipline or across GE?  What strategies enable students to transition from thinking arithmetically to thinking algebraically?  How do students’ beliefs affect conceptual development?

38 38  What strategies enable students to overcome learning barriers or obstacles  How do students’ levels of cognition affect their conceptual development?  How do educators’ epistemological views in their fields, translated into instructional design, foster enduring student learning?

39 39  How well do students transfer their early learning in a discipline or profession into their later learning?  How well do students transfer learning from GE courses into their major program of study?  How well do students transfer their GE learning or major program learning into the life outside of the class such as in community service?

40 40  How well do digital dialogue games or other forms of technology stimulate students’ reasoning or conceptual change?  When students reposition their understanding, is it based on a belief revision or conceptual change and restructured knowledge (talk alouds)?  How effective are hypermedia technologies in fostering complex problem solving?

41 41  What strategies do students use to restructure naïve or intuitive theories?  How well do students build their own knowledge based on the use of instructional multi-media designs?  What strategies do successful students use to read and interpret texts, visuals, maps?

42 42  What barriers do students face when they read and interpret texts, etc. What strategies help them overcome those barriers (vocabulary, discourse organization, comprehension, math?) (Philosophy example)  How well do interactive discussions help students construct knowledge?

43 43 What Is the Question You Want to Answer about one of Your GE or Program-level Outcomes?  What’s your study question? Or  What’s your research question? --------------------------------------------------------- ---------------------------------------------------------- ----------------------------------------------------------

44 44 What Other Data Might You Need to Answer Your Question?  Baseline exercises, such as concept inventories used in Physics, case studies used over time, or simulations used over time  Maps or inventories of practice  Surveys or interviews with students about their learning

45 45  Transcript analyses of course-taking patterns  Participation in co-curricular programs  Educator interviews

46 46  SALG results  Syllabi analyses about kinds of in-class assessments or methods of teaching/learning  Student think alouds

47 47 Think Alouds… Quellmalz and Haydel (2003) found in cognitive analyses of think-alouds that “ students were more likely to use schematic and strategic knowledge on performance assessments than on multiple-choice items.” Assessment approaches that require students to construct and explain thinking as they solve problems can measure distinct components of inquiry and problem solving, including stating research questions, posing hypotheses, planning and conducting investigations, gathering evidence, analyzing data, considering disconfirming evidence, and communicating interpretations. http://serc.carleton.edu/files/NAGTworkshops/Assess/QuellmalzEssay/pdf

48 48 The Design or Selection of Direct and Indirect Methods and Standards and Criteria of Judgment “Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary.“ National Research Council. Knowing what students know: The science and design of educational assessment. Washington, D.C.: National Academy Press, 2001, p. 47.

49 49 Assumptions Underlying Teaching Actual Practices Assumptions Underlying Assessment Tasks Actual Tasks

50 50 Shulman on Assessment Methods  “… the first lesson regarding an assessment is to take responsibility for locating its unavoidable insufficiencies in relation to what it claims it can measure….We do not seek one perfect measurement instrument, but an array of indicators that can be understood in relation to one another.”  Lee Shulman. “Principles for the Uses of Assessment in Policy and Practice.”

51 51 What Tasks Elicit Learning You Desire?  Tasks that require students to select among possible answers (multiple choice test)?  Tasks that require students to construct answers (students’ problem-solving and interdisciplinary thinking abilities)?

52 52 When Will or Do You Seek Evidence?  Formative—along the way? For example, to ascertain progress or development against pedagogy  Summative—at the end? For example, to ascertain level of final achievement

53 53 Direct Methods  Focus on how students represent or demonstrate their learning (meaning making)  Align with students’ learning and assessment experiences  Align with curricular-and co-curricular design verified through mapping

54 54 Possible Assessment Methods Higher Education can Use to Learn More about How Students Learn (See also handout on methods)  Student Assessment of Learning Gains (SALG)—ask students to identify ways they actually learned across components or elements of a lesson or course. Could be extended across the program of study.  Online journals that record how students make meaning/solve problems  Wikis (knowledge building sites)  Classroom Response System (CRS)—clickers

55 55  Assessment checkpoints based on layers of learning in vertical themes (skill layers, factual layers, theoretical layers, conceptual layers, interpretive layers, knowledge layers, logic layers, methods layers, reasoning layers)  Online discussion boards  Small Group Instructional Diagnosis (SGID) conducted by someone other than faculty teaching a course http://www.ntff.com/ntff/sgid.doc http://www.ntff.com/ntff/sgid.doc

56 56  Resulting patterns from engagement with interactive computer simulated tasks that provide data on patterns of actions, decisions, etc., and branch students forward or backward  Knowledge, decision, or procedural maps http://classes.aces.uiuc.edu/aces100/ mind/c_m2.html http://classes.aces.uiuc.edu/aces100/ Spider Concept Map

57 57 What Criteria Will Be Applied to Student Achievement so That You Can Make Inferences about Students’ Achievement of Your Outcome? Skills Knowledge Habits of mind (disciplinary or interdisciplinary habits of mind) Ways of knowing

58 58 Dispositions—Spiritual? Research strategies/approaches Disciplinary conventions Ways of problem solving (including increasingly complex problems)

59 59 How Well Do These Criteria Align with…  Teaching practices  Learning practices (how we position students to learn)  Frequency of feedback  Students’ learning histories  Design and coherence of curriculum and co-curriculum (multiple and diverse opportunities to learn)

60 60 Development of a Plan to Answer Your Research or Study Question and Develop Research-Based Curricula: A Plan Used in Research

61 61 Contributions from Research on Student Learning Based on…  Deconstruction of a unit or course or the curriculum into layers or components or elements  Experimentation with pedagogy based on assumptions about how students learn layers or components or elements  Assessment of student learning after each layer, component, or element to ascertain the efficacy of specific kinds of pedagogy for each layer

62 62  Element-based, component based, or layer-based student-directed questions  Think alouds that ask students to construct and explain thinking

63 63 Example: Deconstruct the curriculum based on vertical themes, such as in a medical program:  Nutrition  Pain  Disability  Life cycle  Personal development  Communication  Evidence-based practice  Ethics & legal responsibilities  Psychological aspects of clinical practice  Pharmacology and therapeutics  Public health  Align pedagogy or forms of instruction with learning points or elements  Assess learning along the learning points or elements to ascertain efficacy of pedagogy or instructional design for learners

64 64  Deconstruct themes into elements or layers or components across the curriculum  Identify chronological pedagogy or forms of instruction along those layers  Develop assessment methods that align with pedagogy or instructional design in each layer or component  Use focus groups or surveys of students’ responses to the pedagogy related to each layer or element

65 65 Example: VaNTh ERC (Vanderbilt- Northwestern-Texas- Harvard/MIT Engineering Research Center Focuses on “real life” challenges in professional education: Turner and Thomas argue in Clear and Simple as the Truth that writing skills are most successfully taught when they are integrated with genuine (rather than contrived) activities that build on past learning, create a real need for the new skills, and offer an opportunity to learn those skills. As they explain: “Intellectual activities lead to skills, but skills do not generate intellectual activities” (p.4)

66 66 According to Hirsch, et als, “Instead of writing essays, papers, and exams, students write to faculty and clients to communicate important information about their projects: for example, they write mission statements, report on client meetings, synthesize the results of research, prepare progress reports, and create slides for PowerPoint presentations. Thus, as a communication course, EDC sends a strong, clear message to students: communication is an integral part of the design enterprise, not merely a superficial matter of editing. Clear communication advances creative problem-solving, the heart of engineering design.” (Hirsch, et als., p. 4)

67 67 Results..  Request student performance analysis that can be aggregated and disaggregated according to your research or study question, such as performance based on students’ course taking patterns, different pedagogies, different contexts for learning  Request narrative interpretation of student performance (recall Case 4)

68 68

69 69

70 70 “ It is always possible to defend the inspirational lecturer, the importance of academic individuality, the value of pressuring students to work independently, but we cannot defend a mode of operation that actively undermines a professional approach to teaching. Teachers need to know more than just their subject. They need to know the ways it can come to be understood, the ways it can be misunderstood, what counts as understanding; they need to know how individuals experience the subject. But they are neither required nor enabled to know these things” (Diana Laurillard, 6)

71 71 Selected Resources Hirsch, P. Kelso, D., Shwom, B.,Troy, J. Walsh, J. Redefining Communication Education for Engineers: How the NSF/VaNTH ERC is Experimenting with a New Approach. Northwestern University, Session 2261 (copy available at www.vanth/docs/016_2001.pdf) Holbert,N. (February, 2008). “Shooting Aliens: The Gamer's Guide to Thinking.” Educational Leadership. Vol 65. No.5. Laurillard, D. (1993). Rethinking University Thinking: A Framework for the Effective Use of Educational Technology. London: Routledge

72 72 Maki, P. 2004. Assessing for Learning: Building a Sustainable Commitment across the Institution. VA: Stylus Publishing, LLC. (to be revised in 2009) National Research Council. 2001.Knowing What Students Know: The Science and Design of Educational Assessment. Washington, D.C. Physics Education Technology Project (http://phetcolorado.edu/web- pages/publicatons/phet_aapt-04pd

73 73 Quellmalz and Haydel. 2003. Center for Technology in Learning at SRI International. Available at: http://sercc.carleton.edu/files/NAGT workshops/Assess/QuellmalzEssay/pdf.http://sercc.carleton.edu/files/NAGT Shulman. L. 2006. “Principles for The Uses of Assessment in Policy and Practice.” President’s Report to the Board of Trustees of the Carnegie Foundation for the Advancement of Teaching. CA: Stanford. Available at: www.teaglefoundation.org/learning/resources.aspx#assessment Material presented in this workshop will be integrated into Maki’s 2009 2 nd Edition of Assessing for Learning. Stylus Publishing, VA


Download ppt "1 The Driver Behind Assessment: Intellectual Curiosity about Research or Study Questions that Lead to Changes in Pedagogy Peggy L. Maki"

Similar presentations


Ads by Google