Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Developing and Assessing General Education Learning Outcomes: A Collaborative Commitment across The Institution Workshop at Miami Dade College November.

Similar presentations


Presentation on theme: "1 Developing and Assessing General Education Learning Outcomes: A Collaborative Commitment across The Institution Workshop at Miami Dade College November."— Presentation transcript:

1 1 Developing and Assessing General Education Learning Outcomes: A Collaborative Commitment across The Institution Workshop at Miami Dade College November 21, 2005 Peggy Maki PeggyMaki@aol.com

2 2 Workshop Foci Building a Culture of Evidence across The Institution Grounding Assessment of GE in Teaching and Learning Collaboratively Developing Learning Outcome Statements-- Claims about Student Learning

3 3 Validating Learning Outcome Statements through Maps and Inventories of Educational Practice Designing or Selecting Valid Assessment Methods that Align with Students’ Educational Experiences Developing Standards and Criteria of Judgment

4 4 Analyzing and Interpreting Results of Student Work Closing the Inquiry Loop

5 5 Mission/Purposes Learning Outcome Statements How well do students achieve our outcomes? Gather Evidence Interpret Evidence Enhance teaching/ learning; inform institutional decision- making, planning, budgeting

6 6 Your Learning Outcomes Articulate some GE learning outcome statements that align with what and how students learn in your programs and services Map GE outcome statements to assure students have diverse and multiple opportunities to learn Identify some direct and indirect assessment methods to capture student learning

7 7 Develop some standards and criteria of judgment to score student work Identify when and where to assess and how to collect evidence of student learning Identify when and who will assess evidence of student learning

8 8 Identify possible times across the institution when colleagues can come together to interpret results and reach consensus about ways to improve student learning After implementing changes, identify when you will reassess the efficacy of changes.

9 9 Learning Faculty Tutors Student Services Staff Mentors Intern Advisors Academic Advisers Athletic Coaches Advisory Board Members Librarians and Resource Staff Support Services Staff Graduate Students; Teaching Assistants Lab Assistants Spiritual Leaders Peers Building a Culture of Evidence

10 10 R.W. Emerson, “Intellect,” Essays (1841) “How can we speak of the action of the mind under any divisions, as of its knowledge, of its ethics, of its works, and so forth, since it melts will into perception, knowledge into act? Each becomes the other. Itself alone is. Its vision is not like the vision of the eye, but is union with the things known.”

11 11 How do you learn? List several strategies you use to learn: ____________________________________ ____________________________________ ____________________________________ ____________________________________ _________________________

12 12 Grounding Assessment of GE in Teaching and Learning Learning is a complex process of interpretation-not a linear process Learners create meaning as opposed to receive meaning Knowledge is socially constructed (importance of peer-to-peer interaction) National Research Council. Knowing What Students Know, 2001.

13 13 Learning involves creating relationships between short-term and long-term memory Transfer of new knowledge into different contexts is important to deepen understanding Practice in various contexts creates expertise

14 14 People learn differently—prefer certain ways of learning Deep learning occurs over time—transference Meta-cognitive processes are a significant means of reinforcing learning (thinking about one’s thinking and ways of knowing)

15 15 Integration of learning and development over time…. Cognitive AffectivePsychomotor

16 16 Specific Questions that Guide Assessment What do you expect your students to know and be able to do by the end of their education at your institution? What do the curricula and other educational experiences “add up to?” What do you do in your classes or in your programs or services to promote the kinds of learning or development that the institution seeks?

17 17 Questions (con’d) Which students benefit from various classroom teaching strategies or educational experiences? What educational processes are responsible for the intended student outcomes the institution seeks? How can you help students make connections between classroom learning and experiences outside of the classroom?

18 18 Questions, con’d: What pedagogies/educational experiences develop knowledge, abilities, habits of mind, ways of knowing/problem solving, and dispositions? How are the curriculum and co-curriculum designed to develop knowledge, abilities, habits of mind, ways of knowing, and dispositions?

19 19 How do you intentionally build upon what each of you teaches or fosters to achieve programmatic and institutional objectives— contexts for learning? What methods of assessment capture desired student learning--methods that align with pedagogy, content, curricular and instructional design?

20 20 Common Categories of GE Learning Writing Speaking Quantitative Reasoning Problem solving, critical thinking

21 21 Leadership Lifelong learning Ethical awareness; social responsibility Team work Global perspectives; multiple perspectives

22 22 Mesa Community College Categories (AZ) Written and oral communication Critical thinking/problem solving Numeracy Arts and humanities Scientific inquiry Information literacy Cultural diversity

23 23 Inventory from MDC’s Student Services Last Friday: Writing Speaking Reading Comprehension Critical thinking/problem solving Quantitative reasoning/problem solving Technology Application of knowledge Proficiency in a chosen field

24 24 Cultural literacy Globalism Teamwork/solo work Self-initiative/independence Social responsibility Ethical awareness Leadership Ability to adapt to environments/changes

25 25 Categories under which students learn and develop List several categories under which you believe students learn or develop as a result of MDC’s GE program? ____________________________________ ____________________________________ ____________________________________ ____________________________________ ____________________________________ ____________

26 26 Inventory Based on Nov. 21 Cross-Campus Group Work Writing Speaking Listening Quantitative reasoning, including ability to assess and evaluate Critical thinking Ethical awareness personal/social responsibility, including cultural dimensions

27 27 Environmental ethics Computer/information literacy Cultural Literacies Problem solving Problem posing Financial responsibility Workforce skills Knowledge about self, others, community, world

28 28 Leadership Active learning (self) Teamwork Ability to link across the curriculum and experiences Time management Global perspectives/diversity Cultural sensitivity Interpersonal skills Adaptability

29 29 Scientific thinking/methods Appreciation of the arts, including a global perspective Life skills

30 30 Collaboratively Developing Learning Outcome Statements Learning outcome statements describe what students should be able to demonstrate, represent, or produce based on how and what they learn at the institution through multiple, varied, and intentional learning opportunities.

31 31 Rely on active verbs, such as create, compose, calculate, develop, build, evaluate, translate, etc., that target what we expect students to be able to demonstrate Emerge from what we value and how we teach or students learn; that is, they emerge from our educational practices and are developed through consensus Are written for a course, program, service, or the institution

32 32 Can be mapped to the curriculum and co- curriculum Can be assessed quantitatively or qualitatively

33 33 Levels of Learning Outcome Statements Institution-level Outcome Statements, including GE Department-,Program-, Certificate-level Outcome Statements Course/Service/Educational Experience Outcome Statements

34 34 Distinguishing between Objectives and Outcomes Objectives state overarching expectations such as– Students will develop effective oral communication skills. OR Students will understand different economic principles.

35 35 Mesa Outcomes under Arts and Humanities Demonstrate knowledge of human creations Demonstrate an awareness that different contexts or world views produce different human creations Demonstrate an understanding and awareness of the impact that a piece has on the relationship and perspective of the audience Demonstrate an ability to evaluate human creations

36 36 Capital Community College (CT) Communicate effectively Reason scientifically and or quantitatively Think critically Develop a global perspective (See handout)

37 37 Ethics—Students should be able to… Identify and analyze real world ethical problems or dilemmas, and identify those affected by the dilemma. Describe and analyze the complexity and importance of choices that are available to the decision-makers concerned with this dilemma

38 38 Articulate and acknowledge their own deeply held beliefs and assumptions as part of a conscious value system Describe and analyze their own and others’ perceptions and ethical frameworks for decision-making Consider and use multiple choices, beliefs, and diverse ethical frameworks when making decisions to respond to ethical dilemmas or problems. California State University Monterey Bay: University Learning Requirements, 2002

39 39 Example from ACRL Literate student evaluates information and its sources critically and incorporates selected information into his or her knowledge and value system. ONE OUTCOME: Student examines and compares information from various sources in order to evaluate reliability, validity,accuracy, timeliness, and point of view or bias.

40 40 Quantitative Literate Graduates according to MAA Should be Able to: 1. Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them. 2. Represent mathematical information symbolically, visually, numerically, and verbally. 3. Use arithmetical, algebraic, geometric, and statistical methods to solve problems.

41 41 4. Estimate and check answers to mathematical problems in order to determine reasonableness, identify alternatives, and select optimal results. 5. Recognize that mathematical and statistical methods have limits. (http://www.ma.org/pubs/books/qrs.html)http://www.ma.org/pubs/books/qrs.html The Mathematics Association of America (Quantitative Reasoning for College Graduates: A Complement to the Standards, 1996). See also AMATYC draft, 2006.

42 42 Writing See NCTA Guidelines See WPA Outcomes in attachments for outcomes at the end of the first year of writing

43 43 Ways to Articulate Outcomes Adapt from professional organizations Derive from mission of institution/program/department/service Derive from students’ work

44 44 Derive from ethnographic process Derive from exercise focused on listing one or two outcomes “you attend to” Consult taxonomies

45 45 Bloom’s Taxonomy—cognitive, psychomotor, affective Webb’s Taxonomy—depth of knowledge Shulman’s Taxonomy—table of learning Taxonomies That May Help You Develop Outcome Statements

46 46 Depth of Knowledge (Webb) Recall and recognition Processing skills and concepts Strategic thinking Extended thinking (complex reasoning, planning, design)

47 47 Dimensions of Knowledge Facts Procedures—series of step-by-step actions and decisions that result in the achievement of a task Processes—flow of events or activities that describe the big picture

48 48 Concepts—class of items, words, or ideas known by a common name Principles—guidelines, rules, parameters Metacognitive—knowledge of one’s own cognition

49 49 Shulman’s Taxonomy Engagement (active learning) Knowledge and understanding Performance, practice, or action (act in and on the world) Reflection and critique (cease action to discover or “make progress”)

50 50 Judgment and design—consider context— even restraints Commitment and Identity—move inward and connect outward http:///www.carnegiefoundation.org/elibrary/docs/printable/ making_differences.htm

51 51 Exercise: Write one or two GE learning outcome statements under a category of learning __________________________________ ___________________________________

52 52 Exercise: How well do your learning outcome statements meet the criteria for well-written outcome statements (see handout)?

53 53 Validating Learning Outcome Statements through Maps and Inventories of Practice Reveal how we translate outcomes into educational practices offering students multiple and diverse opportunities to learn Help us to identify appropriate times to assess those outcomes Identify gaps in learning or opportunities to practice

54 54 Help students understand our expectations of them Place ownership of learning on students Enable them to develop their own maps or learning chronologies

55 55 Collaborative Development of A Curricular-Co-Curricular Map I,R,ECourse Educational experience Outcome 1: Outcome 2: Outcome 3: Outcome 4:

56 56 Inventories of Educational Practice Provide in-depth information about how students learn along the continuum of their studies Identify the range of educational practices and assessment experiences that contribute to learning outcomes (See handouts)

57 57 Exercise: How will you use maps and inventories? Discuss how you will go about the process of developing a curricular or curricular-co- curricular map and how you will label peoples’ entries Discuss how you might use inventories of educational practices

58 58 Designing or Selecting Valid Assessment Methods that Align with Students’ educational Experiences “Every assessment is also based on a set of beliefs about the kinds of tasks or situations that will prompt students to say, do, or create something that demonstrates important knowledge and skills. The tasks to which students are asked to respond on an assessment are not arbitrary.“ National Research Council. Knowing what students know: The science and design of educational assessment. Washington, D.C.: National Academy Press, 2001, p. 47.

59 59 Design or Select Assessment Methods that Prompt Students to: Transfer, integrate, apply, synthesize Value interdependence among courses and experiences Re-use and re-configure what they have learned (even to re-position their understanding) Self-reflect on their emerging learning

60 60 For example, do students Apply business principles to a student-run organization? Apply principles of effective writing to a proposal for an independent study or project? Explore multiple perspectives in solving a campus issue or problem? Self-reflect on principles underlying their actions or decisions?

61 61 Assumptions Underlying Teaching Actual Practices Assumptions Underlying Assessment Tasks Actual Tasks

62 62 Inference Drawing Validity of the Method

63 63 What Tasks Elicit Learning You Desire? Tasks that require students to select among possible answers (multiple choice test)? Tasks that require students to construct answers (students’ problem-solving and thinking abilities)? Question: Consider the contexts for each of these kinds of tasks in your work

64 64 When Do You Seek Evidence? Formative—along the way? For example, to ascertain progress or development Summative—at the end? For example, to ascertain mastery level of achievement

65 65 Direct Methods of Assessment Focus on how students represent or demonstrate their learning (meaning making) Align with students’ learning and assessment experiences Align with curricular-and co-curricular design verified through mapping

66 66 Invite collaboration in design (faculty, students, tutors?)

67 67 Standardized Instruments Psychometric approach—values quantitative methods of interpretation History of validity and reliability Quick and easy adoption and efficient scoring One possible source of evidence of learning

68 68 Do Not Usually Provide Evidence of strategies, processes, ways of knowing, understanding, and behaving that students draw upon to represent learning Evidence of complex and diverse ways in which humans construct and generate meaning Highly useful results that relate to pedagogy, curricular design, sets of educational practices

69 69 Authentic, Performance-based Methods Focus on integrated learning Directly align with students’ learning and previous assessment experiences Provide opportunity for students to generate responses as opposed to selecting responses Provide opportunity for students to reflect on their performance

70 70 Do Not Provide Immediate reliability and validity (unless there has been a history of use) Usually do not provide easy scoring unless closed-ended questions are used.

71 71 Direct Methods across Students’ Learning Chronology On-line tools Critical events Assemblage of learning objects Virtual learning environments or situations (including chatrooms and resource rooms)

72 72 Scenarios Storyboards Self-directed group projects Magic box Personal and annotated websites

73 73 Log book or journal tasks that explore an issue over time Event analysis Video clips Case studies over time as students move through courses and educational experiences

74 74 Externally or internally juried reviewed projects Oral defense E-portfolio Aristotle’s finger exercises Interpreting visual material or data

75 75 Representation –concept mapping or problem solving (3-D) Practice of Artists’ Machetes Mining data Students’ drawings and models—perceptual enhances understanding, analysis. and analytical ability

76 76 Chronological tasks that prompt students to stretch over time Draw on knowledge/understanding to solve problem in a different context Problems with solutions: Are there other solutions? Team-based projects Self-reflections

77 77 Magnify or reduce to seek wider implications and relationships (a la Lewis Thomas) Professional/disciplinary practices Embedded assignments

78 78 Performance on national licensure examinations Locally developed tests

79 79 Indirect Methods of Assessment Than Can Be Combined with Direct Methods Programs or Courses selected by students Focus groups (representative of the population) Interviews (representative of the population) Surveys

80 80 Other Sources of Information that May Be Useful in Your Interpretation CSSE results Grades Participation rates or persistence in support services

81 81 Course-taking patterns Students’ majors Transcript analyses or audits (co- curricular transcript?)

82 82 Exercise: Using the handout, determine the degree of alignment of the direct and indirect methods you may use to asses your outcome statements.

83 83 Developing Standards and Criteria of Judgment A set of criteria that identifies the expected characteristics of a text and the levels of achievement along those characteristics. Scoring rubrics are criterion-referenced, providing a means to assess the multiple dimensions of student learning. Are collaboratively designed based on how and what students learn (based on curricular- co-curricular coherence)

84 84 Are aligned with ways in which students have received feedback (students’ learning histories) Students use them to develop work and to understand how their work meets standards (can provide a running record of achievement).

85 85 Raters use them to derive patterns of student achievement to identify strengths and weaknesses Analytic Holistic

86 86 Interpretation through Scoring Rubrics Criteria descriptors (ways of thinking, knowing or behaving represented in work) Creativity Self-reflection Originality Integration Analysis Disciplinary logic

87 87 Criteria descriptors (traits of the performance, work, text) Coherence Accuracy or precision Clarity Structure

88 88 Performance descriptors (describe well students execute each criterion or trait along a continuum of score levels): Exemplary—Commendable– Satisfactory- Unsatisfactory Excellent—Good—Needs Improvement—Unacceptable Expert—Practitioner—Apprentice-- Novice

89 89 Development of Scoring Rubrics Emerging work in professional and disciplinary organizations Research on learning (from novice to expert) Student work

90 90 Interviews with students Experience observing students’ development

91 91 Consider the following guidelines as you develop a scoring rubric for one or more of our outcomes Identify the purpose of the rubric—for student feedback, for justifying a grade, for program- level understanding about student learning Identify the overall format—analytic or holistic?

92 92 Identify the full range of criteria you will assess with indicators for these criteria Identify the performance descriptors; within each cell identify leveled performance

93 93 Pilot-testing the Scoring Rubric Apply to student work to assure you have identified all the dimensions with no overlap Schedule inter-rater reliability times: -independent scoring -comparison of scoring -reconciliation of responses -repeat cycle

94 94 Analyzing and Interpreting Results Seek patterns against criteria and cohorts Build in institutional level and program level discourse Tell the story that explains the results— triangulate with other data, such as CSSE or participation rates

95 95 Be able to aggregate and disaggregate data to guide focused interpretation Collectively determine what you wish to change

96 96 Examples of Changes: Increased attention to weaving experiences across the institution, a program, or a department to improve student achievement Changes in advising based on assessment results Closer monitoring of student achievement-- tracking

97 97 Faculty and staff development to learn how to integrate experiences that contribute to improved student learning Changes in pedagogy and curricular and co- curricular design Development of modules to assist learning; use of technology; self-paced learning, supplemental learning

98 98 Implement agreed upon changes Re-assess to determine efficacy of changes Focus on collective effort—what we do and how we do it Closing the Inquiry Loop to Learn

99 99 Structures Assessment Committees at the institution and department or program levels Development of task forces to assume responsibilities

100 100 Assessment Committee Working Group that identifies collective expectations for learning Working group that develops outcome statements to guide cycles of inquiry Working group that selects or designs assessment methods to capture learning over time Working group that collects and analyzes students’ work or responses Collective community interpretations of results Collective community contributions about ways to adapt, revise, or innovate practices to improve student learning Collective community dialogue designed to build on institutional learning

101 101 Communication: Collaborative Interpretation Disciplinary work groups Cross-disciplinary work groups Formal opportunities to share program-level findings at the institution-level; opportunities to share institution-level findings at the program-level

102 102 Communication: Decision-making Bodies Planning (short- and long-term planning) Budgeting Decision-making Allocation of Resources

103 103 Human, Financial, Technological Support Grad students or part-time support to assist with development of methods or research on methods, collection or analysis Analysis of results Faculty and staff development or resources to support efforts Development of technology to house results or to draw from existing data

104 104 Exercise: Describe the structures, processes, decisions, and channels and forms of communication that currently exist at MDC, as well as your ideas for deepening the commitment to assessment (see handout).

105 105 “What and how students learn depends to a major extent on how they think they will be assessed.” John Biggs, Teaching for Quality Learning at University: What The Student Does. Society for Research into Higher Education & Open University Press, 1999, p. 141.

106 106 Works Cited Biggs, J. (1999). Teaching for Quality Learning at University: What The Student Does. Society for Research into Higher Education & Open University Press, 1999, p. 141. Maki, P. (2004). Assessing for Learning: Building a Sustainable Commitment Across the Institution. Sterling, VA: Stylus Publishing, LLC, and the American Association for Higher Education. National Research Council. (2001). Knowing What Students Know: The Science and Design of Educational Assessment. Washington, D.C.: National Academy Press


Download ppt "1 Developing and Assessing General Education Learning Outcomes: A Collaborative Commitment across The Institution Workshop at Miami Dade College November."

Similar presentations


Ads by Google