6 Why did Lucy get a C?Write down at least one question that comes to mind when you read the cartoon.What is Lucy questioning?How could the teacher avoid questions regarding his/her basis for grading?
7 What are some questions? How do we clearly communicate performance expectations to students?What is “fair” or equitable? Do we give some students a “break” because their parents get faulty clothes hangers?Do we reward effort? If so, how do we define it?Do we advantage or disadvantage certain students by how we measure performance/achievement?If students don’t perform well on assessments, does that mean we did not teach well? How do we know whether or not we did? Do faculty evaluations provide a better picture of how well we teach?Is it important to have students create coat hanger sculptures? Or to perform well on any of our assessments?Important to whom? Why?What kinds of student outcomes do we value personally as teachers and collectively as a faculty?
8 Focusing on Student Learning “Institutional assessment should not be concerned aboutvaluing what can be measured,but, instead, about measuring that which is valued.”-- Banta, T.W. et. al.
9 Goals for today… Participants will be able to: Explain how student assessment can enhance student engagement/ learning and promote program development;Begin or continue the process to:a) Articulate goals for student learning;b) Translate goals into instructional objectives stated terms of studentlearning outcomes;c ) Identify or develop assessment tasks;d) Select or develop evaluation tools.Describe the changing context of assessment, accreditation, and accountability in higher education.
10 Starting with what we do well… Being accountable:Institutional Effectiveness
11 “Community colleges are prominent among the leaders in higher education in establishing indicators of institutional effectiveness, gathering benchmark data, and using findings to improve the satisfaction of students and other community constituents.” Trudy Banta, Editor’s Notes, 1995
12 Assessment, planning, and budget are integrated. Objectives established by departments during program review become the basis for budget allocations.
13 North Carolina Community College Performance Measures Progress of Basic Skills StudentsPassing Rates on Licensure/Certification ExamsGoal Completion for CompletersEmployment Rate of GraduatesPerformance of College Transfer StudentsPassing Rates in Developmental CoursesSuccess rate of developmental students in subsequent college level coursesStudent satisfactionRetention, graduation ratesEmployer satisfactionBusiness/Industry satisfaction with services providedProgram enrollment
14 Forsyth Tech’s Record… Basic skills students meet state benchmarks (82% compared to system average of 79%)Aggregate passing rates on licensure/certification exams is equal to NC 86% pass rateEmployment rate is reported at 99.05%90% (state 80%) students pass developmental coursesSatisfaction of completers = 93% (state 97%)Business/industry satisfaction with services 100% (100%)
15 Back to Trudy Banta… Peterson (1999) 2,524 non-proprietary postsecondary institutions AS/BS1,393 (55%) responded“Compared to all institutions, associate of arts institutions are less likely to collect cognitive and affective data, less likely to use student-centered methods in collecting data, and less likely to conduct studies of student performance…”
16 How would this change what we do? Outcomes-based (MBO) assessment:“Outcomes (objectives on the tactical plans),developed by all instructional departments and administrative and educational support service departments,are statements describing what each department’s staff/faculty members desire to be the results of their efforts.”Annual FTCC Plan,Assessing Student Learning Outcomes:Directly examiningthe knowledge, skills, and abilitiesthat a student has attainedat key points in his or her progressthrough a set of higher education experiences andin the first years of practice.
17 Creating Assessment Systems-- Shared Commitments Faculty share a commitment to:A set of student learning outcomes;Common assessments within programs;across course sections;Collecting, compiling, analyzing, reporting, and using the results of assessments of student learningTo improve candidate performanceTo improve programsTo improve policies and procedures.
18 What am I going to get out of this? Shared expectations of student performanceClear communication of expectationsEnhanced student performanceAbility of students, faculty, programs/departments to self-advocateA system of program evaluation which includes evaluation of student learning outcomes.
19 Why?What are some of the potential benefits of establishing student assessment systems?
20 Why assess student learning? Enhances Student EngagementContinuous improvement of Curriculum, Instruction, and Student PerformancePromotes Professional Community (inquiry, reflection, scholarship of practice)Enables students, faculty, programs, and institutions to Self-Advocate-- able to participate in data-based decision-makingBetter reflects the complexity, extent, and impact of Faculty WorkHelps us achieve our Institutional MissionDevelops Public Trust
21 Student EngagementEmbedded assessment: student has to be actively engaged (cannot be passive learner)Clear expectations, models of performanceSelf-evaluationAssessing knowledge, skills, dispositions required for practice (meaningful)Can use products of assessments in job search, etc.
22 Improve Curriculum, Instruction, Student Performance JMU: “major dividend of ongoing assessment has been greater faculty involvement. This process ensures that curriculum decisions remain in the hands of those who deliver the curriculum.”curriculum = assessment = curriculum…”real-time,” on-going examination and improvement of teaching and curriculumProvides a focus for instruction.Clarifies expectations of students.Ensures we provide appropriate opportunities for reaching expectations.Clearly communicates target, average, below average, unsatisfactory performanceAvoids overlap or oversight; ensures comprehensiveness of curriculumGrounds curriculum in practice
23 Inquiry, Growth, and Professional Community Dialog: values, commitments, what matters most, common expectations, level of performance, opportunities for learning…Become a learning organizationCross-disciplinary connectionsLeadership opportunities among facultyEstablish connections with practitionersHighlight successes of students, programs, faculty
24 Self-AdvocacyStudents have concrete evidence of what they know and can doFaculty are able to document their impact on studentsContributes to the scholarship of practicePrograms, divisions, institutions have array of information
25 Provides a Fuller Picture of Faculty Work Outcomes such as graduation/retention rates don’t always capture the impact of faculty effortsHelps us “tell our story”:efforts in providing learning opportunitiesthe complex and multidimensional nature of learningHelps make students more accountable for their part in the learning process
26 Enhance Public TrustGeneral public, potential students and families, politicians…Federal, state, public push for accountabilityThe role of anecdotesQuality assurance: accreditation
27 Accomplish Institutional Mission Graduates of Forsyth Tech aretechnically skilled,regionally and globally oriented,prepared for lifelong learning and full civic engagement and employment.
28 Assessment and Accountability Professional standards’ boards have moved from “input” measures to management by objectivesto documentation of studentoutcomes;and have moved fromexamining institutions to examining programs.
29 Assessment and Accountability Council for Higher Education Accreditation: CHEAAmerican Association of Community CollegesCommission on the Accreditation of Allied Health Education ProgramsJoint Review Committee on Educational Programs in Diagnostic Medical SonographyJoint Review Committee on Educational Programs in Nuclear Medicine TechnologyJoint Review Committee on Education in Radiologic TechnologyNorth Carolina Board of NursingTechnology Accreditation Commission of the Accreditation Board of Engineering and TechnologyCommission on Accreditation of Allied Health Education Programs
30 Council for Higher Education Accreditation “Accrediting organizations are responsible for establishing clear expectations that institutions and programs will routinelydefine,collect,interpret, anduse evidence of student learning outcomes.”
31 Council for Higher Education Accreditation: “More specifically:regularly gather and report concrete evidence about what students know and can do as a result of their respective courses of study,framed in terms of established learning outcomes andsupplied at an appropriate level of aggregation.”Supplement this with information about other dimensions of effective institutional or program performance…Prominently feature relevant evidence of student learning outcomes.”Statement of Mutual Responsibilities for Student Learning Outcomes: Accreditation, Institutions, and Programs (September, 2003)GIVE EXAMPLES OF MEMBERS OF CHEA.
32 SACS3.4.1 The institution demonstrates that each educational program for which academic credit is awarded (a) is approved by the faculty and the administration, and (b) establishes and evaluates program and learning outcomes.3.5.1 The institution identifies college-level competencies within the general education core and provides evidence that graduates have attained those competencies.
33 Commission on Accreditation of Allied Health Education Programs “Evaluations of students must be conducted on a recurrent basis and with sufficient frequency to provide both the students and program faculty with valid and timely indications of the students’ progress toward and achievement of the competencies and learning domains stated in the curriculum.”
34 ABET Technology Accreditation Commission “Each engineering technology program must have in place published educational objectives consistent with mission and with ABET criteria…”“…Must utilize multiple assessment measures in a process that provides documented results to demonstrate that the program objectives and outcomes are being met..”[examples include]: “student portfolios, student performance in project work and activity-based learning; national exams; employer and graduate surveys..”
35 Joint Review Committee on Education in Radiologic Technology “A program’s goals are a more specific expression of the programs’ intended student learning outcomes. The goals should be written using behavioral terms and should address the cognitive, affective, and psychomotor domains. They must be measurable, preferably through the use of more than one measurement tool.”-- JRCERT Guide for Program Analysis (05/05)
36 Summarizing…Identify, communicate, and assess clear, measurable student learning outcomesBehavioral statements; cognitive, affective, psychomotor learning domainsBased on institutional, departmental/program, state, and national standardsEstablish a system for directly assessing student achievement of objectivesMultiple assessments across timeincluding graduate and employer surveysCollect, compile, report, and use results to improve student performance, programs, and organization
37 Identifying Student Outcomes Translating program goals and instructors’ intentions into instructional objectivesstated in terms of student learning outcomes.
38 Communicating goals and objectives Beginning with some examples andthe importance of verbs…
39 Clear, Observable Behavior (can’t measure what you can’t see)
40 Taxonomy of Learning Domains “Behavioral…cognitive, affective, psychomotor” Using a Framework to guide usBenjamin Bloom’sTaxonomy of Learning DomainsCognitive: mental skills (Knowledge)Affective: growth in feelings or emotional area (Attitudes)Psychomotor: manual or physical skills (Skills)
41 Writing Instructional Objectives There are a number of approaches to writing instructional objectives:Mager -- Behavioral objectivesEisner -- Expressive objectivesGronlund -- General/specific objectives
42 Writing Instructional Objectives Mager proposes writing specific statements about observable outcomes that can be built up to become a curriculum (an inductive approach).An example of a behavioral objective:Given 3 minutes of class time, the student will solve 9 out of 10 multiplication problems of the type: 5 X 4 = _____.
43 Writing Behavioral Objectives Three Parts of a Behavioral ObjectiveIn an oral presentation,the student will paraphrase Dr. Martin Luther Kings's I Have a Dream address,mentioning at least 3 of the 5 major points discussed in class.
44 Writing Instructional Objectives Eisner proposes that not all instructional objectives should focus on outcome; some should focus on the learning process itself (expressive objectives).Examples of expressive objectives:Students will attend a live symphonyperformance.b. Students will use multiplication ineveryday activities.
45 Writing Instructional Objectives Gronlund proposes starting with a general statement and providing specific examples of topics to be covered or behaviors to be observed (a deductive approach).
46 Stating Instructional Objectives: Curricular Questions Create a basic document in a spreadsheet.Enter text and values into an application.Write formulas to calculate simple and multi-segment problemsFormat cells to display data appropriatelyEmbed charts into the spreadsheetDisplay the sheet in worksheet and formula viewsPrint document in both views and in landscape or portrait format.
47 Writing Instructional Objectives Examples of general/specific objectivesStudents will detect the use of stereotypes.identify situations in which stereotypes might emergerecall or identify indicators or clues of stereotyping:use of overgeneralization, exaggerationlinking features together that are not logically linked (blonds are dumb)use of vague words (shifty)use of extremes or absolutes (never, none, all)absence of individual attributes or variationslocate other information and examples which counter stated characteristicsdetermine if communication includes indicators of stereotyping
48 Writing Instructional Objectives Examples of general/specific objectives (affective):Displays a scientific attitude:demonstrates curiosity in identifying problemsseeks natural causes of eventsdemonstrates open-mindedness when seeking answerssuspends judgment until obtains all possible evidencerespects evidence from credible sourcesshows objectivity in analyzing evidence and drawing conclusionsseeks ways to verify resultsshows willingness to revise conclusions as new evidence becomes available
49 Writing Instructional Objectives Examples of General ObjectivesWrite an essay.Apply systematic strategies to monitor and improve personal health.Set up and operate graphics design equipment.Apply principles of radiation safety and protection.Process appointments in a timely and accurate manner.Develop a basic database using a database application.Other examples?
50 Task #3: Trying our hand…. Translating goals and intentions into instructional objectives stated in terms of student learning outcomes
51 Stating Instructional Objectives Principles of electricity.Comprehends principles of electricity.Topic vs. student learning outcome.
52 Stating Instructional Objectives Comprehends assigned reading material.To increase students’ reading ability.Describe student’s learning behavior rather than teacher’s teaching behavior.
53 Stating Instructional Objectives Gains knowledge of basic principles of radiation safety and protection.Applies basic principles of radiation safety and protection in new situations.#1 = the learning process rather than the learning outcome (knows, develops skills in, acquires, understands, learns). Another pitfall includes describing the learning activity: create a diorama; read seven journal articles, etc.
54 Stating Instructional Objectives Explains the scientific method and applies it effectively.“Explains and applies”---avoid more than one verb. Students may be able to do one but not the other, so is the objective met?
55 Guided Practice: Breaking verbs down into specific learning objectives Students will evaluate [an Articles of Incorporation, a patient care plan, nuclear medicine image, essay].Determine the purpose for analyzing;Identify the criteria to use;Identify idealized standards for each criterion;Examine the information and identify evidence related to criteria;Judge the degree of match of the evidence with idealized standards;State results of analysis by summarizing patterns and giving examples of meeting/not meeting standards within the criteria.
56 Stating Specific Instructional Objectives: Guided Practice Generic objectives can often guide the development of content-specific objectives.Students will demonstrate their knowledge of legal principles regarding the formation and maintenance of corporations or partnerships.Comprehend basic principlesStates principle in own wordsIdentifies examples of the principlesDistinguishes between correct and incorrect applications of the principlePredicts an outcome based on the principle.-- Gronlund, 2004, p. 19
57 Stating Instructional Objectives Curricular Questions: The nuclear medicine technologist provides patient care.Acquires adequate knowledge of the patient’s medical historyProvides for proper comfort and care before, during, and after proceduresRecognizes surgical and disease factors that may create artifacts or variants on imagesIdentifies when data acquisition or data processing protocol must be modifiedProvides safe and sanitary conditionsEstablishes and maintains good communication with each patient.IF THESE WERE OVERALL PROGRAM GOALS, how can we describe expectations for students when they are beginning their program and starting to learn about patient care?KNOWS BASIC TERMS COMPREHENDS CONCEPTS AND PRINCIPLES APPLIES PRINCIPLESINTERPRETS, EVALUATES
58 Task #3: Stating General and Specific Instructional Objectives In groups of 2-3, select or state a general instructional objective.Identify specific learning objectives for the general learning objective.
59 Evaluation Tools Checklists Numerical rating scales __ 1. Sands and prepares surface properly. (check, +/-)Numerical rating scalesa) Uses tools correctly for each task.Numerical rating scales with descriptorsSelects appropriate equipment.__Needs to be ___________________ Needs some help _____________selects propertold what to use in selecting equipment independentlyRating scales:3 = always, 2 = sometimes, 1 = nevera) Pays attention when problems are explained.
60 Rubrics: Indicators of Quality CriteriaPerformance Level 1Performance Level 2Performance Level 3Performance Level 4ThesisUnclear or unidentifiable topic;Possibly vague and/or overly simplistic topic; needs considerable revision.Clearly stated, potentially strong topic with some revisions.Clearly stated, strong topic; original; insightful.StructureUnclear, often because these is weak or non-existent. Transitions confusing or unclear. Few topic sentences.Generally unclear; wanders or jumps from point to point; generally lacks transitions; some paragraphs without clear topics.Generally clear and logical; may lack a few clear transitions or some paragraphs may not be clear and solid.Ideas flow logically. Excellent transitions from point to point. Paragraphs support sold topic sentences.Use of EvidenceVery few or very weak examples or explanations. General failure to support statements or evidence not relevant; quotes poorly integrated.Examples and explanations support some points; points often lack evidence; quotes may be poorly integrated.Examples and explanations support most points; some evidence does not support point or may appear not relevant.Relevant materials support several points; reasons supported with good explanations and examples; excellent integration of quoted material.AnalysisLack of evidence or very few or very weak examples; weak attempts to relate to argument.Evidence is often merely stated and not explained or connected to the argument.Evidence often related to argument or reasons though not always clearly or completely explained.Clearly relates evidence to reasons; poses new ways to think of thesis.Logic and argumentationSimplistic view of topic; no effort to grasp alternative views; might contain logical fallacies.Logic might fail at time; argument might be unclear; might not address opposing views or states them but does not address them.Acknowledges opposing views and addresses several aspects of them, but not always in complete manner.Anticipates and defuses counter-arguments. All ideas flow logically. Makes novel connections which illuminate thesis.
61 Student Assessment Plans Identify program level student learning outcomes. Identify decision points as students progress and the assessments used to make decisions.General Program-Level Student Learning OutcomesPhase IPhase 2Follow UpContent knowledgeAssessment such as comprehensive exams in courses.Assessment such as licensure exam.Employer surveyAlumni surveyProfessional skillsAssessment such as class demonstrations, simulations, or projects.Assessment such as more involved clinical experiences.
62 Student Assessment Plans Individual Student InventoryIdentify program-level student learning outcomesInstructors document which outcomes students demonstrate within courses/clinical experiences.
63 Summary… Objectives = student learning outcomes Objectives: focus instruction,guide learning,provide criteria/standards for assessment,convey instructional intent to others,enable us to evaluate instruction.Frameworks (Bloom’s Taxonomy): comprehensivenessVerbs are the operative words!Pitfalls: topic, teacher behavior, learning process/activityEvaluation tools convey expectations for student performance (various levels of specificity)Programs develop plans for documenting student performance
64 NEXT STEPS: Task #4: Teaching Goals Inventory Help college teachers become more aware of what they want to accomplish in individual courses and across programs.
65 NEXT STEPS: Review, update, or develop course-based assessment: Identify general instructional objectivesIdentify specific instructional objectivesIdentify assessment tasksDevelop evaluation toolsReview, update, or develop program-level student assessment planIdentify program- or departmental student learning outcomesIdentify a systematic process for assessing outcomes and for collecting, compiling, analyzing, reporting, and using the assessment data.