Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006.

Similar presentations


Presentation on theme: "Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006."— Presentation transcript:

1 Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006

2 Presenter Patricia Francis, Assistant Provost for University Assessment and Academic Initiatives SUNY System Administration

3 Session Objectives To briefly review multiple benefits of learning outcomes assessment: accountability, improvement of teaching and learning, community-building, and faculty development To describe strategies for developing program and course objectives, with an emphasis on linking these objectives across levels To discuss the development of assignments and criteria for the purpose of assessing students’ mastery of objectives

4 Some Introductory Information Establishing a Rationale for Assessment and Some Assessment “Basics”

5 The Multiple Benefits of Learning Outcomes Assessment Extrinsic benefits: assessment as accountability – To Middle States – To external program certifiers and accreditors – To boards, legislators, and the public at large The more important, intrinsic benefits – Assessment as improvement – Assessment as community-building – Assessment as faculty development

6 Some Important Assessment “Basics” Establishing congruence among institutional, programmatic, and course objectives, learning opportunities, and assessments Distinguishing among goals, objectives, and outcomes What’s wrong with grades? Using a variety of measures, both quantitative and qualitative, in search of convergence Having a priori standards for all measures Utilizing course-embedded techniques (i.e., as opposed to “stand-alone” approach)

7 Advantages of Course-Embedded Assessment Least time- and labor-intensive Direct, necessary involvement of faculty Student motivation assured Face validity of measures assured (i.e., “authentic” assessment) And, most important, its implications for immediate and direct feedback to individual faculty (and, therefore, for “closing the loop”)

8 Assessment’s “Four Steps” 1. Setting objectives: “What you say you do” 2. Curriculum mapping: “How you do what you say you do” 3. Assessment: “How you know you are doing what you say you do” 4. “Closing the loop”: “What you do next based on results”

9 Program-Level Assessment From Mission to Objectives

10 Establishing a Mission Statement: Some General Suggestions Be comprehensive, including all major functions and services as well as all constituents served Be sure statement reflects institutional mission as well as other major all-college planning initiatives The loftier the better!

11 Sample Mission Statement: “The mission of the Bachelor of Music in Composition is to train composers who exhibit a promising level of creativity — but few of whom have significant formal musical training — to develop their personal artistic qualities and to acquire a comprehensive skill set so that they are prepared, as graduating seniors, to pursue advanced studies in composition or, in exceptional circumstances, to develop professional careers.”

12 Developing Program Objectives: Some General Suggestions Involve all faculty teaching in program Program objectives should reflect program Mission Statement (and vice-versa) Best objectives result from faculty-negotiated agreement about what students in the program should “be like” upon completing program Focus on five or so core objectives to begin with – that’s plenty!

13 Developing Program Objectives: Basic Questions in Getting Started What do you expect of students in terms of knowledge, skills, behavior, and attitudes? What achievements do you expect of graduates in your field? What profiles of your alumni do you have, or can you develop in terms of issues you believe are important?

14 Developing Program Objectives: Some Typical Areas of Interest Knowledge of content Communication ability (written and oral) Information literacy ability (library use and computer proficiency) Quantitative reasoning Critical thinking Analytic and interpretative ability

15 Specific Guidelines for Setting Program Objectives Three Basic Rules

16 Rule #1: Identify Overarching Concepts, Not “Course-Level” Objectives Good Example: “Students will demonstrate the ability to formulate hypotheses, analyze data, and draw conclusions.” Poor Example: “Students will demonstrate the ability to perform an ANOVA.”

17 Rule #2: State Objectives Using Concrete Language and Action Verbs Good Example: “Students will acquire and demonstrate knowledge and skills necessary to solve complex business problems in one or more areas of emphasis.” Poor Example: “Our objective is to enhance students’ intellectual growth.”

18 Rule #3: Focus on Results, Not Process Good Example: “Students will demonstrate clear and effective oral communication skills.” Poor Example: “Students will successfully complete four Oral Intensive courses.”

19 Linking Program Objectives to Assessment With objectives in mind, establish clear and measurable a priori success indicators, setting expectations at reasonable but challenging levels Good Example: “At least 65% of Business Majors taking the MFT will score at or above the 50 th percentile.” Poor Example: “Eighty percent of students will complete the Business Major’s MFT.”

20 Nursing – “At least 90% of graduates will score at or above the national average on the NCLEX-RN examination.” Business Administration – “At the end of the program, 85% of students will successfully defend orally their Capstone Report based upon a comprehensive case problem assigned by program faculty.” Fine Arts – “Seventy-five percent of senior portfolios, reviewed by art department faculty, will reveal the ability to create and critically analyze one’s own work as evidenced by overall evaluation scores at or above a 4 on the 5-point rubric.” Other Examples of Assessment Criteria at the Program Level

21 Course-Level Assessment Linking Program Objectives, Course Objectives, Assignments, and Assessment Activities

22 Course Objectives Should: Reflect program goals and objectives Be developed within the context of the program and, ideally, involve all faculty who teach the course Be more concrete and specific than program objectives, which are more general and overarching Be measurable Be included on course syllabi

23 Questions To Ask 1. Which program objectives are most appropriately covered in your course? (no course can – or need – do everything!) 2. How can you effectively “translate” the program objectives into course-level objectives? 3. What specific activities do you provide to your students that enable them to achieve these objectives? 4. What specific assignments enable you to determine their level of achievement? 5. Based on the outcomes, what can you conclude about that level of achievement, and do you need to do anything differently the next time?

24 “Categories” of Course Objectives Cognitive – what do you want students to know? Behavioral – what competencies do you want them to demonstrate? Attitudinal – are their particular values you want them to adopt?

25 Sample Program Objectives – Sociology Department Student understands and can explain major theories of social behavior. Student understands the nature and purposes of social research and understands different methodological techniques. Student can apply theories and research methods to an applied situation. Student can describe these issues effectively in oral and written form.

26 Linking Program and Course Objectives to Assignments and Assessment (Handout #1) Important questions when developing assignments: 1. Are all objectives being assessed (not necessarily in one assignment)? 2. Do the assignments yield useful results for each objective? 3. Are standards appropriate (i.e., too high, too low)? 4. Will results yield useful information to feed back into course and teaching and to program evaluation?

27 Course-Level “Translation” of Program Objectives (and Corresponding Assessment Measures) Value of fairly broad student learning goals that can then be operationalized more specifically at the course level (especially when different courses involved) Importance of making sure that “compound” outcomes can be broken down into discrete parts, each of which yields a distinct sub-score (holistic scoring not useful) Advantages of using uniform assignments and assessment measures and criteria, at least for different sections of same course If not, external validation (i.e., by other faculty, committee) of assignments and measures important

28 Other Examples from Assessment Plans – Linking Objectives, Measures, and Criteria (Handout #2) Shows variety of approaches SUNY campuses are using to assess GE learning outcomes (including standardized test by one campus) Includes actual criteria identified by faculty or departments to be used in determining course/program effectiveness

29 Then What? Several ways to “close the loop” – At the program level, entails compiling of results from across courses and course sections (i.e., “data extraction”) so that, ultimately, overall conclusions regarding program effectiveness – and possible need for change – can be reached – At the individual course level, assessment results are another form of feedback individual faculty members can use to inform their teaching – Can also refer to “assessing the assessment” – was measure sufficient, were standards too high/low? What’s important is that you use assessment data!

30 Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006


Download ppt "Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006."

Similar presentations


Ads by Google