Presentation is loading. Please wait.

Presentation is loading. Please wait.

September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

Similar presentations


Presentation on theme: "September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement."— Presentation transcript:

1

2 September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement

3 September 28-29, 2006Moscow2 Introductions

4 September 28-29, 2006Moscow3 Continuous Program Improvement Moderator:Gloria Rogers Associate Executive Director Professional Services ABET, Inc. Facilitator:David Hornbeck Adjunct Accreditation Director for Technology ABET, Inc.

5 September 28-29, 2006Moscow4 ABET Faculty Workshop To Promote Continuous Quality Improvement in Engineering Education

6 September 28-29, 2006Moscow5 Workshop Expectations

7 September 28-29, 2006Moscow6 Workshop Will Develop: 1.An understanding of program development and management based on learning outcomes. 2.An awareness of definitions and linkages among Program Educational Objectives Program Outcomes Assessment Evaluation Constituencies

8 September 28-29, 2006Moscow7 Workshop Will Develop: 3.An awareness of assessment tools and their Variety Assets Utility Relevance Limitations 4.An understanding of the structure & cyclic nature of Continuous Quality Improvement Planning Implementation Assessment evaluation, feedback change

9 September 28-29, 2006Moscow8 Workshop Format We utilize both small group and plenary sessions We introduce concepts via critique of case study examples We apply concepts through group preparation of example scenarios We share results & develop understanding through interactive plenary sessions

10 September 28-29, 2006Moscow9 Workshop Day 1 Identify attributes of effective educational objectives Identify attributes of effective program outcomes Investigate key components of effective assessment plans and processes Prepare written program outcomes

11 September 28-29, 2006Moscow10 Workshop Day 2 Investigate the attributes of a variety of assessment tools Develop assessment & evaluation plans for the program educational objectives Develop assessment & evaluation plans for the set of program outcomes Summarize points of learning Discuss lessons learned by ABET in its experience with outcomes-based criteria

12 September 28-29, 2006Moscow11 Workshop Procedures A.Record all your work produced in small group sessions B.Identify recorded work by table and breakout room number C.Reporting in Plenary Sessions: Each group selects a leader, a recorder & a reporter for each exercise D.A workbook of all material & exercises will be provided to each participant

13 September 28-29, 2006Moscow12 Introduction to ABET Continuous Program Improvement

14 September 28-29, 2006Moscow13 Goal of ABET To promote Continuous Quality Improvement in Applied Sciences, Computing, Engineering, and Technology education through faculty guidance and initiative.

15 September 28-29, 2006Moscow14 Accreditation Reform  The Paradigm Shift

16 September 28-29, 2006Moscow15 Philosophy Institutions & programs define missions and objectives Focus on the needs of their constituents Enable program differentiation Encourage creativity in curricula Emphasis on outcomes Skills/knowledge required for professional practice Technical and non-technical elements Programs demonstrate that they are Meeting their objectives Satisfying accreditation criteria

17 September 28-29, 2006Moscow16 Emphases Practice of Continuous Improvement –Input of constituencies –Process reliability & sustainability –Outcomes, Objectives, and Assessment –Technical and Professional Knowledge required by the Profession Resources linked to Program Objectives –Student –Faculty and Support Personnel –Facilities –Institutional Support and Funding

18 September 28-29, 2006Moscow17 Primary Expectations of Programs Adequate preparation of graduates for engineering careers Effective Continuous Quality Improvement Processes

19 September 28-29, 2006Moscow18 The Focus Meaningful Educational Objectives Effective Program Outcomes Practical Assessment Tools Effective & Sustainable Assessment Plan Robust and Credible Evaluation Plan

20 September 28-29, 2006Moscow19 ABET Definitions Program Educational Objectives – broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve within the first few years after graduation. Program Outcomes – narrower statements that describe what students are expected to know and be able to do by the time of graduation. These are the skills, knowledge, and behaviors that enable graduates to achieve the Program Educational Objectives. They are acquired by students as they matriculate through the program.

21 September 28-29, 2006Moscow20 ABET Definitions Assessment – processes to identify, collect, and prepare data that are needed to evaluate the achievement of Program Outcomes and Program Educational Objectives. Evaluation – processes that interpret data accumulated through assessment. Evaluation determines the extent to which Program Outcomes or Program Educational Objectives are being achieved. Evaluation results in decisions & actions that improve a program.

22 September 28-29, 2006Moscow21 a systematic pursuit of excellence and satisfaction of the needs of constituencies in a dynamic and competitive environment. Continuous Quality Improvement is

23 September 28-29, 2006Moscow22 Continuous Quality Improvement Must be systematic and systemic Is the dynamic behavior of an organization Must be shared at all organizational levels May be motivated by external factors Must be sustained by internal behavior Requires that the continuous pursuit of excellence determine philosophies, plans, policies and processes of the organization Requires continuous interaction between internal and external constituencies Focuses on the needs of constituencies

24 September 28-29, 2006Moscow23 CQI Starts with Basic Questions Who are our constituencies? What services do we provide? Do constituencies understand our objectives? What services, facilities and policies are necessary to insure that we continue to satisfy our constituencies? Do our suppliers and institutional leadership understand and support our needs?

25 September 28-29, 2006Moscow24..….More Basic Questions What steps do we perform to provide our services? Are our constituencies satisfied with our services? How do we measure our effectiveness? How do we use these measures to continuously improve our services? Are we achieving our objectives and improving?

26 September 28-29, 2006Moscow25 Assessment: Foundation of CQI Assessment of inputs & processes establishes the capability or capacity of a program Assessment of outcomes measures how effectively the capability has been used Outcomes assessment improves: –Effectiveness –Learning –Accountability

27 September 28-29, 2006Moscow26 CQI as an Operating Philosophy Quality improvement comes from within institution Continuous improvement requires the planned integration of objectives, performance metrics, & assessment Continuous improvement is cyclical Assessment of performance is the baseline for future assessment Educational objectives, mission, and needs of constituencies must be harmonized to achieve CQI

28 September 28-29, 2006Moscow27 Role of ABET Accreditation ABET accreditation provides periodic external assessment in support of the continuous quality improvement program of the institution.

29 September 28-29, 2006Moscow28 Potential Constituencies Students, parents, employers, faculty, alumni Industry advisors, accrediting agencies Educational administration: department, school, college, etc Government agencies: local, state, federal Transfer colleges that supply students Graduate programs that accept graduates Donors, contributors, supporters

30 September 28-29, 2006Moscow29 Step 1: Who are your constituencies ? Identify possible constituencies. What are the expectations of each constituency? How will constituencies be satisfied? When will constituencies be satisfied? What relative priority do constituencies hold? How will constituencies be involved in your CQI?

31 September 28-29, 2006Moscow30 Pick Your Constituencies Select no more than three constituencies to focus on for the workshop exercises Assign a person to represent each of these constituencies at each table Consider what influence the choice of constituencies will have on Educational Objectives and Outcomes

32 September 28-29, 2006Moscow31 Objectives: Exercise 1

33 September 28-29, 2006Moscow32 Outcomes: Exercise 2

34 September 28-29, 2006Moscow33 Report Out on Exercise 1 and Exercise 2

35 September 28-29, 2006Moscow34 Objectives Summary Each addresses one or more needs of a constituency Must be understandable by the constituency being served Should be limited to a manageable number of statements Should be broader statements than the Program Outcomes Every Objective must be supported by at least one Program Outcome

36 September 28-29, 2006Moscow35 Outcomes Summary Each describes an area of knowledge and/or skill that a person can demonstrate Should be stated such that a student can demonstrate upon completion of the program and before graduation Must be a unit of knowledge/skill that supports at least one Educational Objective Collectively, Outcomes define the skills and knowledge imparted by the degree program Outcomes statements normally do not include measures or performance expectations

37 September 28-29, 2006Moscow36 Assessment Basics

38 September 28-29, 2006 Gloria Rogers, Ph.D. Associate Executive Director, Professional Services ABET, Inc. Program Assessment of Student Learning ©

39 September 28-29, 2006Moscow38 Foundational Truths Programs are at different places in the maturity of their assessment processes Programs have different resources available to them (e.g., number of faculty, availability of assessment expertise, time) Each program has faculty who are at different places in their understanding of good assessment practice

40 September 28-29, 2006Moscow39 Hierarchy of assessment learning Knowledge Comprehension Application Analysis Synthesis Evaluation NOVICE INTERMEDIATE Advanced I apply what I have learned and begin to analyze the effectiveness of my assessment processes. I can take what I have learned and put it in context. I begin to question what I hear, challenge assumptions and make independent decisions about effective practices for my program. Everyone who makes a presentation is an expert and I am a sponge.

41 September 28-29, 2006Moscow40 Publication numbers / Faculty development activities; Credit hrs delivered Faculty Background Student Background Educational Resources Programs & services offered; populations served Policies, procedures, governance Faculty teaching loads/class size Processes Statistics on resource availability, participation rates Input Student grades; graduation rates; employment statistics Student learning and growth Faculty publication citations data; faculty devlpmt What have students learned; what skills have they gained; attitudes developed? OutcomesOutputs What comes into the system? What are we doing with the inputs? How many? What is the effect?

42 September 28-29, 2006Moscow41 Faculty Background Student Background Educational Resources Programs & services offered; populations served Policies, procedures, governance Faculty teaching loads/class size ProcessesInput  Assessment of inputs and process only establishes the capability or capacity of a program (how many courses and what is “covered”, background of faculty, nature of facilities, etc.)

43 September 28-29, 2006Moscow42 Publication numbers/Faculty development activities; Credit hrs delivered Statistics on resource availability, participation rates Student grades; graduation rates; employment statistics Outputs  Assessment of outputs serve as indirect measures or proxies for effectiveness—they provide general indicators of achievement.

44 September 28-29, 2006Moscow43 Student learning and growth Faculty publication citations data; faculty devlpmt What have students learned; what skills have they gained; attitudes developed? Outcomes  Assessment of outcomes provides for direct measures of the effectiveness of what has been done with that capability/ capacity related to individual learning and growth.

45 September 28-29, 2006Moscow44 Competency-Based Instruction Assessment-Based Curriculum Individual Perf. Tests Placement Advanced Placement Tests Vocational Preference Tests Other Diagnostic Tests “Gatekeeping” Admissions Tests Rising Junior Exams Comprehensive Exams Certification Exams Campus and Program Evaluation Program Reviews Retention Studies Alumni Studies “Value-added” Studies Program Enhancement Individual assessment results may be aggregated to serve program evaluation needs Level of Assessment (Who?) Individual Group KNOWLEDGEKNOWLEDGE SKILLSSKILLS ATTITUDES&VALUESATTITUDES&VALUES BEHAVIORBEHAVIOR Object of Assessment (What?) Learning/Teaching (Formative) Accountability (Summative) Purpose of Assessment (Why?) (Terenzini, JHE Nov/Dec 1989) Taxonomy of Approaches to Assessment

46 September 28-29, 2006Moscow45 ABET TermsDefinition Some other terms for same concept Objectives Broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve. Goals, outcomes, purpose, etc. Outcomes Statements that describe what students are expected to know and able to do by the time of graduation. Objectives, standards, etc. Performance Criteria Specific, measurable statements identifying the performance(s) required to meet the outcome; confirmable through evidence. Performance Indicators, Standards, rubrics, specifications, metrics, outcomes, etc. Assessment Processes that identify, collect, use and prepare data that can be used to evaluate achievement. Evaluation Process of reviewing the results of data collection and analysis and making a determination of the value of findings and action to be taken. Assessment

47 September 28-29, 2006Moscow46 Assessment for Quality Assurance © Learning Outcomes Constituents Assessment: Assessment: Collection, Analysis of Evidence Evaluation: Evaluation: Interpretation of Evidence Feedback for Continuous Improvement Gloria Rogers – ABET, Inc. Measurable Performance Criteria Educational Practices/Strategies Mission Educational Objectives Assess/ Evaluate

48 September 28-29, 2006Moscow47 Assessment Focus: Evaluate individual student performance (grades) Evaluate teaching/learning Context: Subject matter Faculty member Pedagogy Student Facility Classroom Assessment © Strength of Materials Terminology Material Properties Beams Torsion Columns Fatigue Stress Strain Tensile strength Ductility Sheer force Bending moment Angle of twist Power transmission Euler buckling Crack growth S-N curves G.Rogers, ABET Subject Concepts Topics Timeline 1 semester/quarter

49 September 28-29, 2006Moscow48 Objective G.Rogers--ABET, Inc. Work effectively with others Outcome Researches and gathers information Fulfill duties of team roles Shares work equally Listens to other teammates Performance Criteria Ability to function on multi- disciplinary team ü Makes contributions ü Takes responsibility ü Values other viewpoints

50 September 28-29, 2006Moscow49 Student Pre-college Traits Educational Outcomes Institutional Context Reciprocal CausationAdapted from Terenzini, et.al. 1994,1995 Program Assessment Classroom Experience Pedagogy; Facilities; Climate; Faculty & Student Characteristics Out-of-class Experiences Co-curricular; co-ops; internships; support services Coursework & Curricular Patterns Classes chosen; major Timeline xx Years Environmental Factors

51 September 28-29, 2006Moscow50 Differences between classroom and program assessment Degree of complexity Time span Accountability for the assessment process Cost Level of faculty buy-in Level of precision of the measure

52 September 28-29, 2006Moscow51 Unsatisfactory 1 Developing 2 Satisfactory 3 Exemplary 4 Score Contribute Research & Gather Information Take Responsibility Fulfill Team Role's Duties Share Equally Value Others' Viewpoints Listen to Other Teammates Average Work Effectively in Teams

53 September 28-29, 2006Moscow52 Unsatisfactory 1 Developing 2 Satisfactory 3 Exemplary 4 Score Contribute Research & Gather Information Does not collect any information that relates to the topic. Collects very little information--some relates to the topic. Collects some basic information--most relates to the topic. Collects a great deal of information--all relates to the topic. Take Responsibility Fulfill Team Role's Duties Does not perform any duties of assigned team role. Performs very little duties. Performs nearly all duties. Performs all duties of assigned team role. Share Equally Always relies on others to do the work. Rarely does the assigned work--often needs reminding. Usually does the assigned work--rarely needs reminding. Always does the assigned work without having to be reminded. Value Others' Viewpoints Listen to Other Teammates Is always talking--never allows anyone else to speak. Usually doing most of the talking--rarely allows others to speak. Listens, but sometimes talks too much. Listens and speaks a fair amount. Average Work Effectively in Teams

54 September 28-29, 2006Moscow53 Developing performance criteria Two essential parts –Content reference Subject content that is the focus of instruction (e.g., steps of the design process, chemical reaction, scientific method) –Action verb Direct students to a specific performance (e.g., “list,” “analyze,” “apply”)

55 September 28-29, 2006Moscow54 Knowledge Comprehension Application Analysis Synthesis Evaluation NOVICE INTERMEDIATE EXPERT INTRODUCE REINFORCE DEMONSTRATE /CREATE

56 September 28-29, 2006Moscow55 Clarity of performance criteria Use of action verbs consistent with appropriate level of learning Reference table

57 September 28-29, 2006Moscow56 Writing Measurable Outcomes: Exercise 3

58 September 28-29, 2006Moscow57 Report Out on Exercise 3

59 September 28-29, 2006Moscow58 Examples www.engrng.pitt.edu/~ec2000

60 September 28-29, 2006Moscow59

61 September 28-29, 2006Moscow60 What is ‘acceptable’ level of performance? Developing scoring rubrics

62 September 28-29, 2006Moscow61 What is a rubric, anyway????? A rubric is a set of categories which define and describe the important components of the work being completed, critiqued, or assessed. Each category contains a gradation of levels of completion or competence with a score assigned to each level and a clear description of what performance need to be met to attain the score at each level.

63 September 28-29, 2006Moscow62 Purpose of Rubric (What do you want it to do?) Information to/about student competence (Analytic) –Communicate expectations –Diagnosis for purpose of improvement and feedback Overall examination of the status of student performance? (Holistic)

64 September 28-29, 2006Moscow63 Generic or Task-Specific? Generic –General rubric that can be used across similar performances (used across all communication tasks or problem solving tasks) Big picture approach Element of subjectivity Task-specific –Can only be used for a single task Focused approach Less subjective

65 September 28-29, 2006Moscow64 How many points on the scale? Consider both the nature of the performance and purpose of scoring Recommend 3 to 6 points to describe student achievement at a single point in time. If focused on developmental curriculum (growth over time) more points are needed (i.e., 6-11???).

66 September 28-29, 2006Moscow65 Scale (Numeric w/descriptor) Scale (Numeric w/descriptor) Scale (Numeric w/descriptor) Scale (Numeric w/descriptor) Scale (Numeric w/descriptor) Scale (Numeric w/descriptor) Performance Identifiable performance characteristics reflecting this level Performance RUBRIC TEMPLATE Student Outcome_______________________________

67 September 28-29, 2006Moscow66 Unsatisfactory 1 Developing 2 Satisfactory 3 Exemplary 4 Content Supporting Detail Includes inconsistent or few details which may interfere with meaning of text Includes some details, but may include extraneous or loosely related material Provides adequate supporting detail to support solution/argument Provides ample supporting detail to support solution/ argument Organization Organizational Pattern Little evidence of organization or any sense of wholeness or completeness Achieves little completeness and wholeness though organization attempted Organizational pattern is logical and conveys completeness and wholeness with few lapses Organizational patter is logical and conveys completeness and wholeness Style Language and word choice Has limited or inappropriate vocabulary for the audience and purpose Limited and predictable vocabulary, perhaps not appropriate for intended audience and purpose Uses effective language and appropriate word choices for intended audience and purpose Uses effective language; makes engaging, appropriate word choices for audience and purpose Standard English Does not follow the rules o f standard English Generally does not follow the rules of standard English Generally follows the rules for standard English Consistently follows the rules of standard English Average Effective Writing Skills

68 September 28-29, 2006Moscow67 Unsatisfactory 1 Developing 2 Satisfactory 3 Exemplary 4 Score Contribute Research & Gather Information Does not collect any information that relates to the topic. Collects very little information--some relates to the topic. Collects some basic information--most relates to the topic. Collects a great deal of information--all relates to the topic. Take Responsibility Fulfill Team Role's Duties Does not perform any duties of assigned team role. Performs very little duties. Performs nearly all duties. Performs all duties of assigned team role. Share Equally Always relies on others to do the work. Rarely does the assigned work--often needs reminding. Usually does the assigned work--rarely needs reminding. Always does the assigned work without having to be reminded. Value Others' Viewpoints Listen to Other Teammates Is always talking--never allows anyone else to speak. Usually doing most of the talking--rarely allows others to speak. Listens, but sometimes talks too much. Listens and speaks a fair amount. Average Work Effectively in Teams

69 September 28-29, 2006Moscow68 Example of Results At a level expected for a student who will graduate?

70 September 28-29, 2006Moscow69 Example of Results Teaming Skills 1.Research & gather information 2.Fulfill team role’s duties 3.Shares equally 4.Listens to teammates

71 September 28-29, 2006Moscow70 Example of Results Communication Skills 1. Research & gather information 2. Fulfill team role’s duties 3. Shares equally 4. Listens to teammates

72 September 28-29, 2006Moscow71 Linking results to Practice Development of Curriculum Map Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes

73 September 28-29, 2006Moscow72 Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course. Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects, tests, etc. Formal Feedback. Students are given formal feedback on their performance on this outcome. Not covered. This outcome is not addressed in these ways in this course. Note: Clicking on the link ‘view rubric’ will show you the scoring rubric for that particular performance criteria related to the outcome. Outcome/Performance Criteria Outcome Explicit Demonstrate Competence Formal Feedback Not Covered Recognition of ethical and professional responsibilities. 1. Demonstrate knowledge of professional codes of ethics. View rubric or make a comment (optional)  Yes  2. Evaluate the ethical dimensions of professional engineering, mathematical, and scientific practices. View rubric or make a comment (optional)  Yes  An ability to work effectively in team 1. Share responsibilities and duties, and take on different roles when applicable View rubric or make a comment (optional)  Yes  2. Analyze ideas objectively to discern feasible solutions by building consensus View rubric or make a comment (optional)  Yes  3. Develop a strategy for action. View rubric or make a comment (optional)  Yes  An ability to communicate effectively in oral, written, graphical, and visual forms 1. Identify the readers/audience, assess their previous knowledge and information needs, and organize/design information to meet those needs. View rubric or make a comment (optional)  Yes  2. Provide content that is factually correct, supported with evidence, explained with sufficient detail, and properly documented. View rubric or make a comment (optional)  Yes  3. Test readers/audience response to determine how well ideas have been relayed. View rubric or make a comment (optional)  Yes  4. Submit work with a minimum of errors in spelling, punctuation, grammar, and usage. View rubric or make a comment (optional)  Yes 

74 September 28-29, 2006Moscow73 Curriculum map for Communication Skills 1 st Year2 nd Year3 rd Year4 th Year FALL CM 111 Chem I 4 CH 01 Cons Principles 4 CH 414 Heat Transfer 4 CH 400 Career P III 0 EM 100 Life Skills1 CM 251 O Chem I4 CH 415 Materials4 CH 401 Mass II4 EM 104 Graph Comm2 MA 221 DE I4 CM 225 A Chem I4 CH 403 Lab II2 RH 131 Fresh Comp 4 HSS Elective4 CH 304 Thermo II4 CH 404 Kinetics4 MA 111 Calc 15 CH 200 Career P I0Elective4 WINTER CM 113 Chem II4 CH 202 Che Proc Calc 4 CH 300 Career P II0 CH 406 Design I4 PH 111 Physics I4 CM 252 O Chem II4 CM 360 P Chem4 CH 408 Lab III2 HSS Elective4 MA 222 DE II4 CH 305 Mass I4 CH 440 P Control4 MA11 2 Calc II5 EM 101 Statics I2 MA 227 Statistics4 HSS Elective4 MS 120 M.History1 Hss Elective4 4 SPRING CM 115 Chem III4 CH 301 Fluids4 EE 206 EEE4 CH 407 Design II4 CS 100 Program.2Elective4 CH 402 ChE Lab I1 CH 409 Prof Prac1 EM 103 Int Design2 HSS Elective4 4 HSS Elective4 MA 113 Calc III5 CH 303 Thermo I4Elective4Elective (Des)4 PH 112 Physics II4 HSS Elective4Elective (free)4

75 September 28-29, 2006Moscow74 Assessment Methods

76 September 28-29, 2006Moscow75 Assessment Methods Written surveys and questionnaires Exit and other interviews Standardized exams Locally developed exams Archival records Focus groups Portfolios Simulations Performance Appraisal External examiner Oral exams Behavioral observations

77 September 28-29, 2006Moscow76 Direct Measures Direct measures provide for the direct examination or observation of student knowledge or skills against measurable learning outcomes

78 September 28-29, 2006Moscow77 Indirect Measures Indirect measures of student learning that ascertain the opinion or self-report of the extent or value of learning experiences

79 September 28-29, 2006Moscow78 Direct Indirect Exit and other interviews Standardized exams Locally developed exams Portfolios Simulations Performance Appraisal External examiner Oral exams Behavioral observations Written surveys and questionnaires Exit and other interviews Archival records Focus groups

80 September 28-29, 2006Moscow79 Tools: Exercise 4

81 September 28-29, 2006Moscow80 Assignment After you have shared methods, choose at least two methods (preferably three) that are appropriate for the performance criteria chosen At least one DIRECT measure Use overhead transparency to record your findings Include your rationale for decision

82 September 28-29, 2006Moscow81 Report out on Exercise 4

83 September 28-29, 2006Moscow82 Validity relevance - the assessment option measures the educational outcome as directly as possible accuracy - the option measures the educational outcome as precisely as possible utility - the option provides formative and summative results with clear implications for educational program evaluation and improvement

84 September 28-29, 2006Moscow83 “Bottom Lines” All assessment options have advantages and disadvantages “Ideal” method means those that are best fit between program needs, satisfactory validity, and affordability (time, effort, and money) Crucial to use multi-method/multi-source approach to maximize validity and reduce bias of any one approach ABET

85 September 28-29, 2006Moscow84 Assessment Method Truisms There will always be more than one way to measure any learning outcome No single method is good for measuring a wide variety of different student abilities There is generally an inverse relationship between the quality of measurement methods and their expediency It is important to pilot test to see if a method is appropriate for your program

86 September 28-29, 2006Moscow85 Data Collection Process Why? –Know your question What? –Focus on few criteria for each outcome Who? Students (cohorts); faculty (some) When?

87 September 28-29, 2006Moscow86 Sampling For program assessment, sampling is acceptable and even desirable for programs of sufficient size. –Sample is representative of all students

88 September 28-29, 2006Moscow87 Data collection How do objectives differ from outcomes in the data collection process? Data collection Evaluation & design of improvements Implement improvements & Data Collection Yr 1Yr 2Yr 3Yr 4Yr… Define Outcomes/ Map Curr.

89 September 28-29, 2006Moscow88 Learning Outcomes related to: 03-0404-0505-0606-0707-0808-09 A recognition of ethical and professional responsibilities  An understanding of how contemporary issues shape and are shaped by mathematics, science, & engineering  An ability to recognize the role of professionals in the global society  An understanding of diverse cultural and humanistic traditions  An ability to work effectively in teams  An ability to communicate effectively in oral, written, graphical, and visual forms 

90 September 28-29, 2006Moscow89 Closing the loop JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC Eval Committee receives and evaluates all data; makes report and refers recom- mendations to appropriate areas. Institute acts on the recom-mendations of the Eval. Comm. Reports of actions taken by the Institute and the targeted areas are returned to the Eval Comm. for iterative evaluation. Institute assessment cmte. prepares reports for submission to Dept. Heads of the collected data (e.g. surveys, e-portfolio ratings).

91 September 28-29, 2006Moscow90 Student Learning Outcomes at the PROGRAM level © Learning Outcome ________________________________________________________________________ Performance CriteriaStrategies Assessment Method(s) Context for Assessment Time of data collection Assessment Coordinator Evaluation of Results Results _____ (date): Actions _____(date): Second-Cycle Results ____(date): grogers@abet.org

92 September 28-29, 2006Moscow91 Checklist  Assessment question is known and explicit  Outcomes are defined and number of performance criteria are manageable  Data are efficiently and systematically collected  Assessment methods are appropriate to program context  Results are evaluated  Evaluation is more than looking at the results of learning outcomes  Action is appropriate

93 Things I wish I had known: Capitalize on what you are already doing One size does not fit all You don’t have to measure everything all the time More data are not always better Pick your battles Take advantage of local resources Don’t wait for perfection Go for the early win Decouple from faculty evaluation

94 September 28-29, 2006Moscow93

95 September 28-29, 2006Moscow94 Tools to help you work through the assessment process Assessment of student learning outcomes Assessment processes in business and industry Assessment rubrics Electronic portfolios Assessment terminology Using grades for assessment Using surveys and questionnaires for assessment Data collection General assessment articles and presentations Assessment workshops and conferences

96 September 28-29, 2006Moscow95

97 September 28-29, 2006Moscow96 April 13-14, 2007 www.rose-hulman.edu/assessent2007

98 September 28-29, 2006Moscow97 ABET Lessons Learned

99 September 28-29, 2006Moscow98 ABET Lessons Learned (1/6) Start as soon as possible Develop a comprehensive plan Begin implementing the plan as quickly as possible Do not allow the early steps to consume excessive time and create delays in the process Close Continuous Improvement loops as soon as possible Use consultants with caution - there can be positive and negative effects

100 September 28-29, 2006Moscow99 ABET Lessons Learned (2/6) It is extremely important to defining terminology When reported to constituents or external evaluators, evidence should be organized by Outcomes and Objectives rather than by courses Evidence should show evaluation and assessment processes are in place and working The accumulation of experience with outcomes assessment and continuous improvement will build confidence for all constituencies

101 September 28-29, 2006Moscow100 ABET Lessons Learned (3/6) Coordination between program assessment and institutional assessment can enhance both When presenting information for accreditation reviews: Descriptions of the CI process should be accompanied by evidence of data reduction, analysis, and the resultant actions Text should be used to explain, interpret, and strengthen tabular or statistical data

102 September 28-29, 2006Moscow101 ABET Lessons Learned (4/6) Each program should have some unique Outcomes that are different from those in accreditation criteria and those in other programs at the same institution. The absence unique Outcomes can imply that the program does not have a clear sense of mission. The most successful programs are those with faculty members who have participated in training sessions and communicated with faculty at other institutions It is important for the program Administration to be aware and supportive of Continuous Improvement activities

103 September 28-29, 2006Moscow102 ABET Lessons Learned (5/6) Continuous Improvement programs should employ a variety of assessment tools with a mixture of short and long time cycles Surveys should be only one of several evaluation tools used in Continuous Improvement Requirements for faculty, facilities, etc. should be linked to objectives, outcomes, and Continuous Improvement There has been no apparent relationship between the degree of success and the size of the institution

104 September 28-29, 2006Moscow103 ABET Lessons Learned (6/6) Programs that have successfully implemented Continuous Improvement have had two characteristics in common: There will be at least one faculty member who is highly committed to developing and guiding implementation There will be sincere involvement of the faculty members in the program

105 September 28-29, 2006Moscow104 Introduction to ABET

106 September 28-29, 2006Moscow105 Introduction to ABET Accreditation Federation of 28 professional societies Board of Directors representing those societies Four Commissions –Applied Science Accreditation Commission (ASAC) –Computing Accreditation Commission (CAC) –Engineering Accreditation Commission (EAC) –Technology Accreditation Commission (TAC) Accreditation Council –Representatives of each commission –Coordination, harmonization of processes

107 September 28-29, 2006Moscow106 Accreditation Process Commission responsibilities –Conduct evaluations of programs –Determine accreditation actions Commission makeup –Commissioners are volunteers appointed by societies –Commissioners chair accreditation teams Accreditation Team –Chair + one Program Evaluator for each program –Program Evaluators (PEVs) are volunteers from societies

108 September 28-29, 2006Moscow107 ABET Accreditation Federation of 28 professional societies Board of Directors represents those societies Four Commissions –Applied Science Accreditation Commission (ASAC) –Computing Accreditation Commission (CAC) –Engineering Accreditation Commission (EAC) –Technology Accreditation Commission (TAC) Accreditation Council –Representatives of each commission –Coordination, harmonization of processes

109 September 28-29, 2006Moscow108 ABET Accreditation Statistics CommissionASACCACEACTAC Total Programs Accredited722401793740 Programs Evaluated in 2004-051570373206 Increase in Number of Programs from 1995-2005 +57%+85%+18%-16%

110 September 28-29, 2006Moscow109 ABET Longitudinal Study

111 September 28-29, 2006Moscow110 Engineering Change: A Study of the Impact of EC2000* Lisa R. Lattuca, Project Director and Co-PI Patrick T. Terenzini, Co-PI J. Fredericks Volkwein, Co-PI Pennsylvania State University Center for the Study of Higher Education *EC2000 = Outcomes-based accreditation criteria for the Engineering Accreditation Commission of ABET

112 September 28-29, 2006Moscow111 Key Questions 1.What impact, if any, has EC2000 had on the preparation of graduating seniors to enter the engineering profession? 2.What impact, if any, has EC2000 had on practices that may be related to changes in student preparation?

113 September 28-29, 2006Moscow112 Significance of the Engineering Change Study The first national study of the impact of outcomes-based accreditation in the U.S. A model for assessments in other ABET Commissions. A pre-EC2000 benchmark (1994) for graduating seniors’ preparation. The first post-EC2000 data point (2004) on graduating seniors’ preparation.

114 September 28-29, 2006Moscow113 Engineering Change: Studying the Impact of EC2000 PROGRAM CHANGES EC2000 OUTCOMES Continuous Improvement Employer Ratings Student Learning (3.a-k) Curriculum & Instruction Faculty Culture Policies & Practices STUDENT EXPERIENCES In- Class Out-of- Class

115 September 28-29, 2006Moscow114 Engineering Disciplines Examined Aerospace Chemical Civil Computer Electrical Industrial Mechanical

116 September 28-29, 2006Moscow115 Data Sources and Response Rates Data Sources Target Population Number of Responses Response Rate Programs20314772% Faculty2,9711,24342% Deans4040+98% 1994 Graduates (Pre-)13,0545,49442% 2004 Graduates (Post-)12,9214,33034% Employersunknown1,622N/A

117 September 28-29, 2006Moscow116 Conclusions Recent graduates are measurably better prepared than those of a decade ago in all nine EC2000 outcomes. The most substantial improvements are in Societal and Global Issues, Applying Engineering Skills, Group Skills, and Ethics and Professionalism. Changes in faculty practices are empirically linked to these increases in preparation. Although 25% of employers report decreases in problem-solving skills, 80% still think graduates are adequately or well-prepared in that skill area.

118 September 28-29, 2006Moscow117 Conclusions A complex array of changes in programs, faculty practices, and student experiences systematically enhance student learning. These changes are consistent with what one would expect to see if EC2000 was having an impact. Changes at the classroom level are particularly effective in promoting the a-k learning outcomes.

119 September 28-29, 2006Moscow118 Conclusions Students also learn engineering skills through out-of- class experiences. Finally, a faculty culture that supports assessment and continuous improvement is also important. Most deans’ comments echoed the study findings: –EC2000 is an accelerant for change in engineering programs.

120 September 28-29, 2006Moscow119 Looking Forward ABET has set the stage for systematic continuous review of engineering education. Engineering Change provides important evidence that an outcomes-based model is an effective quality assurance mechanism. Evidence arrives just in time to inform the national debate.

121 September 28-29, 2006Moscow120 ABET Participation Project

122 September 28-29, 2006Moscow121 Participation Project PILOT Report July 22, 2006

123 September 28-29, 2006Moscow122 Partnership to Advance Volunteer Excellence (PAVE) Design and implement a comprehensive and effective program that optimizes the use of the expertise and experience of the volunteer professionals that participate in ABET’s outcomes-based accreditation process.

124 September 28-29, 2006Moscow123 Key Components Develop competency model for Program Evaluators Design a more effective recruitment and selection process Design a more effective training process Design a method of performance assessment and improvement

125 September 28-29, 2006Moscow124 What are competencies? Competencies are behaviors (which include knowledge, skills, and abilities) that define a successful PEV (program evaluator) Set expectations Align with vision, values, and strategy Drive continuous improvement

126 September 28-29, 2006Moscow125 Competencies Effective Communicator Easily conducts face to face interviews Writes clearly and succinctly Presents focused, concise oral briefings Professional Conveys professional appearance Is committed to contributing and adding value. Is considered a person with high integrity and ethical standards

127 September 28-29, 2006Moscow126 Competencies Interpersonally Skilled  Friendly and sets others at ease  Listens and places input into context  Open minded and avoids personal bias  Forthright – doesn’t hold back what needs to be said  Adept at pointing out strengths & weaknesses in non- confrontational manner Technically Current  Demonstrates required technical credentials for the position  Engaged in life long learning and current in their field

128 September 28-29, 2006Moscow127 Competencies Organized Is focused on meeting deadlines Focuses on critical issues and avoids minutia Displays take charge initiative Takes responsibility and works under minimum supervision Team Oriented  Readily accepts input from team members  Works with team members to reach consensus  Values team success over personal success

129 September 28-29, 2006Moscow128 Member Society selects PEV candidate via competency model Society assigns mentor Candidate works preliminary modules on-line Candidate successfully completes modules on-line Candidate attends visit simulation training Candidate successfully completes visit simulation training Society approves PEV for assignment Program Evaluator Support Facilitators (Society) Lead Facilitator (Society) Becoming an ABET Program Evaluator Candidate attends program specific training (Society) PHASE I PHASE II PHASE III Observer visit (optional)

130 September 28-29, 2006Moscow129 Training Pilot Pre-Work CD with Checks for Understanding –Mentor Assigned –Self-Study –Complete Pre-visit forms 1.5 days simulating campus visit –Sunday team meeting –Display materials and lab interview –Draft statement homework –Monday night meeting

131 September 28-29, 2006Moscow130 Evaluation Pilot Performance Appraisal forms: –Describe how competencies are demonstrated pre- visit and during visit –Provide Performance metrics –Require comments for below “met expectations” –Peer, Team Chair, Program

132 September 28-29, 2006Moscow131 Partnership to Advance Volunteer Excellence Determine best implementation strategies together Information-sharing, action planning and collaboration to carry the good work forward Increase the value of accreditation for your programs

133 September 28-29, 2006Moscow132 Points of Learning

134 September 28-29, 2006Moscow133 Questions & Answers


Download ppt "September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement."

Similar presentations


Ads by Google