Presentation is loading. Please wait.

Presentation is loading. Please wait.

Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment IIISOs.

Similar presentations


Presentation on theme: "Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment IIISOs."— Presentation transcript:

1 Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment IIISOs Assessment IIPerformance Indicators, Attributes and Course Learning Objectives IV Student Exit Survey Questionnaire V Alumni Survey Questionnaire V IEmployers Survey Questionnaire Presented by: JIC ABET COMMITTEE

2 I- PEOs ASSESSMENT

3 1.A listing and description of the assessment processes used to gather the data upon which the evaluation of each program educational objective is based 2.Examples of data collection processes may include, but are not limited to, employer surveys, graduate surveys, focus groups, industrial advisory committee meetings, or other processes that are relevant and appropriate to the program 3.The frequency with which these assessment processes are carried out 4.The expected level of attainment for each of the program educational objectives 5.Summaries of the results of the evaluation processes and an analysis illustrating the extent to which each of the program educational objectives is being attained 6.How the results are documented and maintained

4 Performance targets Performance targets --- the target criteria for the outcome indicators. Examples: _ The [average score, score earned by at least 80%] of the program graduates on the [standardized test, standardized test item, capstone design report, portfolio evaluation] must be at least 75/100. The [median rating for, rating earned by at least 80% of] the program graduates on the [self-rating sheet, peer rating sheet, senior survey, alumni survey, employer survey, final oral presentation] must be at least [75/100, 4.0 on a 1–5 Likert scale, “Very good”].

5 Program Educational Objectives for the 2006-2011ABET Cycle

6 5. Summaries of the results of the evaluation processes and an analysis illustrating the extent to which each of the program educational objectives is being attained

7 Results 2007: All students who had graduated in 2002-06 were surveyed. There were 308 graduates of which we were able to locate the email addresses of the 225 (73%). There were 98 respondents (44%). Of this number, 88 (89%) were practicing engineering technology, 8 were in graduate school (8%) and the remainder were in other fields. The survey asked the alumni whether or not they had had an opportunity to demonstrate each of the objectives. The results are presented in Table 4.2. Table 4.2. 2007 Alumni Survey Results – Percent of Graduates Who Indicated That They were Prepared

8 2007 Evaluation of Alumni Survey results: This was the first cycle where we used an electronic survey format to poll our alumni on their achievement of the program educational objectives (Survey Monkey). We attribute the positive response rate to the fact that we were able to streamline the assessment process and better track who had responded and who had not. The overall survey results indicate that the alumni are meeting the program educational objectives. However, there was some concern that the recently graduated classes (2005 and 2006) were not as positive in their responses as the alumni who had been out three years or more. On further analysis and a review of the written comments, it is clear that the quality of the work experience increases with time and many of the recent graduates had not had experiences which provided them with an opportunity to experience some of the program educational objectives (e.g., work in cross-functional teams, confront an ethical issue, get involved in service activities). 2007 Actions taken: The faculty were satisfied with the results and concluded that the alumni were meeting the program educational objectives and there was not a need to take any action at this time. However, there was some concern about the engagement of early graduates in service activities. This is an area that we will continue to monitor.

9 Results 2010: All students who had graduated in 2005-09 were surveyed. There were 312 graduates of which we were able to locate the email addresses of the 240 (77%). There were 96 respondents (40%). Of this number, 89 (93%) were practicing engineering technology, 5 were in graduate school (5%) and the remainder were in other fields. The survey asked the alumni whether or not they had had an opportunity to demonstrate each of the objectives. The results are presented in Table 4.4.

10 2010 Evaluation of Alumni Survey results: Overall, the survey results indicate that the alumni continue to meet the program educational objectives. In the 2007 evaluation there was some concern that the most recently graduated classes (2005 and 2006) were not as positive in their responses as the alumni who had been out three years or more. In this survey we again surveyed the 2005 and 2006 graduates and were able to validate our belief that some objectives are best demonstrated after graduates have been out after two years. This analysis clearly demonstrates that as graduates have more work related experiences their responses are more positive. This is demonstrated by looking at the 2005 and 2006 graduates when surveyed in 2007 and again in 2010. This comparison is shown in Table 4.5.

11 2010 Actions taken: The target performances were either met or exceeded based on the survey results, there are no actions taken at this time. Copies of all surveys and survey methodology will be available in the ABET resource room at the time of the visit.

12 Summary of Advisory Committee Discussions: Every other year the Engineering Technology Advisory Committee reviews and discusses the program educational objectives and the attributes that are demonstrated by the program graduates. The advisory committee is made up of employers (over half of whom are also alumni) and graduates. They meet with faculty yearly to discuss curricular and resource issues as well as current trends and issues in the discipline. On the even numbered years, they discuss their personal experiences or their experiences with the program graduates as they relate to the program educational objectives. The objectives that are the primary focus are the application of engineering technology principles (Obj. 1), ability to work in cross functional teams (Obj. 3), continued learning (Obj. 5), and ethical conduct (Obj.3). They consistently are in consensus that the program educational objectives are being met. The minutes from their meetings are summarized and available for review in the ABET resource room and will be available at the time of the visit.

13 6. How the results are documented and maintained

14 Documentation: The assessment and evaluation documentation is in digital format and is maintained by the office administrator. It is accessible on the intranet and all faculty can review and comment on any of the continuous quality improvement (CQI) processes. All comments are reviewed annually as a part of the program educational objectives and student outcomes review processes. The CQI website will be made available to the team at the time of the ABET visit. Documentation: The assessment and evaluation documentation is in digital format and is maintained by the office administrator. It is accessible on the intranet and all faculty can review and comment on any of the continuous quality improvement (CQI) processes. All comments are reviewed annually as a part of the program educational objectives and student outcomes review processes. The CQI website will be made available to the team at the time of the ABET visit.

15 II- SOs ASSESSMENT

16 1.A listing and description of the assessment processes used to gather the data upon which the evaluation of each student outcome is based 2. Examples of data collection processes may include, but are not limited to, specific exam questions, student portfolios, internally developed assessment exams, senior project presentations, nationally-normed exams, oral exams, focus groups, industrial advisory committee meetings, or other processes that are relevant and appropriate to the program 3. The frequency with which these assessment processes are carried out 4. The expected level of attainment for each of the student outcomes 5. Summaries of the results of the evaluation process and an analysis illustrating the extent to which each of the student outcomes is being attained 6. How the results are documented and maintained

17 The assessment of student outcomes is done on a six-year cycle. The cycle that was used for the current ABET cycle is illustrated in Table 4.6.

18 Although data are only collected every three years, there are activities which are taking place on each outcome each year. The cycle of activity is shown in Table 4.7. Each outcome has been mapped to the engineering technology courses as depicted in Table 4.8. This map was used to make decisions about where the summative data would be collected.

19

20 Results for each student outcome are reported separately in the following tables and all supporting documentations will be available in the ABET resource room at the time of the visit. Each table represents the activity for the current ABET accreditation cycle. Each outcome table includes performance indicators, courses and/or co-curricular activities (educational strategies) that provide students an opportunity to demonstrate the indicator, where summative data are collected, timetable, method of assessment and the performance target. Each table is followed by a graph showing the results with a three cycle trend line. Student Outcome #1: ability to identify, analyze, and solve engineering technology problems

21 Assessment Results (direct measures) 2007: For summative assessment (end of program), the decision was made to focus on the direct assessment for all indicators. Summative data for Indicators #1 and #2 were collected in the Engineering Technology Design I course (ET 4090) where students are asked to develop their statement of the problem and project planning documentation. For indicator #3 the assessment was completed in the second semester design course (ET 4092) as a part of the final assessment of the course. The percent of students who demonstrated each of the criteria were as follows: Indicator #1-80%; Indicator #2- 80%; and Indicator #3-84%. Evaluation and Actions 2008: The assessment results were reviewed by the faculty who are responsible for the Senior Design sequence. A presentation was made at the faculty retreat which was held in August of 2008. Although the students are making progress from the previous assessment in 2004 on Indicator #1 (up from 74%) there was still concern that their problem statements did not reflect an adequate understanding of what was expected. The decision was made to provide them some examples of both poor and well- written problem statements and require them to do an analysis of the difference. They would then be asked to do a self-assessment of how well their problem statements reflected what they identified in the well-written statements and submit their analysis with their problem statement. In a review of the results of Indicator #2, it was determined that the students were performing significantly better than the previous assessment (68%) and that the faculty would continue to monitor the students progress in the following year (2008-09). This improvement was attributed to the fact that the faculty had implemented a two-session sequence in ET4090 on project planning with direct feedback to students in the planning process using the rubric used to assess Indicator #2. Faculty members are satisfied that students are meeting the expectations for Indicator #3. The use of industry-based problems with industry mentors has improved the performance of students in the quality of their solutions and their ability to recognize the constraints that effect their solutions.

22 Second-Cycle Results (direct measures) 2010: This cycle of summative data was taken in the same courses as the 2007 cycle. Based on the actions taken as a result of the 2008 evaluation process, the following results were found: Indicator #1 up 14% (94%); Indicator #2 up 4% (84%); and Indicator #3 was the same (84%). Faculty will discuss their findings at the August 2010 faculty retreat and report the findings at the time of the ABET site visit. Figure 4.9. Trend line for Student Outcome #2: ability to identify, formulate, and solve engineering problems

23 Display materials available at time of visit in the ABET resource room:  Rubrics used by faculty to assess the indicators  Indicator #1 sample problem statements documentation  Indicator #2 project planning guide  Senior survey questions with results and faculty evaluation of results  Minutes of faculty retreat where actions were taken in 2008 and 2011

24 III- Performance Indicators, Attributes and Course Learning Objectives III- Performance Indicators, Attributes and Course Learning Objectives

25 The cognitive domain (Bloom, 1956) involves knowledge and the development of intellectual skills. This includes the recall or recognition of specific facts, procedural patterns, and concepts that serve in the development of intellectual abilities and skills. There are six major categories, which are listed in order below, starting from the simplest behavior to the most complex. The categories can be thought of as degrees of difficulties. That is, the first ones must normally be mastered before the next ones can take place. EvaluationSynthesis Analysis Application Comprehension Knowledge BLOOM’S BLOOM’S TAXONOMY

26

27  Knowledge- Name Bloom’s 6 levels of cognitive domain  Comprehension- Explain each cognitive level  Application- Write course learning outcomes using Bloom’s Taxonomy  Analysis- Categorize the course learning outcomes into 6 levels  Synthesis- Develop a course plan using Bloom’s 6 cognitive levels  Evaluation-Critique the effectiveness of each cognitive level in implementing the course plan.

28 Outcome elements (Performance Indicators) Outcome 3b—ability to design and conduct experiments, as well as analyze and interpret data Outcome elements Designing experiments, Conducting experiments Analyzing data, Interpreting data. Outcome elements—different abilities specified in a single outcome that would generally require different assessment measures.

29 Outcome Attributes (Measures) Outcome 3e—ability to identify, formulate, and solve engineering problems Outcome elementsOutcome attributes Problem identification, Problem statement construction and system definition, Describes the engineering problem to be solved, Visualizes the problem through sketch or diagram, Outlines problem variables, constraints, resources, and information given to construct a problem statement Appraises the problem statement for objectivity, completeness, relevance, and validity. Problem formulation and abstraction, Information and data collection Model translation, Validation, Experimental design Solution development or experimentation Interpretation of results Implementation Documentation Feedback and improvement Outcome attributes --actions that explicitly demonstrate mastery of the abilities specified in an outcome or outcome element

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45 Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course Examples: The students will be able to: Solve a second-order ordinary differential equation with specified initial conditions using Matlab Design and carry out an experiment to measure a tensile strength and determine a 95% confidence interval for its true value Define the four stages of team functioning and outline the responsibilities of a team coordinator, recorder, checker, and process monitor

46 Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course Learning objectives should begin with observable action words (such as explain, outline, calculate, model, design, and evaluate) and should be as specific as possible, so that an observer would have no trouble determining whether and how well students have accomplished the specified task. Words like “know,” “learn,” “understand,” and “appreciate” may be suitable for use in educational objectives or program or course outcomes but not learning objectives.

47 IV- Student Exit Survey Questionnaire

48 Student Exit Survey Template Jubail Industrial College is striving to monitor and improve the quality of its academic programs. Therefore, we would appreciate receiving your opinion about your program of study during the period that you spent at the college. Your views and opinions are crucial for the future improvements of the quality of the program, and will be treated with confidentiality. We hope that your answers are both honest and constructive, and will add value to the learning experience at JIC. Please rate from 5=“Strongly Agree” to 1=“Strongly Disagree”

49

50 IV- Alumni Survey Questionnaire

51

52 54321 Student Outcomes 1 I have developed an ability to apply the knowledge, techniques, skills, and modern tools of the discipline to engineering technology activities. 2 I have developed an ability to apply knowledge of mathematics, science, engineering, and technology to engineering technology problems that require practical knowledge. 3I have developed an ability to conduct standard tests and measurements, and to conduct, analyze, and interpret experiments. 4I have developed an ability to function effectively as a member of a technical team. 5I have developed an ability to identify, analyze, and solve simple engineering technology problems. 6I have developed an understanding of the need for and an ability to engage in self-directed continuing professional development. 7 I have developed an ability to apply written, oral, and graphical communication in both technical and nontechnical environments. 8I have developed an ability to identify and use appropriate technical literature. 9 I have developed an understanding of and a commitment to address professional and ethical responsibilities, including a respect for diversity. 10I have developed a commitment to quality, timeliness, and continuous improvement. 11I learned the skills needed to effectively locate, retrieve, and evaluate information Program Educational Outcomes 12 The ELET program has provided me with adequate background to practice my profession as a Electricalt echnician with confidence. 13 The ELET program has provided me with adequate training for improving my personal skills (e.g., teamwork, leadership, oral and written communication skills) in the work place.. 14 The ELET program has provided me with adequate opportunities to help me understand and appreciate the importance of superior work ethics in the practice of my profession. 15 The ELET program has provided me with adequate opportunities to help me understand and appreciate the importance of a good character in the practice of my profession. 16The ELET program has provided me with an adequate ability and motivation to continuously improve my technical skills. The ELET Program has helped me in achieving the current level of success 18The ELET program has provided me with adequate background that I can build on to continue studies for the BS degree. II Surveying the Student Outcomes and Program Educational Outcomes Please rate from 5=“Strongly Agree” to 1=“Strongly Disagree”

53 Comments Which staff or faculty member had the greatest positive impact on you professionally and/or personally: Please list up to three major strengths of the EE program. Please list up to three areas for improvement in the EE program. Please write below any additional comments concerning the EE program.

54 IV- Employers Survey Questionnaire

55


Download ppt "Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment IIISOs."

Similar presentations


Ads by Google