Use of Surveys N J Rao and K Rajanikanth

Slides:



Advertisements
Similar presentations
Demanding Questions and Difficult Answers Alan Maddocks Carol Newbold Loughborough University.
Advertisements

LBSS Faculty of Law Business and Social Sciences Law Accountancy Business and Management Central and East European Studies Economics Economic and Social.
Assessment Report Computer Science School of Science and Mathematics Kad Lakshmanan Chair Sandeep R. Mitra Assessment Coordinator.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Computer Science Department Program Improvement Plan December 3, 2004.
College Strategic Plan by Strategic Planning and Quality Assurance Committee.
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Venue: M038 Date: Monday March 14,2011 Time: 10:00 AM JIC ABET WORKSHOP No.1 Guidelines on: I- Department’s Mission, PEOs and SOs II- The Preparation of.
GROUP DYNAMICS AND TEAM DEVELOPMENT Radu RĂDUCAN.
Standards and Guidelines for Quality Assurance in the European
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
CAA’s IBHE Program Review Presentation April 22, 2011.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
King Fahd University of Petroleum and Minerals
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Sustaining Agronomy Outcomes Assessment Michelle D. Cook Graduate Research Assistant May 18, 2004.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Assessment of Student Learning Faculty In-service June 5, 2006.
Day 1 Session 2/ Programme Objectives
ACADEMIC PERFORMANCE AUDIT
Foundations of Educating Healthcare Providers
OBE Briefing.
Project Learning Tree Project Learning Tree is an award-winning environmental education program designed for teachers and other educators, parents, and.
CHEMICAL ENGINEERING PROGRAM CHEN Program Assessment Advisory Board Meeting May 21, 2013.
Overview of the Department’s ABET Criterion 3 Assessment Process.
ASSESSMENT SYED A RIZVI INTERIM ASSOCIATE PROVOST FOR INSTITUTIONAL EFFECTIVENESS.
Accreditation Evaluation of the BS-CSE Program Neelam Soundarajan Chair, Undergrad Studies Comm. CSE Department 1.
Bachelor’s Program in CIVIL ENGINEERING Duration of Studies: 8 terms Academic Degree to be Awarded Bachelor of Civil Engineering.
Problem-Based Learning. Process of PBL Students confront a problem. In groups, students organize prior knowledge and attempt to identify the nature of.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Venue: M038 Date: Monday March 28,2011 Time: 10:00 AM JIC ABET WORKSHOP No.2 Guidelines on: IMapping of PEOs to Mission Statement IIMapping of SOs to PEOs.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Design of a Typical Course s c h o o l s o f e n g I n e e r I n g S. D. Rajan Professor of Civil Engineering Professor of Aerospace and Mechanical Engineering.
King Saud University, College of Science Workshop: Programme accreditation and quality assurance Riyadh, June 13-14, 2009 II.4 Exercise: Assessment of.
AN INTRODUCTION TO PERSONAL DEVELOPMENT FOR UNDERGRADUATES.
Problem-Solving Approach of Allied Health Learning Community.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Part I Educational Technology1 INTRODUCING THE STANDARDS TOOLKIT (Educational Technology) Performance Indicator Progression Scope and Sequence Instructional.
Gateway Engineering Education Coalition Background on ABET Overview of ABET EC 2000 Structure Engineering Accreditation and ABET EC2000 – Part I.
Program Outcomes, Program Specific Outcomes, and Course Outcomes N J Rao and K Rajanikanth
University of Utah Program Goals and Objectives Program Goals and Objectives Constituents U of U, COE, ASCE, IAB Constituents U of U, COE, ASCE, IAB Strategic.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Program Curriculum, T-L and Other Processes; Highly Doable and Highly Useful N J Rao and K Rajanikanth
Job Analysis (Session Four) Jayendra Rimal. What is Job Analysis & its Uses The procedure for determining the critical knowledge, abilities, skill and.
UTS Careers Presents: Enhancing Student Employability.
Computer Engineering Program Outcomes Assessment Dept. of Computer Engineering King Fahd University of Petroleum & Minerals, Saudi Arabia Dept. of Computer.
Gap Analysis Megat Johari Megat Mohd Noor Azlan Abdul Aziz
ABET Accreditation College of IT and Computer Engineering
OUTCOME BASED EDUCATION
Accreditation Board for Engineering and Technology
Project Learning Tree Project Learning Tree is an education program designed for teachers and others working with youth from pre-school through 12th grade.
Department of Computer Science The University of Texas at Dallas
Development of ABET Syllabus
Assessment and Accreditation
Competency Based Training Delivery – is a kind of delivery where students undergo training at their own pace.
Engineering Science Program School of Engineering Technology Fall 2015 Assessment Report
Criminal Justice: Law Enforcement Leadership School of Health, Science and Criminal Justice Fall 2015 Assessment Report Curriculum Coordinator: Lisa Colbert.
Campus Management System (CMS): A tool to ease Continual Quality Improvement (CQI) implementation process in Outcome Based Education (OBE) Approach Presented.
Campus Management System (CMS): A tool to ease Continual Quality Improvement (CQI) implementation process in Outcome Based Education (OBE) Approach Presented.
Objectives & Outcomes Chuck Cone ERAU Oct 30, 2010.
Presentation transcript:

Use of Surveys N J Rao and K Rajanikanth

A Recap Evaluation of attainment of POs and PSOs is based on Direct and Indirect Methods! Direct Methods: The performance of students in different assessments (Internal, University)  Evaluation of attainment of COs  Evaluation of attainment of POs and PSOs based on the mappings from COs to POs and PSOs Indirect Methods: Program Exit Surveys, Alumni Surveys, and Employer Surveys are used to evaluate the attainment of POs and PSOs 2

Attainment of POs and PSOs Evaluations of attainment of POs and PSOs based on Direct and Indirect Methods are combined to arrive at the Final Evaluation. Example: PO 5 (Modern Tool Usage): Evaluation Based on Direct Methods: Level 2 3

Attainment of POs and PSOs (2) Based on Indirect Methods (3 Surveys): 2.67 Combined Evaluation: (w 1 x 2) + (w 2 x 2.67) The weights w 1 and w 2 need to decided by the Institute. Typical values can be 0.8 and 0.2 respectively! With these values, the combined value is: = 2.13 (Between Level 2 and Level 3) 4

Attainment of PEOs Evaluation of attainment of PEOs is generally based only on Indirect Methods! Indirect Methods: Alumni Surveys, and Employer Surveys are generally used to evaluate the attainment of PEOs. Thus the data from Surveys is used for evaluating the attainment of POs and PSOs as well as PEOs. The actual responses useful for these two different purposes are not identical! 5

Program Exit Survey - 1 Personal Details: Name Duration at the Institute (From...To....) Program of Study Rural / Urban Background Placement Status Status in GATE / GRE / (What follows are sample questions only) On a scale of 1 (worst) to 5 (best) where relevant (other ranges are possible, of Course) 6

Program Exit Survey - 2 Level of comfort in working in groups Level of confidence in formulating imprecise real- world problems as formal engineering problems Opportunities provided for acquiring leadership skills Communication skills and Interpersonal skills acquired during your stay in the Institute Nature of final-year project: (Research, Implementation, Fabrication, Purely theoretical,...) 7

Program Exit Survey - 3 Confidence in applying concepts of Mathematics and Computing in solving problems Usefulness of professional core courses during job interviews Availability and adequacy of modern tools in the laboratories Opportunities provided for working in multi- disciplinary project teams Usefulness of Mathematics, Professional core and electives in competitive exams like GATE, GRE etc 8

Program Exit Survey - 4 Level of understanding of the need to factor in sustainability, ethical, health, public safety, and environmental issues in the solutions developed by you. Opportunities for working on real-life problems during the program Extent of opportunities available for applying project management principles in academic activities undertaken by you during the program Extent of usefulness of Basic Science and Engineering Science courses in problem solving 9

Program Exit Survey - 5 New tools (outside the formal curriculum) learnt Extent of acquisition of critical analysis competency in solving complex engineering problems (PG?) Opportunities available for working on projects with research focus (PG?) Open suggestions for improving the quality of the program 10

Alumni Survey - 1 Personal Details: Name Duration at the Institute (From...To....) Program of Study Rural / Urban Background

Alumni Survey - 2 On a scale of 1 (worst) to 5 (best) where relevant (other ranges are possible!) (These are sample questions only): Current Position; Organization Initial Position; Organization Promotions, Organizations in which you worked along with period in each organization, Rewards, Awards, projects handled etc. Publication of Research Papers, White Papers etc. Level of comfort in working in groups – initially and at present 12

Alumni Survey - 3 Enhancement of qualifications (higher degrees, certificate courses etc), knowledge, skills etc. (workshops, training programs etc.) Level of confidence and success in formulating imprecise real-world problems as formal engineering problems – initially, now Success in leadership roles (preparedness at program exit, success in on-site trainings etc.) Communication skills (level of acquisition during the program, usefulness in the job, additional acquisitions during work etc.) 13

Alumni Survey - 4 Level of Interpersonal skills Ease with modern tools Learning curve with new tools New tools learnt during job 14

Alumni Survey - 5 Your assessment of need for professional ethics in work Comfort level with application of concepts Mathematics, Engineering,… in solving real problems Usefulness of professional core courses in your professional practice. Relevance of professional electives to your profession so far 15

Alumni Survey - 6 Ability to factor in sustainability, ethical, health, public safety, and environmental issues in the solutions developed by you. Extent of application of project management principles in the projects handled/being handled by you Extent of usefulness of Basic Science and Engineering Science courses in understanding problems you solved so far in your career Open suggestions for improving the quality of the Program 16

Employer Survey - 1 Organization Details:... Employee Details: Name Current Position Date of Joining the Organization Position at the time of joining

Employer Survey - 2 With respect to our Graduates, please indicate Your assessment on the following: Ability to work well in groups Publication of Research Papers, White Papers etc. Level of confidence and success in formulating imprecise real-world problems as formal engineering problems Success in leadership roles Communication skills 18

Employer Survey - 3 Interpersonal skills Ability to learn and use new and modern tools Ethical Behavior Ability to factor in sustainability, ethical, health, public safety, and environmental issues in the solutions developed 19

Employer Survey - 4 Extent of application of project management principles in the projects handled/being handled by him/her Extent of critical analysis competency exhibited in solving complex engineering problems Enthusiasm in participating your CSR activities Any specific negative traits observed Open suggestions for improving the quality of our graduates 20

Using the Survey Data - 1 Using the survey data for evaluating the attainment of a PO or PSO or PEO is same: Example: PO 5 (Modern Tool Usage) 1.Identify the responses that are relevant to this PO from each survey. Example: “Rate the Ability to learn and use new and modern tools” from Employer Survey “New tools (outside the formal curriculum) learnt” from Program Exit Survey and so on... 21

Using the Survey Data With data from only one type of survey, find the average rating for one relevant question. Example (cont’d): Using Program Exit Survey 50 people answered the example question given earlier; 6 rated 1 (low); 35 rated 4; and 9 rated 5. So, the average is: Repeat for all other relevant questions from the same survey Example (cont’d): Assume there are 3 other relevant questions and their average ratings are 3.91, 4.15, and The final average rating from this survey is

Using the Survey Data Set target levels of attainment 6. Example: Average value from a Survey is < 3  Level 1 ≥ 3 and < 4  Level 2 ≥ 4  Level 3 (Other ranges are possible; discuss in department and record the justifications for setting the target levels the way they are set) 7. So, Attainment of PO 5 from the survey under consideration is: 4.19  Level 3 8. Repeat with other types of Surveys if relevant. 23

Using the Survey Data Compute the grand average as the Final Value of Attainment of this PO Example: Attainment of PO5 From Program Exit Survey: Level 3 From Alumni Survey: Level 3 From Employer Survey: Level 2 Final Value: (3+3+2) / 3 = Repeat this for each PO, PSO, and PEO Surveys useful for Pos and PSOs: Program Exit Survey, Alumni Survey, Employer Survey Surveys useful for PEOs: Alumni Survey, Employer Survey 24

Using the Survey Data - 5 Alternative approach for combining results from different surveys: Previous approach: Result of each survey was immediately quantized in to one of the 3 levels Alternatively: We can retain the average value computed for each survey (without quantizing); find the grand average value from all the relevant surveys; and then quantize! Example: Attainment of PO5 Values from Program Exit Survey, Alumni Survey, Employer Survey are respectively: 4.19, 4.32, 3.79  Grand Average = 4.1  Level 3 25

26 USING THE SURVEY DATA EXERCISE 26

Course Surveys Course Surveys: Mid-Course ; Course-End Written / Electronic; Signed / Anonymous Mid-Course Survey: – Typically, about a month after the start of the course; can be repeated after another month! – Useful for corrections in course delivery Course-End Survey: – At the end of the course – Useful for “closing the quality loop” – May be used in computing course attainment, though the manual does not explicitly recognize this approach! 27

Mid-Course Survey - 1 Helpful for mid-course corrections Typical Questions to be answered by all the students (on a scale of 1 to 5 – most negative to most positive response): – COS are clear – Pace of coverage is comfortable 28

Mid-Course Survey - 2 – Instruction is aligned to COs – Questions are encouraged – Good access to learning resources – Examples are worked out well – Good communication skills (of Faculty) – Supportive attitude (of Faculty)

Course-End Survey - 1 Helpful for : “closing the loop” Can be used in computing attainments of COs Questions generally cover: – Course Management – Learning Environment – Attainment of COs – Instructor characteristics

Course-End Survey - 2 Typical Questions to be answered by all the students (on a scale of 1 to 5 – most negative to most positive response): – COs were clear – Instructional activities helped in attaining Cos – Pace of coverage was comfortable 31

Course-End Survey - 3 Questions were encouraged Had good access to learning resources Examples were worked out well and also useful for Examinations Instructor had good communication skills Instructor’s attitude was supportive How much did you learn? Any specific CO(s) that you are not confident of? (Tick them in the list below) The course helped you in improving your problem solving abilities

Using the Survey Data Find the average rating for one relevant question. Example: For a question related to CO3, of the 65 answers: 6 rated 1 (low); 54 rated 4; and 5 rated 5. So, the average is: 3.8 It corresponds to (as per our own settings) Level 2 ( medium)! Repeat for all other relevant questions The final attainment of that CO is the average of all these values This process is repeated for all the COs 33

Combining Direct & Indirect Evaluations The attainment levels obtained by direct methods and course-end survey can be combined to get the final level of attainment. The relative weights need to be decided upon. (90% and 10% to 80% and 20%?) Example:CO2 – Direct method (University Examination + Internal Assessment): 1.9 – Based on Course-End Survey: 2 – Final Value: (0.9 x 1.9) + (0.1 x 2)=