Progress in Graduate Attribute Assessment in Mechanical Engineering at McMaster University M.F. Lightstone July 2014 1.

Slides:



Advertisements
Similar presentations
A Presentation to the Cabinet A Presentation to Stakeholders
Advertisements

University-Wide Course Evaluation Committee Peter Biehl, Chair, Department of Anthropology Krissy Costanzo, Committee Staff Support; Academic Affairs March.
Lia Conklin Olson. Objectives of the Session Upon conclusion of the workshop, participants will be able to: Articulate the processes needed to design.
March 2007 ULS Information Literacy and Assessment of Learning Program.
INSTRUCTORS, FACULTY AND COMMITTEES MUST PLAN WHERE WE WANT THE STUDENT TO GO. THE COURSE OUTLINE GUIDES THE STUDENT ON WHERE TO GO AND HOW.
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
The Assessment Imperative: A Work in Progress A Focus on Competencies Ricky W. Griffin, Interim Dean Mays Business School Texas A&M University.
Group Seminar Field Instruction Model.  1. Delivery of consistent competency based field instruction and augmented case supervision.  2. Provision of.
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
1 Chemical Engineering Student-Faculty Conference April 6, 2011
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Report to External Review Board Brigham Young University Civil & Environmental Engineering October 14, 2005.
College Strategic Plan by Strategic Planning and Quality Assurance Committee.
Program Assessment Team Visit Exit Meeting. Mechanical Engineering Program.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Experience in Applying Online Learning Techniques in Computer Science & Engineering Dr. Aiman H. El-Maleh Computer Engineering Department King Fahd University.
Orientation for Academic Program Reviews
THE ADVANCED TECHNOLOGY ENVIRONMENTAL AND ENERGY CENTER (ATEEC) Summative External Evaluation July 1, 2013 – June 30, 2014 PRELIMINARY OUTLINE.
Facilitating Curriculum Development: Principles, Practices and Tools Peter Wolf April 2013.
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Group Field Instruction Model.  1. Delivery of consistent competency based field instruction and augmented case supervision.  2. Provision of consistent.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
University Of North Alabama General Education Assessment Paradigm Shift: A plan for Revising General Education Assessment at UNA.
Developing and Assessing Program-Level Learning Outcomes Peter Wolf April 17, 2013.
1 Student Success Plans Regional Meeting February 9, 2007 Youngstown State University Office of Assessment Sharon Stringer
Presentation for Flex Day June 7, 2011 LATTC Accreditation
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Chancellor’s Cabinet September 28, 2004.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Florida Department of Education Bureau of School Improvement Office of Curriculum Support.
PROFESSIONAL DEVELOPMENT PLAN WORKSHOP. What is the Professional Development Plan? The Professional Development Plan is a directed planning and evaluation.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Dan Smith Department of Food Science & Technology Oregon State University Making the Connection Between Classroom and Programmatic Assessment Faculty Assessment.
Assistant Principal Meeting August 28, :00am to 12:00pm.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Academic Year.  Still working well 17 reports submitted, 1 missing  9 of 18 departments expressed concerns about assessment 4 departments reported.
Responsible Conduct of Research (RCR) What is RCR? New Requirements for RCR Who Does it Affect? When? Data Management What is the Institutional Plan? What.
Venue: M038 Date: Monday March 28,2011 Time: 10:00 AM JIC ABET WORKSHOP No.2 Guidelines on: IMapping of PEOs to Mission Statement IIMapping of SOs to PEOs.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
Department of Computing and Technology School of Science and Technology Bachelor of Science Technology CIP Code Program Quality Improvement Report.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
1 Proposal for Direct Assessment of Program Outcomes (For Faculty Meeting, Nov. 28, ’05)
“Learn It! Live It!” Ensuring the Workforce Readiness Skills and Behaviors of Today’s and Tomorrow’s Workers Fall Convocation 2015 Presentation Quality.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Progress in Graduate Attribute Assessment in Mechanical Engineering at McMaster University M.F. Lightstone February 2012.
A State Staff Guide for Managing Statewide Initiatives Presented by Dr. Lennox McLendon.
2012 Middle States Accreditation Report Review Chapter 1: Institutional Excellence Standards 1 and 6.
Facilitators: Dr. Wanda Zagozdzon-Wosik Dr. Dr. Jung-Uk Lim ECE 4336: CAPSTONE DESIGN II.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Assessment at City Tech April 17, 2015 Living Lab Associate Seminar Susan Nilsen-Kupsch RDH MPA.
College of Engineering Breakfast Presentation April 8, 2009 COE Assessment1.
Arkansas Tech University’s College Student Personnel Master’s Program Overview 124 Crabaugh Hall – ATU Campus Russellville, AR
Use of Surveys N J Rao and K Rajanikanth
SAN JOSE STATE UNIVERSITY SCHOOL OF SOCIAL WORK FIELD INSTRUCTION INITIATIVE PARTNERED RESEARCH PROJECT Laurie Drabble, Ph.D., MSW, MPH Kathy Lemon Osterling,
AAC&U Members on Trends in Learning Outcomes Assessment Key findings from a survey among 325 chief academic officers or designated representatives at AAC&U.
Curricular Assessment: Update & Discussion Lisa Lenze Director of Learning Initiatives IST All-Faculty Meeting May 11, 2009.
Information Seeking Behavior and Information Literacy Among Business Majors Casey Long Business Liaison Librarian University Library Georgia State University,
Integrating Information Literacy Into Technology Literacy Classes Rory Patterson, Associate Professor of Library Science, Liberty University Ann Rowlette,
Accreditation of study programs at the Faculty of information technologies Tempus SMGR BE ESABIH EU standards for accreditation of study.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
Writing Assignments in Mechanical Engineering Anne Parker University of Manitoba A. Parker, CASDW, UVic,
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
CHW Montana CHW Fundamentals
Assessment and Reporting
Writing Program-level Student Learning Outcomes (PSLOs)
Keywords: Engineering ethics, design education,
Student Learning Outcomes Assessment
Writing the Institutional Report
Applied Psychology Program School of Business and Liberal Arts Fall 2016 Assessment Report
Presentation transcript:

Progress in Graduate Attribute Assessment in Mechanical Engineering at McMaster University M.F. Lightstone July

Organizational Structure Faculty of Engineering Grad. Attributes Comm. – Formed in early 2010 to develop strategy/guidelines for graduate attributes assessment – 1 member from each department – Fall 2010: set of ‘indicators’ for each attribute were developed Mech Eng Departmental Grad Attributes Comm. – Formed in early 2011 – Mandate: assist faculty with curriculum mapping, measurement of indicators, continuous improvement, rubric development and documentation of measurements. 2

Mech. Eng. Progress – Early 2011 Curriculum mapping of indicators was performed based on 2010/11 calendar. For each course and indicator, instructor assessed: 0 - indicator not covered 1 – mentioned 2 – taught and graded 3 – significant part of the course Revealed areas for continuous improvement: – Technical communications – Professionalism – Ethics – Teamwork – Conflict resolution – sustainability 3

Mechanical Engineering Progress: Continuous Improvement (2011/12) Major changes made to curriculum (2011/12): – New course MECH ENG 2A03 (“Design Communication”): has module on technical communication. Also added AUs for CEAB input based assessment. – ENGINEER4A03 (“Engineering & Social Responsibility”): includes professionalism and ethics. Now a required course. 4

Mechanical Engineering Progress Continuous Improvement (2011/12) Capstone design course (MECH ENG 4M06): Incorporated additional lectures on: – Professionalism and ethics (Ross Judd) – Library research (Andrew Colgoni, McMaster Library) – Teamwork/conflict resolution/emotional intelligence (Sonia Hawrylyshyn, Manager, Employee Career Services, Human Resources McMaster) – Success in the Workplace (Dr. John Mackinnon, Vice- President Engineering, AMEC-NSS) – Students were tested on lecture content. 5

Measurement – 2011/12 – Trial Year MECH ENG 4M06 – Capstone design: – Introduced rubrics for assessment of presentations (2 per year) and final report (April 2012). – Rubrics were carefully written to incorporate ‘indicators’ associated with: engineering design communication sustainability economics and project management – Created web-based survey to assess indicators associated with “individual and team work” 6

Measurement (2011/12 - TRIAL) (con’t) Knowledge base for engineering: (Indicator: “competence in engineering fundamentals”) Measure in core courses, span years 2-4: MECH ENG 4V03 “Thermo-fluids Sys. Des”(fall – 2011) MECH ENG 3F04 “Numerical Methods” (fall – 2011) MECH ENG 2W04 “Thermodynamics”(winter – 2012) MECH ENG 4R03 “Control Systems” (winter – 2012) Method for measurement: performance on specific questions on tests/exams 7

2011/12 – What did we learn? Professors were unclear as to how to do measurements Training the faculty on how to actually do GA measurements was critical Needed a step-by-step method that would remove the “fuzziness” and also provide some background information on “jargon” used. 8

Workshop on GA Measurement – April 2012 Ken, Carlos and Marilyn worked on developing a faculty wide workshop on how to actually do GA measurement. Includes: – Background on GA, indicators, Learning Outcomes, Bloom’s Taxonomy,… – How to create a rubric – How to do the measurement – the logistics and examples – What to include in the final report – Importance of continuous improvement First workshop given on April 25, Has been repeated numerous times at both Faculty of Engineering level and to individual departments. 9

At the Department Level Measurements are essentially done at the departmental level and organized by the Department GA Committee (but with some guidance from Faculty level committee). In Mechanical Engineering – measurements are done for all CORE courses. Key point: ALL professors are expected to participate! Departments responsible for: – Developing a “Measurement Plan” for each academic year – Reviewing the quality of the reports that each professor submits (checklist provided) – Creating “year-end” report that integrates the measurements taken from individual courses The importance of sharing documentation and processes between departments Organizing all the material: Marilyn developed website that facilitated sharing of key documents and processes 10

Summary of TimeLine Fall 2010 – Developed indicators (Faculty level) Early 2011 – Indicator mapping (Dept level) 2011/12 – Trial Year – Addressed gaps in mapping (Dept level) – Trial measurements in a few courses (Dept level) – GA measurement workshop (Faculty level – April 2012) 2012/13 – Nearly Real Measurements – Measured indicators for 6 attributes for all core courses – Instructors are still learning, so not all reports at same quality – GA measurement workshop given again + presentations by each dept (May 2013) – Summer 2013 – Faculty streamlined indicator list, dept did a remapping – Note that Marilyn was on sabbatical that year 2013/14 – Real Measurements – Measured indicators for other attributes for core courses (plus “Knowledge Base” measured every year) – Help from Minha on teamwork and communication 2014/15 – CEAB Evaluation year – Bring in “stakeholders” into Department GA committee – Prepare for site visit in fall of

Important Points Need to provide clear training to professors on how to do the measurements Sharing of documentation, rubrics, processes, between departments is critical Document measurement results and store documents centrally Check for continuous improvement (have professors made changes identified in previous measurement) 12

Documents that U. Guelph might want to see: GA measurement workshop presentation Checklist for GA reports Sample measurement report Sample ‘checklist’ for reviewing GA reports GA – suggested management structure document by Marilyn 13