NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
March 23, Todays Outcomes By the end of todays workshop, participants will be able to... Revise and/or finalize SAOs within their areas Discover.
WMO Competency Standards: Development and Implementation Status
National Academic Reference Standards
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Engineering Design Rubric Dimensions 1, 2 and 7.
Apples to Oranges to Elephants: Comparing the Incomparable.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Assessing Student Learning in EPICS
ABET The Complete Report on Your Course. ABET OUTCOME CHECKLIST.
Curriculum, Instruction, & Assessment
Computer Science Accreditation/Assessment Issues Bolek Mikolajczak UMass Dartmouth, CIS Department Chair IT Forum, Framingham, MA January 10, 2006.
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
ABET Accreditation Board for Engineering and Technology
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Capstone Design Project (CDP) Civil Engineering Department First Semester 1431/1432 H 10/14/20091 King Saud University, Civil Engineering Department.
Minia Univresity Council for Education Affairs Meeting - July Higher Education Enhancement Project Fund (HEEPF) Improving Transportation Engineering.
Y. Rong June 2008 Modified in Feb  Industrial leaders  Initiation of a project (any project)  Innovative way to do: NABC ◦ Need analysis ◦ Approach.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Best Practices In Design Outcomes Of A Survey P. H. King, PhD, PE Joan Walker, PhD Vanderbilt University.
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
ACADEMIC PERFORMANCE AUDIT
OBE Briefing.
ABET’s coming to Rose! Your involvement Monday, Nov 5, 2012.
Engineering Education Conference - Spring 2009 Increasing Assessment Effectiveness in a Time of Decreasing Budgets Increasing Assessment Effectiveness.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Outcome-based Education – From Curriculum to Classroom practices
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
ationmenu/nets/forteachers/2008s tandards/nets_for_teachers_2008.h tm Click on the above circles to see each standard.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Instruction and Assessment of Multidisciplinary Teaming Skills in Senior Design Deanna P. Dannels, Paula Berardinelli, Chris M. Anson, Lisa Bullard, Chris.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
EENG 4910/4990 Engineering Design Murali Varanasi September 02, 2009.
1 Roles and Responsibilities of The Learning Evidence Team at CCRI Presented at CCRI Peggy Maki
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Using Performance Criteria to Assess Student Outcomes Teaching & Learning Symposium May 20, 2009.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
The Importance of Professional Learning in Systems Reform AdvancED Performance Accreditation Bev Mortimer Concordia Superintendent
Capturing the Cyclic Nature of Design with Multi-Generation Projects Department of Bioengineering University of California–San Diego Melissa Kurtis Micou,
Marquette University Jay R. Goldberg, PhD, PE Forum on Innovation and Entrepreneurship in Biomedical Engineering Education.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION GRAVES COUNTY SCHOOLS © 2010 AdvancED.
Assessment of Industrial Internships Karyn Biasca.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
By: Wilmer Arellano.  1. Form a team  2. Find a Team Leader  3. Find Three Potential Topics  4. Find a Mentor  5. Select a Topic.
Roundtable on Entrepreneurship Education - Program Highlights - Integrated Product Development John Ochs, PhD., Director Lisa Getzler-Linn, Academic Projects.
Department of Computer Science The University of Texas at Dallas
Assessment and Accreditation
Jillian Kinzie, Indiana University Center for Postsecondary Research
Presentation transcript:

NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup

Lisa Getzler-Linn, Director Baker Institute for Entrepreneurship, Creativity and Innovation Integrated Product Development Program Integrating assessment of innovation, creativity and entrepreneurial skills in the undergraduate engineering curriculum

Integrated Product Development Program (IPD): Authentic, experiential learning through projects with established companies, local entrepreneurs, student entrepreneurs >19 years >250 Industry Sponsors >~3000 students in over 400 project teams 2014 Project Year: 32 teams, 210 students, 15 majors,18 team advisors

Assessment of student performance in an experiential, problem based, multidisciplinary team project course where a large part of the learning is unstructured, and the body of knowledge expected to be applied is variable can be direct and authentic but it’s a challenge. Tools used to assess a student’s performance should represent all meaningful aspects of that performance as well as provide equitable grading standards. IPD has developed direct, authentic and formative measurement tools that are tied directly to course learning objectives. Assessment of Student Performance in IPD

Student Performance Assessment Primer Direct Measures Tools used to measure what a student can do Indirect Measures Tools used to measure what is perceived by the student Authentic Measures Tools used to measure an act that occurs in a real setting as opposed to a simulation Performance Criteria The standards by which student performance is measured Formative Assessment Tools that measure attainment and provide feedback so the student may adjust, improve or reiterate their work product, performance or behavior. Summative Assessment Tools that measure skill attainment or knowledge gained during a period of time where the measurement is taken at the end of the process.

IPD Objectives: Design effective solutions to industry problems Demonstrate an understanding of technical entrepreneurship Participate in & lead an interdisciplinary product development team Effectively communicate through written, oral & graphical presentations Develop engineering design solutions in a broad global context Address aesthetics & ergonomics issues in product development Develop a value statement for the product to be developed Design, create & evaluate technical and financial feasibility studies Experience project management including people & financial resources

Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics Can we measure a student’s understanding of the underlying process, entrepreneurial mindset, use of higher order skills, and willingness to immerse themselves in the product development/innovation journey? What “measurable moments” occur during the lifecycle of an IPD project? Which are the appropriate assessment tools for each measurable moment? Behaviors, attitudes, mindset?What about project artifacts? How and what to Measure ?

Lisa G “A well articulated and publicly visible rubric can become the meeting ground that facilitates a shared understanding of what the students should know and be able to do (Bush & Timms, 2000)” IPD + Assessment = rubrics A rubric is: An assessment tool used to create a clear picture of what is expected of students in a given task or deliverable. Designed to both set expectations and measure the learning that has occurred. Used by IPD to directly measure the authentic learning that has occurred during the life of the team project as well as to give formative feedback.

Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics The IPD Toolset of Rubrics Rubrics are used by all 18 team advisors to grade all 200 students across all 32 teams and measure: student performance by team- Midterm Presentation, Final Presentation, Written Reports, Posters and Briefs individual student performance- Notebook, Contribution to Project

Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics The IPD Toolset of Rubrics activity - artifact - criteria Spring semester – Month #1 Background and overview of industry, company and competitive landscape. Problem definition, business opportunity and technical contextualization of problem including customer and stakeholder identification and needs plus current practices and specifications and constraints. Presentation #1 - team describes, discusses and presents evidence of above. Rubric – team measure for capturing the artifacts, experiences and authentic moments when the students actually discovered these, and measuring the level to which they did so.

Presentation #1 – team’s first attempt

How and Why – not so much What Spring semester - Month #2, Presentation #2: Generating concepts then combining and selecting the one(s) that will solve the technical problem in a business context through innovative, appropriate means and the process followed to do so. Technical analyses of concept(s) through modeling, simulation, mock-up development to create a clear path toward recognizing parameters, performance characteristics and user requirements. Tying the customer /stakeholder needs back to the concept selection process and quantifying those needs.

Presentation #2 – deeper dive

Spring semester - Individual Lab or Maker Notebook: This living document is used throughout the project as both a record of work done by the individual student as a member of the course/team/project, and as a legal record of Intellectual Property if invention occurs. Reflection on the design process has been included as a metric that measures professional skills beyond that of a student in a course. Notebooks are collected 3 x per semester and the rubric is applied. One grade is given at the end for the overall document and process followed. Record, Reflect, Reiterate

Individual Notebook Rubric

Individual Contribution to Team

Presentations Weekly Briefs and Executive Summaries: Team artifacts with measures to capture the professional skills of graphical, oral and written communications in the context of presenting evidence of the project’s status. Through the ability to communicate the actual events that led to the discoveries and solutions to the problem, student learning is achieved and measured – both presentation skills and the authentic events that are documented. Project Artifacts

Presentation Rubric

Program Assessment - indirect IPD uses surveys, interviews and focus groups for student feedback on program efficacy – for the purpose of continuous improvement. Areas covered are course objectives as related to project deliverables, students’ understanding of their own capabilities as a result of the course, student satisfaction with faculty, staff, process and facilities. for sponsor satisfaction, surveys and individual interviews are conducted as external evaluation of above metrics. IPD provides assessment data, documents and protocols to participating departments and colleges for accreditation purposes.

Design Your Assessment Tools: focus Higher order skills like creativity, innovation, communication, critical thinking, design thinking, etc. can be measured. What are you measuring for? Attainment of knowledge? Application of techniques? Evidence of work accomplished? Which skills should be measured for grading purposes? Are there activities that indicate that learning has occurred? How should domain knowledge be measured?

Design Your Assessment Tools: define Purpose of assessment: grading? learning outcomes? pre/post Student performance, self-efficacy or program efficacy? Direct or Indirect? Formative or Summative? Type of learning being measured? experiential, authentic? Multiple graders or one faculty member? Learning objectives? Criteria for each objective?

Thank You! Lisa Getzler-Linn, Director Baker Institute for Entrepreneurship, Creativity and Innovation Integrated Product Development Program Lehigh University 11 E Packer Avenue Bethlehem PA