Presentation is loading. Please wait.

Presentation is loading. Please wait.

NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup.

Similar presentations


Presentation on theme: "NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup."— Presentation transcript:

1 NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup

2 Lisa Getzler-Linn, Director Baker Institute for Entrepreneurship, Creativity and Innovation Integrated Product Development Program Integrating assessment of innovation, creativity and entrepreneurial skills in the undergraduate engineering curriculum

3 Integrated Product Development Program (IPD): Authentic, experiential learning through projects with established companies, local entrepreneurs, student entrepreneurs >19 years >250 Industry Sponsors >~3000 students in over 400 project teams 2014 Project Year: 32 teams, 210 students, 15 majors,18 team advisors

4 Assessment of student performance in an experiential, problem based, multidisciplinary team project course where a large part of the learning is unstructured, and the body of knowledge expected to be applied is variable can be direct and authentic but it’s a challenge. Tools used to assess a student’s performance should represent all meaningful aspects of that performance as well as provide equitable grading standards. IPD has developed direct, authentic and formative measurement tools that are tied directly to course learning objectives. Assessment of Student Performance in IPD

5 Student Performance Assessment Primer Direct Measures Tools used to measure what a student can do Indirect Measures Tools used to measure what is perceived by the student Authentic Measures Tools used to measure an act that occurs in a real setting as opposed to a simulation Performance Criteria The standards by which student performance is measured Formative Assessment Tools that measure attainment and provide feedback so the student may adjust, improve or reiterate their work product, performance or behavior. Summative Assessment Tools that measure skill attainment or knowledge gained during a period of time where the measurement is taken at the end of the process.

6 IPD Objectives: Design effective solutions to industry problems Demonstrate an understanding of technical entrepreneurship Participate in & lead an interdisciplinary product development team Effectively communicate through written, oral & graphical presentations Develop engineering design solutions in a broad global context Address aesthetics & ergonomics issues in product development Develop a value statement for the product to be developed Design, create & evaluate technical and financial feasibility studies Experience project management including people & financial resources

7 Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics Can we measure a student’s understanding of the underlying process, entrepreneurial mindset, use of higher order skills, and willingness to immerse themselves in the product development/innovation journey? What “measurable moments” occur during the lifecycle of an IPD project? Which are the appropriate assessment tools for each measurable moment? Behaviors, attitudes, mindset?What about project artifacts? How and what to Measure ?

8 Lisa G “A well articulated and publicly visible rubric can become the meeting ground that facilitates a shared understanding of what the students should know and be able to do (Bush & Timms, 2000)” IPD + Assessment = rubrics A rubric is: An assessment tool used to create a clear picture of what is expected of students in a given task or deliverable. Designed to both set expectations and measure the learning that has occurred. Used by IPD to directly measure the authentic learning that has occurred during the life of the team project as well as to give formative feedback.

9 Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics The IPD Toolset of Rubrics Rubrics are used by all 18 team advisors to grade all 200 students across all 32 teams and measure: student performance by team- Midterm Presentation, Final Presentation, Written Reports, Posters and Briefs individual student performance- Notebook, Contribution to Project

10 Lisa Getzler-Linn, Integrated Product Development Program John B. Ochs, Integrated Product Development Program Todd A. Watkins, College of Business & Economics The IPD Toolset of Rubrics activity - artifact - criteria Spring semester – Month #1 Background and overview of industry, company and competitive landscape. Problem definition, business opportunity and technical contextualization of problem including customer and stakeholder identification and needs plus current practices and specifications and constraints. Presentation #1 - team describes, discusses and presents evidence of above. Rubric – team measure for capturing the artifacts, experiences and authentic moments when the students actually discovered these, and measuring the level to which they did so.

11 Presentation #1 – team’s first attempt

12 How and Why – not so much What Spring semester - Month #2, Presentation #2: Generating concepts then combining and selecting the one(s) that will solve the technical problem in a business context through innovative, appropriate means and the process followed to do so. Technical analyses of concept(s) through modeling, simulation, mock-up development to create a clear path toward recognizing parameters, performance characteristics and user requirements. Tying the customer /stakeholder needs back to the concept selection process and quantifying those needs.

13 Presentation #2 – deeper dive

14 Spring semester - Individual Lab or Maker Notebook: This living document is used throughout the project as both a record of work done by the individual student as a member of the course/team/project, and as a legal record of Intellectual Property if invention occurs. Reflection on the design process has been included as a metric that measures professional skills beyond that of a student in a course. Notebooks are collected 3 x per semester and the rubric is applied. One grade is given at the end for the overall document and process followed. Record, Reflect, Reiterate

15 Individual Notebook Rubric

16 Individual Contribution to Team

17 Presentations Weekly Briefs and Executive Summaries: Team artifacts with measures to capture the professional skills of graphical, oral and written communications in the context of presenting evidence of the project’s status. Through the ability to communicate the actual events that led to the discoveries and solutions to the problem, student learning is achieved and measured – both presentation skills and the authentic events that are documented. Project Artifacts

18 Presentation Rubric

19 Program Assessment - indirect IPD uses surveys, interviews and focus groups for student feedback on program efficacy – for the purpose of continuous improvement. Areas covered are course objectives as related to project deliverables, students’ understanding of their own capabilities as a result of the course, student satisfaction with faculty, staff, process and facilities. for sponsor satisfaction, surveys and individual interviews are conducted as external evaluation of above metrics. IPD provides assessment data, documents and protocols to participating departments and colleges for accreditation purposes.

20 Design Your Assessment Tools: focus Higher order skills like creativity, innovation, communication, critical thinking, design thinking, etc. can be measured. What are you measuring for? Attainment of knowledge? Application of techniques? Evidence of work accomplished? Which skills should be measured for grading purposes? Are there activities that indicate that learning has occurred? How should domain knowledge be measured?

21 Design Your Assessment Tools: define Purpose of assessment: grading? learning outcomes? pre/post Student performance, self-efficacy or program efficacy? Direct or Indirect? Formative or Summative? Type of learning being measured? experiential, authentic? Multiple graders or one faculty member? Learning objectives? Criteria for each objective?

22 Thank You! Lisa Getzler-Linn, Director Baker Institute for Entrepreneurship, Creativity and Innovation Integrated Product Development Program Lehigh University 11 E Packer Avenue Bethlehem PA 18015 lig4@lehigh.edu


Download ppt "NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup."

Similar presentations


Ads by Google