Graduate attribute assessment as a COURSE INSTRUCTOR Brian Frank and Jake Kaupp CEEA Workshop W2-1B.

Slides:



Advertisements
Similar presentations
Global Learning Outcomes at Pensacola State College (GLOs)
Advertisements

Performance Assessment
The Teacher Work Sample
Provincial Report Cards Mathematics Grades 1 to 12.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Rubric Workshop Los Angeles Valley College Fall 2008.
Common Core State Standards (CCSS) Nevada Joint Union High School District Nevada Union High School September 23, 2013 Louise Johnson, Ed.D. Superintendent.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Mary Allen Qatar University September Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps.
KSC Mathematics Creating Rubrics for Assessment of General Education Mathematics Dick Jardine SUNY General Education Assessment Conference Syracuse, NY.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Consistency of Assessment
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Advances research methods and proposal writing Ronan Fitzpatrick School of Computing, Dublin Institute of Technology. September 2008.
Business research methods: data sources
performance INDICATORs performance APPRAISAL RUBRIC
+ 21 st Century Skills and Academic Standards Kimberly Hetrick Berry Creek Middle School Eagle County School District.
FLCC knows a lot about assessment – J will send examples
Foreign language and English as a Second Language: Getting to the Common Core of Communication. Are we there yet? Marisol Marcin
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
From Learning Goals to Assessment Plans University of Wisconsin Parkside January 20, 2012 Susan Hatfield Winona State University
The Comprehensive School Health Education Curriculum:
Creating Useful Indicators Brian Frank Jake Kaupp Peter Ostafichuck
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
1 Project Management Principles Coursework Assignment: Things to pay attention to, for the report and the oral presentation...
-SLO Development Progress -SLO Activity -Assessment Progress -Support Needs.
Adolescent Sexual Health Work Group (ASHWG)
SCORING. INTRODUCTION & PURPOSE Define what SCORING means for the purpose of these modules Explain how and why you should use well-designed tools, such.
Department of Computing and Technology School of Science and Technology A.A.S. in Computer Aided Design Drafting (CADD) CIP Code Program Quality.
Sheila Roberts Department of Geology Bowling Green State University.
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program Development) Faculty of Engineering.
1 Selecting Appropriate Assessment Methods Presented at the Teaching & Learning Innovations 17 th Annual Conference At the University of Guelph May 12,
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
ASSESSMENT SYED A RIZVI INTERIM ASSOCIATE PROVOST FOR INSTITUTIONAL EFFECTIVENESS.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
ELA Common Core Shifts. Shift 1 Balancing Informational & Literary Text.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Using the Capstone Course to Generate Student Learning Outcomes Texas State University Thomas E. Castleberry Patricia M. Shields Hassan Tajalli.
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
Performance-Based Assessment Authentic Assessment
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
EQAO Assessments and Rangefinding
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Assessment and Testing
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Anchor Standards ELA Standards marked with this symbol represent Kansas’s 15%
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
The New Face of Assessment in the Common Core Wake County Public Schools Common Core Summer Institute August 6, 2013.
Syllabus Unit 1: Intro to Civics: Government, Citizenship, Economics Unit 2: American Revolution & Founding America (Constitution & Amendments) Unit 3:
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Writing Assignments in Mechanical Engineering Anne Parker University of Manitoba A. Parker, CASDW, UVic,
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
Quality Assurance processes
CRITICAL CORE: Straight Talk.
Internal assessment criteria
General Education Assessment Subcommittee Report
Setting Writing Goals in Science The Living Environment
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Research project criteria
Presentation transcript:

Graduate attribute assessment as a COURSE INSTRUCTOR Brian Frank and Jake Kaupp CEEA Workshop W2-1B

WHY?

Course instructor CEAB program improvement processes Develop sustainable process to evaluate performance against expectations Facilitate a long-term collaboration with colleagues

CEAB requirements include: a)indicators that describe specific abilities expected of students b)A mapping of where attributes are developed and assessed within the program c)Description of assessment tools used to measure student performance (reports, exams, oral presentations, …) d) Evaluation of measured student performance relative to program expectations e)a description of the program improvement resulting from process 4

Graduate attributes required 1. Knowledge base for engineering 2. Problem analysis 3. Investigation 4. Design 5. Use of engineering tools 6. Individual and team work 7. Communication skills 8. Professionalism 9. Impact on society and environment 10. Ethics and equity 11. Economics and project manage. 12. Lifelong learning

Program objectives and indicators Mapping the curriculum Collecting data Analyze and interpret Curriculum & process improvement What do you want to know about the program? Course involvement

Learning outcomes Assessment Learning & teaching activities John Biggs (1999): What the Student Does: teaching for enhanced learning, Higher Education Research & Development, 18:1, to meet outcomes to assess outcomes Course

Learning outcomes Assessment Learning & teaching activities to meet outcomes to assess outcomes Program’s indicators Program’s data Program’s special features and questions Course

WHAT WORKS to improve learning? Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In L. H. Meyer, S. Davidson, H. Anderson, R. Fletcher, P.M. Johnston, & M. Rees (Eds.), Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp ). Wellington, New Zealand: Ako Aotearoa 800 meta-analyses 50,000+ studies 250+ million students

When teachers claim that they are having a positive effect on achievement or when a policy improves achievement this is almost a trivial claim: virtually everything works. One only needs a pulse and we can improve achievement. J. Hattie, 2009 Hattie, J. (2009). The Black Box of Tertiary Assessment: An Impending Revolution. In L. H. Meyer, S. Davidson, H. Anderson, R. Fletcher, P.M. Johnston, & M. Rees (Eds.), Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research (pp ). Wellington, New Zealand: Ako Aotearoa “ ”

Mapping indicators to a course

Course outcomes Program’s indicators Course outcomes Program’s indicators OR

Assume: Indicators mapped to courses AttributeIndicatorCode (D)evelop/ (A)ssess Course Knowledge base Create mathematical descriptions or expressions to model a real-world problem3.01-FY1 D,A APSC-171 Knowledge base Select and describe appropriate tools to solve mathematical problems that arise from modeling a real-world problem3.01-FY2 D,A APSC-171 Knowledge base Use solution to mathematical problems to inform the real-world problem that gave rise to it.3.01-FY3 D,A APSC-171 Problem analysis Identifies known and unknown information, uncertainties, and biases when presented a complex ill- structured problem3.02-FY1 D,A APSC-100 Problem analysis Creates process for solving problem including justified approximations and assumptions3.02-FY2 D,A APSC-100

Indicators in your course 1.Applies prescribed process for solving complex problems (3.02- FY1) 2.Selects and applies appropriate quantitative model and analysis to solve problems (3.02-FY2) 3.Evaluates validity of results and model to describe limitations and quantify error (3.02-FY3) 4.Composes structured document following prescribed format using standard grammar and mechanics (3.07-FY1) 5.Analyzes quantitative data to reach supported conclusion with explicit uncertainty (3.03-FY1)

develop and assess indicators to answer questions.

Learning outcomes Assessment Learning & teaching activities to meet outcomes to assess outcomes Program’s indicators Program’s data Tool: Course planning matrix

APSC-100: Engineering Practice I || Course learning outcomes 1.Applies prescribed process for solving complex problems (3.02-FY1) 2.Selects and applies appropriate quantitative model and analysis to solve problems (3.02-FY2) 3.Evaluates validity of results and model to describe limitations and quantify error (3.02-FY3) 4.Composes structured document using standard grammar and mechanics (3.07-FY1) 5.Analyzes quantitative data to reach supported conclusion with explicit uncertainty (3.03-FY1) Week Learning objectives Instructional approach and content Learning activityAssessment 14,5 Lecture: motivation, course overview, models. Lecture: Group activity to consider model for elevator failure problem CLA/Cornell Critical thinking pretest (CLO7) 21,2,3,8 Pre-studio: MATLAB online module 1 Lecture: complex problem solving, risk, hazard analysis Lecture: Group activity to develop process for resolving elevator failure problem Pre-studio: MATLAB online readiness quiz (no grades) MATLAB quiz #1 OHS online test (CLO6) 38,9Pre-studio: MATLAB online module 2 Lecture: argumentation, brainstorming Lecture: analyze past assignments for effective argument MATLAB Studio: Importing data (problem #2) MATLAB quiz #2

Assessment measures & Teaching and learning activities

Assessment measures Local written exam (e.g. question on final) Local written exam (e.g. question on final) Standardized written exam (e.g. Force concept inventory) Standardized written exam (e.g. Force concept inventory) Performance appraisal (e.g. Lab skill assessment) Performance appraisal (e.g. Lab skill assessment) Simulation (e.g. Emergency simulation) Simulation (e.g. Emergency simulation) Behavioural observation (e.g. Team functioning) Behavioural observation (e.g. Team functioning) External examiner (e.g. Reviewer on design projects) External examiner (e.g. Reviewer on design projects) Oral exam (e.g. Design projects presentation) Oral exam (e.g. Design projects presentation) Focus group Surveys and questionnaires Oral interviews Portfolios (student maintained material) Portfolios (student maintained material) Archival records (registrar's data, records,...) Archival records (registrar's data, records,...)

Design project Online module Lecture with embedded activities Laboratory investigation Problem-based learning Experiential (service learning, co-op) Computer simulation/animation Reciprocal teaching Teaching and learning activities

BREAKOUT 1 DEVELOP A COURSE PLAN This presentation and sample indicators:

SCORING EFFICIENTLY AND RELIABLY

Course grading Outcomes assessment

Why not use grades to assess outcomes? Electric Circuits I Electromagnetics I Signals and Systems I Electronics I Electrical Engineering Laboratory Engineering Communications Engineering Economics... Electrical Design Capstone Student transcript How well does the program prepare students to solve open-ended problems? How well does the program prepare students to solve open-ended problems? Are students prepared to continue learning independently after graduation? Are students prepared to continue learning independently after graduation? Do students consider the social and environmental implications of their work? Do students consider the social and environmental implications of their work? What can students do with knowledge (recall vs. evaluate)? What can students do with knowledge (recall vs. evaluate)? Course grades aggregate assessment of multiple objectives, and provide little information for program improvement Course grades aggregate assessment of multiple objectives, and provide little information for program improvement

When assessing students, the scoring needs to be: Valid: they measure what they are supposed to measure Reliable: the results would be consistent when repeated with the same subjects under the same conditions (but with different graders) Expectations are clear to students, colleagues, and external reviewers

RUBRICS Reduce variations between graders (increase reliability) Describes clear expectations for both instructor and students (increase validity) Dimensions (Indicator) Scale (Level of Mastery) Not demonstrated Marginal Meets expectations Exceeds expectations

Dimensions (Indicator) Scale (Level of Mastery) Not demonstrated Marginal Meets expectations Exceeds expectations Indicator 1 Indicator 2 Indicator 3 Descriptor 1a Descriptor 2a Descriptor 3a Descriptor 1b Descriptor 2b Descriptor 3b Descriptor 1c Descriptor 2c Descriptor 3c Descriptor 1d Descriptor 2d Descriptor 3d Threshold performance Target performance

ANALYTIC rubric for grading oral presentations (Assessing Academic Programs in Higher Education by Allen 2004) Below expectationSatisfactoryExemplaryScore OrganizationNo apparent organization. Evidence is not used to support assertions. The presentation has a focus and provides some evidence that supports conclusions. The presentation is carefully organized and provides convincing evidence to support conclusions (0 – 2)(3 – 5)(6 – 8) ContentThe content is inaccurate or overly general. Listeners are unlikely to learn anything or may be misled. The content is generally accurate, but incomplete. Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic. The content is accurate and complete. Listeners are likely to gain new insights about the topic. (0 – 2)(5 – 7)(10 – 13) StyleThe speaker appears anxious and uncomfortable, and reads notes, rather than speaks. Listeners are largely ignored. The speaker is generally relaxed and comfortable, but too often relies on notes. Listeners are sometimes ignored or misunderstood. The speaker is relaxed and comfortable, speaks without undue reliance on notes, and interacts effectively with listeners. (0 – 2)(3 – 6)(7 – 9)

0-3 (not demonstrated) 4 (marginal) 5-6 (meets expectations) 7-8 (outstanding) Mark (/8) Purpose and style Unclear purpose, very hard to understand. Challenging to understand; tone and style inappropriate for the audience. Clear purpose is met. Formal tone and style appropriate to audience Professional tone and style. Authoritative and convincing /8 Coherence and Format Sequence, transitions, formatting Poorly organized; rambling, lacks unity; Inconsistent writing/formattin g; many gaps or redundancies. Organization sometimes unclear; significant gaps or redundancies, formatting problems; some wordy expressions, lacks transitions Organized, appropriate sections, uniformly and correctly formatted; little irrelevant information. Focused, logically organized; skillful and varied transitions. Professionally formatted. No irrelevant information /8 Graphical communications Figures and tables not related to text, don’t contribute to report; difficult to follow. Some figures and tables not discussed in text; figure/table captions missing; incomplete list of tables/ figures. Figures and tables referred to in text, captioned. Appropriate lists of figures/tables. Figures and tables professionally formatted, integrated into text, complementing text /8 Etc.…………

OBSERVABLE STATEMENTS OF PERFORMANCE ARE IMPORTANT

BREAKOUT 2 CREATE ONE DELIVERABLE AND RUBRIC FOR YOUR COURSE

… AND CONFERENCE PRESENTATIONS

Below Expectations Major Errors or lack of Depth Unacceptable quality Marginal Some significant errors or lack of depth Satisfactory quality Meets Expectations Appropriate depth / few errors Good quality Exemplary Exceptional depth / accuracy Outstanding quality (rows omitted) Development and Analysis of Solution Conceptualization: variety and quality of design solutions considered Data: appropriate tools used to collect, analyze, and present data Detailed Design: design decisions supported with appropriate justification Predictions: appropriate tools used to predict performance of final device (rows omitted)

Level of Mastery Below Expectations Major Errors or lack of Depth Unacceptable quality Marginal Some significant errors or lack of depth Satisfactory quality Meets Expectations Appropriate depth / few errors Good quality Exemplary Exceptional depth / accuracy Outstanding quality Development and Analysis of Solution Data: appropriate tools used to collect and analyze data No physical prototyping is used in the project. Physical prototyping tools are described but in very limited detail. There may be errors in the use of the tools. Physical prototyping tools are described but only limited detail is included. Appropriate tools for physical prototyping are selected and used correctly Ideal tools for physical prototyping are selected and used correctly. Detailed Design: design decisions supported with appropriate justification There is no evidence of the application of engineering knowledge There is little evidence of the application of engineering knowledge There is some evidence of the application of engineering knowledge. There is adequate evidence of the application of engineering knowledge There is good evidence of the application of engineering knowledge Performance Predictions: appropriate tools used to predict performance Discrepancies between predictions and actual performance are not explained. Discrepancies are mentioned, but reasons for the discrepancies are not explained or are incorrect. Discrepancies in results are explained, but reasons for the discrepancies are incomplete Discrepancies are explained. The accuracy and/or assumptions in the prediction are partially described. Discrepancies are well justified. The accuracy and assumptions in the prediction approaches are explained and considered.

Outcome Scale (Level of Mastery) Not demonstrated Marginal Meets expectations Exceeds expectations 3.01: Newtonian mechanics remembersunderstandssynthesizesevaluates 3.02: Defines problem remembersanalyzesevaluatescreates 3. 03:Designs investigation remembersunderstandsanalyzescreates

CALIBRATION FOR GRADERS

CASE STUDY: VALUE FOR INSTRUCTOR

Look for trends over a semester Engineering Graduate Attribute Development (EGAD) Project 40

what is “good” performance? Pitfalls to avoid: Johnny B. “Good”: NARROW : is description applicable to all submissions? Is descriptor aligned with objective? Out of alignment: bloomin’ complex : Bloom’s is not meant as a scale!

PROBLEMS YOU WILL FIND…

IT TAKES TIME

INITIALLY STUDENTS MAY NOT LOVE IT

SO… COLLABORATION IS IMPORTANT

CONTINUE COLLABORATION NETWORK AND SURVEY

Graduate attribute assessment as a COURSE INSTRUCTOR Brian Frank and Jake Kaupp CEEA Workshop W2-1B

MODELS FOR SUSTAINING CHANGE

HIGH IMPACT ACTIVITIES FIRST YEAR EXPERIENCES BROAD INTEGRATING THEMES LEARNING COMMUNITIES WRITING INTENSIVE COURSES UNDERGRADUATE RESEARCH DIVERSITY/GLOBAL LEARNING COLLABORATIVE PROJECTS COMMUNITY BASED LEARNING CAPSTONE COURSES High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter, George D. Kuh, Washington, DC: AAC&U, 2008.

CONCEPTUAL FRAMEWORK John Biggs (1999): What the Student Does: teaching for enhanced learning, Higher Education Research & Development, 18:1, 57-75

ACTIVITIES FOR LEARNING Educational approachLearning LectureReception of content Concept mappingStructuring, overview TutorialElaboration, clarification Field tripExperiential knowledge, interest Learning partnersResolve differences, application ProjectIntegration, self-management John Biggs (1999): What the Student Does: teaching for enhanced learning, Higher Education Research & Development, 18:1, 57-75

Example: Knowledge assessment Calculus instructor asked questions on exam that specifically targeted 3 indicators for “Knowledge”: 1.“Create mathematical descriptions or expressions to model a real-world problem” 2.“Select and describe appropriate tools to solve mathematical problems that arise from modeling a real-world problem” 3.“Use solution to mathematical problems to inform the real-world problem that gave rise to it” Engineering Graduate Attribute Development (EGAD) Project 54

Example (cont’d): The student can create and/or select mathematical descriptions or expressions for simple real-world problems involving rates of change and processes of accumulation (overlaps problem analysis) Engineering Graduate Attribute Development (EGAD) Project 55 Context: calculating Intersection of two trajectories

CHECKLIST FOR INDICATORS