“Strategies for Harnessing Information Technology to

Slides:



Advertisements
Similar presentations
Chapter 5 Transfer of Training
Advertisements

[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
SE Name SE Title Blackboard Training: Approaches and Opportunities.
Cost Management ACCOUNTING AND CONTROL
Science Subject Leader Training
Joint ATS-WASC Accreditation Reviews Jerry McCarthy, ATS Teri Cannon, WASC.
NCATS REDESIGN METHODOLOGY A Menu of Redesign Options Six Models for Course Redesign Five Principles of Successful Course Redesign Four Models for Assessing.
1 Engaging a Multicultural, Multicampus Community in General Education Reform Lenore P. Rodicio, Dwight Smith, and Lois Willoughby Miami Dade College.
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Career and College Readiness Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Assessment Literacy MODULE 1.
Assessment Literacy Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Career and College Readiness MODULE 1.
World’s Largest Educational Community
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
National Academy of Engineering of the National Academies 1 Phase II: Educating the 2020 Engineer Phase II: Adapting Engineering Education to the New Century...
APS Teacher Evaluation
1 Implementing Internet Web Sites in Counseling and Career Development James P. Sampson, Jr. Florida State University Copyright 2003 by James P. Sampson,
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
2010 SACS-COC Annual Meeting December 6, 2010 CS-69 Administrative Program Review Assuring Quality in Administrative and Academic Support Units.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Sponsored by CEPA Foundation – Cultural & Educational Programs Abroad CEPA Foundation Webinar #1 on Curriculum Integration Integrating Education Abroad.
March 2007 ULS Information Literacy and Assessment of Learning Program.
Dr. Craig Campbell St. Edward’s University Online learning and teaching.
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
1. Welcome back to CSC- 1st day of class
Institutional Effectiveness (ie) and Assessment
H to shape fully developed personality to shape fully developed personality for successful application in life for successful.
Definitions types added-value tutor role building-up informal learning awareness raising examples 1 Astrid Quasebart ESTA-Bildungswerk gGmbH senior project.
1. Karadeniz Technical University Continuing Education Center has been established to organize Karadeniz Technical University’s continuing education programs,
World’s Largest Educational Community
Virginia Teacher Performance Evaluation System 0 August 2012.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
Competency-Based Instruction Assessment-Based Curriculum Individual Perf. Tests Placement Advanced Placement Tests Vocational Preference Tests Other Diagnostic.
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
Accreditation Board for Engineering and Technology ABET 1Advisory committee of
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
Assessment Plans Discussion CLAS Unit Heads Maria Cimitile, Associate Dean, CLAS Julie Guevara, Accreditation & Assessment Officer January 11, 2006.
STRATEGIC PLANNING AND ASSESSMENT PLANNING Presentation to CLAS Unit Heads Nov. 16, 2005 Maria Cimitile Julie Guevara Carol Griffin.
“Transformative Assessment Case Study” June 20, 2003 Gloria M. Rogers, Ph.D. Rose-Hulman Institute of Technology Copyright [Gloria M. Rogers, Rose-Hulman.
Standards and Guidelines for Quality Assurance in the European
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Communication Degree Program Outcomes
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
Overview  Portfolio Basics Portfolio purposes Learning objectives to be addressed Roles of student and faculty in portfolio development, implementation.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
Serving: What does the learner demand of us? Process: What processes do we need to master in order to serve our population? Development: What competencies.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Gateway Engineering Education Coalition Background on ABET Overview of ABET EC 2000 Structure Engineering Accreditation and ABET EC2000 – Part I.
UTPA 2012: A STRATEGIC PLAN FOR THE UNIVERSITY OF TEXAS-PAN AMERICAN Approved by President Cárdenas November 21, 2005 Goals reordered January 31, 2006.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Accreditation Board for Engineering and Technology
Assessment and Accreditation
February 21-22, 2018.
Rose-Hulman Institute of Technology
Get on Board: Reaffirmation 2016
Presentation transcript:

“Strategies for Harnessing Information Technology to September 5, 2000 8th Improving Student Learning Symposium Improving Student Learning Strategically “Strategies for Harnessing Information Technology to Facilitate Institutional Assessment” Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment Rose-Hulman Institute of Technology Terre Haute, Indiana USA G. Rogers, Rose-Hulman Institute of Technology gloria.rogers@rose-hulman.edu

Overview Use of models to guide institutional strategies for improving student learning Assessing student learning Best practices for student assessment Brief history of RHIT process Assessment model/taxonomy A case study - demonstration Benefits to teaching/learning Assessment method truisms Barriers to faculty involvement Advice from the field

Use of Principles of Best Practice for Assessment of Student Learning in guiding development of assessment “system” Value of using models to guide practice Recognition of local constraints OUTCOMES INPUTS

Rose-Hulman Institute of Technology September 5, 2000 Rose-Hulman Institute of Technology Terre Haute, Indiana, USA 1600+ undergraduate students B.S. degrees in engineering, science, and mathematics Median SAT scores 1350 (700M,650V) 80%+ engineering students G. Rogers, Rose-Hulman Institute of Technology gloria.rogers@rose-hulman.edu

BRIEF History Presidential Commission of faculty, staff and students appointed in Spring of 1996 to develop a plan for the assessment of student outcomes Provide for continuous quality improvement Meet outcomes-based accreditation standards Regional (NCA) Program (ABET)

Rose-Hulman Institute of Technology Gloria Rogers - Rose-Hulman Institute of Technology Accreditation Requirements Institutional Mission Constituents Educational Goals & Objectives Measurable Performance Criteria Educational Practices/Strategies Feedback for Continuous Improvement Program Outcomes Accreditation Assessment: Collection, Analysis of Evidence Evaluation: Interpretation of Evidence Assessment for Continuous Improvement

X X X Taxonomy of Approaches to Assessment Placement “Gatekeeping” B E H A V I O R A T I U D E S & V L S K I L K N O W L E D G Competency-Based Instruction Assessment-Based Curriculum Individual Perf. Tests Placement Advanced Placement Tests Vocational Preference Tests Other Diagnostic Tests X X “Gatekeeping” Admissions Tests Rising Junior Exams Comprehensive Exams Certification Exams Individual Level of Assessment (Who?) X Program Enhancement Individual assessment results may be aggregated to serve program evaluation needs Campus and Program Evaluation Program Reviews Retention Studies Alumni Studies “Value-added” Studies Group Object of Assessment (What?) Learning/Teaching (Formative) Accountability (Summative) Purpose of Assessment (Why?) (Terenzini, JHE Nov/Dec 1989)

Rose-Hulman’s Mission To provide students with the world’s best undergraduate education in engineering, science, and mathematics in an environment of individual attention and support.

Input Recruit highly qualified students, faculty, and staff Provide an excellent learning environment Quality Encourage the realization and recognition of the full potential of all campus community members Climate Instill in our graduates skills appropriate to their professions and life-long learning Outcomes Resources Provide resource management & development that supports the academic mission

Instill in our graduates skills appropriate to their professions and life-long learning Outcomes Ethics and professional responsibility Understanding of contemporary issues Role of professionals in the global society and ability to understand diverse cultural and humanistic traditions Teamwork Communication skills Skills and knowledge necessary for mathematical, scientific, and engineering practice Interpret graphical, numerical, and textual data Design and conduct experiments Design a product or process to satisfy a client's needs subject to constraints

Why portfolios? Authentic assessment Capture a wide variety of student work Involve students in their own assessment Professional development for faculty

Why “electronic” portfolios? Student-owned laptop computer program since 1995 Classrooms, residence halls, common areas, library, fraternity houses all wired Access Efficient Cost effective Asynchronous assessment

RosE-Portfolio Structure Advisor ADMIN Student Rater Employer User Management Group Management System Configuration Criteria Tree Activity Managment Faculty Submit Review Search Dynamic Resume Access Control View Advisee’s portfolio Search Advisee’s portfolio Inter-rater Reliability Rating sessions Feedback Rating management Curriculum Map PTR Portfolio Submit Review Search View Search

Show Me!

Assessment of student material Faculty work in teams Each team assesses one learning objective Score holistically Emerging rubrics Does the reflective statement indicate an understanding of the criterion? Does the reflective statement demonstrate or argue for the relevance of the submitted material to the criterion? Does the submitted material meet the requirements of the criterion at a level appropriate to a graduating senior at R-HIT?

Show Me!

Example of Results Understand criterion? Submission relevant to criterion? Meet standards for R-HIT graduate?

Example of Results Does submission meet the standards for a graduate of R-HIT? Appropriate for audience Organization Content factually correct Test audience response Grammatically correct

Linking results to Practice Development of Curriculum Map Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes Show Me!

Curriculum Map Results Fall 1999-2000 (181 courses/labs) Communication Skills

Curriculum Map Results Fall 1999-2000 (181 courses/labs) Ethics

Institute acts on the recom-mendations of the Eval. Comm. Closing the loop JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC WINTER SPRING SUMMER FALL Eval Committee receives and evaluates all data; makes report and refers recom-mendations to appropriate areas. Institute acts on the recom-mendations of the Eval. Comm. Reports of actions taken by the Institute and the targeted areas are returned to the Eval Comm. for iterative evaluation. Institute assessment cmte. prepares reports for submission to Dept. Heads of the collected data (e.g. surveys, e-portfolio ratings).

Primary focus It is not about electronic portfolios. It is about: supporting teaching and learning faculty and student development the transformation of the teaching/learning environment

Benefits to teaching Faculty are asked to reflect on learning outcomes in relation to practice Consider the value of stated outcomes Right ones? Right performance criteria? Individual faculty role in creating the context for learning Develop a common language and understanding of program/institutional outcomes Explicit accountability Promotes interdisciplinary discussions/collaborations

Benefits to learning Students review their own progress as it relates to expected learning. Portfolios provide a way for students to make learning visible and becomes the basis for conversations and other interactions among students and faculty. Learning is viewed as an integrated activity not isolated courses. Students learn to value the contributions of out-of-class experiences. Student reflections are metacognitive as they appraise their own ways of knowing. Promotes a sense of personal ownership over one’s accomplishments.

Assessment method truisms There will always be more than one way to measure any outcome No single method is good for measuring a wide variety of different student abilities Consistently inverse relationship between the quality of measurement methods and their expediency Importance of pilot testing to see if method is good for your program (students & faculty)

Barriers to implementation Faculty current workload lack of incentive to participate in the process (rewards) “what’s in it for me” (cost/benefits) Institutional/program leadership Lack of vision for the program/institutional assessment process (no existing, efficient models) Cost/benefit unknown Difficulty of restructuring the reward system to facilitate faculty participation

Portfolio deficiencies Process deficiencies Lack of understanding of the dynamics of organizational change Absence of “tools” to facilitate collaborative work Portfolio deficiencies Ill-defined purpose Lack of efficient ways to manage the portfolio process Systematic review of portfolio contents is ill-defined or non-existent Student and faculty roles not clear Portfolio process not integrated into the teaching/learning environment Resource deficiencies Expertise in portfolio development Development of “authentic” portfolio

Advice from the field E=MC2 You cannot do it all - prioritize All assessment questions are not equal One size does not fit all It’s okay to ask directions Take advantage of local resources Don’t wait until you have a “perfect” plan Decouple from faculty evaluation

DEMO Site http://www.rose-hulman.edu/ira/reps/