Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Strategies for Harnessing Information Technology to

Similar presentations


Presentation on theme: "“Strategies for Harnessing Information Technology to"— Presentation transcript:

1 “Strategies for Harnessing Information Technology to
September 5, 2000 8th Improving Student Learning Symposium Improving Student Learning Strategically “Strategies for Harnessing Information Technology to Facilitate Institutional Assessment” Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment Rose-Hulman Institute of Technology Terre Haute, Indiana USA G. Rogers, Rose-Hulman Institute of Technology

2 Overview Use of models to guide institutional strategies for improving student learning Assessing student learning Best practices for student assessment Brief history of RHIT process Assessment model/taxonomy A case study - demonstration Benefits to teaching/learning Assessment method truisms Barriers to faculty involvement Advice from the field

3 Use of Principles of Best Practice for Assessment of Student Learning in guiding development of assessment “system” Value of using models to guide practice Recognition of local constraints OUTCOMES INPUTS

4 Rose-Hulman Institute of Technology
September 5, 2000 Rose-Hulman Institute of Technology Terre Haute, Indiana, USA 1600+ undergraduate students B.S. degrees in engineering, science, and mathematics Median SAT scores 1350 (700M,650V) 80%+ engineering students G. Rogers, Rose-Hulman Institute of Technology

5 BRIEF History Presidential Commission of faculty, staff and students appointed in Spring of 1996 to develop a plan for the assessment of student outcomes Provide for continuous quality improvement Meet outcomes-based accreditation standards Regional (NCA) Program (ABET)

6 Rose-Hulman Institute of Technology
Gloria Rogers - Rose-Hulman Institute of Technology Accreditation Requirements Institutional Mission Constituents Educational Goals & Objectives Measurable Performance Criteria Educational Practices/Strategies Feedback for Continuous Improvement Program Outcomes Accreditation Assessment: Collection, Analysis of Evidence Evaluation: Interpretation of Evidence Assessment for Continuous Improvement

7 X X X Taxonomy of Approaches to Assessment Placement “Gatekeeping”
B E H A V I O R A T I U D E S & V L S K I L K N O W L E D G Competency-Based Instruction Assessment-Based Curriculum Individual Perf. Tests Placement Advanced Placement Tests Vocational Preference Tests Other Diagnostic Tests X X “Gatekeeping” Admissions Tests Rising Junior Exams Comprehensive Exams Certification Exams Individual Level of Assessment (Who?) X Program Enhancement Individual assessment results may be aggregated to serve program evaluation needs Campus and Program Evaluation Program Reviews Retention Studies Alumni Studies “Value-added” Studies Group Object of Assessment (What?) Learning/Teaching (Formative) Accountability (Summative) Purpose of Assessment (Why?) (Terenzini, JHE Nov/Dec 1989)

8 Rose-Hulman’s Mission
To provide students with the world’s best undergraduate education in engineering, science, and mathematics in an environment of individual attention and support.

9 Input Recruit highly qualified students, faculty, and staff Provide an excellent learning environment Quality Encourage the realization and recognition of the full potential of all campus community members Climate Instill in our graduates skills appropriate to their professions and life-long learning Outcomes Resources Provide resource management & development that supports the academic mission

10 Instill in our graduates skills appropriate to their professions and life-long learning
Outcomes Ethics and professional responsibility Understanding of contemporary issues Role of professionals in the global society and ability to understand diverse cultural and humanistic traditions Teamwork Communication skills Skills and knowledge necessary for mathematical, scientific, and engineering practice Interpret graphical, numerical, and textual data Design and conduct experiments Design a product or process to satisfy a client's needs subject to constraints

11 Why portfolios? Authentic assessment
Capture a wide variety of student work Involve students in their own assessment Professional development for faculty

12 Why “electronic” portfolios?
Student-owned laptop computer program since 1995 Classrooms, residence halls, common areas, library, fraternity houses all wired Access Efficient Cost effective Asynchronous assessment

13 RosE-Portfolio Structure
Advisor ADMIN Student Rater Employer User Management Group Management System Configuration Criteria Tree Activity Managment Faculty Submit Review Search Dynamic Resume Access Control View Advisee’s portfolio Search Advisee’s portfolio Inter-rater Reliability Rating sessions Feedback Rating management Curriculum Map PTR Portfolio Submit Review Search View Search

14 Show Me!

15 Assessment of student material
Faculty work in teams Each team assesses one learning objective Score holistically Emerging rubrics Does the reflective statement indicate an understanding of the criterion? Does the reflective statement demonstrate or argue for the relevance of the submitted material to the criterion? Does the submitted material meet the requirements of the criterion at a level appropriate to a graduating senior at R-HIT?

16 Show Me!

17 Example of Results Understand criterion?
Submission relevant to criterion? Meet standards for R-HIT graduate?

18 Example of Results Does submission meet the standards for a graduate of R-HIT?
Appropriate for audience Organization Content factually correct Test audience response Grammatically correct

19 Linking results to Practice
Development of Curriculum Map Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes Show Me!

20 Curriculum Map Results Fall 1999-2000 (181 courses/labs) Communication Skills

21 Curriculum Map Results Fall 1999-2000 (181 courses/labs) Ethics

22 Institute acts on the recom-mendations of the Eval. Comm.
Closing the loop JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC WINTER SPRING SUMMER FALL Eval Committee receives and evaluates all data; makes report and refers recom-mendations to appropriate areas. Institute acts on the recom-mendations of the Eval. Comm. Reports of actions taken by the Institute and the targeted areas are returned to the Eval Comm. for iterative evaluation. Institute assessment cmte. prepares reports for submission to Dept. Heads of the collected data (e.g. surveys, e-portfolio ratings).

23 Primary focus It is not about electronic portfolios. It is about:
supporting teaching and learning faculty and student development the transformation of the teaching/learning environment

24 Benefits to teaching Faculty are asked to reflect on learning outcomes in relation to practice Consider the value of stated outcomes Right ones? Right performance criteria? Individual faculty role in creating the context for learning Develop a common language and understanding of program/institutional outcomes Explicit accountability Promotes interdisciplinary discussions/collaborations

25 Benefits to learning Students review their own progress as it relates to expected learning. Portfolios provide a way for students to make learning visible and becomes the basis for conversations and other interactions among students and faculty. Learning is viewed as an integrated activity not isolated courses. Students learn to value the contributions of out-of-class experiences. Student reflections are metacognitive as they appraise their own ways of knowing. Promotes a sense of personal ownership over one’s accomplishments.

26 Assessment method truisms
There will always be more than one way to measure any outcome No single method is good for measuring a wide variety of different student abilities Consistently inverse relationship between the quality of measurement methods and their expediency Importance of pilot testing to see if method is good for your program (students & faculty)

27 Barriers to implementation
Faculty current workload lack of incentive to participate in the process (rewards) “what’s in it for me” (cost/benefits) Institutional/program leadership Lack of vision for the program/institutional assessment process (no existing, efficient models) Cost/benefit unknown Difficulty of restructuring the reward system to facilitate faculty participation

28 Portfolio deficiencies
Process deficiencies Lack of understanding of the dynamics of organizational change Absence of “tools” to facilitate collaborative work Portfolio deficiencies Ill-defined purpose Lack of efficient ways to manage the portfolio process Systematic review of portfolio contents is ill-defined or non-existent Student and faculty roles not clear Portfolio process not integrated into the teaching/learning environment Resource deficiencies Expertise in portfolio development Development of “authentic” portfolio

29 Advice from the field E=MC2 You cannot do it all - prioritize
All assessment questions are not equal One size does not fit all It’s okay to ask directions Take advantage of local resources Don’t wait until you have a “perfect” plan Decouple from faculty evaluation

30 DEMO Site


Download ppt "“Strategies for Harnessing Information Technology to"

Similar presentations


Ads by Google