Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.

Slides:



Advertisements
Similar presentations
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Advertisements

Leon County Schools Performance Feedback Process August 2006 For more information
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
The Academic Assessment Process
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
Grade 12 Subject Specific Ministry Training Sessions
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Key Elements of University Going Culture at La Joya ISD? By: Dr. Armando O'caña Sr. La Joya Independent School District Student Services Division Dropout.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
1 October, 2005 Activities and Activity Director Guidance Training (F248) §483.15(f)(l), and (F249) §483.15(f)(2)
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
PeopleProcessPlanPurpose $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Working Definition of Program Evaluation
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
EDU 385 Education Assessment in the Classroom
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Exploring Evidence.
Model for Sustaining Departmental Student Outcomes Assessment Russ E. Mullen, Mary H. Wiedenhoeft, Thomas A. Polito, Sherry L. Pogranichniy, and Michelle.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Student Learning Objectives (SLO) Resources for Science 1.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
DIFFERENTIATED INSTRUCTION Ideas taken from: Student Success DIFFERENTIATED INSTRUCTION EDUCATOR’S GUIDE (2010) REACH EVERY STUDENT.
ACTIVITY 1: DO YOU KNOW WHAT I MEAN?  Have a volunteer hold a sheet you have created with different shapes drawn on it.  With their back to the group,
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
QUALITY OF THE PROJECT DESIGN 15 Points (recommend 5 pages)
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
School Climate Transformation Grant. SSAISD Learner Profile ▪Reflects to set personal goals ▪Is an accomplished reader ▪Employs digital skills ▪Is an.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Lakeland Middle School Professional Learning Communities (PLC)
The Two Most Common Types of Contemporary Planning Techniques
PORTFOLIO ASSESSMENT Jay Barrett, GED 621.
Introductions Introduction
The Two Most Common Types of Contemporary Planning Techniques
Presentation transcript:

Dr. Geri Cochran Director, Institutional Research, Assessment and Planning

 It’s all about what is important to you ◦ Identifying what is important, the values that guide what you are doing. ◦ Using those values as a basis for evaluating what you are doing ◦ Taking what you have learned from that evaluation to improve what you are doing in order to better achieve your values

Program assessment is a form of evaluation by which participants in a program judge its effectiveness in achieving their values and use what they have learned to improve the program’s effectiveness in achieving those values.

Premise Assessment is linked directly to value Propositions 1. What we assess indicates what we value 2. What we value should guide what we assess

A form of evaluation in which the values of the participants’ in a program are made explicit expectations for what should “come out” of their actions and those actions are evaluated according to the extent to which the actions achieve the expected outcomes.

Improving Programs Program Outcomes Program Criteria Program Values

 Focusing on the value of education shifts our attention from inputs to outcomes  What “comes out” of an educational experience must be directly or indirectly observable to be assessed.  Program assessment belongs to the program; the purpose of outcomes assessment is to improve programs by encouraging evidence-based decision-making by people in the program

 Professional Readiness: Ability to work in the professional environment Ability to create a compelling portfolio to profile themselves for industry opportunities. Time management and ability to produce quality work in a specified time frame Ability to work within industry protocols to develop and execute projects

 Professional Communication:  Ability to use the language specific to the area of design production or management used by working professionals.  Ability to use artistic vocabulary for collaboration  Ability to communicate ideas both verbally and through presentational material

 Creative Collaboration:  Ability to work effectively with a team on projects.  Ability to respect working relationships  Follow set protocols according to industry standards.

 Finance & Budgeting:  Ability to manage resources - time, materials, personnel, and facilities  Problem-Solving:  Ability to combine research and resources in solving production problems  Ability to problem-solve within resource constraints  Specialty Skills:  According to their specialty, ability to apply technical and artistic skills to elevate the effectiveness of the finished work.

 What evidence should be gathered for assessing outcomes?  What are the sources of the evidence for the outcomes?  How often is the evidence to be collected?

 Relatively direct (SACS preferred) ◦ Writing assignments ◦ Oral presentations ◦ Design projects ◦ Artistic compositions ◦ Essay exams  Relatively indirect ◦ Surveys ◦ Internship reports ◦ Employer Surveys

 Evidence should be meaningful information that is appropriate for assessing a particular outcome.  Evidence should be manageable: reasonable to attain and evaluate (time, effort, availability)

 List of Outcomes  Evidence to be collected  Source of evidence  Frequency of collection of evidence

OutcomesEvidenceSourceFrequency Ability to work in a professional environment Production evaluations, internship reports Students, faculty, professional in the field Semester & Annually Ability to create a compelling portfolio Selection of work in portfolio, sample of designs, etc StudentsAnnually

 Is the evidence specific enough in describing the form of the evidence and venue for collection?  Does the plan rely mainly on direct evidence?  Is the evidence meaningful for the particular outcome?  Is the evidence manageable (reasonable to collect and evaluate) ?

Improving Programs Program Outcomes Program Criteria Program Values

 Goal: To use the evidence as a basis for judging the extent to which the program is meeting the members’ values for the program.

 Goal: To apply what has been learned in evaluating the program toward identifying actions to address areas of concern

 As a result of your assessment, what changes if any, have you implemented to address areas of concern (in the program or in the assessment of the program)?

 What outcomes are you planning to assess for the next reporting cycle?

 How can I help? Geri Cochran