Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

District Accreditation
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Leon County Schools Performance Feedback Process August 2006 For more information
WRITING Educational Objectives, Learning Outcomes Mi-Suk Shim, Ph.D. Spring 2006 DIIA.
Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
1. What is it we want our students to learn?
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Pedagogies of Engagement (Cooperative Learning) and Assessment – Overview – Karl A. Smith Engineering Education – Purdue University Civil Engineering -
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration –Initiated in 2003 –Low-residency—one weekend per month (over.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Foundations of Assessment I Understanding the Assessment Process.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
Assessment 101: A Review of the Basics Jill Allison Kern, PhD Director of Assessment Christopher Newport University January 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Model for Sustaining Departmental Student Outcomes Assessment Russ E. Mullen, Mary H. Wiedenhoeft, Thomas A. Polito, Sherry L. Pogranichniy, and Michelle.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Assessment Workshop 2 Developing Student Learning Outcomes.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Advanced Placement Studio (APS) Course Description The AP Studio Art Portfolio course is for students who are seriously interested in the practical experience.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Program Level Assessment for Continuing Studies Programs.
Designing Quality Assessment and Rubrics
Office of Planning & Development
Assessment & Evaluation Committee
QM and Accreditation—Sounds Boring but It’s BASIC
PORTFOLIO ASSESSMENT Jay Barrett, GED 621.
Associate Provost for Graduate and Professional Studies
Assessment & Evaluation Committee
The Heart of Student Success
PD 101 Professional Learning Standards
Presentation transcript:

Dr. Geri Cochran Director, Institutional Research, Assessment and Planning

 It’s all about what is important to you ◦ Identifying what is important, the values that guide what you are doing. ◦ Using those values as a basis for evaluating what you are doing ◦ Taking what you have learned from that evaluation to improve what you are doing in order to better achieve your values

Program assessment is a form of evaluation by which participants in a program judge its effectiveness in achieving their values and use what they have learned to improve the program’s effectiveness in achieving those values.

Premise Assessment is linked directly to value Propositions 1. What we assess indicates what we value 2. What we value should guide what we assess

A form of evaluation in which the values of the participants’ in a program are made explicit expectations for what should “come out” of their actions and those actions are evaluated according to the extent to which the actions achieve the expected outcomes.

Improving Programs Program Outcomes Program Criteria Program Values

 Focusing on the value of education shifts our attention from inputs to outcomes  What “comes out” of an educational experience must be directly or indirectly observable to be assessed.  Program assessment belongs to the program; the purpose of outcomes assessment is to improve programs by encouraging evidence-based decision-making by people in the program

 Professional Readiness  Students will be ready for the professional environment by demonstrating versatility and a professional disposition through their knowledge of current and viable performance-based techniques.

 Artistry Students will be able to synthesize their knowledge of technical skills, creative imagination and aesthetic knowledge in preparation for performance, emphasizing the elements of musicality, phrasing, and characterization.

 Physicality Students will execute choreography with clarity, energy, athleticism, and breath, utilizing a basic understanding of anatomical principles and dance alignment.

 Disposition Students will practice the craft of dance with a disciplines and self-motivated approach. Student will be able to collaborate and work as part of a team. Students will be fluent in dance vocabulary, terminology, and professional standards and protocol.

 What evidence should be gathered for assessing outcomes?  What are the sources of the evidence for the outcomes?  How often is the evidence to be collected?

 Relatively direct (SACS preferred) ◦ Writing assignments ◦ Oral presentations ◦ Design projects ◦ Artistic compositions ◦ Essay exams  Relatively indirect ◦ Surveys ◦ Internship reports ◦ Employer Surveys

 Evidence should be meaningful information that is appropriate for assessing a particular outcome.  Evidence should be manageable: reasonable to attain and evaluate (time, effort, availability)

 List of Outcomes  Evidence to be collected  Source of evidence  Frequency of collection of evidence

OutcomesEvidenceSourceFrequency Students will practice the craft of dance with clarity, energy, athleticism and breath Production evaluations, performance reviews, Instructor reports, etc Students, faculty, professional in the field Semester & Annually Students will be ready for the professional environment by demonstrating versatility… Selection of work in portfolio, videos of performances, etc Students, facultyAs appropriate but collected annually to show growth

 Is the evidence specific enough in describing the form of the evidence and venue for collection?  Does the plan rely mainly on direct evidence?  Is the evidence meaningful for the particular outcome?  Is the evidence manageable (reasonable to collect and evaluate) ?

Improving Programs Program Outcomes Program Criteria Program Values

 Goal: To use the evidence as a basis for judging the extent to which the program is meeting the members’ values for the program.

 Goal: To apply what has been learned in evaluating the program toward identifying actions to address areas of concern

 As a result of your assessment, what changes if any, have you implemented to address areas of concern (in the program or in the assessment of the program)?

 What outcomes are you planning to assess for the next reporting cycle?

 How can I help? Geri Cochran