Data, Exhibits and Performance-based Assessment Systems David C. Smith, Dean Emeritus College of Education University of Florida

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Standard 13 Related Educational Activities. What does it cover? The institutions programs or activities that are characterized by particular content,
What’s new in the accreditation standards for TSPC programs.
“Sticking Points” Conceptual framework has five structural elements Conceptual framework has five structural elements Standard 1 requires data, not information.
TWS Aid for Scorers Information on the Background of TWS.
Completing a Professional Portfolio College of Education & Human Development University of Louisville Betty Doyle 1.
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Creating a Teacher Education Assessment System David C. Smith, Dean Emeritus College of Education University of Florida
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
NCATE Institutional Orientation Loren Blanchard, Vice President David C. Smith, Dean Emeritus Accreditation & Accountability College of Education Louisiana.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Weber State University’s Teacher Preparation Program Conceptual Framework.
THROUGH AN ONLINE PERSONAL DEVELOPMENT PLAN (PDP) INDIANA UNIVERSITY-PURDUE UNIVERSITY INDIANAPOLIS OCTOBER 2010 Generating and Assessing Learning.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Education Bachelor’s Degree in Elementary Education Began the Master’s of Special Education program in January of 2011 Professional After graduation Sorensen.
Nursing Science and the Foundation of Knowledge
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Virginia Foundation for Educational Leadership Virginia Department of Education Webinar Series 2012 Welcome to Webinar 2.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
Welcome! NCATE RETREAT FALL Please Make Sure…. To Sign In Wear a Name Badge Take the CORRECT program data packet.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
2012 Regional Assessment Workshops Session 2 Dr. Maryellen Cosgrove, Dean School of Business, Education, Health and Wellness Gainesville State University.
Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Sultan Qaboos University College of Education Course: Instructor:
October 8,  Review TEAC Process  Faculty Presentations on Reflection/ Learning to Learn  Group Work on Evidence for Claim 3  Audit Update 
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
The Role of the NCATE Coordinator Kate M. Steffens St. Cloud State University NCATE Institutional Orientation September, 2002.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Revision of Initial and Continued Approval Standard Guidelines for Educational Leadership Programs Presentation to FAPEL Winter Meeting Tallahassee, FL.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
NCATE Vocabulary Candidates--university/college students
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Conceptual Framework Presentation, 2006, Slide 1 The Conceptual Framework for Programs that Prepare Professionals Who Work in Schools What - Why - and.
Developing Educational Leaders Who Create Tomorrow’s Opportunities: Issues for Partner Instructors in the College of Education April 10, 2008.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Assessment Presentation Day For Faculty Cindy J. Speaker, Ph.D. Wells College August 21, 2006.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Student Learning Objectives (SLO) Resources for Science 1.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Professional Education Unit Assessment System Model Operationalizing the Conceptual Framework Operationalizing.
Deciphering SPA Requirements Kathy Hildebrand, Ph.D., Assistant Dean of Assessment & Continuous Improvement, College of Education Cynthia Conn, Ph.D.,
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15: Getting Started on the Assessment Path Essential Issues to Consider.
Stetson University welcomes: NCATE Board of Examiners.
SLO P ROCESSES G UIDE This guide is a compilation of a series of SLO presentations over the last several years. This guide will serve as a review or for.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
NCATE Unit Standards 1 and 2
The assessment process For Administrative units
Consider Your Audience
Town Hall Meeting November 4, 2013
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
COE Assessment The “Then” and “Now”.
NCATE 2000 Unit Standards Overview.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
A Deeper Dive: Building Rubrics that Align and Inform
Writing the Institutional Report
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Student Learning Outcomes at CSUDH
Presentation transcript:

Data, Exhibits and Performance-based Assessment Systems David C. Smith, Dean Emeritus College of Education University of Florida

Accountability and Assessment are With Us It is obvious that we are in an age of accountability and that inevitably involves assessment. Like the Ice Age, there is little reason to believe that it will pass quickly.

We Need Data What data do you have to support your convictions and contentions? How do you respond to important questions regarding program effectiveness and efficiency when you are asked, “what evidence do you have”, and “how do you know”?

Often the Right Behavior for the Wrong Reasons Think about assessment and the development of an assessment system as an opportunity rather than a problem or burden. (Not NCATE.)

Your Mind-set An assessment system will not tell you what you should do. Data-driven decisions. Data-informed decisions.

Essential Considerations in Design Have you thought deeply about the purpose, mission and vision of the organization and related them to the assessment system? Is your conceptual framework reflected in your assessment system? There are implications for what you choose to include in your assessment system.

Assessment System Issues to Consider Will (do) you have a blueprint or framework (design) for your assessment system? What criteria will you use for creating it? What does (will) it look like? How was (will) it be created? Consider the language in Standard 2.

Inevitable Tension in Assessment The need to accommodate your situation. The need to compare with others in similar situations. It is necessary to compare within and across institutions. (Internal relative productivity and comparison with counterpart units.)

Multiple Measures Multiple measures can be valuable. Intentional redundancy can be critical. (Aircraft instruments.) Sometimes a matter of perspective. It is valuable to look at a problem from more than one angle. (Headcount and FTE faculty and candidates.) Sometimes a matter of timing. What are key points at which to access? (At a minimum, entrance, exit and follow-up.)

Problems that I see regularly see: People have difficulty in creating an assessment system. People think more about collecting data than they do about the structure of their assessment system and the kinds of data that they include in their assessment system. They often want to do it for the wrong reason – for accreditation rather than seeing it as a tool to evaluate and improve what they are doing.

Other Problems They have difficulty in using candidate three ways. (The aggregation issue.) People are not aware of meaningful data that already exists and can be imported into their assessment system. Then they can focus their effort on data that they need to generate. People often do not know how to use data well.

People often do not consider examining relationships among data sets. (FTE and headcount enrollment, enrollment and cost to generate a SCH). Time is a problem. It is not realistic to expect that busy people can create and maintain an assessment system “on top of everything else”. It is very difficult to develop, implement, maintain and revise an assessment system without additional resources. Resources – human and technological, are needed. The allocation of resources is a measure of institutional priority.

Collecting and Using Data It is one thing to collect data. It is another thing to be discriminating in collecting data. And still another thing to know how to use data.

Proactive Data We are not good at being proactive in generating data and we are not good at being creative in generating data. Be proactive – give people information that they do not ask for but informs them more deeply about the effectiveness of your organization. Think carefully about what creative and informative data you might want to include in your assessment system.

Aggregation and Design Issues - Timing Admission. Early in the program. Mid-program. Pre-student teaching. Exit. Follow-up.

Aggregation and Design Issues - Content Candidate. Demographic. Qualitative. Performance. Knowledge Skills Dispositions Evidence of a positive effect on student learning. Resources and Productivity People. Budget. Space. Equipment.

Aggregation and Design Issues – Levels of Data Course/Faculty Program Department / Cost Center Unit Institution

Aggregation and Design Issues – Sets and Sub-sets Course Faculty Course Faculty Course Faculty Program Program Department Department Cost Centers Unit Unit Unit Support Centers Program Department Unit Institution

Candidate Performance Assessment Choose a question. (K – S – D) How would you measure individual performance? How would you aggregate the data to the program and the unit? If appropriate, how would you compare the unit data with parallel institutional data?

Knowledge The candidates are well-grounded in the content they teach. The candidates possess the professional knowledge to practice competently. The candidates possess technological knowledge for professional and instructional purposes.

Skills The candidates can plan an effective lesson. The candidates can give timely and effective feedback to their students. The candidates appropriately address the needs of diverse and special needs students. The candidates have a positive effect on student learning.

Dispositions The candidates have a passion for teaching. The candidates genuinely care about their students. The candidates believe that all their students can learn. The candidates are reflective practitioners.

Informing Through Exhibits Provide data through exhibits. The conceptual framework. Evidence of candidate performance. Portfolios. Evidence of a positive effect on student learning. Pictures are worth 1000s of words: Clinical sites. Maps. Posters of events.

Exhibits Reflect a Climate Exhibits can be user-friendly. Access to documents. Electronic support. Video tapes. Work stations. CDs. Creature comforts. Pictures of campus events. Faculty publications. Location, location, location.

Everything is not easily measured. “It doesn’t make sense to think that you have to measure with a micrometer if you are going to mark with a piece of chalk and cut with an axe.”

Do not make high-stakes decisions based on soft data. Consider “directionality” in analyzing data.

What matters and what matters most? (The need to know and the nice to know.) There are major implications for assessment system design and data elements.

 Some of the least valuable data are the most easily gathered.  Some of the most important things may be the most difficult to measure.

What you do not measure is a profound statement about what you do not value.

People in an organization focus on what is measured not what is said to be important. Consider the impact on single measures of performance in P-12 schools.

Assessing Your Assessment System What data will you include? How essential is it? How important is it? In considering your data. How will you collect it? How will you analyze it? How will you use it?

Assessing Your Assessment System Is your assessment system too large? Is your assessment system too small? Does it have the data you need? Does it have data you do not use?

Creating an assessment system is a creative task; it is also tedious and time-consuming.