Assessing When Numbers Don’t Count Binghamton University March 23, 2007.

Slides:



Advertisements
Similar presentations
Close Reading at NQ Is it really that different to what I have done before?
Advertisements

Analyzing Student Work
The Nominal Group Technique Chapter 42 Research Methodologies.
What “Counts” as Evidence of Student Learning in Program Assessment?
What is Primary Research and How do I get Started?
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Chapter 11: Collecting Data by Communication. Key Issues for Collecting Information by Communication.
FACULTY DEVELOPMENT INSTITUTE – 23/2/2008 THE SYLLABUS CORNERSTONE OF EFFECTIVE LEARNING FACILITATOR: Professor Pandeli Glavanis (PhD) Associate Director,
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Oral Presentation Rubrics Standards-based Assessment of and for Learning.
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Focus Groups and Student Learning Assessment. What is a Focus Group? A focus group is a guided discussion whose intent is to gather open-ended comments.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
Topic: Assessment and Evaluation
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Introduction to Communication Research
Measuring Learning Outcomes Evaluation
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Standards and Guidelines for Quality Assurance in the European
Continuous Assessment: Challenges and Opportunities for In-service Training Nizar Ibrahim 27/3/2011.
Science Inquiry Minds-on Hands-on.
FLCC knows a lot about assessment – J will send examples
Action Research: For Both Teacher and Student
Talent Management Training Methods.
Copyright © 2008 Allyn & Bacon Meetings: Forums for Problem Solving 11 CHAPTER Chapter Objectives This Multimedia product and its contents are protected.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Reflective practice Session 4 – Working together.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
RESEARCH DESIGN.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Chapter 3 Goals After completing this chapter, you should be able to: Describe key data collection methods Know key definitions:  Population vs. Sample.
1 Making sound teacher judgments and moderating them Moderation for Primary Teachers Owhata School Staff meeting 26 September 2011.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
Lect 6 chapter 3 Research Methodology.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
EDU 385 Education Assessment in the Classroom
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Teaching Today: An Introduction to Education 8th edition
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
One Step at a Time: Presentation 8 DISCUSSION SKILLS Introduction Initial Screen Skills Checklist Classroom Intervention Lesson Planning Teaching Method.
An Introduction to Formative Assessment as a useful support for teaching and learning.
Facilitate Group Learning
Interviews By Mr Daniel Hansson.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Student Employment Where Learning Happens. Today’s Agenda Overview of Learning Outcomes UWM Employment Experience – What our data says – Student Employment.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Mary Ann Roe e-Colorado Portal Coordinator Colorado Department of Labor and Employment Jennifer Jirous Computer Information Systems Faculty Pikes Peak.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Improved socio-economic services for a more social microfinance.
Consider Your Audience
Business and Management Research
Presentation transcript:

Assessing When Numbers Don’t Count Binghamton University March 23, 2007

Today’s Objectives Define what a discourse-based, or qualitative assessment method is Define what a discourse-based, or qualitative assessment method is Review a few of these methods Review a few of these methods Work with some case studies to better understand how these methods might be used Work with some case studies to better understand how these methods might be used Discuss the uses and limitations of these methods, especially with regard to assessment Discuss the uses and limitations of these methods, especially with regard to assessment

The Pressure to Assess This week—Spellings Commission meeting to discuss several issues. One of them is student learning outcomes This week—Spellings Commission meeting to discuss several issues. One of them is student learning outcomes There is consistent pressure to use standardized tests and surveys; we are currently using one of them, the NSSE There is consistent pressure to use standardized tests and surveys; we are currently using one of them, the NSSE Through the use of standardized tests and surveys, we gain perspective on how we compare to other institutions Through the use of standardized tests and surveys, we gain perspective on how we compare to other institutions We might be able to gain a “value added” perspective We might be able to gain a “value added” perspective

Weaknesses of the Standardized Test/Survey Approach We obtain a “macro” perspective, but may not gain a “micro” perspective We obtain a “macro” perspective, but may not gain a “micro” perspective We have little control over the questions We have little control over the questions “Value added” is still difficult to establish with standardized tests “Value added” is still difficult to establish with standardized tests Faculty see little value in using standardized tests, especially in interdisciplinary studies and the liberal arts Faculty see little value in using standardized tests, especially in interdisciplinary studies and the liberal arts Standardized tests/surveys often assume they can measure most of what concerns faculty in respect to teaching and learning Standardized tests/surveys often assume they can measure most of what concerns faculty in respect to teaching and learning

Forging the Middle Ground: Discourse-Based Assessment Methods Allow for the discovery of the unanticipated Allow for the discovery of the unanticipated Relevant to interdisciplinary study Relevant to interdisciplinary study Maximizes faculty/staff input when discourse is highly valued Maximizes faculty/staff input when discourse is highly valued Good to use when the number of objectives outweigh the amount of time available to assess student learning Good to use when the number of objectives outweigh the amount of time available to assess student learning Great contextualizer Great contextualizer When performed carefully and compared with other methods, great return on investment When performed carefully and compared with other methods, great return on investment

Types of Qualitative Assessment Focus groups Focus groups Expert panels Expert panels Open-ended surveys Open-ended surveys Ethnographic studies (participant observations) Ethnographic studies (participant observations) Portfolio reviews Portfolio reviews Primary trait scoring Primary trait scoring Delphi panels Delphi panels

Qualitative Assessment is an Inductive Process Defined Focus Observations Analysis Summary Report Comparison with Other Assessments Action

Expert Panel Type of focus group Type of focus group Focus is on (for assessment purposes) a particular assignment or performance Focus is on (for assessment purposes) a particular assignment or performance Not a simple conversation; is done methodically, with precision, and is systematic Not a simple conversation; is done methodically, with precision, and is systematic Often involves convenience or random samples of a homogeneous population Often involves convenience or random samples of a homogeneous population Must include carefully written questions Must include carefully written questions Might include a survey or other assessment technique as part of the process Might include a survey or other assessment technique as part of the process

Krueger’s 10 Quality Factors in Focus Group (Expert Panel) Research Clarity of purpose Clarity of purpose Appropriate environment Appropriate environment Sufficient resources Sufficient resources Appropriate participants Appropriate participants Skillful moderator Skillful moderator Effective questions Effective questions Careful data handling Careful data handling Systematic and verifiable analysis Systematic and verifiable analysis Appropriate presentation Appropriate presentation Honoring the participant, clarity, and method Honoring the participant, clarity, and method

Expert Panel Procedure Select FocusSelect Experts Write Questions Use Additional Method? Logistics Conduct Expert Panel— How to Assess Agreement? Report Results

Final Thoughts on Expert Panels Good method of assessing “ineffable outcomes” Good method of assessing “ineffable outcomes” Better when expert panel questions and conversations are grounded in standards and/or student learning objectives Better when expert panel questions and conversations are grounded in standards and/or student learning objectives Also good when specific focus is on a particular assignment or performance Also good when specific focus is on a particular assignment or performance It is advisable to use a secondary method either prior to or during expert panel It is advisable to use a secondary method either prior to or during expert panel Does not control for anonymity among respondents Does not control for anonymity among respondents

Primary Trait Scoring Focus is upon one particular assignment, performance, etc., that is reflective of several aggregate student learning outcomes Focus is upon one particular assignment, performance, etc., that is reflective of several aggregate student learning outcomes Rate each outcome according to a scale— ex., proficient, satisfactory, unsatisfactory Rate each outcome according to a scale— ex., proficient, satisfactory, unsatisfactory Idea is to look at trends, not numbers, that spark discussion Idea is to look at trends, not numbers, that spark discussion

Primary Trait Scoring--Procedure Choose an assignment in which students demonstrate summative knowledge, skills, or competencies Choose an assignment in which students demonstrate summative knowledge, skills, or competencies Carefully rate student performance according to the scale Carefully rate student performance according to the scale Place checkmarks in each column Place checkmarks in each column Look for visual trends Look for visual trends Discuss why these trends occur, what basis these rating occurred, and what specific issues are revealed through the analysis Discuss why these trends occur, what basis these rating occurred, and what specific issues are revealed through the analysis Combine with other findings, or make plans for action Combine with other findings, or make plans for action

Example of Primary Trait Analysis ElementExcellentSatisfactoryUnsatisfactory Demonstrate an ability to research the financial integrity of a business plan through accounting/financ ial analysis √√√√ √√√√√ √ √√ Demonstrate an ability to assess a plan’s organizational integrity; that is, a business plan demonstrates how an organization will be built and sustained to under gird the success of the business being organized √√√√√√ √√√√√ √

Primary Trait Analysis—Final Thoughts Method to get faculty or staff to talk about what assessment results mean Method to get faculty or staff to talk about what assessment results mean A good starting point toward developing a rubric A good starting point toward developing a rubric Enables discussion, which can lead to further discovery Enables discussion, which can lead to further discovery “Simple, stress free, and easy” “Simple, stress free, and easy”

Delphi Panel Introductory Exercise Divide into 3 groups Divide into 3 groups Get out piece of paper and individually write down, “what do students have the most difficulty with when they first come to college (as first-year students)?” Get out piece of paper and individually write down, “what do students have the most difficulty with when they first come to college (as first-year students)?” Try to create frequency counts—combine like answers and tally them Try to create frequency counts—combine like answers and tally them Discuss Discuss

Questions for Groups What do these say about the difficulty students might have when they start? What do these say about the difficulty students might have when they start? Take a look at the most “popular” answer—do these ordinarily achieve “majority vote status? Take a look at the most “popular” answer—do these ordinarily achieve “majority vote status? Even in cases where “majority vote status” is achieved, might less popular answers indicate group consensus? Even in cases where “majority vote status” is achieved, might less popular answers indicate group consensus?

Introduction to the Delphi Method Combination of at least 3 methods—open-ended survey, closed-ended survey, and expert panel Combination of at least 3 methods—open-ended survey, closed-ended survey, and expert panel Unlike expert panel, attempts to maximize anonymity of respondents to control for power dynamics among these respondents Unlike expert panel, attempts to maximize anonymity of respondents to control for power dynamics among these respondents Assumes highly motivated groups of experts (faculty or staff) willing to participate in more than one round of questions Assumes highly motivated groups of experts (faculty or staff) willing to participate in more than one round of questions

Introduction to the Method 1. Find homogeneous group of experts who can comment either on one assignment or specific student learning outcomes 2. Create an open-ended survey in which respondents are asked to identify strengths and weaknesses in student performance in reference to specific standard or student learning outcome 3. Content analyze responses by combining like responses, placing how many times each was mentioned in parentheses 4. “Cut and paste” these onto a survey, and ask respondents to indicate to what extent they agree with each on a 4 or 5 point scale 5. Report those responses that indicate consensus 6. If needed, move to 3 rd round, in which respondents rank these consensus items

Strengths of Delphi Method A way of addressing “ineffable outcomes” A way of addressing “ineffable outcomes” Can be used to designate the most agreed-upon student learning objectives that faculty have communicated Can be used to designate the most agreed-upon student learning objectives that faculty have communicated Can be used to gather information from employers, internship supervisors, alumni, etc. about specific items of interest Can be used to gather information from employers, internship supervisors, alumni, etc. about specific items of interest

Limitations of Delphi Method Can be time consuming Can be time consuming Takes some knowledge of statistics Takes some knowledge of statistics Not a method that can be used by itself; usually results need to be compared with direct assessments of student learning Not a method that can be used by itself; usually results need to be compared with direct assessments of student learning

Today’s Activities Separate into three groups; select group note taker Separate into three groups; select group note taker If you have not already, read the case study packets If you have not already, read the case study packets As a group, discuss questions at end of case study—debate, applaud, etc.—do something active As a group, discuss questions at end of case study—debate, applaud, etc.—do something active Write answers on provided sheet of paper Write answers on provided sheet of paper