Assessing Student Learning Outcomes in Student Development – Part II Student Development Division Retreat SUNY Oneonta May 27, 2008.

Slides:



Advertisements
Similar presentations
Altoona Area High School
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
What “Counts” as Evidence of Student Learning in Program Assessment?
Keys for Interview Success
Project Monitoring Evaluation and Assessment
New Attorney Boot Camp – Part I
Interview Skills Presented by: Lucia V. Cook Catch the Fever March 2014.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
Behavioural Interviews How to prepare and what to expect.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
QEP Update CFCC Planning Retreat June 19, QEP Update Mid-Term Report Global Outcomes 1.Measurable improvement of students’ critical thinking skills.
Data Dashboards and Key Performance Indicators Presented by: Melissa Wright, M.A. Assistant Director, Baseline September 21, #labgabLike.
Standards and Guidelines for Quality Assurance in the European
FLCC knows a lot about assessment – J will send examples
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
HOW TO EFFECTIVELY MARKET YOURSELF AT JOB FAIRS. PREPARATION  Preparing Yourself Gives You a Distinct Advantage in the Eyes of Employers  Most of Your.
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
How Do I Get There? 4.00 – Understand job search techniques – Understand how to apply, interview, and make a plan for employment.
Curriculum Mapping: Assessment’s Second Step Office of Institutional Assessment & Effectiveness SUNY Oneonta Fall 2010.
Why Should I Hire You? So school just let out for the summer. You’re planning on attending college in the fall. It’s time.
Job and College Readiness Class Icebreaker Activity 4/25 Demonstrate how you would introduce yourself when walking into your interview.
Assessment Workshop SUNY Oneonta April 24, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
By: Rebecca Cosper and Elizabeth Moczygemba. The Job Interview To prepare for the interview: Do your homework. Get organized. Plan to make a good first.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessment Workshop College of San Mateo February 2006.
HECSE Quality Indicators for Leadership Preparation.
Interview Techniques and Tips. Interviews techniques SCREEING INTERVIEW Screening tools to ensure that candidates meet minimum qualification requirements.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Student Services Learning Outcomes and Assessment Jim Haynes, De Anza College Sept. 17, 2010.
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
The Interview. Preparing for your Interview 1.Research the company. 2.Rehearse common interview questions. 3.Create your own questions about the job.
Interview Skills Developed by: Student Career and Employment Centre.
INTRODUCTION TO THE SAVANNAH TECH WORK ETHICS PROGRAM What Makes You Special To an Employer?
SUCCESSFUL INTERVIEWING Allegheny Valley Exit Interview April 28 th, 2015.
How Do I Get There? Understand job search techniques. Understand how to apply, interview, and make a plan for employment.
SCC Annual Operational Planning Why Plan????
How Do I Get There? 4.00 – Understand job search techniques – Understand how to apply, interview, and make a plan for employment.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
Interviews Dos & Don’ts By Jennifer Opper Business Education 9-12 th grade.
+ Montgomery College Program Assessment Orientation Spring 2013.
Honors CAH II 1.00 Rebecca Benners. Must include the following 1. Name – full and legal name 2. Address – no abbreviations (St., Dr., N., etc.) 3. Company.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
How Do I Get There? 4.00 – Understand job search techniques – Understand how to apply, interview, and make a plan for employment.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Basics PNAIRP Conference Thursday October 6, 2011
The Job Interview by Definition
Fullerton College SLOA Workshop:
Introduction to Evaluation
Professional Etiquette
Career Services Counseling & Advising.
Professional and Social Communication
SUCCESSFUL INTERVIEWING Allegheny Valley Exit Interviews
Professional Interviewing Skills
The Interview: Make the Sale
DOES THE TYPE OF ASSESSMENT FEEDBACK I GIVE MAKE A DIFFERENCE
Assessing Academic Programs at IPFW
What to do with your data?
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Presentation transcript:

Assessing Student Learning Outcomes in Student Development – Part II Student Development Division Retreat SUNY Oneonta May 27, 2008

Presenter Patty Francis, Associate Provost for Institutional Assessment and Effectiveness

Topics/Activities for Today Continued themes from May 9 division meeting Developing a unit assessment plan Writing good outcome statements Measuring student learning outcomes Hands-on opportunity to develop outcome statements/measures for department

Continued Themes from May 9 Meeting

Assessment Basics and Rationale Assessment as an ongoing, iterative process which must culminate with “closing the loop” Importance of congruence in the assessment process Assessment as an opportunity for ongoing dialogue, professional development and, most important, improving student services

Developing a Unit Assessment Plan

Overall, Unit Objectives Would Reflect: Institutional effectiveness performance indicators Documentation of all services and programs offered Tracking of use of services (and by whom) Student satisfaction with services/programs Direct impact of services/programs on students

Writing Good Outcome Statements

Issues to Address at the Outset Consistency between college, division, and unit mission statement (and, ultimately, outcome statements) Do all staff have the opportunity to provide input? Who are all your constituents? What results do you expect your programs and services to produce?

Components of a Good Outcome Statement Who is the target? What domain of student development is the target? What change is expected?

But Other Things to Keep in Mind Do you have a program/activity in place to bring about outcome? Can desired change in students be measured? How will you know you were successful? Do external standards apply (i.e., in case of external accreditation/certification)?

Outcome Writing Rule #1: Focus on Student, Not Program Good example: “Student workers will identify, provide, and implement technical equipment that is appropriate for specific union events.” Poor example: “College union work study program will teach student workers how to select and set up equipment for union events.”

Outcome Writing Rule #2: State Outcomes Using Concrete Language and Action Verbs Good example: “Students will negotiate necessary academic accommodations with faculty members.” Poor example: “Our objective is to enhance students’ independence and self-confidence.”

Outcome Writing Rule #3: Focus on Results, Not Process Good example: “Students will demonstrate increased job search skills (e.g., letter and resume writing, interviewing, employer research).” Poor example: “Students will participate in three 2-hour sessions on letter writing, interviewing, and employer research.”

Measuring Student Learning Outcomes

Basic Principles Measurement techniques must be rigorous, since unreliable data are of minimal value Best to use variety of quantitative and qualitative measures  Quantitative easier, but not often as rich  Qualitative often more informative, but require check on scoring (e.g., rubrics) Rely as much as possible on existing data sources

Types of Information to Include 1. Normative/benchmark information whenever possible (external sources, own performance over time) 2. Locally collected data Survey data  SUNY SOS  NSSE  Local surveys of student satisfaction/ perceptions National benchmarking data (e.g., ACUHO/EBI, ACUI/EBI) Performance-based data Focus groups

General Types of Measures (Maki, 2004) Direct measures – students actually demonstrate learning so that evaluators can match results to expectations  Standardized tests  Authentic, performance-based – embedded into students’ actual educational context Indirect – students’ perceptions of learning  Should never be used as sole indicator

From Measures to Criteria Assessment criteria reflect your expectations about student performance (i.e., how you know you were successful) Set criteria at reasonable but challenging levels Often take these forms:  “90% of students will …..”  “80% of students will score at least 70% on…”

Authentic, Performance-Based Assessment: The Value of Rubrics Rubrics provide reliable means of rating student performance in more qualitative way Steps in developing rubrics  Use entire staff to help develop and be as specific as possible in differentiating between performance levels  Use existing rubrics as guide  Pilot test to assure scoring is reliable

Rubric Example #1: Recreation & Sport Services (Univ. of W. Florida) VariableFails to MeetApproachesMeetsExceeds Respect Makes negative comments about others and recreation. Engages in disrespectful behavior during work and/or participation. Keeps self from making negative comments about others and recreation. Sometimes engages in disrespectful behavior during work and/or participation activities. Sometimes makes positive comments about others and recreation. Generally engages in positive behavior during work and/or participation opportunities. May engage in disrespectful behavior during work and/or participation activities but learns from the experience and corrects behavior. Frequently makes positive comments about others and recreation. Generally engages in positive behavior during work and/or participation opportunities. Rarely engages in disrespectful behavior during work and/or participation activities and always learns from the experience and corrects behavior.

Rubric Example #2: Career Services (Interviewing) (Univ. of W. Florida) VariableFails to MeetApproachesMeetsExceeds Appearance Did not dress in proper interview attire or was messy and wrinkled - wore shorts, flip flops, hat, t-shirt, or other apparel that was improper Wore casual khakis or jeans with a semi-casual shirt or blouse. Candidate may have not worn tie or professional heels Professional appearance, but had one or two items that stood out to be less professional - for example, too short of skirt Wore exceptional interview attire - conservative suit, proper length skirt, well- pressed garments and overall - well groomed First Impression May have been late to the interview, appeared too casual or uninterested initially, provided poor handshake or greeting, etc. Was on time to the interview, but offered poor greeting, handshake or appearance, etc. Was on time to the interview; looked interested and had a professional appearance - may need to practice greeting or handshake Was on time to the interview, offered a professional appearance, and gave a great handshake and greeting Non-Verbal Behavior May have used many distracting hand gestures or no eye contact - Candidate appeared completely uninterested or heavily slouched May have used distracting hand gestures or may have slouched in the interview for approximately half of their interview time Used some distracting hand gestures or may have slouched some of the time during the interview, but overall looked interested Used appropriate hand gestures and displayed good posture & appropriate eye contact - generally illustrated that they were interested Communi- cation May have organized thoughts in an unclear way, used slang, or may have been too concise or brief with answers given May need some work with organizing thoughts and using examples to strengthen issues discussed Organizes thoughts well, uses proper grammar, but may still need to use examples Gives solid introductions and conclusions to thoughts, expresses self in proper grammar, and uses examples

Congruence in the Assessment Process: A Detailed Example Handout #1

Your Turn! Handout #2