Mary Allen Qatar University September 2011. Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps.

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Strategies to Measure Student Writing Skills in Your Disciplines Joan Hawthorne University of North Dakota.
Using Rubrics to Grade, Assess, and Improve Learning
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Value Added Assessment RAD Reading Assessment Teacher Moderation Greg Miller Supervisor of Assessment Lynda Gellner Literacy Consultant Juanita Redekopp.
Learning Outcomes, Authentic Assessments and Rubrics Erin Hagar
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
GCAT-SEEK Workshop, 2013 Dr. Tammy Tobin, Susquehanna University Adapted from Mary J. Allen, SACS-COC Summer Institute Developing and Using Rubrics to.
How to Develop and Use Rubrics Modified from a presentation given by Mary J. Allen.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
1 How LRW Faculty can Contribute to Their Law School’s Assessment Plan David Thomson (University of Denver) Sophie Sparrow (University of New Hampshire)
What should be the basis of
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Principles of Assessment
Assessing General Education Programs GE Fall Retreat 2010 Kathleen Thatcher.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
David Gibbs and Teresa Morris College of San Mateo.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
California State University East Bay
Assessment for Optimal Learning Tace Crouse Faculty Center for Teaching and Learning University of Central Florida.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
DLM Early Childhood Express Assessment in Early Childhood Dr. Rafael Lara-Alecio Dr. Beverly J. Irby
Stronge Teacher Effectiveness Performance Evaluation System
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessment Workshop College of San Mateo February 2006.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Professional Development for the Teachers of Tomorrow’s Children WACTE October 28, 2008 Sheila Fox, WWU.
Assessment 101: A Review of the Basics Jill Allison Kern, PhD Director of Assessment Christopher Newport University January 2013.
Brad Thiessen WASC Re-Accreditation Kick-Off Event & Training Friday, July 26, 2013.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
INACOL Standard D: CLEAR EXPECTATIONS PROMPT RESPONSES REGULAR FEEDBACK.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Using edTPA Data for Program Design and Curriculum Mapping Mary Ariail, Georgia State University Kristy Brown, Shorter University Judith Emerson, Georgia.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
Identifying Assessments
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Assessment Means. Learning Outcome Participates will create assessment processes that demonstrate student learning outcomes are being met.
Overview of Student Growth and T-TESS. Keys of Appraisal Student growth is a part of the appraisal process: Formative Ongoing and Timely Formalize what.
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
GRADING n n Grading that reflects the actual ability of the student n n Grading as a personal communication.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Reviewing Syllabi to Document Teaching Culture and Inform Decisions Claudia J. Stanny Director, Center for University Teaching, Learning, & Assessment.
CURRICULUM MAPPING. WORKSHOP LEARNING OUTCOME PARTICIPANTS WILL USE A CURRICULUM MAP TO DEMONSTRATE HOW STUDENT PROGRESS TO ACHIEVE PROGRAM LEARNING OUTCOMES.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Creating Assessable Student Learning Outcomes
TLQAA STANDARDS & TOOLS
Presentation transcript:

Mary Allen Qatar University September 2011

Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps  develop/refine sustainable, multi-year assessment plans  develop/refine rubrics and calibrate reviewers  analyze assessment results  use a variety of strategies to close the loop  evaluate the impact of improvement actions

 an on-going process designed to monitor and improve student learning

Faculty:  Develop SLOs  Verify curriculum alignment  Develop an assessment plan  Collect evidence  Assess evidence and reach a conclusion  Close the loop

 Standard 3.3.1

 Program goals  Cohesive curriculum  How students learn  Course structure and pedagogy  Faculty instructional role  Assessment  Campus support for learning

 Direct vs. indirect assessment  Value-added vs. absolute learning outcomes  Authentic assessment  Formative vs. summative assessment  Triangulation

 Clarify what faculty want students to learn  Clarify how each outcome can be assessed

 Knowledge  Skills  Attitudes/Values/Predispositions

 CLO  PLO  ILO

 List of goals and outcomes  List of outcomes  Typically 6-8 outcomes in all

1.Active verbs 2.Simple language 3.Real vs. aspirational 4.Aligned with mission 5.Avoid compound outcomes 6.Outcomes vs. learning processes 7.Focus on high-priority learning

 Coherence  Synthesizing experiences  On-going practice of learned skills  Opportunities to develop increasing sophistication and to apply what is learned

 I = Introduced  D = Developed & Practiced with Feedback  M = Demonstrated at the Mastery Level Appropriate for Graduation

 Curriculum Map 2  Curriculum Map 3

 CLOs that align with relevant PLOs  Faculty can provide artifacts for assessment  Faculty teach courses consistent with the map

 Focuses faculty on curriculum cohesion  Guides course planning  Allows faculty to identify potential sources of assessment evidence  Allows faculty to identify where they might close the loop

 Except for NCATE-accredited programs

 Who?  What?  Where?  When?  Why?

 Relevant samples  Representative samples  Reasonably-sized samples

 Anonymity  Confidentiality  Privacy  Informed consent

Find examples of:  Direct assessment  Indirect assessment  Formative assessment  Summative assessment  Authentic assessment  Triangulation

 PLO  When to assess  What direct and indirect evidence to collect  Who will collect the evidence  How evidence will be assessed  How decisions will be made

 Valid  Reliable  Actionable  Efficient and cost-effective  Engages students  Interesting to faculty  Triangulation

 Published tests  Locally-developed tests  Embedded assessment  Portfolios

 Surveys  Interviews  Focus groups

 Holistic  Analytic

 Rubric Packet  AAC&U VALUE Rubrics  Specialized Packets

 Efficiency  Defines faculty expectations  Well-trained reviewers use the same criteria  Criterion-referenced judgments  Ratings can be done by multiple people

 Assess while grading  Assess in a group

 Columns are used for assessment  Faculty can adapt an assessment rubric in different ways  Faculty maintain control over their own grading

 Grading may require extra criteria  Grading requires more precision  Calibrate when doing assessment

 Speed up grading  Clarify expectations to students  Reduce student grade complaints  Improve the reliability and validity of assessments and grades  Make grading and assessment more efficient and effective  Help faculty create better assignments

 Below Expectations  Needs Improvement  Meets Expectations  Exceeds Expectations

 Adapt an already-existing rubric  Analytic method

 Consider starting at the extremes  Some words I find useful

 One reader/document.  Two independent readers/document.  Paired readers.

 Collect the assessment evidence and remove identifying information.  Develop and pilot test the rubric.  Select exemplars of weak, medium, and strong student work.  Consider pre-programming a spreadsheet so data can be entered and analyzed during the reading and participants can discuss results immediately.

 Correlation  Discrepancy Index

 How good is good enough?

 Celebrate!  Change pedagogy  Change curriculum  Change student support  Change faculty support  Change equipment/supplies/space

1.Focus on what is important. 2.Don’t try to do too much at once. 3.Take samples. 4.Pilot test procedures. 5.Use rubrics. 6.Close the loop. 7.If you rely on adjunct faculty, include them in assessment. 8.Keep a written record.