Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Covering Assessment in LIS Education and in the Profession Megan Oakleaf Peter Hernon Karin De Jager Library Assessment Conference August 2008.

Similar presentations


Presentation on theme: "1 Covering Assessment in LIS Education and in the Profession Megan Oakleaf Peter Hernon Karin De Jager Library Assessment Conference August 2008."— Presentation transcript:

1 1 Covering Assessment in LIS Education and in the Profession Megan Oakleaf Peter Hernon Karin De Jager Library Assessment Conference August 2008

2 2 Panel Overview Introduction Some examples of what LIS education is doing Assessment Student outcomes Student learning outcomes The Assessment Toolkit Assessment in Context A view from South Africa Library Assessment Conference 20082

3 3 LIS Education Status of using student learning outcomes should guide programs Role of research at masters level ResearchApplication of inquiry process EvaluationLibrary centric Examination of program/service for summative/formative evaluation AssessmentConnects libraries and broader organizations to stakeholder expectations and requirements Library Assessment Conference 20083

4 4 Key Stakeholders behind Assessment Government Federal State Accreditation Regional Accrediting Organizations Program Accreditors Library Assessment Conference 20084

5 5 Critical Issues How do we build research as a most essential activity within LIS education? How do we build research as a more essential activity among libraries and librarians? Library Assessment Conference 20085

6 6 Partnerships An institution LIS schools/programs Cross-disciplinary partners Library Campus IR Library Assessment Conference 20086

7 7 Other Partnerships LIS Education Libraries within the US Key professional associations CE Workshops Speaking Scholarship ISSUES: How to address research and assessment? Library Assessment Conference 20087

8 8 AGGREGATE STATISTICS ON GROUPS OF STUDENTS GRADUATION RATES, RETENTION RATES TRANSFER RATES EMPLOYMENT RATES FOR A GRADUATING CLASS Assessment 1.Accountability: Meeting institutional mission effectiveness and institutional fiscal efficiency Library Assessment Conference 20088

9 9 Assessment Cycle for Student Learning Outcomes Institutional Mission Vision Values Identify outcomes Interpret evidence Planning (Assessment Plan) Gather evidence Use the results Review outcomes Library Assessment Conference 2008 LibQUAL+ is irrelevant 9

10 10 Student Learning Outcomes--Examples Conceptual Leadership Critical thinking Problem solving Information literacy Global citizen Values (moral, etc.) Skills Oral/written communication Foreign language communication Technological sophistication Quantitative reasoning ability Other Library Assessment Conference 200810

11 11 WHAT SHOULD STUDENTS LEARN (IN A PROGRAM)? HOW WELL ARE THEY LEARNING IT (IN THAT PROGRAM)? WHAT MEASURES AND PROCEDURES DOES THE INSTITUTION USE TO DETERMINE THAT IT IS EFFECTIVE? TO WHAT EXTENT DOES THE INSTITUTION OFFER EVIDENCE THAT DEMONSTRATES ITS EFFECTIVENESS TO THE PUBLIC? WHAT DOES THE INSTITUTION PLAN TO DO WITH THIS EVIDENCE TO IMPROVE OUTCOMES? WHAT SHOULD STUDENTS LEARN (IN A PROGRAM)? HOW WELL ARE THEY LEARNING IT (IN THAT PROGRAM)? WHAT MEASURES AND PROCEDURES DOES THE INSTITUTION USE TO DETERMINE THAT IT IS EFFECTIVE? TO WHAT EXTENT DOES THE INSTITUTION OFFER EVIDENCE THAT DEMONSTRATES ITS EFFECTIVENESS TO THE PUBLIC? WHAT DOES THE INSTITUTION PLAN TO DO WITH THIS EVIDENCE TO IMPROVE OUTCOMES? Assessment 2.Educational quality and improvement (e.g., student learning) Library Assessment Conference 200811

12 12 Student learning outcomes exist at the following levels course Program Institutional Use of rubrics

13 Peter Hernon13 Direct Methods Embedded course assessment (performance on assignments, etc.; minute paper) Portfolio assessment Performance (internships, practicum, student teaching) Professional jurors or evaluators Capstone course/experience Experimental research designs), with pre- and post-testing Use of standardized tests Think-aloud protocol Directed conversation Videotape/audiotape evaluation Analysis of theses/dissertations/ senior papers (content analysis, interviews, or oral defense)

14 Peter Hernon14 Indirect Methods Surveys (self- reporting) and self- assessments Curriculum and syllabus evaluation Exit interviews Observation Other

15 15 Simmons MLIP Leadership Model The curriculum and assessment activities are guided by a leadership model, which was adapted from a model developed by the National Center for Healthcare Leadership. The model consists of twenty-five distinct leadership competencies in three broad areas: Transformation, Accomplishment, and People.leadership model http://web.simmons.edu/~phdml/docs/phdmlip_model s.pdf http://web.simmons.edu/~phdml/docs/phdmlip_model s.pdf

16 16 Assessment in Context Learning in context is authentic & meaningful. Students apply skills as they would in the real world. Learning in context is active. Students construct meaning and knowledge: they do not have meaning or knowledge handed to them in a book or lecture. Learning, then, is a process of students making sense of how things fit together; factual and procedural knowledge is built along the way (Shavelson & Baxter, 1996). Learning in context is open-ended & acknowledges more that one right approach/answer (Shepard, 1996). Library Assessment Conference 200816

17 17 Example LIS Assignment Planning, Marketing, & Assessing Library Services Assignment Tasks: Locate a new or recently revised library service & a host librarian For the service, develop: Project Management Plan Marketing Plan Assessment Plan Present final plans to class & host librarian Library Assessment Conference 200817

18 18 Library Service Examples Virtual/IM reference Downloadable audio Gaming programs Single service points Information commons Portals/blogs/wikis LibGuides Digitization Orientations & outreach Book clubs & summer reading programs Cafes/coffee bars Library Assessment Conference 200818

19 19 Assessment Plan Outline Service goals & link to strategic plan Literature review Service outcomes Target audience Methods & tools for evidence collection Recommendations for pilot assessment Analysis of evidence (data plan) How assessors will know the outcome has been met Result scenarios & decision making indicators Recommendations for reporting Responsible parties Timeline Library Assessment Conference 200819

20 20 Libraries Impacted (2007-2008) Syracuse University Libraries Lemoyne College Libraries SUNY ESF Libraries SUNY Cortland Libraries SUNY Brockport Libraries SUNY Binghamton Libraries SUNY Upstate Medical University Libraries Mid-York Library System (NY) Cazenovia Public Library (NY) Roanoke City Libraries (VA) Loyola University Libraries Cornell University Libraries University of Utah Libraries Enoch Pratt Free Library (MD) Northwestern University Libraries Supreme Court Library (NV) Rockefeller University Libraries UT – San Antonio Libraries Brandeis University Libraries 26 NYC-area public school libraries Wantagh Public Library (NY) Brooklyn Public Library (NY) Jervis Public Library (NY) Fayetteville Free Library (NY) Cazenovia Public Library (NY) Cicero-North Syracuse HS Library Fletcher Free Library (VT) RIT Libraries US Military Academy Library Celebration School Library (FL) Deschutes Public Libraries (OR) Middlebury College Libraries (??) University of New Hampshire Libraries Regent University Libraries Wake County Public Libraries (NC) Whitesboro High School Library Norwood-Norfolk Central HS Library New England Law Library Consortium YouthBuild Charter School Library (PA) Wellesley College Libraries Library Assessment Conference 200820 Onondaga County Public Libraries (NY) University of Rochester Libraries Mott Road Elementary School Library (NY) Vogelson Public Library (NJ) LeMoyne Elementary School Library (NY) Groton Elementary School Library (NY) Paine Memorial Library (NY) Drew University Libraries Broome County Public Library New York University Libraries Green Mountain Library Consortium Oneida-Herkimer BOCES School Library System Schoharie Free Public Library (??) University of Virginia Libraries Boston College Libraries Oneida Castle Elementary School Library Andrew J. Lanza Library (??) George C. Marshall European Center for Security Studies

21 21 Assessment Impact Examples Librarians move forward on projects. Nearly all librarians say theyll enact student plans, in part or in whole. We have paid thousands to consultants who have produced reports that dont come anywhere near the level of detail and professionalism that these students provided for us gratis. If we were to move on this we could have a family-centered program at the [childrens hospital] that would become a national model. –to hospital president and others from chair of pediatrics If you were wondering if your project was ever touched – most certainly! Your project has been the backbone of my knowledge and launching point for inquiry. Hopefully in 2-3 months you will see these items [downloadable audio] in the catalog and in our marketing. --to student from Wake Public Libraries (NC) Library Assessment Conference 200821

22 22 Assessment Impact Examples Students gain professional positions. Student named Federal Library Technician of the Year. Student recommended as chair of assessment committee at New England Law Library Consortium. Library Assessment Conference 200822

23 23 Courses at selected LIS Programs UIUC Evaluating Programs and Services Michigan Evaluation of Systems and Services Outcome Based Evaluation of Programs and Services Rutgers Evaluation of Library and Information Services & Systems Indiana Evaluation of Resources and Services Texas Administration WisconsinInformation Services Management HawaiiTeaching Information Technology Literacy Florida StatePlanning, Evaluation & Financial Management ECUTheme & component throughout program Library Assessment Conference 200823

24 Library Assessment Conference 200824 From SA point of view - 2 implicit assumptions: The workplace requires evaluation & assessment activities from librarians Library schools are teaching some of the competencies required for these activities Little evidence of either in local practice

25 Library Assessment Conference 200825 Reasons No standardized data collection required from libraries Inevitable result: not a strong culture of assessment evident on the SA library scene If evaluation & assessment not a high priority in libraries - almost self-evidently not high priority in library schools either

26 Library Assessment Conference 200826 Library education in SA Schools/Departments generally small & threatened with closure Reduced in number during the last 10 years from 18 to 12; also more closures in sight Some that remain have merged with other disciplines in order to survive; Or evolved other survival strategies; e.g. diversifying into adjacent areas like knowledge or records management, media studies & publishing

27 Library Assessment Conference 200827 Two kinds of qualifications English speaking universities: mainly post-graduate Diploma after Bachelors degree To ensure that students have some subject specialization Other universities: first degree in librarianship with somewhat less emphasis on subject specialization 2 qualifications initially envisaged as equal (both took 4 years to complete); but Gradual emergence of 3 year qualification in information studies - much less subject specialization required

28 Library Assessment Conference 200828 Implications Librarians rather technicist in orientation Focus on the practicalities of obtaining, managing & provision of resources Frequently not enough subject expertise to be regarded as equals by faculty Tend to concentrate on undergraduate needs & information literacy of very diverse & frequently underprepared student body. Library performance measurement may be regarded with suspicion Fear that own institution might be shown up - of lesser quality than others

29 Library Assessment Conference 200829 University of Cape Town Postgraduate diploma: small course on performance measurement & evaluation 6 teaching periods: objectives of performance evaluation, approaches to measuring few informal case studies & examples of processes & procedures Eventual need for evaluation skills in workplaces emphasized Self-study projects on e.g. measuring in ILL depts, assessment of infolit competencies & information needs; statistics for electronic resources & web usability studies

30 Library Assessment Conference 200830 Yet growing demand for evidence of quality SA Council for HE mandates national institutional quality audits Libraries to provide evidence of quality & impact of services on teaching & research Some assistance from CHELSA Considerable interest in PM7 in 2007 Ca 70 librarians from SA (total of nearly 200) Influence of LibQUAL+ Though language & structure very difficult at institutions where English not first language of student body

31 Library Assessment Conference 200831 Also problem with research in SA SA research output declined since 1990s Respected researchers generally pale & male & about to retire Results of our LibQual evaluation (2005) Loud & clear; postgraduates & researchers not happy with Library resources & services Both faculty & postgraduates (i.e. both current and future researchers) rated all of Information Control below minimum expectations Serious & sustained interventions required to support & enhance the research enterprise

32 Library Assessment Conference 200832 Novel intervention SA library education Ambitious Library project to support researchers in the Library Consortium of 3 large academic libraries funded by Carnegie Intended to catch up with what was not learnt in library school Program for librarians: total immersion into research enterprise Monitoring & measuring ALL activities essential for improvement

33 Library Assessment Conference 200833 No research support without evaluation & measurement Two-week Academy for 6 mid-career librarians from each institution for 2 (or 3) years Best possible researchers talking about their own research Wide range of disciplines & from very different epistemologies Each participant also to produce potentially publishable research paper: with data collection, measurement or assessment component Research involves finding out & counting & measuring to understand what is really going on – Whether in libraries or elsewhere in the research enterprise

34 Library Assessment Conference 200834 Bibliography Shavelson, Richard J., and Gail P. Baxter. "Linking Assessment with Instruction." A Handbook for Student Performance in an Era of Restructuring. Eds. Robert E. Blum and Judith A. Arter. Alexandria, Virginia: Association for Supervision and Curriculum Development, 1996. IV-7:1 - IV-7:6. Shepard, Lorrie A. "Why We Need Better Assessments." A Handbook for Student Performance Assessment in an Era of Restructuring. Eds. Robert E. Blum and Judith A. Arter. Alexandria, Virginia: Association for Supervision and Curriculum Development, 1996. I-2:2 - I-2:7.


Download ppt "1 Covering Assessment in LIS Education and in the Profession Megan Oakleaf Peter Hernon Karin De Jager Library Assessment Conference August 2008."

Similar presentations


Ads by Google