Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost.

Similar presentations


Presentation on theme: "Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost."— Presentation transcript:

1 Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost

2 How would you assess these SLOs? Graduates are able to critique a brief draft essay, pointing out the grammatical, spelling, and punctuation errors and offer appropriate suggestions to correct identified deficiencies Senior undergraduate psychology majors perform above the national average on the GRE Psychology Subject Test In a “capstone course” during the final semester prior to graduation; required to critique a supplied essay containing predetermined errors; evaluated by a 3-person faculty panel (criteria: appropriate suggestions to remediate 90% of the errors) GRE Psychology Subject Test; completed during the senior year, required for graduation. Compare average GRE Psychology Subject Test scores with average scores of all examinees nationwide

3 Assessment Methods Assessment Methods Assessment methods are ways to ascertain (“measure”) student achievement levels associated with stated student learning outcomes (SLOs) “Outcome” is a generic term for goals, objectives, and/or aims

4 MissionVision UniversityStudentOutcomesStudentLearningOutcomes AssessmentMethods A specific assessment method(s) is selected for a specific outcome... “How do I ‘measure’ this outcome?” Basis for Selecting Appropriate Assessment Methods

5 Assessment Methods Assessment Methods Assessment methods include both direct and indirect approaches... We’ll define these terms in a few minutes. First, let’s explore a few criteria or considerations to keep in mind as you select appropriate assessment methods...

6 Qualitative Versus Quantitative Methods Qualitative assessment: collects data that does not lend itself to quantitative methods but rather to interpretive criteria; “data” or evidence are often representative words, pictures, descriptions, examples of artistic performance, etc. Quantitative assessment: collects representative data that are numerical and lend themselves to numerical summary or statistical analysis Programs are free to select assessment methods appropriate to their discipline or service.... choices must be valid and reliable

7 Valid and Reliable Methods Valid: The method is appropriate to the academic discipline and measures what it is designed to measure Reliable: The method yields consistent data each time it is used and persons using the method are consistent in implementing the method and interpreting the data Basic Aim: “defensible methods”

8 Embedded assessment - “measurement” strategies included as part of the requirements within existing courses, internships, or other learning experiences– “double duty” assessment; e.g., “critical assignments” Ancillary assessment - “measurement” strategies added on or in addition to requirements within existing courses, internships, or other learning experiences– “additional duty” assessment Locus of Assessment

9 Sources for Finding Assessment Methods  Professional associations and organizations  Other programs/departments at CBU  Similar programs/departments at other universities  Published Resources Dunn, D. S., Mehrotra, C. M. and Halonen J. S. (2004). Measuring Up: Educational Assessment Challenges and Practices for Psychology. APA: Washington, DC.  Web... In general or for your specific area http://www.liberalarts.wabash.edu/assessment  Literature search by a professional librarian  Personal experience – yours or colleagues

10  Does it “fit” the SLO?  Did the faculty or student services staff select the method and are they willing to participate in its use?  Will all students in the program or provided service be included in the assessment (ideally, yes) or a sample of students (maybe)?  How much time is required to complete the assessment method? Determine how this affects faculty, staff, and students When SELECTING ANY ASSESSMENT method, here are some questions to consider carefully:

11  When and where will the assessment be administered?  Are there financial costs? Are program and/or university resources available?  Is the method used at one point in time (cross-sectional method) or utilized with students over several points in time (longitudinal method)?  Does the program faculty/staff have the skills and/or knowledge necessary to use the method and analyze the results?  Most importantly... WHO is responsible to make certain the assessment is accomplished?

12 TIP Ideally…. as you write or rewrite SLOs keep in mind the question: “What method(s) can I use to assess this SLO?” Why is this tip potentially useful?

13 Direct Methods Direct assessment methods are “measurement” strategies that require students to actively demonstrate achievement levels related to institutional and program- specific learning outcomes Direct assessment methods focus on collecting evidence on student learning or achievement directly from students using work they submit (assignment, exam, term paper, etc.) or by observing them as they demonstrate learned behaviors, attitudes, skills, or practice

14 Capstone or Senior-Level projects, papers, presentations, performances, portfolios, or research evaluated by faculty or external review teams... effective as assessment tools when the student work is evaluated in a standard manner, focusing on student achievement of program-level outcomes Exams - locally developed comprehensive exams or entry-to- program exams, or national standardized exams, certification or licensure exams, or professional exams Internship or Practicum - evaluations of student knowledge and skills from internship supervisors, faculty overseers, or from student participants themselves. This may include written evaluations from supervisors focused on specific knowledge or skills or evaluation of student final reports or presentations from internship experiences. Direct Methods: Examples

15 Portfolios (hard-copy or web-based) - reviewed by faculty members from the program, faculty members from outside the program, professionals, visiting scholars, or industrial boards Professional Jurors or Evaluators to evaluate student projects, papers, portfolios, exhibits, performances, or recitals Intercollegiate Competitions - useful for assessment when students are asked to demonstrate knowledge or skills related to the expected learning outcomes within appropriate programs Course assessments - these are projects, assignments, or exam questions that directly link to program-level expected learning outcomes and are scored using established criteria; common assignments may be included in multiple sections taught by various professors (assuming prior agreement) Direct Methods, continued

16 Direct Methods: Advantages  Require students to actively demonstrate knowledge, attitudes, and/or skills  Provide data to directly measure expected outcomes  Demand less abstract interpretation  Usually “easier” to administer Direct Methods are always our first choice; indirect methods support but cannot replace direct methods

17 Achievement Levels or Criteria  Rarely does every student achieve all SLOs completely, 100%; nor can we expect this  What “level” of achievement is acceptable? Identified in the “OPlan”  Rubrics recognize varying achievement levels  Rubrics are a scoring method or technique appropriate to many assessment methods

18 OutcomeNoviceDevelopingProficientAccomplished Correctly analyzes research data 1  Limits analysis to correct basic descriptive analysis. 2  Selects and executes correct basic statistical analyses 3  Selects, articulates, and executes an inferential statistical analysis 4  Selects, articulates, and executes the statistical analysis suitable to the research question A Rubric Example Excellent resource: Stevens, D. D. & Levi, A. J. (2005). Introduction to Rubrics. Sterling, VA: Stylus. CBU utilizes 4-point rubrics, with the specific level criteria appropriate to the outcome in question

19 Guidelines for Implementing Imbedded, Direct Assessment  Link class assignments to both SLOs and course objectives  If multiple sections of the same course exist and the intent is to aggregate data across sections, ensure that the assessment is the same in all sections (same assignment and grading process)  Make certain faculty collaboration underpins assessment across multiple course sections  Tell students which assignment(s) is being used for SLO assessment as well as course assessment…Why?

20 Indirect Methods Methods requiring the faculty and student life staff to infer actual student abilities, knowledge, and values rather than observing direct evidence of learning or achievement Indirect assessment is gathering information through means other than looking at actual samples of student work... e.g., surveys, exit interviews, and focus groups Indirect methods provide perceptions of students, faculty, or other people (often alumni or employers) who are interested the program, service, or institution Indirect methods expand on or confirm what is discovered after first using direct methods

21 Indirect Methods, Continued Exit interviews and Student Surveys - to provide meaningful assessment information, exit interviews and/or student surveys should focus on students’ perceived learning (knowledge, skills, abilities) as well as students’ satisfaction with their learning experiences; including such things as internships, participation in research, independent projects, numbers of papers written or oral presentations given, and familiarity with discipline tools

22 Faculty Surveys aimed at getting feedback about faculty perceptions of student knowledge, skills, values, academic experiences, etc. Alumni Surveys aimed at evaluating perceptions of knowledge, skills, and values gained while studying in a particular program... surveys frequently target alumni who are 1-and 5- years post-graduation and include program-specific questions Indirect Methods, Continued

23 Surveys of Employers / Recruiters aimed at evaluating specific competencies, skills, or outcomes Tracking Student Data related to enrollment, persistence, and performance... may include graduation rates, enrollment trends, transcript analysis (tracking what courses students take and when they take them), and tracking student academic performance overall and in particular courses Indirect Methods, Continued

24 External Reviewers provide peer review of academic programs and the method is a widely accepted in assessing curricular sequences, course development and delivery, as well as faculty effectiveness... using external reviewers is a way to assess whether student achievement reflects the standards set forth in student learning and capacity outcomes... skilled external reviewers can be instrumental in identifying program strengths and weaknesses leading to substantial curricular and structural changes and improvements Indirect Methods, Continued

25 Curriculum and syllabus analysis – Examining whether the courses and other academic experiences are related to the stated outcomes... often accomplished in a chart or “map.” Indirect Methods, Continued Syllabus analysis is an especially useful technique when multiple sections of a course are offered by a variety of instructors... provides assurance that each section covers essential points without prescribing the specific teaching methods used in helping the students learn the outcomes

26 Keeping records or observing students' use of facilities and services... data can be correlated with test scores and/or course grades Example: Logs maintained by students or staff members documenting time spent on course work, interactions with faculty and other students, internships, nature and frequency of library use, computer labs, etc. Indirect Methods, Continued

27 Advantages of Indirect Methods Relatively easy to administer Provide clues about what could/should be assessed directly Able to flesh out subjective areas direct assessments cannot capture Particularly useful for ascertaining values and beliefs Surveys can be given to many respondents at the same time

28 Indirect Methods Advantages, Continued  Surveys are useful for gathering information from alumni, employers, and graduate program representatives Exit interviews and focus groups allow questioning students face-to-face; exploring and clarifying answers is done more easily External reviewers can bring objectivity to assessment and answer questions the program or department wants answered or questions based on discipline-specific national standards

29 Disadvantages of Indirect Methods Indirect methods provide only impressions and opinions, not “hard” evidence on learning Impressions and opinions may change over time and with additional experience Respondents may tell you what they think you want to hear Survey return rates are often low and, consequently, not representative

30 Indirect Methods Disadvantages, Continued  You cannot assume those who did not respond would responded in the same way as those who did respond  Exit interviews take considerable time to complete  Focus groups usually involve a limited number of respondents who are not representative  Unless the faculty agree upon the questions asked during exit interviews and focus groups, there may not be consistency in responses

31 Suggestions for Implementing Indirect, Ancillary Assessment  Use “purposeful samples” when it is not possible to include all students (which is always the first choice)  Offer incentives to participants  Anticipate low turn-out and therefore over-recruit  Plan carefully logistics and question design (i.e., surveys, interviews, focus groups)  Train group moderators and survey interviewers

32 Implementation Suggestions, Continued  Consider using web-based or telephone as well as face-to-face interviews or focus groups  Set time limits for focus groups and interviews  Develop and provide very careful, explicit directions  Be wary of FERPA regulations when using archival records  Only use archival records that are relevant to specific outcomes

33 Capitalize on what you are already doing Capitalize on what you are already doing Integrate imbedded assessment as much as possible Integrate imbedded assessment as much as possible Schedule ancillary assessment during regular class times or times when students are present Schedule ancillary assessment during regular class times or times when students are present Make assessment a graduation requirement Make assessment a graduation requirement Plan an “assessment day” Plan an “assessment day” Seek to make assessment a routine activity within your curriculum or student services programs Seek to make assessment a routine activity within your curriculum or student services programs Implementing Assessment in General

34 REVIEW: Assessment Strategy Combinations Imbedded, direct assessment Imbedded, direct assessment Imbedded, indirect assessment Imbedded, indirect assessment Ancillary, direct assessment Ancillary, direct assessment Ancillary, indirect assessment Ancillary, indirect assessment Depending on the specific SLO, there are four assessment strategies or frames: REMEMBER: There is more than one way to assess any given SLO! It’s your choice as long as it is valid and reliable.


Download ppt "Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost."

Similar presentations


Ads by Google