Presentation is loading. Please wait.

Presentation is loading. Please wait.

Student Learning Objectives Pilot Test

Similar presentations


Presentation on theme: "Student Learning Objectives Pilot Test"— Presentation transcript:

1 Student Learning Objectives Pilot Test
SLO Learning Goals and Quality Assessment Aurora Public Schools Fall 2013

2 Introductions Center for Transforming Learning and Teaching
Catalyzing and co-creating the transformation of learning environments through the use of assessment so that all are engaged in learning and empowered to positively contribute in a global society. Facilitator/Trainer: Julie Oxenford O’Brian Coach/Trainer: Mary Beth Romke The organization providing this professional development today is the Center for Transforming Learning and Teaching. CTLT builds on and extends the work of the C2D3 project which ended this August. If you’d like to learn more about CTLT visit our web site. Facilitator: Introduce yourselves to your group—highlight 3-4 key points about your career/expertise Have table groups introduce selves with one thing they know about themselves as learners

3 Check-In Check-in with your table group:
How was your experience identifying an SLO Learning Goal? Were you able to specify associated standards, determine the cognitive complexity, write your goal as an objective statement, and determine if the goal represents the learning needs of your students? Did you try-out engaging your students with success criteria? Capture remaining questions about identifying SLO Learning Goal(s) on a sticky note

4 Purpose of Session Two Finalize SLO Learning Goals. Introduce the roles of Assessment in the SLO Process and the key characteristics of Quality Assessment.

5 SLO Components Day One Day Two Day Three and Four Day Five Day Three
Learning Goal Standards Reference Rationale Success Criteria Measures Evidence Sources Alignment of Evidence Collection and Scoring Performance Targets Baseline Data Performance Groups Rationale for Targets Progress Monitoring Check Points Progress Monitoring Evidence Sources Instructional Strategies SLO Results Student Performance Results Targets Met Teacher Performance Day One Day Two Day Three and Four Day Five Day Three Day Five

6 Materials I’d like to take you through a quick tour of your materials.
Have folks put a sticky note on their Notecatcher, let them know that this document will be the one they will continually go back to during the day today.

7 Learning Outcomes: Session Two
Finalize an SLO Learning Goal including writing a rationale for the goal. Understand the role of assessment in SLOs. Define assessment and the key components of assessment. Identify a variety of methods (informal and formal) for collecting data about student learning. Describe the relationship between the method of assessment used and the information gained. Identify baseline data sources. Engage in learning activity during this session. Complete follow-up readings and tasks.

8 Activity: Monitoring your learning
Turn to Progress Monitoring (Note catcher, p. 2-3). Re-write today’s learning outcomes in language that has meaning for you. Create a bar graph which describes where you currently believe you are in relationship to each learning target. Leave the “reflections” column blank for now. Learning Target I don’t know what this Is I need more practice I’ve got It I could teach some-one about it Reflections Identify a variety of methods (informal and formal) for collecting data about student learning. In my words: I can describe how to collect learning data and list several different options.

9 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

10 SLO Learning Goal Process
Day One Identify the “big ideas” for the grade level and content area. Identify learning goals associated with at least one “big idea” that would be achieved across several units, and/or which have related goals in prior or subsequent grade levels. These become candidates to be the SLO Learning Goal. Determine which standards are associated with each candidate SLO Learning Goal. Prioritize possible Learning Goals based on the learning needs of the student population (identifying two or three top priorities). Determine the cognitive complexity (depth of knowledge) of the priority SLO Learning Goals. Eliminate candidate SLO learning goals with a depth of knowledge less than 3 for secondary and less than 2 for elementary. Select the SLO Learning Goal. Describe the rationale for your selection.

11 SLO Rubric A tool for evaluating the quality of SLOs. Used by:
Teachers in the development of each SLO component. Supervisors as they vet SLOs with teachers. District leaders to investigate the quality of SLOs being developed. SLO Pilot will try-out this “draft” tool.

12 SLO Rubric Consider your SLO Learning Goal.
Is your SLO Learning Goal consistent with the Learning Goal component definition on the Rubric? Does it meet the criteria for “acceptable quality”? Note: you should not have completed a rationale yet. Take a few minutes to make any needed revisions to meet the “acceptable quality” criteria.

13 Effective Feedback is Clear, descriptive, criterion-based, and indicates: √ how the learning goal differed from that reflected in quality criteria, and √ how the receiver of the feedback can move forward (what they might do next to improve).

14 Provide feedback about SLO Learning Goals
Choose a partner (different grade level and/or content area). Exchange your SLO Learning Goals with your partner. Consider: To what degree does her/his Learning Goal meet the acceptable quality criteria? Is it clear how the Learning Goal relates to the identified standards? Is the Learning Goal at an appropriate level of cognitive rigor (DOK level)? How could the components of the Learning Goal be improved? Share your feedback with your partner.

15 SLO Learning Goal Rationale
Acceptable Quality Criteria for Rationale: Clearly explains why the learning goal is an appropriate focus or need for students to learn. Clearly explains how the learning goal addresses high expectations (DOK no less than 3 for secondary and no less than 2 for elementary).

16 Learning Goal Rationale Outline
Justify that your SLO learning goal is at the right level (it is an educational objective). State how cognitively complex (as measured by Depth of Knowledge) and that it is a DOK>=3 for secondary and DOK>=2 for elementary. Describe the data that justifies the learning goal is a need for the identified student population.

17 Practice: Your Rationale
Write a rationale for your SLO Learning goal. Capture your rationale: On the SLO form or In the note catcher for today (p. 3).

18 Share your Rationale Stand up and find someone you haven’t spoken with today. Share your SLO Learning Goal statement and your rationale. Provide just-in-time feedback to your partner about his/her rationale. Make any needed revisions to your rationale.

19 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

20 Defining Educational Assessment
What is assessment? Write your working definition of assessment in your note catcher (p. 4). Activating peers as resources: Find a partner Share your definition Update your definition (if appropriate)

21 Defining Educational Assessment
Terms used synonymously in education: assessment, educational measurement, and testing. Educational Assessment is. . . A process by which educators use students’ responses to specially created or naturally occurring stimuli to draw inferences about the students’ knowledge and skills. A process of reasoning from evidence. Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

22 Assessment Components
The aspect(s) of student learning that are to be assessed (cognition). The tasks used to collect evidence about students’ achievement (observation). The approach used to analyze and interpret the evidence resulting from the tasks (interpretation). Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

23 Assessment Triangle Cognition Observation Interpretation (Tools, p. 1)
Pellegrino, J., Chudowsky, N., and Glaser, R. Eds. (2001). Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press.

24 Assessment Quality Work with your table group to list three considerations for assessment quality. Capture in your note catcher (p. 6). Prepare to share your list. . .

25 Characteristics of Quality Assessment
Accuracy The assessment instrument measures what it is supposed to measure. Consistency Multiple data sources result in the same inferences. Fairness (bias) All students can access the materials in the assessment instrument and have the chance to show what they know. Motivation Students want to show what they know.. Instructional importance and utility The use(s) for the results justify the investment of time and effort involved. (Tools, p. 6)

26 Testing Axioms Turn to Testing Axioms (Tools, p. 5).
Talk with a partner about the following: Do you agree/disagree with each axiom? What are the implications for using externally developed tests for SLOs? What are the implications for other classroom uses of test results? Grading? These axioms guide most large-scale assessment development (e.g. TCAP, Interim assessments).

27 Assessment Results  Learning
Assessment results measure learning, but are not direct observations of learning. All assessment instruments measure only a sample of the learning we care about. All assessment results include “error” in their measurement of students’ learning. Increasing assessment quality = reducing the error in our measurement of students’ learning.

28 Quality Assessment Criteria
Select a partner and turn to the Quality Assessment Criteria. Individually and silently read the first row of the quality criteria. Turn to your partner and “say something” about the criteria: A summary of what you have read. A connection to something else. An elaboration or explanation of what you have read. Silently read the next row of quality criteria. Continue until you have read and “said something” about each of the quality criteria.

29 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

30 Assessment in Student Learning Objectives
As part of the SLO Process, we use multiple evidence sources (data collected from a variety of assessment instruments) to “reason from evidence” about: Student learning in relationship to our Learning Goal at the beginning of the instructional interval (baseline data). Student progress towards the Learning Goal during the instructional interval (progress monitoring/formative assessment). Student learning in relationship to our Learning Goal at the end of the instructional interval (summative assessment). Teacher contribution to Student Learning Growth (aggregation of results across students in the class/course).

31 Levels of Objectives Level of Objective Global Educational
Lesson Objectives or Targets Levels of Objectives SLO Learning Goals Level of Objective Global Educational Instructional Scope Broad Moderate Narrow Time needed to learn Two or more years (often many) Weeks, months, or academic year Hours or days Purpose or function Provide vision Design curriculum Prepare lesson plans Example of use Plan a multi-year curriculum (e.g. elementary reading) Plan units of instruction Plan daily lessons, activities, experiences and exercises A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom’s taxonomy of educational objectives, 2001 Tools, p. 7

32 Black, Harrison, Lee, Marshall & Wiliam, 2003
Definitions of Formative Assessment “An assessment activity can help learning if it provides information to be used as feedback by teachers, and by their students in assessing themselves and each other, to modify the teaching and learning activities.” “Formative assessment is a planned process in which assessment-elicited evidence of students’ status is used by teachers to adjust their ongoing instructional procedures or by students to adjust their current learning tactics.” The authors of one of the studies had this to say in defining the kind of assessment practices (at the classroom level) that help to transform students into learners. In his new book, Transformative Assessment, Popham adds the idea that formative assessment is a planned process. Read quotes. Ask them if they would add anything to quotes? Like parents as users of the assessment information. How do these quotes fit with their ideas? Buzz for 5 min Black, Harrison, Lee, Marshall & Wiliam, 2003 Popham, 2008 Tools, p. 9

33 Formative Assessment Episode
Determine the learning goal/target. Gather/collect information about learning (in relationship to the target(s)). Analyze and interpret the gathered information about learning. Use the learning information to improve teaching and/or learning. Any assessment includes three distinct activities. Read Slide. Today we are moving back to how we gather information about learning.

34 Formative Assessment Episodes
Learning Goal/Target Collecting Learning Information Analyzing Learning Information Interpreting Learning Information Using Learning Information Oxenford-O’Brian, 2013 Tools, p. 11

35 Summative vs. Formative Assessment
Ranking/ Sorting Certifying Competence Grading Accountability Collecting data about learning Analyzing & Interpreting Questioning Clarifying Targets w/ Learners Providing Useful Feedback Self- & Peer- Assessment Setting Goals & Monitoring Progress Planning & Evaluating Instruction Adjusting Learning Activity Defining the Learning Target (s) Tools, p. 15

36 Assessment in the SLO Form
Take out the SLO Form and SLO Component Descriptions. Where in the form will you capture information about assessment occurring as part of the SLO Process? Measuring and Scoring - how you will observe and interpret student learning at the end of the instructional interval. Performance Targets/Baseline Data – how you will observe and interpret student learning at the beginning of the instructional interval. Progress Monitoring – how you will observe and interpret student learning during the instructional interval (progress towards the Learning Goal(s)). Results – how student learning results are aggregated into a teacher performance rating.

37 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

38 Data Collection Methods
How we “collect data” determines our assessment method. Use sticky notes to write down all of the strategies you currently use to collect data about student learning. List as many as you can, capturing one per sticky note.

39 Jigsaw Reading: Collecting Data
Select a partner and assign readings (one per person): Evidence of Learning (Davies, 2000) – Tools, p. 21. Assessment, Testing, Measurement and Evaluation (Russell & Airasian, 2012) – Tools, p. 27. As you read, highlight: Assessment Methods: descriptions of different categories of data collection techniques or sources of evidence. Examples of strategies for each data collection method. Share the descriptions and examples with your partner.

40 Data Collection Strategies
Work as a table group. Group your data collection strategies (sticky notes) into the following categories of data collection methods: Observation Questioning Student Products Put similar strategies together.

41 Student Products – additional categories
Not all student products yield the same type of data about learning. Additional “assessment methods” that can be part of student products include: Selected Response Short Constructed Response Extended Constructed Response Performance/Demonstration Portfolio

42 Informal vs. Formal Methods
Informal Assessment Methods: Formal Assessment Methods: Collected in the moment Take less time May or may not be planned ahead of time Individual, small group, or full class May or may not result in documented evidence Observation and Questioning Structured Take more time Planned in advance Usually full class Result in documented evidence of student learning Student Products

43 Assessment Methods Continuum
Informal Formal Portfolio Complexity of information Demonstration or Performance Time Extended Constructed Response Short Constructed Response Selected Response Observation Questioning (individual, group, full class) Student Products Tools, p. 33

44 Organizing based on Continuum
Sort your “student product” strategy examples into the additional categories of the continuum. You may need to clarify some of your examples. If a strategy doesn’t fit into one of the categories, put it in an “other” category. Turn to the Assessment Methods Continuum (Note Catcher, p. 9-10). Make notes about assessment methods: Clarifications about the category Example strategies

45 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

46 Accuracy  Alignment Are the data we collect providing information about the learning goals we care about? This is often referred to as “alignment”. Alignment includes: To what degree do the assessment tasks/items include the type of thinking/skills included in the learning goal? To what degree do the assessment tasks include the knowledge/concepts included in the learning goal? Are the assessment tasks as cognitively complex (DOK) as the learning goal?

47 Accuracy Starts with Learning Goal
Accuracy Starts with Learning Goal Accurate assessment depends on knowing the kind of thinking and the complexity of the thinking that is being asked of students by the learning goal or target. Clarifying the type of thinking and cognitive complexity of learning goals/targets helps us to better select a method of assessment that measures what we’re looking for.

48 This means. . . Deconstruct the learning goals/targets (identifying the skills/type of thinking and the content/knowledge). Categorize the type of thinking required by the learning goal/target (using Revised Bloom’s Taxonomy). Establish the cognitive complexity of the learning goal/target (using the Depth of Knowledge Framework). Remember we already did this!

49 Appropriate Assessment Method(s)
Once we are clear on the thinking and depth of knowledge required by a learning goal/target (deconstructing). We can better determine what assessment methods to use to collect data about student learning in relationship to the goal/target. Every assessment method is not equally accurate for assessing every type of goal/target.

50 Aligning Learning Goals and Assessment Methods
Aligning Learning Goals and Assessment Methods Use the “Learning Goal to Method Match” blank table (Note Catcher, p ). Fill in why you think each cell represents a “match” or not. Compare your completed table to a table of a partner. Prepare to share out questions/conflicts.

51 Cognitive Processes vs. Assessment Methods
Type of Learning Target Assessment Method Observation Questioning Selected Response Short Constructed Response Extended Constructed Response Demonstration or Performance Remember Only if student talk is factual Yes if questions are factual Good for assessment remembering facts Good for assessing remembering facts Good for assessing conceptual knowledge Too time consuming, hard to distinguish specific gaps Understand Yes Yes if questions are about understanding Only for factual knowledge Possibly May be difficult to distinguish specific gaps Apply Difficult to use for this type of thinking Possibly, may be difficult Analyze Yes if questions are about analysis Evaluate Yes if questions are about evaluation Create No No (only to assess pre-requisite knowledge) Only if what is being created is a written product Best Method (in general) Tools, p. 35

52 Depth of Knowledge vs. Assessment Methods
Type of Learning Target Assessment Method Observation Questioning Selected Response Short Constructed Response Extended Constructed Response Demonstration or Performance DOK 1: Recall and Reproduce Only if student talk is factual Yes if questions are factual Good for assessing remembering facts Yes, for reproducing procedures. Too time consuming, hard to distinguish specific gaps DOK 2: Skills and Concepts Yes Yes, depending on the questions Possibly Good for assessing some skills DOK 3: Strategic Thinking/ Reasoning Yes, depending on the question Difficult to use for this complexity of thinking DOK 4: Extended Thinking Difficult to use for this complexity of content Difficulty to use for this complexity of thinking Tools, p. 36

53 Activity: Practice Matching Learning Goals to Assessment Methods
Work with your content/grade level group. Take out your SLO Learning Goal(s). Use the “Learning Goal and Assessment Methods Match” table (Note Catcher, p. 11). Identify the assessment method(s) you will use for your SLO Learning Goal(s) and explain why.

54 Day Two Agenda Finalize SLO Learning Goals Quality Assessment Practice
Using Assessment for SLOs Data Collection Methods Goal Method Match Baseline Data Sources Materials: Talking Points: Activity: Facilitator Notes: Version 1.0

55 What is baseline data? Student learning data collected before or at the beginning of the instructional period. Measures of student learning that relate to your SLO learning goal. Could include: TCAP results (by student) from last year for current students. District interim/benchmark assessment results from beginning of the year. Results from other district-or school-wide assessments. Results from classroom assessments.

56 Why analyze baseline data?
Evaluate how much initial student performance varied at the beginning of the instructional period. Determine if students can/should be put into more than one group based on their initial performance. Establish a “baseline” from which student learning growth can be measured for different performance groups. Not to establish an initial score for every student.

57 What evidence do you have?
Talk with a partner. . . What sources of evidence (assessment results) are available about student learning in relationship to your SLO Learning Goal from the beginning of the instructional interval? How closely does each evidence source align with the SLO learning goal? How formal was the data collection method? When was data collected? How was it scored? How many evidence sources do you have?

58 Triangulation

59 How much baseline data? Consider all of your evidence sources about student learning in relationship to your SLO Learning Goal collected before or near the beginning of the instructional period. Prioritize them. List your top three evidence sources on the “Baseline Data” chart (Note Catcher, pg. 14). Describe your level of confidence in your top three evidence sources.

60 Analyzing baseline data
For each evidence source (Baseline data handout): Determine what scores or metrics are provided by the evidence source? Describe the performance of the student population, or the class. (e.g. 80 % of the students were proficient; 15% were partially proficient; and 5% were unsatisfactory) Consider the range of student performance (low to high). Is the variability in student performance enough to form more than one group of students based on their performance? If yes, describe the performance of the groups of students (2-4).

61 Combining evidence sources
In your Note Catcher (pg. 14): Identify the number of performance groups you will have. Identify a label for each (e.g. low, medium, high). Describe student performance for each evidence source by performance group. Create a combined description of student performance for each performance group.

62 Assign Students to Performance Groups
The simplest case: one performance group: Student performance does not vary Baseline student performance can be characterized for the student population as a whole. More than one performance group: Assign students (by name) to each performance group. How will you assign students form whom performance was inconsistent across evidence sources? How will you assign students for whom all baseline data is not available? Use Performance Group Descriptions chart (Note Catcher, p. 16).

63 Before we see you again. . . Identify appropriate assessment methods for your SLO Learning Goal example. Bring at least one example instrument (if available) for the content area for your example learning goal(s).

64 Reflect and Consider your Learning
Return to your Progress Monitoring (Note Catcher). Did you move to the right in your self- assessment? Add to your graph. Make any notes about your own learning in the “reflections” column.

65 Give us Feedback!! Oral: Share one ah ha! Written: Use sticky notes + the aspects of this session that you liked or worked for you. The things you will change in your practice or that you would change about this session. ? Question that you still have or things we didn’t get to today Ideas, ah-has, innovations Leave your written feedback on the parking lot.


Download ppt "Student Learning Objectives Pilot Test"

Similar presentations


Ads by Google