Presentation is loading. Please wait.

Presentation is loading. Please wait.

Interpreting and Using the ACT Aspire Interim Results

Similar presentations


Presentation on theme: "Interpreting and Using the ACT Aspire Interim Results"— Presentation transcript:

1 Interpreting and Using the ACT Aspire Interim Results
Dr. Jorge Peña, Director of School Improvement and Accreditation Archdiocese of Chicago November 2017

2 Workshop Objectives Objective 1- Build assessment literacy: understand the skills the Interim assessments measure Objective 2- Build assessment literacy: understand how results are reported Objective 3- Use results to make adjustments to instruction

3 ACT Aspire Periodic Technical Manual
Build Assessment Literacy: Understand the Skills the Assessment Measures Scavenger Hunt ACT Aspire Periodic Technical Manual

4 Debrief Scavenger Hunt
What skills do the Interim tests measure? What did we learn about the complexity and rigor of the tasks the Interim tests measure?

5 What skills do the Interim tests measure?
Reading: Key Ideas and Details, Craft and Structure – Inference and Knowledge Integration Math: Geometry, Number and Operations in Base 10, Operations and Algebraic Thinking, Grade Level Progress (17-20), Number and Operations – Fractions, Modeling, Integrating Essential Skills, Measurement and Data and FOUNDATIONS – questions from the year before (8-10) Science: Interpretation of Data, Evaluation of Models, Inferences, and Experimental Results, Scientific Investigation

6 What did we learn about the complexity and rigor of the tasks the Interim tests measure?
DOK (Depth of Knowledge) DOK 1 – Low on Blooms Taxonomy (Remembering and Understanding) DOK 2 – Mid Blooms Taxonomy (Applying and Analyzing) DOK 3 – High Blooms Taxonomy (Evaluation and Creation)

7 Depth of Knowledge: Reading Interim
DOK 1 – 20-35% (at most 4 questions) DOK 2 – 40-70% DOK 3 – 40-70%

8 Depth of Knowledge: Math Interim
DOK 1 – 12-21% DOK 2 – 44-51% DOK 3 – 32-40%

9 Depth of Knowledge: Science Interim
DOK 1 – 5-20% DOK 2 – 43-62% DOK 3 – 20-33%

10 Workshop Objectives Objective 1- Build assessment literacy: understand the skills the Interim assessments measure Objective 2- Build assessment literacy: understand how results are reported Objective 3- Use results to make adjustments to instruction

11 Ladder of Inference I observe objectively Helps stay focused on evidence

12 Ladder of Inference I select data from what I observe & I add meaning to what I have selected I observe objectively Helps stay focused on evidence

13 I make assumptions based on the meaning I have added
Ladder of Inference I make assumptions based on the meaning I have added I select data from what I observe & I add meaning to what I have selected I observe objectively Helps stay focused on evidence

14 I draw conclusions which prompt feelings
Ladder of Inference I draw conclusions which prompt feelings I make assumptions based on the meaning I have added I select data from what I observe & I add meaning to what I have selected I observe objectively Helps stay focused on evidence

15 I take action based on my beliefs and feelings
Ladder of Inference I take action based on my beliefs and feelings I draw conclusions which prompt feelings I make assumptions based on the meaning I have added I select data from what I observe & I add meaning to what I have selected I observe objectively Helps stay focused on evidence

16 Ladder of Inference Climbing too quickly up the ladder
I Select: some data - ACT Aspire Interim results - Conversations with teachers

17 Ladder of Inference Climbing too quickly up the ladder
I add: interpretation - Students are doing poorly in reading I Select: some data - Quarterly reading benchmarks - Conversations with teachers

18 Ladder of Inference Climbing too quickly up the ladder
I draw: conclusions - Curriculum is not adequate I add: interpretation - Students are doing poorly in reading I Select: some data - Quarterly reading benchmarks - Conversations with teachers

19 Ladder of Inference Climbing too quickly up the ladder I take: actions
-We need a new reading curriculum I draw: conclusions - Curriculum is not adequate I add: interpretation - Students are doing poorly in reading I Select: some data - Quarterly reading benchmarks - Conversations with teachers

20 Ladder of Inference Disciplined Approach I Select: some data
- ACT Aspire Interim results - Quizlets - Teachers’ notes from guided reading groups

21 Ladder of Inference I add: interpretation
Disciplined Approach I add: interpretation - Analysis of Quizlets and Interim results indicate high students errors are due to guessing words I Select: some data - ACT Aspire Interim results - Quizlets - Teachers’ notes from guided reading groups

22 Ladder of Inference I draw: conclusions
Disciplined Approach I draw: conclusions - Students are not regularly applying word-level decoding skills I add: interpretation - Analysis of Quizlets and Interim results indicate high students errors are due to guessing words I Select: some data - ACT Aspire Interim results - Quizlets - Teachers’ notes from guided reading groups

23 Ladder of Inference I take: actions
Disciplined Approach I take: actions - Several possible actions (analyze the complexity of tasks, re-teach skill), but should be tied specifically to the multiple sources of data. I draw: conclusions - Students are not regularly applying word-level decoding skills I add: interpretation - Analysis of Quizlets and Interim results indicate high students errors are due to guessing words I Select: some data - ACT Aspire Interim results - Quizlets - Teachers’ notes from guided reading groups

24

25

26 Report: Subject Proficiency by Group 30,000 feet altitude view

27 Understand the Interim to Summative Concordance Tables

28 Report: Subject Proficiency by Group 30,000 feet altitude view

29 Report: Subject Proficiency by Group 30,000 feet altitude view

30 Report: Subject Proficiency by Student 15,000 feet altitude view

31 Report: Student Performance 10,000 feet altitude view

32 Report: Skills Proficiency by Student 5,000 feet altitude view

33 Report: Response/Content Analysis Plane has landed

34 Report: Response/Content Analysis

35 Report: Response/Content Analysis

36 Ensure you are at the “school” level

37 Interim Reports- Educator

38 Workshop Objectives Objective 1- Build assessment literacy: understand the skills the Interim assessments measure Objective 2- Build assessment literacy: understand how results are reported Objective 3- Use results to make adjustments to instruction

39 Adjusting Instruction and Improving Student Learning
There are only three ways to improve student learning. The first is to increase the level of knowledge and skill the teacher brings to the instructional process. The second is to increase the level of complexity of the content that students are asked to learn. And the third is to change the role of the student in the instructional process. That’s it. If you are not doing these three things, you are not improving instruction and learning. Source: Instructional Rounds in Education (2009). Pg. 24 Table Talk: what resonates with you with this quote?

40 The teacher and the student in the face of the content.
The relationship between the teacher, the student, and the content- not the qualities of any one of them by themselves. Increases in student learning occur only as a consequence of improvements in the level of content, teachers’ knowledge and skill, and student engagement. If you change any single element of the instructional, you have to change the other two.

41 Tasks predict student performance
Think of the task as the “ceiling” of what we would expect students to know Tasks with low cognitive demands generate low cognitive student responses Tasks with high cognitive demands generate high cognitive student responses Task are high leverage because they predict student performance University of Chicago’s Consortium on School Research

42 Harvard Graduate School of Education researchers analyzed tasks for each grade level
X-axis is the grade level, 0 is Kindergarten Y-axis is the average assignment rating for the tasks assigned to student

43 We expect students to be assigned tasks that match the grade level, in other words grade 5 students are assigned tasks for grade 5

44 The orange line shows ratings for the task
The orange line shows ratings for the task. In grade 5, the average task was rated a 4.34 which is grade 4, in the third month What do you notice about the average assignment rating for the grade levels?

45 I notice that primary grade tasks match the grade level

46 I also notice gaps in grade level tasks ratings emerge in grade 4 through grade 8. As we know students are taught how to read in grades K through 3, then instruction shifts in grade 4 to read to learn.

47 I also notice when students enter high school, the grade level task rating increases by a year and half

48 Consider Levels of Complexity: Higher Order Thinking Skills

49 Consider Levels of Complexity: Bloom’s Taxonomy

50 Consider Levels of Complexity: Webb’s Depth of Knowledge

51 Whole-group or Small-group instruction: What percent of students should demonstrate proficiency on a skill in order to determine continuing whole-group instruction or providing small-group instruction? The optimal percent of proficiency to determine adjusting to small-group instruction is 66%.

52 What is the success rate?
The optimal success rate for fostering student achievement is 80%. A success rate of 80% shows that students are learning the material, and it also shows that students are challenged.

53 Questions to answer when analyzing a task
What is the actual work that students are being asked to do? What do you have to know in order to engage the task? What is the actual product of the task? What is the distribution of performance among students in the class on the task? If you were a student and did the task, what would you know and be able to do? Grade-level meetings or faculty meetings. Analyze tasks on the skills on the CSIP.

54 Questions to answer when analyzing a task
What is the actual work that students are being asked to do? What do you have to know in order to engage the task? What is the actual product of the task? What is the distribution of performance among students in the class on the task? If you were a student and did the task, what would you know and be able to do?

55 Analyze an ACT Aspire Interim test item
Use the Response/Content Analysis Report Identify a skill with a proficiency rate less than 66% Analyze a test item for that skill by answering the five questions: What is the actual work that students are being asked to do? What do you have to know in order to engage the task? What is the actual product of the task? What is the distribution of performance among students in the class on the task? If you were a student and did the task, what would you know and be able to do?

56 Review Next Steps: Bring training back to the faculty at school Build assessment literacy- identify the skills the ACT Aspire Interim assessments measure Build assessment literacy- understand how results are reported Collaborate with teachers to analyze ACT Aspire Interim results Compare/contrast the complexity of tasks between the ACT Aspire Interim test items and student tasks assigned by teachers Identify the skills that require small-group instruction for re- teaching Re-teach the skills that have a proficiency rate less than 66% Document and showcase the improvement journey: meeting agendas with notes, pictures, videos questions to Dr. Jorge Peña,


Download ppt "Interpreting and Using the ACT Aspire Interim Results"

Similar presentations


Ads by Google