Presentation on theme: "Local Assessment Validity Study Rayne A. Sperling Jonna M. Kulikowich Penn State Research Team."— Presentation transcript:
Local Assessment Validity Study Rayne A. Sperling Jonna M. Kulikowich Penn State Research Team
October 17, 20082 Penn State Research Team Rayne A. Sperling, Associate Professor of Education Jonna M. Kulikowich, Professor of Education Crystal Ramsay, Graduate Research Assistant Toni Betaudier, Graduate Research Assistant Kelli Higley, Psychometrics and Assessment Assistant Whitney Zimmerman, Assessment Assistant
October 17, 20083 Expert Advisory Board Julie Coiro, Assistant Professor, University of Rhode Island Kim Gattis, American Institute of Research Nell Sedransk, Associate Director, National Institute of Statistical Sciences
October 17, 20084 Regulation Guiding the Research “Students shall demonstrate proficiency in reading, writing and mathematics on either the State assessments administered in grade 11 or 12 or local assessment aligned with academic standards and State assessments under § 4.52 (relating to local assessment system) at the proficient level or better to graduate.”
October 17, 20085 Initial Questions For those students who failed to reach proficiency on the PSSA during the 11 th grade administration or the 12 th grade retake administration, what are the local assessments used to measure proficiency? – Description of assessments used – Greater understanding of the alignment of local assessments to proficiency statements – Greater understanding of district practices
October 17, 20086 Collecting Local Assessments Two requests by PDE (7/28/09, 8/12/08) Materials and practices submitted by October 7, 2008 included in the study Date of receipt stamped on submissions Documents boxed and shipped to Penn State – 100% agreement between documents received by PDE and those examined by Penn State Documents stored by randomly-generated ID codes in Penn State research suite
October 17, 20087 Database (~85% return rate) – Set-up and analysis plan – School district information (non- exhaustive) School & school district Location of district Size of district Percent proficient on PSSA Student characteristics – Local assessment information: Alignment – Local assessment information: Practices
February, 20099 Purpose Local Assessment Information – Nature of the materials (non-exhaustive) Commercially-published materials – Tests – Remedial programs Intermediate Unit assessments District-created tests Teacher-created tests
October 17, 200810 Phase I Local Assessment Information – Nature of the materials (con.) Item and task types – Multiple-choice items – Matching – Essays – Performance Tasks – Portfolios Courses and curriculum Preparation programs and tutoring programs
October 17, 200811 Phase I Local Assessment Information – Nature of the district practices (non- exhaustive). Some districts: Enroll students in a course. Enroll students in a tutoring program. Use senior project to demonstrate proficiency. Use course enrollment and final grade to demonstrate proficiency.
October 17, 200812 Phase I Local Assessment Information – Nature of the district practices (cont.) Some districts: Report they are starting to work on a PSSA practice. Use individualized computer-based assessment plans. Retest students on tests and/or individual anchors. Do NOT include PSSA proficiency as part of graduation requirements.
October 17, 200813 Phase I Research Goals for Phase I – Catalogue and describe local assessments submitted – Describe reported practices related to PSSA proficiency as a graduation requirement
October 17, 200814 Phase I Do the local assessments align with PSSA academic standards in Mathematics and in Reading? – Convene Expert Panels (October 27-29, 2008). – Panel teams for Mathematics – Panel teams for Reading – Rubric for alignment to standards (4-point scale) – Rubric for evaluation of appropriateness of practices as a measure of proficiency (4-point scale)
October 17, 200815 Phase I Standards-alignment rubric (to be informed through the expert panels) Proficiency columns from the PSSA standards – 0 = No content areas represented; no alignment of outcomes to standards. – 1 = Some content areas represented; some outcomes are aligned. – 2 = Many to most content areas represented; most outcomes are aligned. – 3 =All content areas represented; all outcomes are aligned.
October 17, 200816 Phase I District practices rubric (to be informed through the expert panels) – 0 = No, this type of course/curriculum is not suitable to evaluate proficiency. – 1 = The sequence of practices are not clear. Some tests used could become part of an assessment system that can evaluate proficiency. However, more information is needed. – 2 = There are some very good elements in the sequence of practices, but improvement is still possible in developing an assessment system that can evaluate proficiency. – 3 = Yes, the sequence of steps or procedures used in the local assessment system is clear. Further, the local assessment can be used to evaluate proficiency of student performance.
October 17, 200817 Phase I Report (Due Date of Preliminary Draft: Nov. 10) Content (non-exhaustive) – Frequency/Percentage tables – Histograms – Cross-tabulation tables – Sequence-of-practice charts – Statistical analyses Descriptive summaries Rank-correlation coefficients Inferential statistics as appropriate
October 17, 200818 Phase II Purposive sample of districts – Representing district characteristics – Representing assessment types and practices Case study design (n = approximately 20 TBD schools)
October 17, 200819 Phase II Site visitation and focused data collection by Penn State Research Team members – Interviews with stakeholders – Collection of artifacts to inform assessment characteristics and use practices Course and curriculum materials Assessments Criteria used to establish proficiency Scoring keys and rubrics Student samples with assignment of scores/ratings
October 17, 200820 Phase II Site visitation and focused data collection by Penn State Research Team members (cont.) – Grounded Theory research design to evaluate interview protocols – Use of descriptive and inferential statistics to determine if proficiency criteria are valid