Presentation is loading. Please wait.

Presentation is loading. Please wait.

Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University

Similar presentations


Presentation on theme: "Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University"— Presentation transcript:

1 Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University

2 Element 2a. Assessment System
Element 2a focuses on the design, development, implementation and evaluation of the unit assessment system. The unit collaborates with members of its professional community in designing and managing an assessment system with multiple and comprehensive measures to help ensure candidate performance, program quality and unit operations. The assessment system should reflect the conceptual framework, professional and state standards. The system is regularly evaluated by the professional community to ensure the fairness, accuracy, consistency, and freedom from bias of its assessment procedures and unit operations.

3 Key criteria for meeting the expectations of Element 2a
The unit's assessment system: Reflects the conceptual framework Identifies comprehensive/integrated measures Monitors candidate performance and unit operations Includes multiple assessments and transition points Includes fair, accurate, consistent assessments with efforts to eliminate bias (including unit operations) Involves the professional education community in the design, development and evaluation of the assessment system

4 Sub-elements of Standard 2a (1)
The professional education unit has an assessment system that reflects the conceptual framework and professional and state standards and is regularly evaluated by its professional community. 8 AFI’s were cited from

5 Sub-elements of Standard 2a (2)
The professional education unit’s system includes a comprehensive and integrated set of assessment and evaluation measures to monitor candidate performance and manage and improve professional education unit’s operations and preparation programs. 4 AFI’s were cited from

6 Sub-elements of Standard 2a (3)
Decisions about candidate performance are based on multiple assessments made at admission into preparation programs, appropriate transition points, and preparation program completion. 0 AFI’s were cited from

7 Sub-elements of Standard 2a (4)
The professional education unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy, and consistency of its assessment procedures and professional education unit operations. 5 AFI’s were cited from

8 Scenario One   Assessments are conducted by various assessors (site supervising teachers and university site supervisors). Full time and part time faculty who have responsibility for completing the assessment instruments meet each semester to discuss the instrument ratings and discuss expectations for instrument use and review rating scales. The supervising teachers are invited to meet each semester with the division chair to discuss instrument use. The unit has not yet adopted a process by which they have determined the accuracy and consistency in use of the instruments they currently use. While there is discussion about what the indicators mean, there have not been any inter-rater studies or activities to confirm consistency in use and accuracy of data. One example of an effort to ensure fairness in the process may be evidenced in the event there is a low rating (3 or below on a 5-point scale) the DSC supervisor may request that another supervisor conduct an independent review to affirm or disaffirm the original ratings.

9 Findings (Scenario One)
Unacceptable, Acceptable, or Target? Acceptable with an Area for Improvement: The unit has not developed a systematic method for assuring that assessments are fair, consistent, and free of bias. Rationale: Although in some situations varied scorers or a second scorer may be used for specific assessments, there have been limited attempts to ensure reliability across all assessments. Additionally, assessments have not been examined for lack of bias.

10 Scenario Two The application of assessments at multiple transitions points provides reliable data about candidates. Data collected regularly through an electronic system for ease in reporting aggregate and individual candidate results. Faculty members engage in discussion of findings from candidate assessments and surveys but the process of using data to inform decisions about candidates and delivery is not systematic or planned.

11 Findings (Scenario Two)
Unacceptable, Acceptable, or Target? Acceptable with an Area for Improvement The program does not have a systematic, planned process for evaluating assessment results, courses, programs and clinical experiences. Rationale: Faculty report a number of examples where results from candidate assessments have informed their decisions about program matters; however, actual analysis of data to determine implications for programmatic improvement occurs irregularly.

12 Scenario Three The unit has identified key assessment to evaluate both unit operations and program quality. The unit plans to use the data from assessments identified below to inform program and unit improvement. Candidate performance and progress is the focus at the program level and unit operations and program quality are the focus at the unit level. While the unit has identified the above as key assessments with frequency indicators for collection of data, there is no articulated plan at this time for how these assessments will be collected, analyzed, used and shared in a systematic manner for program quality and unit operations.

13 Findings (Scenario Three)
Unacceptable, Acceptable, or Target? Acceptable with an area for improvement The professional education unit has not fully operationalized the information technologies they have identified to maintain their assessment system. Rationale: Even though the unit has identified multiple information technology systems (BANNER, LiveText, and Access), there is not a well articulated comprehensive, integrative mechanism for utilization of those systems to collect, analyze, and evaluate data. At the time of the on-site visit, the unit provided an updated data collection schedule and timeline that includes assessment data that will be collected within the respective information technology system however it has not been fully integrated or operationalized.

14 Scenario Four The assessment system was reviewed to ensure the system allowed for effective and efficient student transitions through the program. The evaluation system was adjusted to meet the new timelines for students, but the key transition points remained. Changes were approved by the faculty and taken to the Advisory Council. The unit stated that training is provided to clinical faculty and school personnel in the sue of evaluation tools and rubrics, however there was not consistent evidence in the exhibits or in interviews to validate this statement. While the assessment tools were very consistent in structure around the five domains and the use of a standardized scoring scale; evidence was lacking in the training and using scoring guides to ensure assessment procedures were fair, accurate and consistent.

15 Findings (Scenario Four)
Unacceptable, Acceptable, or Target? Unacceptable at the advanced level

16 Rubric for Evaluating Evidence, pg. 1
Characteristics Performance Sample Evidence Unacceptable Acceptable Target 1 The unit has not involved its professional community in the development of its assessment system. The unit has an assessment system that reflects the conceptual framework and professional and state standards and is regularly evaluated by its professional community. The unit, with the involvement of its professional community, is regularly evaluating the capacity and effectiveness of its assessment system, which reflects the conceptual framework and incorporates candidate proficiencies outlined in professional and state standards. 2 The unit's assessment system is limited in its capacity to monitor candidate performance, unit operations, and programs. The unit's system includes comprehensive and integrated assessment and evaluation measures to monitor candidate performance and manage and improve the unit's operations and programs. The unit regularly examines the validity and utility of the data produced through assessments and makes modifications to keep abreast of changes in assessment technology and in professional standards.

17 Rubric for Evaluating Evidence, pg. 2
Characteristics Performance Sample Evidence 3 The assessment system does not reflect professional, state, and institutional standards. Faculty have access to candidate assessment data and/or to data collection system information Faculty have access to and are trained to understand candidate assessment data and/or to the data collection system. 4 Candidate assessment data are not shared with candidates, nor faculty to help them reflect and improve their performance and preparation programs. Candidate assessment data are shared regularly with candidates and faculty to help them reflect on and improve their performance and preparation programs. Candidate assessment data is regularly aggregated and disaggregated by the unit to help faculty and candidates reflect on and improve their performance and preparation programs.

18 AFIs related to the development of the Assessment System
The unit assessment system is not aligned with the unit’s conceptual framework. The assessment system has not been developed in collaboration with the professional community. The unit has not implemented procedures to ensure fairness, accuracy and consistency in the assessment of candidate performance.

19 AFIs related to the development of the Assessment System
The assessment system does not indicate how data will be regularly analyzed to improve candidate performance, program quality and unit operations. The unit assessment system does not clearly address candidate outcomes identified in the conceptual framework. Assessments of candidates knowledge, skills and dispositions in courses are not clearly and consistently linked to candidate performances described in the conceptual framework.

20 Thank You!


Download ppt "Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University"

Similar presentations


Ads by Google