Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of ODRC’s ORAS Quality Assurance Program

Similar presentations


Presentation on theme: "Overview of ODRC’s ORAS Quality Assurance Program"— Presentation transcript:

1 Overview of ODRC’s ORAS Quality Assurance Program
Accurate Assessments Overview of ODRC’s ORAS Quality Assurance Program

2 Overview What is Quality Assurance? ORAS File Review
Development of the QA Program Results of the QA Program Recommendations Binder being passed through the room that contains some of the QA materials and reports. Please be sure to return it to the front of the room before you leave.

3 What is Quality Assurance
A planned and systematic pattern of actions necessary to provide adequate confidence that the product optimally fulfills customer expectations. Proactive Improve Confidence Credibility (Click for graphics to appear) A good QA system should be: Proactive: Activities done before the product is delivered to assure good quality to the customers. Process should help us get better at making our product. Improve work processes and efficiency. Increases confidence in the product. Increases credibility in our business.

4 Why quality assurance? Create a culture of quality
Good fidelity = replication of results Mission “reduce recidivism” Quality Assurance Video (click clip art target icon) – demonstrates the necessity of quality in any profession. Corrections in Ohio is a multi-million dollar industry and like other industries, when we fail there is a cost and the potential for loss of life. Like the examples in the video, if corrections agencies do not conduct effective quality assurance we will fail our mission. Create a culture of quality throughout the organization. Everyone needs invested. (Click make text appear) Discuss DRC culture issues before and after QA process: Before - Strong commitment by leadership. End users resistant, “big brother watching”. Perceived “gotcha”. POs not used to being observed and evaluated on skills. After – PO more comfortable after experiencing the process. Skill assessment & individualized asst. to make improvements. Strict trg. approach well received by skeptical staff. 2. Implementing the assessment process as recommended by UC is vital if we want to replicate the results in their study. An accurate assessment is the 1st step in reducing recidivism. Often, convenience wins out over conscience. Staff are concerned with volume/deadlines… therefore losing fidelity in the assessment process & negatively impacting results. QA encourages fidelity. 3. Per UC, correctional services and interventions can be effective in reducing recidivism… the most effective are based on principals of effective interventions: Risk principal (who), need principal (what), treatment (how), and program integrity (how well). ORAS sets up the risk/need (who/what). ***Stay focused on desired outcomes – always mindful of overall mission. What’s 1st step in effectively reducing recidivism? An accurate assessment.

5 Supporting ORAS Quality
FAQ Flow charts Internal audit standards Quality Assurance Reports Case file reviews Quality Assurance Program 1. FAQ & Process flow charts were created to assist users in implementation of ORAS. (Open FAQ link/show where to access) 2. Internal audit standards were created for all areas of DRC using ORAS. (prisons, APA, Parole Board) Audit standards focus on compliance with policy, business rules, and/or contract requirements. (Open Internal Audit standards link to show) 3. QA Reports requested: sort data by tool, agency, unit/dorm, individual. Reports: override information, work flow information (who did assessments, reassessments due, distribution of risk, distribution of need, gap analysis, etc. Ad Hoc reports will allow you to build a customized report. Reports have been in test site. Testing identified some deficiencies and the vendor is working to resolve those issues. Expect the QA reports in late summer. (Open QA Reports to explain where to find) 4. Case File Review: Supervisors of end users check 12 areas of ORAS. 5. Created QA Program: that incorporates a direct observation of the skills needed to complete accurate assessments.

6 Dirty Dozen of ORAS QA Keep these 12 things clean for good assessments
Show link to The Dirty Dozen of ORAS QA” form – clip art icon. Explain the form The form may be laminated and placed on every supervisors desk (similar to RIB card – show) Supervisors/analysts will be trained to conduct the QA and provided forms to track results. 1. Disproportionate distribution of risk: UC baseline for the CST 15% - low 40% - moderate 37% - high 8% - very high APA numbers show approximately 70% - low risk level. Potential red flag.

7 Keeping the Dirty Dozen Clean
Duplicates Sources used Verifying collateral information Notes Open each of the 4 links and discuss 1.Duplicates – be sure we are not creating duplicates in the system. (17,000 duplicates current) Also, ISR available on this screen. 2. Sources used – everything available should be checked along with the offender interview. 3. Collateral information – access investigations via the gateway. 4. Notes – help in QA efforts and adds confidence when assessments are being shared across agencies. File review is a quick check on quality. NOT PROACTIVE. Cursory attempt to catch any obvious problems. Best gauge of quality is through a direct observation of the assessment process. Thus, DRC created a QA Program.

8 Building a QA Program Pilot site evaluations Benchmarking
Literature Review University of Cincinnati Benchmarking: Other states: Ray Wahl, Deputy Court Administrator, Utah State Courts/Jerry Bartruff – Iowa Department of Corrections/ Frederick Klunk – PA Board of Probation & Parole Other agencies: Ramsey County LSI-R evaluation/DYS/Federal Probation Department Literature review Effectiveness of cognitive behavioral treatment for adult offenders: A methodological, quality-based review (MacKenzie & Hickman) Eval of the Reasoning & Rehabilitation Cognitive Skills Development Program as Implemented in Juvenile ISP in Colorado (Pullen) Cognitive Behavioral Interventions: Process Evaluation Report. North Carolina Department of Correction (Price) The Carey Group (Mark Carey) University of Cincinnati – review/revisions.

9 Building Blocks of a QA Program
Goal = accurate assessments What contributes to the goal? Fidelity to the assessment process Good interview skills How will the QA program be delivered? Create measures/tools Timelines Implement the QA program Analyze and share results Evaluate program and make adjustments. 1. DRC reached consensus on goal and activities that contributed most to the goal: Fidelity to the assessment process and Good interview skills 2. Fidelity: Some staff looked for convenience in the process: telephone interviews, file review assessments (missing offender participation completely), not using an interview guide, not using the scoring guide, allowing an offender to fill out the tool (instead of the self report form) and then discuss it during the interview, etc.) Many of these “shortcuts” will not show in data collection, reports, etc. Only find it via direct observation. Can staff demonstrate the process correctly when watched? Good interview skills – requires observation. Choose between in-person, video, or audio tapes. Best results through direct observation of the interview. 3. Process required a timeline. Established with consideration of resources, when implementation was effective for each user group, etc. APA last quarter of Parole Board and CC 2nd quarter of Prison will be last quarter of 2012

10 Program Delivery QA Coordinator Random selection of staff Scheduling
Communication Work flow Selection of Evaluators Consider how the program will be delivered - step by step. Consider potential concerns of evaluators, end users, and managers. 1. Coordinator – Oversees the process. Data cleaning, resolve scheduling issues, monthly check-ins with the evaluators, handling switches to ensure objective random selection of staff, respond/resolve problems/concerns. Database tracking information – which evaluations were left, etc. Resource person. Ensure communication. ***Discuss “I’ll take your access example”. 2. Random selection – 2 officers from every unit selected by the QA Coordinator. Objective and random. Allows end users to be more comfortable knowing there wasn’t a “reason” they were selected. 3. Scheduling – logistics of how to schedule observations. Ensure minimal disruption to everyone’s work while maintaining equal representation of the staff. 4. Communication – Needs to be constant. 5. Work flow – What needs to happen before, during, and after the observations. Paperwork flow, analysis, sharing results, etc. Coordinator set up the framework, evaluators implemented, paperwork passed off to research department, research conducts analysis, publish results. 6.Selection of evaluators – the accuracy & value of QA are directly lined to the knowledge & skills of the QA evaluator.

11 Direct Observation Tool and Measures
ORAS Direct Observation Tool Performance Measurement Guide (Open each link and explain form) Direct Observation tool – not a pass/fail. Any area that scores 1 or 2 needs improvement.

12 Implement Pre-brief meeting Direct Observation of the interview
Individual Scoring of the tool Debrief and Coaching Results to the QA Coordinator Pre-brief – clearly communicate the goal of QA. Be sure staff are comfortable and understand the process. Debrief/coaching included supervisors to help expose them to the QA process and learn how to administer QA in their own units. Results published to all levels of staff: End users received immediate feedback and coaching when they were evaluated. The reports show trend analysis and their input into the recommendations. Increases staff buy-in.

13 Communication Goals Program process Results End users
All levels of management Make a plan for communication – talk about QA! Complete transparency. Explanation of process & tools provided to end users, supervisors, administrators. Immediate feedback to end user. Updates provided to oversight committee, evaluators, etc. 1. Goal – ensure accurate assessments 2. Program Process - training focus not part of disciplinary process, personnel evaluations, etc. 3. Results – shared immediately with the PO via debrief/coaching with their evaluator. Supervisors included in the debrief/coaching. Reports distributed to management which include recommendations for improvement. Results used to impact training, policy, and expectations in the ORAS process. 4. End users (Open End Users link to show Coach clip – how not to make end users feel) received s about the process which included copies of the tools being used for the evaluation. We maintained transparency. Memos explaining the process were also sent to supervisors, administrators, etc. Supervisors were encouraged to talk about the process w/ their staff & in staff meetings. We attended multiple meeting to discuss the process & answer questions. We tried to build support for a “culture of QA” by constant reminders and discussion about it. 5. Some supervisors/administrators wanted to use the process negatively. They wanted to progress discipline on problem staff w/ continuous performance problems. Reminded them this is not a process to discipline: focus is skill assessment/skill building. Poor performer will be given multiple opportunities to increase their proficiency. Do not want staff to feel like they are being “slapped” Always tell folks, 2 things contribute to poor performance: lack of necessary skills to complete the task – corrected w/ addt’l trg & support. A lack of motivation to complete the task – corrected w/ performance evals & discipline. QA is intended for skill building.

14 Selection of Evaluators
Subject matter experts/Certified ORAS Trainers IPC skills Recommendation from their immediate supervisor. Completion of evaluator training. SME included end users. No union issues at DRC. The QA Program has a training focus similar to other proficiencies required. Firearm certification is required: Peers may be certified instructors, who administer the test, judge proficiency level, and provide remedial training as needed. QA Program uses the same concept. Evaluators needed to have above average IPC skills: demonstrated ability to train/coach staff, experience as trainers in other areas, had other QA and coaching responsibilities, etc. Recommendation from their immediate supervisor – supervisors were asked to ensure that the recommended evaluator was appropriate with their knowledge, skills, and ability to participate in this process and effectively deal with skeptical/negative staff.

15 Evaluator Training Direct Observation Tool
Performance Measurement Guide Double coding General process Common rating errors Providing good feedback that is balanced Giving and receiving criticism Evaluator training – included details of the process flow, how to use the Direct Observation Tool and Performance Measurement Guide, and how to provide feedback, taking and receiving criticism, effective listening, etc. Show link to QA Training Packet – Clip art icon Evaluators need to know how to communicate effectively to achieve maximum benefit with the end users.

16 Evaluation of the QA Program
Survey Evaluation Evaluators (19) Assessors (116) Do survey results demonstrate gaps? Perceived value verses real value. Share results Helpful hints to increase value of a survey: Forced essay questions, 3 reminders prior to deadline, reassurance that results would be reviewed and used to improve overall process. Follow through is essential. Adjust the program based on the feedback to ensure achieving intended outcomes (i.e. skill enhancement, increase in accurate assessments, etc.) At DRC, we are currently in round 2 of the ORAS QA and focusing on Parole Board staff. Several changes were made to the QA program based off the survey results. Share results – inform training, policy makers, managers, etc.

17 Survey Results Value of the pre-brief meeting
65% of the assessors and 59% of the evaluators rated it successful or very successful. Concentrated areas of coaching Assessors and evaluators rated using the interview guide and using collateral information highest. Did assessors learn anything new? 61% of the assessors and 90% of the evaluators responded yes. Ability of the evaluators to add value 82% of the assessors rated the evaluators with the ability or strong ability to add value. How important are accurate ORAS assessments? 65% of the assessors rated it as important or very important. Evaluators – 19, 100% responded to the survey. Assessors (Parole Officers) – 116, 66% responded to the survey. Asked questions about process, content, and opinions. Some consistent answers and some gaps in the responses.

18 Survey Comments from End Users
What are the barriers to completing accurate ORAS assessments? The length of time it tales to complete the interview and input the information into the portal. Also the program runs slow. Clients not being fully truthful. Not having an investigation in the file. Time consuming when you have a lot of other things to do.

19 Survey Comments from End Users
What was most valuable about the ORAS QA process? Getting reassurance that I was doing some things right. The tool has potential if done accurately. Helped understand my struggles were being felt by others also being evaluated. One on one discussion and feedback. Don’t think we need this, to concerned about QA and not the time consuming ORAS tool. Left out my personal favorite – One person responded that they would have received more benefit from staring at the fabric of their cubicle walls.

20 Survey Comments from End Users
How would you improve the ORAS QA process? It was fairly upfront and open. The evaluator did a good job with me. Maybe assess everyone at some point. I think the ORAS QA process can be very effective, if administered by someone that has actually completed an ORAS assessment of some kind. Have the assessor sit in on more than one assessment. Figure out a way across the state to bring all officers to the same page period. Good Luck!

21 QA Results Trends Direct Observation Tool Double Coding Survey
Interview Skills Reliability Direct Observation Tool Double Coding Survey (Open each link (2) and discuss results) Interview Skills: Not a pass or fail test. Overall score on the tool is to quantify only. Any area that staff scored a 1 or 2 should receive some type of training /coaching to improve the area. Identified trends: Consistent with the pilot evaluation results. Weak domains: Peer Associations (6) and Criminal Attitudes and Behavioral Patterns (7). Subjective areas to score that are dynamic. Also, these areas are primary risk factors. Additional analysis is being done with the control data that was collected. How does the type of training, amount of experience, and the availability of collateral information impact results. Domains 6 and 7 Interview Skills

22 Lessons learned Resources Subject Matter Experts Have a QA Coordinator
Explain QA process to the offender Scoring Guide Have follow up plan ready to implement Staff support – supervisors need to be more knowledgeable. Sustainability What did we do well? Partnered with DRC’s Bureau of Research to help ensure viable outcomes. Where do we need to improve? Cultural of quality assurance will require long-term planning. How will we sustain QA efforts and prevent “drift”.

23 Next Steps Interview training
Booster trainings focused on weak areas and/or changes RIB rules card Supervisor staffing requirements Individual Improvement Plan Consider assessor models Continue QA efforts statewide with all user groups 1. Requested ORAS QA Training Committee/CTA to assist with training needs. Interview training: building rapport, extracting needed information, controlling length of interview, handling difficult offenders, etc. Booster trainings for weak domains 2. Supervisors complete file review QA during staffing. Required monthly: results tracked & analyzed.(Open link/Dirty Doz cl) 3. 4 APA staff were identified to need substantial training and support to reach our goal of completing accurate assessments. Individual Improvement Plan: working one on one, conducting file QA on their cases, increasing trg. requirements.(Open link) 4. Throughout the state, there are units testing the “assessor model” to complete ORAS. Easier to focus training, support, and QA on a smaller number of staff. Logistics may not make it possible in every area. APA Field Services is responsive to the QA results and is committed to taking corrective actions to achieve their goal of “accurate assessments”.

24 How to create QA Program
Goals Concerns/barriers for staff Resources Process flow Direct Observation Training Communication Implementation Evaluation Follow up – use results to impact positive change

25 Good Quality Assurance
Proactive Improve Confidence Credibility

26 Questions Deborah Herubin John Geras


Download ppt "Overview of ODRC’s ORAS Quality Assurance Program"

Similar presentations


Ads by Google