Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of ODRC’s ORAS Quality Assurance Program.

Similar presentations


Presentation on theme: "Overview of ODRC’s ORAS Quality Assurance Program."— Presentation transcript:

1 Overview of ODRC’s ORAS Quality Assurance Program

2 What is Quality Assurance?ORAS File ReviewDevelopment of the QA ProgramResults of the QA ProgramRecommendations

3 ProactiveImproveConfidenceCredibility  A planned and systematic pattern of actions necessary to provide adequate confidence that the product optimally fulfills customer expectations.

4  Create a culture of quality  Good fidelity = replication of results  Mission “reduce recidivism”

5  FAQ FAQ  Flow charts  Internal audit standards Internal audit standards  Quality Assurance Reports Quality Assurance Reports  Case file reviews  Quality Assurance Program

6 Keep these 12 things clean for good assessments

7  Duplicates Duplicates  Sources used Sources used  Verifying collateral information Verifying collateral information  Notes Notes

8  Pilot site evaluations  Benchmarking  Literature Review  University of Cincinnati

9  Goal = accurate assessments  What contributes to the goal? ◦ Fidelity to the assessment process ◦ Good interview skills  How will the QA program be delivered? ◦ Create measures/tools ◦ Timelines  Implement the QA program  Analyze and share results  Evaluate program and make adjustments.

10  QA Coordinator  Random selection of staff  Scheduling  Communication  Work flow  Selection of Evaluators

11  ORAS Direct Observation Tool ORAS Direct Observation Tool  Performance Measurement Guide Performance Measurement Guide

12  Pre-brief meeting  Direct Observation of the interview  Individual Scoring of the tool  Debrief and Coaching  Results to the QA Coordinator

13  Goals  Program process  Results  End users End users  All levels of management

14  Subject matter experts/Certified ORAS Trainers  IPC skills  Recommendation from their immediate supervisor.  Completion of evaluator training.

15  Direct Observation Tool  Performance Measurement Guide  Double coding  General process  Common rating errors  Providing good feedback that is balanced  Giving and receiving criticism

16  Survey Evaluation ◦ Evaluators (19) ◦ Assessors (116) ◦ Do survey results demonstrate gaps? Perceived value verses real value.  Share results

17  Value of the pre-brief meeting ◦ 65% of the assessors and 59% of the evaluators rated it successful or very successful.  Concentrated areas of coaching ◦ Assessors and evaluators rated using the interview guide and using collateral information highest.  Did assessors learn anything new? ◦ 61% of the assessors and 90% of the evaluators responded yes.  Ability of the evaluators to add value ◦ 82% of the assessors rated the evaluators with the ability or strong ability to add value.  How important are accurate ORAS assessments? ◦ 65% of the assessors rated it as important or very important.

18  What are the barriers to completing accurate ORAS assessments? ◦ The length of time it tales to complete the interview and input the information into the portal. Also the program runs slow. ◦ Clients not being fully truthful. ◦ Not having an investigation in the file. ◦ Time consuming when you have a lot of other things to do.

19  What was most valuable about the ORAS QA process? ◦ Getting reassurance that I was doing some things right. ◦ The tool has potential if done accurately. ◦ Helped understand my struggles were being felt by others also being evaluated. ◦ One on one discussion and feedback. ◦ Don’t think we need this, to concerned about QA and not the time consuming ORAS tool.

20  How would you improve the ORAS QA process? ◦ It was fairly upfront and open. The evaluator did a good job with me. Maybe assess everyone at some point. ◦ I think the ORAS QA process can be very effective, if administered by someone that has actually completed an ORAS assessment of some kind. ◦ Have the assessor sit in on more than one assessment. ◦ Figure out a way across the state to bring all officers to the same page period. Good Luck!

21 Domains 6 and 7Interview Skills Trends Survey Interview Skills Reliability  Direct Observation Tool Direct Observation Tool  Double Coding Double Coding  Survey

22  Resources  Subject Matter Experts  Have a QA Coordinator  Explain QA process to the offender  Scoring Guide  Have follow up plan ready to implement  Staff support – supervisors need to be more knowledgeable.  Sustainability

23  Interview training  Booster trainings focused on weak areas and/or changes  RIB rules card  Supervisor staffing requirements Supervisor staffing requirements  Individual Improvement Plan Individual Improvement Plan  Consider assessor models  Continue QA efforts statewide with all user groups

24  Goals  Concerns/barriers for staff  Resources  Process flow  Direct Observation  Training  Communication  Implementation  Evaluation  Follow up – use results to impact positive change

25 ProactiveImproveConfidenceCredibility

26 Deborah Herubin John Geras


Download ppt "Overview of ODRC’s ORAS Quality Assurance Program."

Similar presentations


Ads by Google