Overview of ODRC’s ORAS Quality Assurance Program

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Training activities administration and logistical support
360 degree feedback information session
Educational Specialists Performance Evaluation System
Growing World Class Service (or any other kind of change)
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
15-2 Avoiding the costs associated with foodborne illness Preventing the loss of revenue/reputation due to closure Improving employee morale Increasing.
Presentation to HRPA Algoma January 29, My favourite saying… Fail to plan, Plan to Fail. 2.
HOUSTON EMPLOYEE ASSESSMENT AND REVIEW (HEAR) PROCESS INFORMATION SESSION NON-SUPERVISOR For more information, visit
360-degree Look at Me The Leadership Effectiveness Inventory (LEI)
Performance Assessment Process: The Employee’s Perspective May 2014.
Paul Brinkhurst The Centre for Literacy Summer Institute 2012 Workplace Literacy & Essential Skills: Shaping a New Learning Culture June 27-29, 2012 Montreal,
1 A Review  Adequate training does not happen without a lot of work  It requires significant planning  There are definite.
A Decision Matrix for Designing Staff Training Ronnie Detrich Wing Institute.
Quality evaluation and improvement for Internal Audit
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Evaluation in Michigan’s Model Steve Goodman National PBIS Leadership Forum October, 2010
To use this PowerPoint you will need the following documents found in this module: OIP Stage 3 for the BLT The BLT 5-Step Process Visual The BLT 5-Step.
Training transfer from the classroom to the workplace
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
Module 3. Session DCST Clinical governance
Delivering Transition Support Through the VLE “Vive la difference!”
Charting Library Service Quality Sheri Downer Auburn University Libraries.
Too expensive Too complicated Too time consuming.
Commonwealth of Massachusetts Statewide Strategic IT Consolidation (ITC) Initiative ANF IT Consolidation Website Publishing / IA Working Group Kickoff.
Prepared by SOCCCD Office of Human Resources
Performance Reviews Coaching and Feedback. Performance Reviews: Coaching and Feedback Module 1: At our best Coaching and feedback refresh.
Supporting and Sustaining Volunteers Nonprofit Learning Point September 23, 2015.
The Information School at the University of Washington Information Audits Bob Boiko UW iSchool ischool.washington.edu Metatorial Services Inc.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
December 3, 2014 Lauren Benishek, PhD & Sallie Weaver, PhD
Introduction Complex and large SW. SW crises Expensive HW. Custom SW. Batch execution Structured programming Product SW.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
Student Perception Survey Results Ms. X’s Results for SPS 2015.
Ohio Housing Finance Agency – Strategic Priority Culture Initiative Ohio Housing Finance Agency Strategic Priority Culture Initiative.
© 2001 Change Function Ltd USER ACCEPTANCE TESTING Is user acceptance testing of technology and / or processes a task within the project? If ‘Yes’: Will.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Assuring Safety for Clinical Techniques and Procedures MODULE 5 Facilitative Supervision for Quality Improvement Curriculum 2008.
QA Best Practices Tool Kit Task Force The Back Story QA Summit The Healthcare Documentation Quality Assessment and Management Best Practices Tool Kit.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
ELEMENT 7 - PLAN EVALUATION 73. ____ Workplace injury/illness data are effectively analyzed. The OSHA 300 log, safety committee minutes and baseline surveys.
Selecting Evidence Based Practices Oregon’s initial attempts to derive a process Implementation Conversations 11/10.
1 October 2015 Daryl D. Green Oak Ridge Office of Environmental Management OREM 2015 Safety Culture Evaluation Results LESSONS LEARNED.
Compliance Monitoring and Enforcement Audit Program - The Audit Process.
Consistency of Assessment (Validation) Webinar – Part 1 Renae Guthridge WA Training Institute (WATI)
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
People Priorities Framework
FMS Building Services Employee Survey Results Unit Response Rate = 92.5% Total Number of Respondents- Unit = 146 Total Number of Respondents in ASUR database.
The Curriculum Development Process Dr. M
Impact Planning Jeff Greenidge Network Director. Why gather impact data? Demonstrate Validate Maintain.
Competency based learning & performance Ola Badersten.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
Introduction to SPA Trade Services PDA Work Flow Management Systems The Era of Change.
Canberra Chapter July PMI Chapter Meeting July 2007 PMCDF Competence Framework A presentation by Chris Cartwright.
MANAGING EMPLOYEE PERFORMANCE Facilitator: Joan Strohauer, CalHR Guest Presenters: Marva Lee, Personnel Officer, CalSTRS Brenna Neuharth, Workforce Planning.
Fundamentals of a Vocational Assessment
TRAINING NEED ANALYSIS
Overview – Guide to Developing Safety Improvement Plan
Software Quality Engineering
Transforming Hidalgo County CSCD into an Evidence Based Agency
Overview – Guide to Developing Safety Improvement Plan
LEAP 360: Accessing K-2 Formative Tasks
2018 Great Colleges Survey for Champlain College
Presentation transcript:

Overview of ODRC’s ORAS Quality Assurance Program Accurate Assessments Overview of ODRC’s ORAS Quality Assurance Program

Overview What is Quality Assurance? ORAS File Review Development of the QA Program Results of the QA Program Recommendations Binder being passed through the room that contains some of the QA materials and reports. Please be sure to return it to the front of the room before you leave.

What is Quality Assurance A planned and systematic pattern of actions necessary to provide adequate confidence that the product optimally fulfills customer expectations. Proactive Improve Confidence Credibility (Click for graphics to appear) A good QA system should be: Proactive: Activities done before the product is delivered to assure good quality to the customers. Process should help us get better at making our product. Improve work processes and efficiency. Increases confidence in the product. Increases credibility in our business.

Why quality assurance? Create a culture of quality Good fidelity = replication of results Mission “reduce recidivism” Quality Assurance Video (click clip art target icon) – demonstrates the necessity of quality in any profession. Corrections in Ohio is a multi-million dollar industry and like other industries, when we fail there is a cost and the potential for loss of life. Like the examples in the video, if corrections agencies do not conduct effective quality assurance we will fail our mission. Create a culture of quality throughout the organization. Everyone needs invested. (Click make text appear) Discuss DRC culture issues before and after QA process: Before - Strong commitment by leadership. End users resistant, “big brother watching”. Perceived “gotcha”. POs not used to being observed and evaluated on skills. After – PO more comfortable after experiencing the process. Skill assessment & individualized asst. to make improvements. Strict trg. approach well received by skeptical staff. 2. Implementing the assessment process as recommended by UC is vital if we want to replicate the results in their study. An accurate assessment is the 1st step in reducing recidivism. Often, convenience wins out over conscience. Staff are concerned with volume/deadlines… therefore losing fidelity in the assessment process & negatively impacting results. QA encourages fidelity. 3. Per UC, correctional services and interventions can be effective in reducing recidivism… the most effective are based on principals of effective interventions: Risk principal (who), need principal (what), treatment (how), and program integrity (how well). ORAS sets up the risk/need (who/what). ***Stay focused on desired outcomes – always mindful of overall mission. What’s 1st step in effectively reducing recidivism? An accurate assessment.

Supporting ORAS Quality FAQ Flow charts Internal audit standards Quality Assurance Reports Case file reviews Quality Assurance Program 1. FAQ & Process flow charts were created to assist users in implementation of ORAS. (Open FAQ link/show where to access) 2. Internal audit standards were created for all areas of DRC using ORAS. (prisons, APA, Parole Board) Audit standards focus on compliance with policy, business rules, and/or contract requirements. (Open Internal Audit standards link to show) 3. QA Reports requested: sort data by tool, agency, unit/dorm, individual. Reports: override information, work flow information (who did assessments, reassessments due, distribution of risk, distribution of need, gap analysis, etc. Ad Hoc reports will allow you to build a customized report. Reports have been in test site. Testing identified some deficiencies and the vendor is working to resolve those issues. Expect the QA reports in late summer. (Open QA Reports to explain where to find) 4. Case File Review: Supervisors of end users check 12 areas of ORAS. 5. Created QA Program: that incorporates a direct observation of the skills needed to complete accurate assessments.

Dirty Dozen of ORAS QA Keep these 12 things clean for good assessments Show link to The Dirty Dozen of ORAS QA” form – clip art icon. Explain the form The form may be laminated and placed on every supervisors desk (similar to RIB card – show) Supervisors/analysts will be trained to conduct the QA and provided forms to track results. 1. Disproportionate distribution of risk: UC baseline for the CST 15% - low 40% - moderate 37% - high 8% - very high APA numbers show approximately 70% - low risk level. Potential red flag.

Keeping the Dirty Dozen Clean Duplicates Sources used Verifying collateral information Notes Open each of the 4 links and discuss 1.Duplicates – be sure we are not creating duplicates in the system. (17,000 duplicates current) Also, ISR available on this screen. 2. Sources used – everything available should be checked along with the offender interview. 3. Collateral information – access investigations via the gateway. 4. Notes – help in QA efforts and adds confidence when assessments are being shared across agencies. File review is a quick check on quality. NOT PROACTIVE. Cursory attempt to catch any obvious problems. Best gauge of quality is through a direct observation of the assessment process. Thus, DRC created a QA Program.

Building a QA Program Pilot site evaluations Benchmarking Literature Review University of Cincinnati Benchmarking: Other states: Ray Wahl, Deputy Court Administrator, Utah State Courts/Jerry Bartruff – Iowa Department of Corrections/ Frederick Klunk – PA Board of Probation & Parole Other agencies: Ramsey County LSI-R evaluation/DYS/Federal Probation Department Literature review Effectiveness of cognitive behavioral treatment for adult offenders: A methodological, quality-based review (MacKenzie & Hickman) Eval of the Reasoning & Rehabilitation Cognitive Skills Development Program as Implemented in Juvenile ISP in Colorado (Pullen) Cognitive Behavioral Interventions: Process Evaluation Report. North Carolina Department of Correction (Price) The Carey Group (Mark Carey) University of Cincinnati – review/revisions.

Building Blocks of a QA Program Goal = accurate assessments What contributes to the goal? Fidelity to the assessment process Good interview skills How will the QA program be delivered? Create measures/tools Timelines Implement the QA program Analyze and share results Evaluate program and make adjustments. 1. DRC reached consensus on goal and activities that contributed most to the goal: Fidelity to the assessment process and Good interview skills 2. Fidelity: Some staff looked for convenience in the process: telephone interviews, file review assessments (missing offender participation completely), not using an interview guide, not using the scoring guide, allowing an offender to fill out the tool (instead of the self report form) and then discuss it during the interview, etc.) Many of these “shortcuts” will not show in data collection, reports, etc. Only find it via direct observation. Can staff demonstrate the process correctly when watched? Good interview skills – requires observation. Choose between in-person, video, or audio tapes. Best results through direct observation of the interview. 3. Process required a timeline. Established with consideration of resources, when implementation was effective for each user group, etc. APA last quarter of 2011. Parole Board and CC 2nd quarter of 2012. Prison will be last quarter of 2012

Program Delivery QA Coordinator Random selection of staff Scheduling Communication Work flow Selection of Evaluators Consider how the program will be delivered - step by step. Consider potential concerns of evaluators, end users, and managers. 1. Coordinator – Oversees the process. Data cleaning, resolve scheduling issues, monthly check-ins with the evaluators, handling switches to ensure objective random selection of staff, respond/resolve problems/concerns. Database tracking information – which evaluations were left, etc. Resource person. Ensure communication. ***Discuss “I’ll take your access example”. 2. Random selection – 2 officers from every unit selected by the QA Coordinator. Objective and random. Allows end users to be more comfortable knowing there wasn’t a “reason” they were selected. 3. Scheduling – logistics of how to schedule observations. Ensure minimal disruption to everyone’s work while maintaining equal representation of the staff. 4. Communication – Needs to be constant. 5. Work flow – What needs to happen before, during, and after the observations. Paperwork flow, analysis, sharing results, etc. Coordinator set up the framework, evaluators implemented, paperwork passed off to research department, research conducts analysis, publish results. 6.Selection of evaluators – the accuracy & value of QA are directly lined to the knowledge & skills of the QA evaluator.

Direct Observation Tool and Measures ORAS Direct Observation Tool Performance Measurement Guide (Open each link and explain form) Direct Observation tool – not a pass/fail. Any area that scores 1 or 2 needs improvement.

Implement Pre-brief meeting Direct Observation of the interview Individual Scoring of the tool Debrief and Coaching Results to the QA Coordinator Pre-brief – clearly communicate the goal of QA. Be sure staff are comfortable and understand the process. Debrief/coaching included supervisors to help expose them to the QA process and learn how to administer QA in their own units. Results published to all levels of staff: End users received immediate feedback and coaching when they were evaluated. The reports show trend analysis and their input into the recommendations. Increases staff buy-in.

Communication Goals Program process Results End users All levels of management Make a plan for communication – talk about QA! Complete transparency. Explanation of process & tools provided to end users, supervisors, administrators. Immediate feedback to end user. Updates provided to oversight committee, evaluators, etc. 1. Goal – ensure accurate assessments 2. Program Process - training focus not part of disciplinary process, personnel evaluations, etc. 3. Results – shared immediately with the PO via debrief/coaching with their evaluator. Supervisors included in the debrief/coaching. Reports distributed to management which include recommendations for improvement. Results used to impact training, policy, and expectations in the ORAS process. 4. End users (Open End Users link to show Coach clip – how not to make end users feel) received emails about the process which included copies of the tools being used for the evaluation. We maintained transparency. Memos explaining the process were also sent to supervisors, administrators, etc. Supervisors were encouraged to talk about the process w/ their staff & in staff meetings. We attended multiple meeting to discuss the process & answer questions. We tried to build support for a “culture of QA” by constant reminders and discussion about it. 5. Some supervisors/administrators wanted to use the process negatively. They wanted to progress discipline on problem staff w/ continuous performance problems. Reminded them this is not a process to discipline: focus is skill assessment/skill building. Poor performer will be given multiple opportunities to increase their proficiency. Do not want staff to feel like they are being “slapped” Always tell folks, 2 things contribute to poor performance: lack of necessary skills to complete the task – corrected w/ addt’l trg & support. A lack of motivation to complete the task – corrected w/ performance evals & discipline. QA is intended for skill building.

Selection of Evaluators Subject matter experts/Certified ORAS Trainers IPC skills Recommendation from their immediate supervisor. Completion of evaluator training. SME included end users. No union issues at DRC. The QA Program has a training focus similar to other proficiencies required. Firearm certification is required: Peers may be certified instructors, who administer the test, judge proficiency level, and provide remedial training as needed. QA Program uses the same concept. Evaluators needed to have above average IPC skills: demonstrated ability to train/coach staff, experience as trainers in other areas, had other QA and coaching responsibilities, etc. Recommendation from their immediate supervisor – supervisors were asked to ensure that the recommended evaluator was appropriate with their knowledge, skills, and ability to participate in this process and effectively deal with skeptical/negative staff.

Evaluator Training Direct Observation Tool Performance Measurement Guide Double coding General process Common rating errors Providing good feedback that is balanced Giving and receiving criticism Evaluator training – included details of the process flow, how to use the Direct Observation Tool and Performance Measurement Guide, and how to provide feedback, taking and receiving criticism, effective listening, etc. Show link to QA Training Packet – Clip art icon Evaluators need to know how to communicate effectively to achieve maximum benefit with the end users.

Evaluation of the QA Program Survey Evaluation Evaluators (19) Assessors (116) Do survey results demonstrate gaps? Perceived value verses real value. Share results Helpful hints to increase value of a survey: Forced essay questions, 3 reminders prior to deadline, reassurance that results would be reviewed and used to improve overall process. Follow through is essential. Adjust the program based on the feedback to ensure achieving intended outcomes (i.e. skill enhancement, increase in accurate assessments, etc.) At DRC, we are currently in round 2 of the ORAS QA and focusing on Parole Board staff. Several changes were made to the QA program based off the survey results. Share results – inform training, policy makers, managers, etc.

Survey Results Value of the pre-brief meeting 65% of the assessors and 59% of the evaluators rated it successful or very successful. Concentrated areas of coaching Assessors and evaluators rated using the interview guide and using collateral information highest. Did assessors learn anything new? 61% of the assessors and 90% of the evaluators responded yes. Ability of the evaluators to add value 82% of the assessors rated the evaluators with the ability or strong ability to add value. How important are accurate ORAS assessments? 65% of the assessors rated it as important or very important. Evaluators – 19, 100% responded to the survey. Assessors (Parole Officers) – 116, 66% responded to the survey. Asked questions about process, content, and opinions. Some consistent answers and some gaps in the responses.

Survey Comments from End Users What are the barriers to completing accurate ORAS assessments? The length of time it tales to complete the interview and input the information into the portal. Also the program runs slow. Clients not being fully truthful. Not having an investigation in the file. Time consuming when you have a lot of other things to do.

Survey Comments from End Users What was most valuable about the ORAS QA process? Getting reassurance that I was doing some things right. The tool has potential if done accurately. Helped understand my struggles were being felt by others also being evaluated. One on one discussion and feedback. Don’t think we need this, to concerned about QA and not the time consuming ORAS tool. Left out my personal favorite – One person responded that they would have received more benefit from staring at the fabric of their cubicle walls.

Survey Comments from End Users How would you improve the ORAS QA process? It was fairly upfront and open. The evaluator did a good job with me. Maybe assess everyone at some point. I think the ORAS QA process can be very effective, if administered by someone that has actually completed an ORAS assessment of some kind. Have the assessor sit in on more than one assessment. Figure out a way across the state to bring all officers to the same page period. Good Luck!

QA Results Trends Direct Observation Tool Double Coding Survey Interview Skills Reliability Direct Observation Tool Double Coding Survey (Open each link (2) and discuss results) Interview Skills: Not a pass or fail test. Overall score on the tool is to quantify only. Any area that staff scored a 1 or 2 should receive some type of training /coaching to improve the area. Identified trends: Consistent with the pilot evaluation results. Weak domains: Peer Associations (6) and Criminal Attitudes and Behavioral Patterns (7). Subjective areas to score that are dynamic. Also, these areas are primary risk factors. Additional analysis is being done with the control data that was collected. How does the type of training, amount of experience, and the availability of collateral information impact results. Domains 6 and 7 Interview Skills

Lessons learned Resources Subject Matter Experts Have a QA Coordinator Explain QA process to the offender Scoring Guide Have follow up plan ready to implement Staff support – supervisors need to be more knowledgeable. Sustainability What did we do well? Partnered with DRC’s Bureau of Research to help ensure viable outcomes. Where do we need to improve? Cultural of quality assurance will require long-term planning. How will we sustain QA efforts and prevent “drift”.

Next Steps Interview training Booster trainings focused on weak areas and/or changes RIB rules card Supervisor staffing requirements Individual Improvement Plan Consider assessor models Continue QA efforts statewide with all user groups 1. Requested ORAS QA Training Committee/CTA to assist with training needs. Interview training: building rapport, extracting needed information, controlling length of interview, handling difficult offenders, etc. Booster trainings for weak domains 2. Supervisors complete file review QA during staffing. Required monthly: results tracked & analyzed.(Open link/Dirty Doz cl) 3. 4 APA staff were identified to need substantial training and support to reach our goal of completing accurate assessments. Individual Improvement Plan: working one on one, conducting file QA on their cases, increasing trg. requirements.(Open link) 4. Throughout the state, there are units testing the “assessor model” to complete ORAS. Easier to focus training, support, and QA on a smaller number of staff. Logistics may not make it possible in every area. APA Field Services is responsive to the QA results and is committed to taking corrective actions to achieve their goal of “accurate assessments”.

How to create QA Program Goals Concerns/barriers for staff Resources Process flow Direct Observation Training Communication Implementation Evaluation Follow up – use results to impact positive change

Good Quality Assurance Proactive Improve Confidence Credibility

Questions Deborah Herubin Deborah.Herubin@odrc.state.oh.us 614-995-0181 John Geras John.Geras@odrc.state.oh.us