Army DCIPS 2010 Performance Management and Bonus Process Review.

Slides:



Advertisements
Similar presentations
NC Educator Evaluation System Process Orientation
Advertisements

Process and Procedure Documentation. Agenda Why document processes and procedures? What is process and procedure documentation? Who creates and uses this.
DCIPS Performance Evaluation Administrative Reconsideration Guidance
Introduction and Overview.   PowerPoint  Civil Service chapter 10 rules  Planning and evaluation form  Performance notes  Request for review Handouts.
Mississippi Principal Evaluation System (MPES) Circle Survey Training November
PERFORMANCE MANAGEMENT
OJJDP Performance Measurement Training 1 Incorporating Performance Measurement in the Formula Grant RFP and Application Format Presenter: Pat Cervera,
Managing Effective and Meaningful Appraisal and Merit Increase Programs ABA Presentation April 18, 2006.
Introduction Performance Appraisal Application (PAA) Demonstration Training Introduction Fall 2014.
1 U.S. DEPARTMENT OF LABOR Office of The Assistant Secretary for Administration and Management Human Resources Center The Department of Labor’s Performance.
Illinois Cooperative Education and Internship Association Spring Conference “The Illinois Cooperative Work Study Program Overview And a Showcase of the.
N EW P ERFORMANCE evaluation form training UNIVERSITY OF CALIFORNIA AGRICULTURE and NATURAL RESOURCES Spring 2007.
Performance Management
Grants Business Process Re-Engineering (BPR) Overview
SOAR – Preparing for Launch Task Force Information January 2015.
Providing Leadership in Reading First Schools: Essential Elements Dr. Joseph K. Torgesen Florida Center for Reading Research Miami Reading First Principals,
Goddard Space Flight Center Office of Human Capital Management Our Vision: Deliver integrated, strategic human capital solutions as an innovative, value-added.
Performance-Based Bonus Program Overview
© 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Reporting and Using Evaluation Results Presented on 6/18/15.
5 Criteria of Performance Measures
Cover Page - Currently in Graphics Being Designed
1 Writing Effective Critical Elements Using the SMART or MARST Formats.
North Carolina Teacher Salaries State Board of Education March 2013 Alexis Schauss, Director of School Business.
Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for.
Public School-Operated UPK Information Session. Goals Increase your understanding of QUALITYstarsNY Answer your questions and concerns about participating.
Maricopa Priorities Status Update Fall Maricopa Priorities Basics What it isWhat it’s not A regular, cyclical, bottom-up process to: ▫Evaluate everything.
ODADAS/DMH Consolidation: State and Local Relationship Items Meeting #1 August 13, 2012.
Your Financial Work Defense Finance and Accounting Service DFAS Transformation Update March 2003.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Performance Management
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
DCIPS Implementation Project Plan Update Army G2 Intelligence Personnel Management Office (IPMO) April 6, 2009.
McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc. All rights reserved Performance Appraisals Chapter 11.
© 2010 McGraw Hill Ryerson 10-1 COMPENSATION Third Canadian Edition Milkovich, Newman, Cole.
DCIPS Performance Management Rating Guidance for Supervisors.
2014 Performance Review Process Overview
Module 5 Session 5.2 Visual 1 Module 5 Refining Objectives, Scope, and Other Project Parameters Session 5.2 Reviewing the PAR and refining key project.
SES Performance-Based Pay System and FY04 Year End Activities Jack Kelly Office of the Deputy Under Secretary December 15, 2004.
Performance Appraisal Basics MANA 4328 Dr. Jeanne Michalski
RMSUpdate May 5, 2005 Retail Market Subcommittee Update to TAC.
Performance Evaluation and Pay Pool Results March 2015.
North Carolina Educator Evaluation System Jessica Garner
R12 General Ledger Management Fundamentals
2 AgendaAgenda Why a New Performance Management System 5-Level Summary Rating Appraisal Process Pay-For-Performance What’s Key What Changed/What Stayed.
2015/16 Staff Performance Appraisals Webinar for ANR Supervisors Spring 2016.
DSRIP Application 101. DSRIP Application Overview NY DSRIP Application – requires average score of 60% or higher Organizational Application 30% weight.
HR Elements for HR Practitioners 1 Lesson 10: Awards & Recognition Duration: 30 minutes Awards & Recognition Slide
HR Elements for HR Practitioners 1 Lesson 7: Performance Management Duration: 1 hour, 30 minutes Performance Management Slide 7- 1.
Non-Educator Merit Pay Plan
Moderator Training – unitised assessment
TeamBudget Update
Lesson 6: Evaluating Performance
2018 Safety Group 1 – 5 Year Program Timeline Guide
2017 Safety Group 1 – 5 Year Program Timeline Guide
Iowa College Aid Processing System (ICAPS)
Your Journey to Pay for Performance
Performance Management
Our Step on the Path Forward FY2020 Budget Process & Timeline
2019 Safety Group 1 – 5 Year Program Timeline Guide
Summary and Action Points
Wednesday, December 1st Today’s Facilitators: Kim Glow & Cindy Dollman
DPMAP, PART AND PARB - Wrapping up our first year now!
2016 Safety Group 1 – 5 Year Program Timeline Guide
DCIPS Performance Evaluation Administrative Reconsideration Guidance Overview These slides should be used in conjunction with the “Army DCIPS Performance.
Managing Employees’ Performance
2012 Safety Group 1 – 5 Year Program Timeline Guide
Process and Procedure Documentation
Performance Management Performance Review Authority Guidance
Performance-Based Compensation Program Overview
Presentation transcript:

Army DCIPS 2010 Performance Management and Bonus Process Review

Agenda  Purpose  Key Timeline Data  Lessons Learned  Organizations Analyzed  Army Aggregate Data  Bonus Group Results  Challenges 2

Purpose The purpose of this brief is to provide a comprehensive overview of the 2010 Performance Management and Bonus Process. It also depicts a 2009 bonus data set to correlate the results of the 2010 analysis to possible key decisions in the process. BLUF: Meaningful distinction in Performance Management provided bonus results within expected outcomes. 3

Key Timeline Data  The earliest bonus pool was conducted on 16 Nov 10  The last bonus pool was conducted on 20 Dec 10  The pay out for 2010 was conducted on 27 Jan 11  Manual RPAs were processed after 28 Feb 11  Initial analysis conducted by USD(I) 1 Mar 11  Bonus data call sent to the community 1 Mar 11  Final data received from community 27 Apr 11  Final review completed and reported 18 May 11  Comments/data received from 22 organizations for the analysis 4

Key Impacts – 3 Each Positive and Negative  Meaningful distinction in the Performance Management Process resulted in fair and equitable distributions of rating for DCIPS vs TAPES -An approximate 40% increase in both Successful and Excellent ratings -An approximate 80% decrease in the Outstanding rating under TAPES 2009  Minimal reconsideration requests at HQDA, G-2 level -Only 6 requests required G-2 ruling/determination  The incorporation of the PRA review added validity to the process -The 2 nd level review reinforced the leadership involvement in the PM process  Minimal training on DCIPS was significant for: HLRs, Managers and Military Raters  The 50% Bonus Rule proved problematic and restricted the ability to adequately reward the workforce  The automated tools supporting the process required modification -PAA Tool allowed HLRs to approve reports prior to PRA approval -CWB/DPAT required changes during the process to function properly 5

Lessons Learned  The 50% Bonus Rule did not allow the organizations maximum flexibility to adequately reward employees  Rater consistency training required for shared understanding of ratings category  Managers needed more training on writing SMART objectives  Employees needed more training on writing self assessments  Bonus guidance should be released earlier in the PM process to provide organizations adequate time to develop business rules  Training for bonus board members just prior to commencement of the boards supported the process  Identifying alternate board members provided continuity throughout the process  The awareness of assessing the Performance Elements as well as the Objectives 6

Organizational Data Reviewed 7 ORGANIZATIONEMPLOYEES ORGANIZATION EMPLOYEES AFRICOM21MEDCOM55 ATEC81NETCOM105 AMC425OAA42 FORSCOM137SMDC56 HQDA, G-2183TRADOC804 HT JCOE110USA AFRICA21 IMCOM290USA EUROPE66 INSCOM2379USA NORTH12 JIATF SOUTH213USA SOUTH22 JIEDDO15USA PACIFIC65 JSOC131USA SOC160 TOTAL EMPLOYEES 5393 Organizational data was based on PRA certification and data reported. Total numbers do not include Employees in the following categories: Transition, New Hires Less than 90 days, and Offline Evaluations.

8 Army Aggregate Report for Employees Overall Summary – FY10 Performance Cycle Overall Workforce Considered5393 Number of Bonus Pools140 Average Overall Rating3.78 Average Bonus Budget Percentage1.77% Average Bonus Amount$2,813 Number of QSIs258 Percent of Workforce Receiving a Bonus47%

Total Employees5393 Average Rating3.78 Average Percent of Employees Receiving a Bonus 47% Total Employees Receiving a QSI 258 Average Bonus Amount $2,813 Mode Bonus Amount $2,450 Lowest Bonus Amount $195 Highest Bonus Amount $10,000 Number of Bonus Pools 140 Bonus Group Results General Data Overall Ratings Distribution – Visual Representation 60% of the employee ratings were between 3.3 and Percent of Rated Workforce

10 Number of Quality Step Increases (QSIs) Awarded *The 190 QSIs is.79% of INSCOM’s total population. Total QSIs awarded was less than 5% of the Population vs 12% in 2009

Range of Bonuses Low to Highest (listed) 11 Bonus Amounts The Range of Bonuses throughout commands

12 Employees Rated Minimally Successful (Level 2) Organizations Overall Rating Percentage

13 Employees Rated Successful (Level 3) Organizations Overall Rating Percentage

14 Employees Rated Excellent (Level 4) Organizations Overall Rating Percentage

15 Employees Rated Outstanding (Level 5) Overall Rating Percentage

16 Employees Rated Successful and Above Organizations Overall Combined Rating Percentage All numbers represent Percentages

17 Overall Percentage by Ratings Category Rating Category Total Percentage

18 Overall Comparison 2009 (TAPES) vs 2010 (DCIPS) Total Percentage

19 Percentages by Individual Ratings Numerical Ratings Range of Ratings: Less than 2.0 – Unsuccessful 2.0 to 2.5 – Minimally Successful 2.6 to 3.5 – Successful 3.6 to 4.5 – Excellent 4.6 to 5 – Outstanding Total Percentage

20 DCIPS Wide Component Ratings All DCIPS Organizations Percent of DCIPS Workforce

Performance Management Program Challenges  Management of the 50% Bonus Rule  Performance Objectives not SMART enough  Poorly written Self Report of Accomplishments (SRAs)  Explaining the ratings distinction within/across ratings categories (3.5 Successful to 3.6 Excellent, a 10 th difference)  Lack of PRA teeth in the process – ability to direct changes  Lack of bonus pool training for all Data Administrators  Multiple user guides made the process difficult  The automated tools (numerous “Flash Updates”)  PAA – Premature approvals in the system  CWB – Import tool missed key data points for successful upload into DCPDS  DCPDS – CWB Uploads for bonus pay out were problematic 21

22 Back Up Data

23 DCIPS Wide Component Funding

24 DCIPS Wide Component Bonuses