Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for.

Slides:



Advertisements
Similar presentations
2012 EXAMINER TRAINING Examples of NERD Comment Formatting
Advertisements

Key Performance Indicators KPI’s
Using Baldrige to Create Organizational Alignment & Integration
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
June 2002QPRC 2002, Tempe, Arizona A Workshop on Assessing to the Baldrige Criteria Cheryl L. Jennings, Motorola Lynn Kelley, Textron.
2014 Baldrige Performance Excellence Program | Introduction to the Baldrige Criteria Baldrige Performance Excellence Program |
2014 Baldrige Performance Excellence Program | Self-Assessing Your Organization with the Baldrige Criteria.
HR Manager – HR Business Partners Role Description
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
Plateau Competency Management and Assessment Overview v 5.8.
Comment Writing Exercise Return Examiner Training.
1 MQA Feedback Report Update Board of Education February 19, 2015 Building learners of tomorrow…
2010 AHCA/NCAL National Quality Award Program - Gold Overview - Jeri Reinhardt Ed McMahon Tim Case.
Tennessee Center for Performance Excellence Section 3 – Evaluating Results.
Do You Know ???.
Planning and Strategic Management
JUDGES MEETING 1. Judges Meeting The Judges Meeting –Each Team will be assigned a date and time. Each Senior Examiner will be expected to present their.
2015 Baldrige Performance Excellence Program | Baldrige Performance Excellence Program | 2015 Self-Assessing Your Organization with.
NEW YORK ARMY NATIONAL GUARD Organizational Self-Assessment Workshop Mr. Edwin Perez ARNG G5 Business Transformation Office 6-8 April 2015Day 1.
Copyright Cengage Learning 2013 All Rights Reserved 1 Chapter 2: Strategic Planning for Competitive Advantage Prepared & Designed by Laura Rush, B-Books,
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
performance INDICATORs performance APPRAISAL RUBRIC
Quality and Accreditation (1/3) Certification of Kingdom Tower for “ FIT for PURPOSE ” Series of tests of “Fitness” of sub-systems: Foundation Sub-Systems.
2014 Baldrige Performance Excellence Program | Polishing Feedback Comments Sample 1: Process Strength.
Creating a Learning Organization Through the AHCA/NCAL Quality Award Program Demi Haffenreffer, RN, MBA President Haffenreffer & Associates, Inc.
EXAMINER TRAINING 2012 EXAMINER TRAINING Introduction to Evaluating a Baldrige Application Presented by The Granite State Quality Council and The.
Creating Sustainable Organizations The Baldrige Performance Excellence Program Sherry Martin HIV Quality of Care Advisory Committee September 13, 2012.
Strategic Planning Module Preview This PowerPoint provides a sample of the Strategic Planning Module PowerPoint. The actual Strategic Planning PowerPoint.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Best Practices in Application Writing Kay Kendall Excellence at Work Conference February 25, Abby Rd. Westford, MA
Applicant Name RMPEx Site Visit Opening Meeting Team Leader - Team Members –
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
Strategy Review, Evaluation, and Control Chapter Nine.
2010 AHCA/NCAL National Quality Award Program - Silver Award Overview - Session Two Lance Reynolds Kevin Warren Tim Case.
Application Workshop – Session One April 26, 2011.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Application Workshop – Session Five June 21, 2011.
303KM Project Management1 Chapter 2: The Project Management in Context of Organization Environment.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Geelong High School Performance Development & Review Process in 2014.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Applicant Name RMPEx Site Visit Opening Meeting Team Leader - Team Members –
2010 AHCA/NCAL National Quality Award Program - Silver Overview - Session One Lance Reynolds Kevin Warren Tim Case.
2008 AHCA/NCAL National Quality Award Program - Step III Overview - Jon Frantsvog Ira Schoenberger Tim Case.
2007 Faculty & Staff Denison Organizational Culture Survey.
 To identify & evaluate whether its resources have got any strategic value or not a firm generally uses various approaches  The approaches are
Factor0–5%10–25%30–45%50–65%70–85%90–100% Approach No systematic approach to Item requirements is evident; information is anecdotal. The beginning of a.
Quality Function Deployment. Example Needs Hierarchy.
Tennessee Center for Performance Excellence Section 2 – Process Evaluation Factors.
Baldrige National Quality Program Baldrige Background l Results l Baldrige Program Impacts Legal Aid Group March 11, 2002.
Chapter 12 Translating Expectations to Specifications CEM 515: Project Quality Management Prof. Abdulaziz A. Bubshait King Fahd University of Petroleum.
Chapter 3 Designing a Competitive Business Model and Building a Solid Strategic Plan.
Welcome, Examiners! Washington State Quality Award Return Examiner Training 2009.
Albemarle County’s Departmental Assessment Process Who, What, When and How? Lori Allshouse, County Executive Department John Freeman, Department of Social.
Department of Defense Voluntary Protection Programs Center of Excellence Development, Validation, Implementation and Enhancement for a Voluntary Protection.
2016 Baldrige Performance Excellence Program | Writing High-Quality Feedback for 2016 Baldrige Award Applicants.
Entrepreneurial Strategies. A Major Shift... From financial capital to intellectual capital – Human – Structural – Customer.
Presented by Deborah Eldridge, CAEP Consultant
Strategic Management and the Entrepreneur-Over view
Finalizing Award Recommendations
Sustaining Continuous Improvement
Applicant Name RMPEx Site Visit Opening Meeting
Strategy Review, Evaluation, and Control
Sterling Examiner Preparation
Strategic Planning Setting Direction Retreat
Conducting a Self Assessment and Developing Your CQI Plan
Writing and Using the Accountability Report in an Academic Setting
Sterling Examiner Preparation
Conducting a Self Assessment and Developing Your CQI Plan
2007 Faculty & Staff Denison Organizational Culture Survey
Presentation transcript:

Scoring 1

Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for improvement (OFI’s) Scoring Guidelines are provided for Approach, Deployment, Learning, and integration Scores are assigned for each ITEM

Scoring Guidelines for Approach SCOREPROCESS 0% -5% No SYSTEMATIC APPROACH is evident; information is ANECDOTAL. (A) 10% - 25% The beginning of a SYSTEMATIC APPROACH to the BASIC REQUIREMENTS of the Item is evident. (A) 30% - 45% An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the BASIC REQUIREMENTS of the Item, is evident. (A) 50% - 65% An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the OVERALL REQUIREMENTS of the Item, is evident. (A) 70% - 85% An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the MULTIPLE REQUIREMENTS of the Item, is evident. (A) 90% - 100% An EFFECTIVE, SYSTEMATIC APPROACH, fully responsive to the MULTIPLE REQUIREMENTS of the Item, is evident. (A)

Category/Item Question Organization BASIC OVERALL MULTIPLE

Scoring Guidelines for DEPLOYMENT SCOREPROCESS 0% -5%Little or no DEPLOYMENT of an APPROACH is evident. 10% - 25% The APPROACH is in the early stages of DEPLOYMENT in most areas or work units, inhibiting progress in achieving the BASIC REQUIREMENTS of the Item. 30% - 45% The APPROACH is DEPLOYED, although some areas or work units are in early stages of DEPLOYMENT. 50% - 65% The APPROACH is well DEPLOYED, although DEPLOYMENT may vary in some areas or work units. 70% - 85%The APPROACH is well DEPLOYED, with no significant gaps. 90% - 100% The APPROACH is fully DEPLOYED without significant weaknesses or gaps in any areas or work units.

Scoring Guidelines for LEARNING SCOREPROCESS 0% -5% An improvement orientation is not evident; improvement is achieved through reacting to problems. 10% - 25% Early stages of a transition from reacting to problems to a general improvement orientation are evident. 30% - 45% The beginning of a SYSTEMATIC APPROACH to evaluation and improvement of KEY PROCESSES is evident. 50% - 65% A fact-based, SYSTEMATIC evaluation and improvement PROCESS and some organizational LEARNING are in place for improving the efficiency and EFFECTIVENESS of KEY PROCESSES. 70% - 85% Fact-based, SYSTEMATIC evaluation and improvement and organizational LEARNING are KEY management tools; there is clear evidence of refinement and INNOVATION as a result of organizational-level ANALYSIS and sharing. 90% - 100% Fact-based, SYSTEMATIC evaluation and improvement and organizational LEARNING are KEY organization-wide tools; refinement and INNOVATION, backed by ANALYSIS and sharing, are evident throughout the organization.

Scoring Guidelines for INTEGRATION SCOREPROCESS 0% -5% No organizational ALIGNMENT is evident; individual areas or work units operate independently. 10% - 25% The APPROACH is ALIGNED with other areas or work units largely through joint problem solving. 30% - 45% The APPROACH is in early stages of ALIGNMENT with your basic organizational needs identified in response to the Organizational Profile and other Process Items. 50% - 65% The APPROACH is ALIGNED with your organizational needs identified in response to the Organizational Profile and other Process Items. 70% - 85% The APPROACH is INTEGRATED with your organizational needs identified in response to the Organizational Profile and other Process Items 90% - 100% The APPROACH is well INTEGRATED with your organizational needs identified in response to the Organizational Profile and other Process Items.

Overall Score for Categories 1 – 6 Items An applicant’s score for approach [A], deployment [D], learning [L], and integration [I] depends on their ability to demonstrate the characteristics associated with that score Although there are 4 factors to score for each Item, only one overall score is assigned The examination team selects the “best fit” score, which is likely to be between the highest and lowest score for the ADLI factors

“Best Fit” Example Consider two applicants, both scoring 60% for A, D, and I, and 20% for L ApplicantMarket GrowthTechnologyCompetitors A10%/yearStableStable B 2X/yearMajor changeMany new Item Score: Applicant A:50% (Learning could lead to incremental results improvements) Applicant B:30% (Learning needed to sustain the organization)

6. Independent review: LeTCI – Results Items Objective: Be able to evaluate a results item for independent review

Using a Table of Expected Results to Identify Category 7 OFI’s Purpose: Identify results that the applicant hasn’t provided WHY? Because applicants like to show results that are favorable, but may not always show results that are important.

Table of Expected Results Title of Expected Result Source ReferenceCategory 7 Reference Result Reference (Describe expected result and segments) (List where in the application you learned this was important to report in Cat 7) (List where in Cat 7 the result belongs) (Identify the page or Figure that contains the results) Examples: Status of Action Plans (% of Action Plans on target) OP pg.71, 2.1a2Not found Workforce segmentsOP p67.4 Expect to see segmentation by FT, PT and on-call staff Found in 7.4-4, 6, 10 Not found in 7.4-7, 8, 9

Process for using a Table of Expected Results As you read the application, pay attention for processes that the applicant cites as important, measures that are discussed, segments that are used, etc. As you find examples of these, include them in the table of expected results in columns 1-3. When you review Category 7, review it for the expected results and complete column 4. Based on the comments in Column 4 of the table, an OFI, or OFI’s can be drafted citing missing expected results. Note: These are results that the applicant has introduced an expectation for in the Organizational Profile and Categories 1-6. They are NOT results that the examiner would like to see.

Evaluation Factors for Category 7 (results) Le = Performance Levels Numerical information that places an organization’s results on a meaningful measurement scale. Performance levels permit evaluation relative to past performance, projection goals, and appropriate comparisons. T = Trends Numerical information indicating the direction, rate and breadth of performance improvements. A minimum of 3 data points is needed to begin to ascertain a trend. More data points are needed to define a statistically valid trend. C = Comparisons Establishing the value of results by their relationship to similar or equivalent measures. Comparisons can be made to results of competitors, industry averages, or best-in-class organizations. I = Integration Connection to important customer, product and service, market, process and action plan performance measurements identified in the Organizational Profile and in Process Items. G = Gaps Absence of results addressing specific areas of Category 7 Items, including the absence of results on key measures discussed in Categories 1–6

Results Evaluation Factors Le Trend Comparison Denotes “Good” Trend direction

Results strengths and OFI;s Strengths are identified if the following is observed: Performance levels [Le] are equivalent or better than comparatives, and/or benchmarks Trends [T] show consistent improvement, and Results are linked [Li] to key requirements Opportunities for improvement are identified if: Performance levels [Le] are not as good as comparatives Trends [T] show degrading performance Comparisons are not shown Results are not linked [Li] to key requirements Results are not provided [G] for key processes and/or action items

Scoring a Results Item Scoring a Results item is similar to a Process Item Scoring Guidelines for Le, T, C, and I A single “best fit” score is selected for each results item

Results Scoring for Level [Le] SCORERESULTS 0% -5% There are no organizational PERFORMANCE RESULTS or poor RESULTS in areas reported. 10% - 25% A few organizational PERFORMANCE RESULTS are reported; there are some improvements and/or early good PERFORMANCE LEVELS in a few areas. 30% - 45% Improvements and/or good PERFORMANCE LEVELS are reported in many areas addressed in the Item requirements. 50% - 65% Improvement TRENDS and/or good PERFORMANCE LEVELS are reported for most areas addressed in the Item requirements. 70% - 85% Current PERFORMANCE LEVELS are good to excellent in most areas of importance to the Item requirements. 90% - 100% Current PERFORMANCE LEVELS are excellent in most areas of importance to the Item requirements.

Results Scoring for Trend [T] SCORERESULTS 0% -5% TREND data either are not reported or show mainly adverse TRENDS. 10% - 25% Little or no TREND data are reported, or many of the trends shown are adverse. 30% - 45%Early stages of developing TRENDS are evident. 50% - 65% No pattern of adverse TRENDS and no poor PERFORMANCE LEVELS are evident in areas of importance to your organization’s KEY MISSION or business requirements. 70% - 85% Most improvement TRENDS and/or current PERFORMANCE LEVELS have been sustained over time. 90% - 100% Excellent improvement TRENDS and/or consistently excellent PERFORMANCE LEVELS are reported in most areas.

Results Scoring for Comparison [C] SCORERESULTS 0% -5%Comparative information is not reported. 10% - 25%Little or no comparative information is reported. 30% - 45%Early stages of obtaining comparative information are evident. 50% - 65% Some TRENDS and/or current PERFORMANCE LEVELS—evaluated against relevant comparisons and/or BENCHMARKS—show areas of good to very good relative PERFORMANCE. 70% - 85% Many to most reported TRENDS and/or current PERFORMANCE LEVELS—evaluated against relevant comparisons and/or BENCHMARKS—show areas of leadership and very good relative PERFORMANCE. 90% - 100% Evidence of industry and BENCHMARK leadership is demonstrated in many areas.

Results evaluation: key concepts Integration Results align with key factors, e.g., Strategic challenges, Workforce requirements Vision, mission, values Results presented for Key processes Key products, services Strategic accomplishments What examples can you think of? Strong integration? Not-so-strong integration? 21

Results evaluation: key concepts Strong integration: results presented for Key areas addressing strategic challenges, Key competitive advantages Key customer requirements Not-so-strong integration: Results missing for the above Results presented that the Examiner can’t match to process items or Organizational Profile 22

Results Scoring for Integration [I] SCORERESULTS 0% -5% RESULTS are not reported for any areas of importance to your organization’s KEY MISSION or business requirements. 10% - 25% RESULTS are reported for a few areas of importance to your organization’s KEY MISSION or business requirements. 30% - 45% RESULTS are reported for many areas of importance to your organization’s KEY MISSION or business requirements. 50% - 65% Organizational PERFORMANCE RESULTS address most KEY CUSTOMER, market, and PROCESS requirements 70% - 85% Organizational PERFORMANCE RESULTS address most KEY CUSTOMER, market, PROCESS, and ACTION PLAN requirements. 90% - 100% Organizational PERFORMANCE RESULTS fully address KEY CUSTOMER, market, PROCESS, and ACTION PLAN requirements.

“Key” results Refers to elements or factors most critical to achieving the intended outcome In terms of results, look for Those responsive to the Criteria requirements Those most important to the organization’s success Those results that are essential elements for the organization to pursue or monitor in order to achieve its desired outcome 24

THANK YOU!! Your support and participation as Examiners helps us all by helping WSQA fulfill its mission! 25