Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating and Managing Performance

Similar presentations


Presentation on theme: "Evaluating and Managing Performance"— Presentation transcript:

1 Evaluating and Managing Performance
…..it’s not just about Vendors Yukon Procurement Conference February 16, 2015

2 Purpose Provide an overview of:
what “Vendor Performance Management” is, and is not Some examples from public sector agencies in Canada (and their different objectives)

3 What is “Vendor Performance Management”?
It is: about improving success in meeting contract outcomes one element of “Vendor Management” Aligning procurement, contract terms, project management to meet organizational objectives Monitoring/mitigating vendor risks as well as Aggregating, assessing and communicating vendor performance information and Actively managing relationships It is NOT : code for simply increasing criticism of suppliers Surprise report cards Only about ‘bidder barring’

4 Office of the Procurement Ombudsman, 2010
Benefits Increase accountability Help achieve best value for taxpayers Office of the Procurement Ombudsman, 2010 Increase competitive advantage Improve stakeholder satisfaction Increase performance visibility Survey Analytics, 2011 “Measuring, monitoring, evaluating and reporting on vendor performance creates an atmosphere that fosters better communication and results in improved government-vendor relationships.” Office of the Procurement Ombudsman, Procurement Practices Review,

5 How is it done? Most effective approaches integrate tools across all phases of the procurement and contract lifecycle For example, buyers can: Ensure clear statement of functional and performance requirements Establish key performance indicators Use in-contract evaluations, monitoring procedures, and measurement of performance against KPIs Evaluate and maintain records on supplier performance Provide feedback to suppliers

6

7 Which tool(s) to pick? Each tool:
Has different advantages/disadvantages Requires a different level of investment to implement and maintain Has a different likelihood of success, depending on: Market characteristics (e.g. market size and health, maturity of vendors) Capacity/maturity of an organization to properly implement and support

8 Public Sector Activity
Recent Survey (Ontario)* 92% of suppliers and buyers think VPM is an important activity 25% of organizations have VPM activities Becoming increasingly common across Canada, some examples include: PWGSC Canada Revenue Agency Correctional Service of Canada Nfld Department of Transportation and Works Infrastructure Ontario Defence Construction Canada Province of BC Ministry of Government Services (Ontario) *Malatest & Assoc. (2012), Vendor Performance Management Study The Malatest study found that of public sector organizations surveyed that did have vendor management tools in place, the majority used performance clauses in contracts to help manage vendor performance (94%), as well as a variety of templates and forms (75%), progress meetings (70%), performance documentation (65%), and third party verification (35%). In addition, approximately 80% of respondents (with virtually no difference between buyers and suppliers) thought performance should be measured on every contract (and not, for example, just those triggered by poor performance or by a dollar threshold). This study also proposed that the most important elements of performance to be measured (in order of priority) were: Quality of deliverable Effective communication throughout the engagement Budget/cost control Availability of resources to carry out the contract Quality of resources Maintaining timelines/deadlines

9 Defence Construction Canada (DCC)
Driver: “train (not penalize) contractors/consultants to meet expectations of DCC”  Overall rating = sum of points from equally weighted criteria Administration / Contract Management Execution / Project Management Quality of Workmanship Completion / Close Out / Time Health & Safety Scale of Unacceptable (0-5), Not satisfactory (6-10); Satisfactory (11-16) and Superior (17-20) Defence Construction Canada (DCC) is a federal crown agency that provides contracting, construction contract management and related infrastructure services to the Department of National Defence. Note slide shows the criteria for contractors, there is a slightly different list for consultants

10 DCC - continued Bidding privileges suspended for any score of 5 or less in one category, or a second occurrence of a total score less than 50% “failures” are relatively rare Heavy reliance on documentation – require clear evidence DCC’s own practices didn’t contribute Generally considered by staff and suppliers to be successful Staff believe contributes to their ability to be an attractive client in a competitive market Staff at DCC indicated that overall the system is considered to be working well. They rely on an evidence based approach to document performance concerns and will only pursue suspensions in cases where the documentation clearly indicates DCC’s own contract management practices have not contributed to poor vendor performance. In general, the system is considered to be working well, is appreciated by the majority of suppliers, and contributes to the organization’s ability to be an attractive client in a competitive market

11 Infrastructure Ontario
Driven in part by desire to reduce time spent evaluating high volume of responses to RFPs/tenders Required for all contracts >$100,000 Vendor Performance Rating = average of that vendor’s Scorecards over a three year period Vendor performance is rated as : 1 – Consistently falls far below expectations (< 25% of expectations met) 2 – Frequently misses expectations (<50%) 3 – Mostly meets expectations but sometimes misses (<75%) 4 – Consistently meets expectations (100% of requirements) 5 – Exceeds expectations Ratings applied during final evaluation, worth minimum of 10% excellent service can lead to extended time on prequalified list A Vendor Performance Rating (“VPR”) is an average of all of that vendor’s (final) Scorecards over a three year period. If the vendor does not have a scorecard for each of the three years, its VPR will be based on the average ratings for the years for which scorecards are available, or – if it has none -- it will be assigned a rating based on the average of all IO vendors’ ratings from the previous fiscal year (the “Global Average”). 1 – Consistently falls far below expectations (< 25% of expectations met). Performance jeopardized the achievement of contract requirements, despite additional oversight. 2 – Frequently misses expectations (<50%). There are a number of performance issues that required IO to provide additional oversight to ensure that contract requirements were met. 3 – Mostly meets expectations but sometimes misses (<75%). There are very minor performance issues but vendor or service provider has otherwise met the contract requirements. 4 – Consistently meets expectations (100% of requirements). There are no performance issues and the vendor or service provider has met the contract requirements. 5 – Exceeds expectations. Vendor or service provider has demonstrated a performance level in measurable excess of contract requirements (e.g. provided tangible recommendations for improvement, proactively addressed issues before they arose, etc.).

12 BC – Internal Reference Check (2011)
Required for services contracts >$10M Conducted at beginning of a prequalification process, result in pass/fail Common indicators include: Performance - contractual standards & service target levels Cost performance Schedule performance Team This internal reference check must review a vendor's performance on all contracts with a value of $1 million or greater that the vendor has had with the Province in the past three years. The policy arose as a result of concerns that poor performance by a vendor on one government contract often had no consequences in terms of their ability to bid on or to be selected for future opportunities within government. The pressure was public/political – not driven by staff Performance - contractual standards and service target levels (has contractor met contractual standards and service target levels? Was the client satisfied with the contractor’s performance of the services? Other considerations can be professional behaviour, client management, timely meeting of administrative requirements eg. reporting, record keeping etc.) Cost performance (were the services provided within budget or close to the cost estimates? Did the contractor manage expenses appropriately? How did the contractor manage cost overruns?) Schedule performance (timeliness of completion, meeting of target levels, interim and final milestones) Team (evaluation of parent companies, affiliates, major subcontractors, management, key personnel, predecessors) BC experiences suggests staff support for the tools is a critical consideration The internal reference check procedure is not considered to be particularly successful significant reluctance among staff to both record and rely on documentation of poor performance, at least in the case of these high value and typically highly visible strategic contracts, to eliminate a vendor from a procurement process. The policy does not appear to have had significant impact on the procurements it is intended to affect. The Vendor Scorecards, however, appear to be off to a more successful start. SPO has been using a ‘deal health check’ measurement tool in relation to the major strategic deals for the past two years in an informal way – i.e. not required in the contracts themselves and not intended for public reporting purposes. The ‘health check’ has some significant similarities to the Vendor Report Card in that it assesses similar areas and involves some two-way ratings. Through this approach they have gained the input and acceptance of the vendors involved, which will help with the formal launch of the Report Card as a formal performance management and public reporting tool in the coming year. The usefulness of the tool in supporting a proactive and constructive approach to identifying and managing performance issues led SPO to extend implementation beyond the vendors associated with the strategic deals alone to those involved in helping SPO design, implement and manage them

13 BC – Vendor Scorecard (2014)
Two new tools: one for “strategic deals”, one for the consultants that support them Driver – recognition that improving performance requires clear and ongoing communication concerning performance expectations Rating carried out by both parties and aims to foster discussion about areas needing improvement or differences of opinion For consultants: If more than 2 of 8 questions unsatisfactory = 1 demerit 3 demerits = potential removal from RFQ SPO has developed two Vendor Scorecard tools. The first is for use in assessing the relationship and performance of with the vendors involved in the strategic deals themselves. The current portfolio of these strategic deals consists of 13 major projects with a total contract value of $5.9 billion and annual contract expenditures of around $550 million. The second tool is designed to assess the performance of consultants involved in supporting SPO’s mandate, (including negotiations, procurement support, training, development of tools and analysis of deals, etc.) which involves several dozen individuals and firms, initially selected through a RFQ process, and expenditures of $2-4 million annually. At the conclusion of each contract, the contract manager and the direct client contact (i.e. the person most closely linked to the work performed by the consultant) will jointly complete a consultant Rating Sheet. A copy of the completed Rating Sheet will be provided to the consultant and a debriefing meeting held if requested. A Rating Sheet will receive a “Fail” if two or more of the eight questions achieve the following marks: “Sometimes / Rarely / Never / No / Unsatisfactory”. Each failed Rating Sheet is a “demerit” and recorded in the Vendor’s SPO Vendor Scorecard. Three demerits recorded may result in the consultant (individual or firm) being removed from the RFQ List of Qualified Suppliers. Companies can reapply to subsequent RFQ’s issued by the SPO once the current RFQ has expired (in 2016). Removal from the RFQ list does not preclude vendors from bidding on other procurements issued by government. Each SPO Vendor Scorecard will be retained and held by the SPO and may be used for future reference checking purposes.

14 Questions? Note – TEB piloting initiative of survey for consultants designed to increase communication about performance feedback for both parties re: managing costs maintain project schedule Write clear/comply with contract specifications manage support staff/third party contractor(s) communicate effectively Overall management abilities (e.g. problem solve, document and minimize risk) Areas where most effective or needing improvement Overall satisfaction Additional Q for Project Manager: To what extent did the final product or service satisfy the terms of the contract?


Download ppt "Evaluating and Managing Performance"

Similar presentations


Ads by Google