Contract management – new approaches Nick Capon Centre for Enterprise Research and Innovation Measuring Contractor Performance Acknowledgements to Stephen DeBoise, Portsmouth City Council
Scope Apologies for absence of Richard Tonge Recent research at Portsmouth City Council Current methods they use Strengths, weaknesses Proposed improvements
Definitions? Before an order is placed –‘Contractor EvaIuation’ –Pre Qualification Questionnaire (PQQ) After an order is placed –‘Contractor Appraisal’, or –‘Vendor Rating’
Benefits of measurement? Joint understanding of customer needs Motivating improvement Cost of control minimised Reduced waste and complaints Benchmarking of contractor performance Demonstrating control of the vendor base. Customer Your organisation Contractor Other stakeholders Align aims and priorities Value
Challenges – excuses? Workload – labour and data intensive Subjectivity – evidence? rewards/ penalties cause bias? measures recorded depend on how explained? Motivation – measures can create argument rather than benefit Historical nature – slow Investment needed to mechanise data collection and communication
Theory - QTCC If contractor not critical to continued success Communication Quality TimeCost
Theory – 7’C’ If contractor critical: CompetencyCost Cash resources Control Commitment Capacity Consistency "Measure for Measure." Supply Management, 1 February 2001, 39
What to measure? Or same QTCC in greater detail: –Cost: Prices, target costs, non-performance costs, savings achieved/year –Quality: End customer complaints and feedback, SPC capability analysis and SPC trend reporting, SLA achievement, documented service design improvements –Time: Source reliability, staff turnover, compliance to procedures, financial stability, total workload for us as % of total turnover <30%, on time –Communication: Relationships, understanding of needs and values, communication delays. Purchasing Principles and Management, Bailey and Farmer, 2005
What do we measure now? Outputs - Achieve specification Outcomes - Survey of clients Process - Would we work with you again? Organisation concern - consistency Contractor concern – trends, review Organis ation - Difficul t Contractor – Low response Org – ethics, innovat ion Contractor – Must compare
What would we like to measure? Contractor 1.Outputs, compliance with specification 2.Sustainability, continuity 3.Value for money 4.Innovation 5.Competition 6.Partnership, shared values 7.Skills, best practice Organisation 1.Partnership, shared values 2.Outcomes for service users 3.Communication, trust 4.Understanding of needs 5.Value for money 6.Sustainable company
What would help? What can organisation do to help contractor? Time to plan Information sharing, also within organisation departments Reduce complexity Transparency Stable agreed expectations Feedback, clarity What can contractor do to help organisation? Soft market testing to stimulate new suppliers Willingness to change measures to suit At need identification At ITT At PQQ At tender At SLA At contract review
Measure of Overall Satisfaction (9 Excellent, >6 Good, >0 Improvement required, <0 Failing to perform) Customer Perception (Outcomes) Process Expectations (subjective) Contract requirements (Outputs, service levels defined in specs) Satisfaction survey >90%, >75%, >60%, >0 plus complaints low Score 3, 2, 0, -1 Exceeds, Meets, Mostly, None Score 3, 2, 0, -1 Evidence required Contractor data, plus periodic independent check Conclusion Exceeds, Meets, Mostly, None Score 3, 2, 0, -1 ‘Least good at, best at…’
How to measure? Check goal alignment –What does contractor think are your priorities? Remove your role and allow end customer to communicate direct to contractor if possible –Examples: Website feedback from customers, measure plus a quote –Travel agent website of hotels Self assessment by contractor of trends –Encourages involvement –Reduces workload –SPC, trends more important than KPI Independent assessor for depth –If customer is not web literate –Example: Help the Aged to assess Care Homes 360 degree feedback
Constraints Resources in Organisation –to create three appropriate measures for each contract –simple transparent database to update results Training for Contractors Template for contractors to provide information Sustaining
Pilot testing - Outcomes Volume of valid complaints –compared to a target agreed with the contractor. Asking customers –‘How likely are you to raise/recommend us to a friend (0-100%)?’ –if required ‘What extra should we have done to get 100% score? Method –User surveys, comparison with benchmarks
Pilot testing - Outputs Meet timescale: –Non-achievement, missed/late deliveries –Rectification response time Quality: –Capability, maintaining adequate resources and skills –Work completed in sufficient detail –Safety, environment, discrimination Price: –Variations to contract pricing –Number of cost saving improvements
Pilot testing - Process Problem resolution including 360 o feedback Communication response Invoice accuracy Technical innovation Financial stability ongoing Cultural ethos/ values same as ours Subjective assessment, with evidence
% who measure this Complaints (reactive, volume) Satisfaction (proactive, %) Illustrative current practice - Outcomes Methods: Complaints: -Unsolicited praise/ criticism by letter, newspaper or telephone -Realtime update of shared web database -Minuted monthly if serious Satisfaction: -Proactive survey
% who measure this Meet timecale Rectification response time Non achievement, missed deliveries Illustrative current practice - Outputs Maintaining adequate resources and skills quality Work completed in sufficient detail Safety, environment, discrimination Financial overspend, Changes/ variations to contract in price Number of cost saving improvements Methods: - Self assessment by contractor - Some use SPC to monitor trends
% who measure this Illustrative current practice - Process Problem resolution, including 360 o feedback Communication response Invoice accuracy Technical innovation Financial stability ongoing Cultural ethos same as ours Methods: - Verbal dialogue -Monthly minuted discussion
How reported? Monthly report, face to face discussion Geographical analysis to direct improvement action SPC charts to highlight significant issues
Action resulting? Financial sharing of improvements/ penalties for failure Focus for improvement action Planned transparent sharing of vendor rating results (IT system) with rest of organisation, who might buy from same contractor
Questions?