THE ADVANCED IRB FORUM Monday 23 June Kevin Ryan.

Slides:



Advertisements
Similar presentations
Using Growth Models to improve quality of school accountability systems October 22, 2010.
Advertisements

1st Meeting of the Working Party on International Trade in Goods and Trade in Services Statistics - September 2008 Australia's experience (so far) in.
British Bankers’ Association CRD 3 and beyond How are you left? Simon Hills British Bankers Association.
Own Risk & Solvency Assessment (ORSA): The heart of Risk & Capital Management John Spencer Director, Ultimate Risk Solutions.
Better risk management in support of regulatory quality Incorporating risk assessment tools in RIA to prepare better rules Charles-Henri Montin Senior.
Contract Certainty John Harvie 30 May Market Reform Page 2 How did this issue arise? –Global, historic practice and culture, a legacy of the past.
A Brief Supervisory Perspective on Banks’ Internal Assessments of Capital Adequacy David Palmer Federal Reserve Board April 2009.
1 The critical challenge facing banks and regulators under Basel II: improving risk management through implementation of Pillar 2 Simon Topping Hong Kong.
Role of actuarial function supporting the FLAOR leading to the ORSA Ian Morris June 2014.
Health Insurance October 19, 2006 Insurance is defined as a means of protecting against risk. Risk is a state in which multiple outcomes are possible and.
Beginning the Research Design
1 Benchmarking Model of Default Probabilities of Listed Companies Cho-Hoi Hui, Research Department, HKMA Tak-Chuen Wong, Banking Policy Department, HKMA.
IS Audit Function Knowledge
CHAPTER 8 Testing VaR Results to Ensure Proper Risk Measurement.
1 Validation and Verification of Simulation Models.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
1 Assessing the impact of IDR on bank’s regulatory capital Eduardo Epperlein & Alan Smillie PRMIA-ISDA Seminar 11 September 2007 The analysis and conclusions.
Budgets. On completing this chapter, we will be able to: Understand why financial planning is important. Analyse the advantage of setting budgets- or.
BRITISH BANKERS’ ASSOCIATION Results of BBA/ISDA/RMA IRB Validation Study BBA/ISDA/RMA Advanced IRB Forum Monika Mars London - June 23, 2003.
The Revision of Basel Capital Rules
Practical Implications of Regulatory Convergence – Lessons from Basel II Mary Frances Monroe Division of Banking Supervision and Regulation Board of Governors.
OECD Guidelines on Insurer Governance
The New Basel Capital Accord Darryll Hendricks Senior Vice President Federal Reserve Bank of New York February 2, 2001 (Second Consultative Package)
Corporate Governance: Basel II and Beyond Corporate Governance Program for Bank Directors of Indian Banks Mumbai December 14, 2005.
Ten years of Evaluability at the IDB Yuri Soares, Alejandro Pardo, Veronica Gonzalez and Sixto Aquino Paris, 16 November, 2010.
The role of governance in self-assessment NATSPEC conference Sue Preece HMI March
Basel 2: Current Status Phil Rogers, HSBC Bank Credit and Risk 25 July 2006.
Overview of Credit Risk Management practices in banksMarketing Report 1 st Half 2009 Overview of Credit Risk Management practices – The banking perspective.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Date (Arial 16pt) Title of the event – (Arial 28pt bold) Subtitle for event – (Arial 28pt) Implementation and policy overview Directors of General Insurance,
Regulatory Convergence under Post Basel II: some comments Giovanni Majnoni Contractual Saving Conference Washington, DC, May 1, 2002.
Abcd Managing and measuring operational risk in an insurance company John Rowland Tillinghast General Insurance Spring Seminar May 2003 Scarman House.
Solvency II Open Forum 4 th March 2008 Michael Aitchison.
Private & Confidential1 (SIA) 13 Enterprise Risk Management The Standard should be read in the conjunction with the "Preface to the Standards on Internal.
Ratemaking ASOPS By the CAS Committee on Professionalism Education.
B RITISH B ANKERS' A SSOCIATION Implementing Basel II a trade association view Simon Hills Director Prudential Capital & Risk.
PD - 16 Developments on International Accounting Standards From a P & C and Life Perspective Canadian Institute of Actuaries Annual Meeting David Oakden.
1 BASEL II: ONE CREDIT ANALYST’S PERSPECTIVE Presented November 9, 2004 in Quito, Ecuador, on the occasion of the 10th anniversary celebration of ECUABILITY.
1 City University of Hong Kong Professional Seminar on Latest Perspective on Basel II Simon Topping Hong Kong Monetary Authority 19 July 2004.
Basel Committee Recommendations. Framework Amendment to Capital Accord to incorporate market risk –1996 Application of Basel II to trading activities.
Future of Credit Risk Management: Supervisory Approach to Basel II CIA Annual Meeting Session 4405 Ben Gully Director, Basel Implementation Division Office.
Chapter 10 Verification and Validation of Simulation Models
IAEA International Atomic Energy Agency Methodology and Responsibilities for Periodic Safety Review for Research Reactors William Kennedy Research Reactor.
Chapter 12 Auditing Projects.
FINANCIAL SERVICES ADVISORY SERVICES 13 March 2007 Challenges faced by consultants whilst consulting on Basel II.
Basel Committee Norms. Basel Framework Basel Committee set up in 1974 Objectives –Supervision must be adequate –No foreign bank should escape supervision.
The Use of Actuaries as Part of a Supervisory Model Michael Hafeman – Consultant World Bank May 2004.
A regulatory perspective: assessing ‘best practice’ risk systems Michael Ainley Head of Wholesale Banks Department Financial Services Authority, UK 18.
Challenges in Validation: Taking the Study Findings Forward A Corporate Perspective Lyn McGowan RBC Financial Group Advanced IRB Forum New York, June 19,
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
CIA Annual Meeting LOOKING BACK…focused on the future.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
Building Valid, Credible & Appropriately Detailed Simulation Models
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Internal Audit Quality Assessment Guide
Consultation on Guidance for (Re)Insurance undertakings on the Head of Actuarial Function Role (CP 103) Presentation to Society of Actuaries in Ireland.
Introduction and Overview
Understanding Standards: Advanced Higher Event
EIA approval process, Management plan and Monitoring
12.2 Conduct Procurements The process of obtaining seller responses, selecting a seller and awarding the contract The team applies selection criteria.
Measurement of Operational Risk
Chapter 10 Verification and Validation of Simulation Models
Introduction and Overview
Basel 3 – A Practical Look
Helene Skikos DG Education and Culture
Christopher Irwin Taipei October 17, 2001
Backtesting.
Anti-Procyclicality Framework
Operational Risk Management
Presentation transcript:

THE ADVANCED IRB FORUM Monday 23 June Kevin Ryan

AREAS TO BE COVERED Issues in validating Internal Ratings systems How are supervisors reacting – emerging thoughts The FSA Consultative Paper –Sections on Validation, External Models & Data, Data Quality, Assessment Horizon & PD estimation

NOT TO BE COVERED Detailed ‘how to do it’ to get your IRB systems approved by the FSA We do not know enough at this stage –“We recognise a lot of work remains in this complex area with many challenges for firms and us” Responsibility of firms to validate their rating systems –Firms ‘validate’; supervisors ‘approve’ (or certify)

STYLISED SUPERVISORS’ VIEW OF RATING SYSTEMS Data and validation a challenge! Good work has been done, but not been part of the mainstream activity of the firms, so –Even in the best firms, few people understand the issues –Knowledge and oversight by senior management and control functions quite low –Resources for improvements can be difficult to get, so trade- offs made –Limited appetite for fresh thinking or challenging assumptions Appetite for short cut solutions, eg agency ratings

A MODEL IS PROVED BY ITS PERFORMANCE ? SMALL SAMPLE SIZE In the common binomial test, construct a confidence interval around estimated PD Suppose a PD estimate is 100bp, and you want to be 95% confident that actual PD is between 80 and 120bp

A MODEL IS PROVED BY ITS PERFORMANCE ? SMALL SAMPLE SIZE In the common binomial test, construct a confidence interval around estimated PD Suppose a PD estimate is 100bp, and you want to be 95% confident that actual PD is between 80 and 120bp You need 9500 borrowers in that grade

A MODEL IS PROVED BY ITS PERFORMANCE ? SMALL SAMPLE SIZE Using the common binomial test, what levels of true PD are consistent with zero observed defaults, at a 95% confidence level? If you have 500 borrowers in a grade, any PD up to 1.25% If you have 80 in a grade, a PD as high as 6% or 7% (Binomial test doesn’t work if PD x Borrowers below 5)

A MODEL IS PROVED BY ITS PERFORMANCE ? DEFAULTS ARE CORRELATED The distribution you can expect from defaults is many periods with actual defaults below the mean, and fewer periods (typically clustered together) with actual defaults well above the mean IMPLICATIONS Confidence bounds wider than binomial More difficult to interpret

A MODEL IS PROVED BY ITS PERFORMANCE ? IMPLICATIONS “Regardless of the efforts used by banks, …there will still remain some portfolios where there will likely never be sufficient default data to calibrate PDs with any degree of statistical significance” Internal Ratings Validation Study “We’re going to need to be pragmatic for some years to come”

WHAT CAN BE DONE? BROADER APPROACH TO VALIDATION, TO SUPPLEMENT OUTCOMES ANALYSIS Logic and conceptual soundness of the approach Statistical testing prior to use Monitoring of process – are the methods being applied as intended Benchmarking – compare to relevant alternatives David Wright, FRB, 19 June

WHAT CAN BE DONE? MORE ON OUTCOMES ANALYSIS, INCLUDING More exploration and use of tolerance levels by firms, which requires Better understanding of distribution of expected defaults Supervisory interest in information content of transition matrices What can be learnt from other industries? –Insurance experience of modeling rare events?

FSA APPROACH – FSA CP KEY POINTS Accountability –Firms responsibility to validate & submit documentation with IRB application –Application to be signed by chief executive We expect he will have similar questions to supervisors Independence –Approved by senior committee –Independent staff to participate

FSA APPROACH – FSA CP KEY POINTS Scope –To cover all portfolios, but depth will vary with significance Materiality considerations –Methods may vary, especially by portfolio If more reliance can be placed on back-testing we should need less additional evidence If less reliance on back-testing, more additional evidence is needed

FSA APPROACH – FSA CP KEY POINTS Scope –Must assess the accuracy of the overall output of the system Not just the inputs Will need to take account of overrides and judgmental adjustments to any underlying statistical models –Based on statistical analysis of rating system, related internal data, and third party and publicly available information

FSA APPROACH – FSA CP Coverage –Full documentation –Clarity on what rating system aiming to predict, and expected distributions –Take full account of adjustments between unbiased estimates and those used in regulatory capital calculation Conservatism Cyclical effects Double default effects

FSA APPROACH – FSA CP Coverage –Clearly set out standards of control –Clearly set out limitations of approach Assume that there will be candour –Include work to demonstrate both ability to rank into grades (discriminate) and estimate a PD etc (calibrate) –Include steps to be taken in event quantitative tests for power and accuracy are breached

FSA APPROACH – FSA CP Discrimination/ranking –Firm must justify that system shows a ‘high degree of power in line with industry norm for portfolios of that nature’ –We do not specify what test should be used, eg Gini or other, or its level But we consult on such a test –Expect firms to strive for best model they can, other things being equal

FSA APPROACH – FSA CP Discrimination/ranking - issues –Some see discrimination as key to model building –Some research that high discrimination needed to achieve accurate PD measures –But levels dependent on features of portfolio, number of defaults –High is good, but may be over-fitting –Firms must use targets, but reluctant to admit to them –Importance of expertise to interpret

FSA APPROACH – FSA CP Calibration –Standardised ‘scorecard’ proposed –Firm to justify its estimate against own historic experience and external sources –Take account of factors which may expect to lead to differences; eg conservatism, cycle effects –We do not specify tests for assessing accuracy But we consult on such a test –If actuals not consistent with estimates, firms must justify differences and/or take steps to improve

FSA APPROACH – FSA CP Data quality –“Data quality was not seen as the major obstacle to validation in the long term” Internal Ratings Validation Study –Some evidence that cleaning the data produces more powerful models, while greater accuracy self evident –Missing defaults? –“We recognise that the data accuracy challenge for IRB is significant” FSA CP –“We propose quantifiable targets to cover completeness and accuracy that will rise over time”

FSA APPROACH – FSA CP External models and data –In principle supportive as can supplement internal data and models; also external vendors may operate to higher standards than firms –But external models carry explicit risk of black box, with limitations not known and application to inappropriate portfolios –Limit to how much vendors will reveal –We are trialing ‘vendor grid’ which would standardise information to be provided – aimed at reducing risks

FSA APPROACH – FSA CP Pooled data –In principle supportive as, like external data, can supplement internal experience, and without commercial confidentiality issues of external vendors –“Issues and challenges facing pooling initiatives Consistent data definitions for all Legal restrictions – data protection and confidentiality Potential increase in systemic risk” Internal Ratings Validation Study

FSA APPROACH – FSA CP Expert judgment –“There is a feeling that not enough time has been spent on discussing acceptable validation techniques for these types of systems” Internal Ratings Validation Study –Will look to accommodate, but validation challenges increase –Are there some cases where it is not feasible to produce quantified estimates? –Or use ranking with conservative estimates of losses

BENCHMARKING A complement to back-testing Conceptually two types, although differences between them may be blurred in practice –Relative benchmarks – compare estimates between firms to identify outliers –Absolute benchmarks - compare with an external benchmark given some credibility We propose some element of benchmarking in our proposed approaches to discrimination (industry standard) and calibration (comparison with external sources)

BENCHMARKING POSSIBLE INITIATIVES Private sector services to benchmark ratings and/or estimates Some international interest in requiring firms to rate ‘test portfolios’ FSA considering targeted benchmarking at obligor level where back-testing difficult and amounts large –Very early stage of thinking but would welcome feedback –Could be run by industry, FSA or other body

SOME VALIDATION CHALLENGES Scale of task given possible number of models –Materiality to firm v materiality to the market –Importance of consistency Need for firms and supervisors to increase expertise –Even statistical models require expert judgment –How much do supervisors need to know? Can we give enough guidance to allow objective self-assessment? What is the scope and appetite for improved standards?

SOME VALIDATION CHALLENGES What is the supervisory standard? –Can we identify a definite pass or a definite fail? Benchmarks which must be beaten to qualify, or if beaten are sufficient Implications for expert judgment approaches, or methods which ‘perform’ less well? Other objectives may be avoidance of systemic risk, and allowing entry to stimulate competition –How do we incentivise firms to improve? Rising standards or Pillar 2 requirements –Too much conservatism will take away incentives LGD and EAD?