Boot Camp on Reinsurance Pricing Techniques - Loss Sensitive Treaty Features August, 2009 Kathy H. Garrigan, FCAS, MAAA.

Slides:



Advertisements
Similar presentations
An Example of Quant’s Task in Croatian Banking Industry
Advertisements

1 On Optimal Reinsurance Arrangement Yisheng Bu Liberty Mutual Group.
Chapter 25 Risk Assessment. Introduction Risk assessment is the evaluation of distributions of outcomes, with a focus on the worse that might happen.
Utility Theory.
Reinsurance Structures, Pro Rata Pricing, & When Good Pricing Goes Bad August 8, 2007.
Copyright © 2009 Pearson Education, Inc. Chapter 29 Multiple Regression.
Adding a Cat Load to Property Reinsurance Pricing One Reinsurer’s Approach June 1, CAGNY.
Copyright © 2008 Pearson Addison-Wesley. All rights reserved. Chapter 7 Financial Operations of Insurers.
Reserve Risk Within ERM Presented by Roger M. Hayne, FCAS, MAAA CLRS, San Diego, CA September 10-11, 2007.
Confidence Intervals for Proportions
An Issue of Use or Abuse? Adjusting for Financial and Finite Reinsurance Steven Ader, Director.
Reinsurance and Rating Agency Models
Reinsurance Presentation Example 2003 CAS Research Working Party: Executive Level Decision Making using DFA Raju Bohra, FCAS, ARe.
Uncertainty and Consumer Behavior
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
Copyright © 2010 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
Boot Camp on Reinsurance Pricing Techniques – Loss Sensitive Treaty Provisions August 2007 Jeffrey L, Dollinger, FCAS.
1 Analyzing Loss Sensitive Treaty Terms 2001 CARe Seminar g ERC ® Jeff Dollinger, FCAS, Employers Reinsurance Co Kari Mrazek, FCAS, GE Reinsurance Co.
Reinsurance Structures and Pricing Pro-Rata Treaties CARe Pricing Boot Camp August 10, 2009 Daniel Kamen, FCAS, MAAA Vice President Allied World Reinsurance.
Chapter 19: Confidence Intervals for Proportions
Reinsurance By Roar Rasten Gard AS
Introduction to Reinsurance Reserving Casualty Loss Reserve Seminar Washington, D.C. September 23, 2002 Bruce D. Fell, FCAS, MAAA Am-Re Consultants, Inc.
2008 Seminar on Reinsurance Reinsuring Commercial Umbrella Brian E. Johnson, ACAS, MAAA.
March 11-12, 2004 Elliot Burn Wyndham Franklin Plaza Hotel
Reinsurance Structures and On Level Loss Ratios Reinsurance Boot Camp July 2005.
ACE Tempest Re Numerical Examples – Adjustable Features Jeanne Lee Ying.
Philadelphia CARe Meeting European Pricing Approaches Experience Rating May 7-8, 2007 Steve White Seattle.
Integrating Reserve Risk Models into Economic Capital Models Stuart White, Corporate Actuary Casualty Loss Reserve Seminar, Washington D.C September.
LECTURE 22 VAR 1. Methods of calculating VAR (Cont.) Correlation method is conceptually simple and easy to apply; it only requires the mean returns and.
Ab Page 1 Advanced Experience Ratemaking Experience Rating and Exposure Shift Presented by Robert Giambo Swiss Reinsurance America Seminar on Reinsurance.
Reinsurance of Long Tail Liabilities Dr Glen Barnett and Professor Ben Zehnwirth.
MULTIPLE TRIANGLE MODELLING ( or MPTF ) APPLICATIONS MULTIPLE LINES OF BUSINESS- DIVERSIFICATION? MULTIPLE SEGMENTS –MEDICAL VERSUS INDEMNITY –SAME LINE,
Ab Rate Monitoring Steven Petlick Seminar on Reinsurance May 20, 2008.
TOPIC THREE Chapter 4: Understanding Risk and Return By Diana Beal and Michelle Goyen.
RMK and Covariance Seminar on Risk and Return in Reinsurance September 26, 2005 Dave Clark American Re-Insurance Company This material is being provided.
Boot Camp on Reinsurance Pricing Techniques – Loss Sensitive Treaty Provisions July 2005.
Introduction to Exposure Rating CAS Ratemaking Seminar Boston March 17, 2008 Halina Smosna ACAS, MAAA Vice President, Endurance Re.
Seminar on Reinsurance – June 2-3, 2003 Pricing Techniques: Practical Track 2-3 Michael Coca Chief Actuary, PartnerRe.
Sample-Based Epidemiology Concepts Infant Mortality in the USA (1991) Infant Mortality in the USA (1991) UnmarriedMarriedTotal Deaths16,71218,78435,496.
The Common Shock Model for Correlations Between Lines of Insurance
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 6 The Standard Deviation as a Ruler and the Normal Model.
Hidden Risks in Casualty (Re)insurance Casualty Actuaries in Reinsurance (CARe) 2007 David R. Clark, Vice President Munich Reinsurance America, Inc.
EXPOSURE RATING – UNIQUE APPLICATIONS: UMBRELLA PRICING ADEQUACY Halina Smosna Endurance Reinsurance Corp of America CARe June 1 & 2, 2006.
©2015 : OneBeacon Insurance Group LLC | 1 SUSAN WITCRAFT Building an Economic Capital Model
Reserve Uncertainty 1999 CLRS by Roger M. Hayne, FCAS, MAAA Milliman & Robertson, Inc.
Ab Rate Monitoring Steven Petlick CAS Underwriting Cycle Seminar October 5, 2009.
The Black-Scholes-Merton Model Chapter B-S-M model is used to determine the option price of any underlying stock. They believed that stock follow.
Chapter 7 Financial Operations of Insurers. Copyright ©2014 Pearson Education, Inc. All rights reserved.7-2 Agenda Property and Casualty Insurers Life.
Accounting Implications of Finite Reinsurance Contracts 2003 Casualty Loss Reserve Seminar Chicago, IL Session 4 – Recent Developments in Finite Reinsurance.
Risk Transfer In The Real World Presentedby Jane C. Taylor, FCAS, MAAA Junction Consulting, Inc. Casualty Loss Reserve Seminar Boston, MA September 12,
2000 SEMINAR ON REINSURANCE PITFALLS IN FITTING LOSS DISTRIBUTIONS CLIVE L. KEATINGE.
1 Introduction to Reinsurance Exposure Rating CAS Ratemaking Seminar Session REI-47 March 12, Las Vegas Ira Kaplan
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Lecture Slides Elementary Statistics Eleventh Edition and the Triola Statistics Series by.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
What’s the Point (Estimate)? Casualty Loss Reserve Seminar September 12-13, 2005 Roger M. Hayne, FCAS, MAAA.
©Towers Perrin Introduction to Reinsurance Reserving Casualty Loss Reserve Seminar Atlanta, Georgia September 11, 2006 Christopher K. Bozman, FCAS, MAAA.
Chapter 15 Random Variables. Introduction Insurance companies make bets. They bet that you are going to live a long life. You bet that you are going to.
Copyright © 2010 Pearson Education, Inc. Chapter 16 Random Variables.
PITFALLS IN REINSURANCE PRICING. Trend, Development Beyond Policy Limits Trending vs Detrending Cessions-rated Treaties Bornhuetter-Ferguson Data Issues.
Modelling Multiple Lines of Business: Detecting and using correlations in reserve forecasting. Presenter: Dr David Odell Insureware, Australia.
The accuracy of averages We learned how to make inference from the sample to the population: Counting the percentages. Here we begin to learn how to make.
Statistics 16 Random Variables. Expected Value: Center A random variable assumes a value based on the outcome of a random event. –We use a capital letter,
Statistics 19 Confidence Intervals for Proportions.
Types of risk Market risk
INTRODUCTION TO REINSURANCE
By Bliss Joseph, Marney Lariviere, Pari Patel, and Serena Shah
Homeowners Indications – Getting It Right
Session II: Reserve Ranges Who Does What
Types of risk Market risk
Presentation transcript:

Boot Camp on Reinsurance Pricing Techniques - Loss Sensitive Treaty Features August, 2009 Kathy H. Garrigan, FCAS, MAAA

Table of Contents Loss Sensitive Feature defined Uses for Loss Sensitive Features Examples of Loss Sensitive Features Valuing Loss Sensitive Features in a treaty Questions?

What is a Loss Sensitive Treaty Feature? A Loss Sensitive term or feature is a provision within a reinsurance contract that causes the ceded premium, ceded loss, or commission to vary based on the loss experience of that contract. Why have such a feature? Allows cedants to share in the ceded experience which could motivate them to care more about the results! Works to bridge the gap that may exist between the reinsurer’s loss pick and the cedant’s loss pick Such features are generally “worth something” and it’s the reinsurance pricing actuary’s role to figure out what that value is How will this feature add to or subtract from underwriting ratio? Does the feature make sense along with the rest of the deal structure? Can you present more than one structure option to the cedant that has the same value to the reinsurer?

Types of Loss Sensitive Features Features that cause ceded premium to vary based on loss experience: Reinstatement Provisions Swing Rated Contracts No Claims Bonus Features that cause ceding commission to vary based on loss experience: Profit Commission Sliding Scale Commission Features that cause ceded loss to vary based on loss experience: Annual Aggregate Deductibles (AADs) Loss Ratio Corridors Loss Ratio Caps

Which reinsurance structures might have these features? Pro Rata / QS Treaties Profit Commission Sliding Scale Commission Loss Corridor Loss Ratio Cap Excess of Loss (XOL) Treaties Reinstatements Swing Rating Provisions No Claims Bonuses (if anywhere, Cat XOLs) Annual Aggregate Deductibles

Profit Commission Very common loss sensitive feature for both Quota Shares or XOLs. Cedant can receive a defined percentage of “profit” on the reinsurance contract, where profit is often defined as (Premium – Loss – Commission – Reinsurer’s Margin) “50 after 10” PC with ceding commission of 30%, has PC formula = 50% * (1-.3-.1-LR) = 50% * (.6 – LR) Therefore the cedant will achieve some sort of profit commission for any loss ratio result that is better than 60% What is the EXPECTED cost of the PC? Let’s assume our ELR is 60% - and we know no PC is paid at a 60% - does that mean that the expected cost of the PC is zero?

Profit Commission (continued) Answer: Just because there is no PC paid at the expected loss ratio, that seldom means the EXPECTED cost of the profit commission is zero Put another way, the cost of the PC at the EXPECTED loss ratio is not equal to the EXPECTED cost of the PC WHY? (And believe me, underwriters will want to know why) 60% is the EXPECTED loss ratio – that doesn’t mean we feel that every possible loss ratio outcome for this treaty is a 60% There is a probability distribution of potential outcomes AROUND that 60% expected loss ratio It is possible (and maybe even likely) that the loss ratio in any year could be far less than 60%, (where we would be paying on the PC) Therefore there is a COST to this feature, which we will work to calculate in the coming slides. Note a PC only goes one way – the cedant receives money when the deal is running profitably; the cedant does not pay out more money when the deal isn’t running profitably

Cost of PC calculation – specific extreme case EQ exposed California property QS If there’s no EQ, we expect a 40% Loss Ratio every year, every scenario Cat (EQ) ELR = 30% 90% chance of no EQ 10% chance of EQ where resulting LR = 300% Ceding commission = 30% PC terms are “50% after 10%” If there is NO EQ, we know the LR = 40%, so PC value = .5*(1-.4-.3-.1) = 10% If there IS an EQ, there is no PC. So what is our EXPECTED cost of PC? 0% PC, 10% of the time (yes EQ), plus 10% PC, 90% of the time (no EQ) OR….9% of Premium. Because of the “skewed” nature of Cat, PCs are not common. If you are going to have a huge loss every 10 years, the reinsurer wants to keep as much premium as possible the other 9 years

Cost of Loss Sensitive Feature: General Build Aggregate Loss Distribution Judgmentally select loss ratio outcomes and assign each a probability of happening Fit data to either an aggregate distribution (like a lognormal) or fit frequency data separately from severity data and combine Hardcore curve fitting is beyond the scope of this presentation Apply loss sensitive terms to each point on the loss distribution or to each simulated year Calculate a probability weighted average cost (or savings) of the loss sensitive arrangement

Valuing the cost of a PC of 50% after 10%, 30% Ceding Commission, 60% Expected LR As you can see, the Expected PC Cost does not equal the PC Cost at the Expected Loss Ratio

WHY NOT??? Your loss distribution – the loss ratios and the probabilities assigned to them – are what determines the value of your loss sensitive feature. In general, the more “skewed” your distribution is, the more apt you are to have to give money back to the cedant. You need a LOT of “better than average” scenarios to balance out the few AWFUL scenarios you may have. It’s those “better than average” scenarios that usually mean you are giving money back to the cedant. (Recall our “extreme” case earlier – 10% chance of getting crushed w/ EQ – 90% of the time you don’t). Since your expected loss sensitive value (cost or savings) is a function of your loss distribution, determining the appropriate distribution is important! (BUT NOT EASY!)

What if your loss distribution is shaped like this?

Or like this?

Other Loss Sensitive Features on QS’s Pro Rata / QS Treaties Profit Commission (already covered) Sliding Scale Commission Loss Corridor Loss Ratio Cap

Sliding Scale Commission A ceding commission is set at a “provisional” level at the beginning of a contract. This provisional ceding commission corresponds to a certain loss ratio in the contract “30 at a 60” The ceding commission INCREASES if the contract’s loss ratio is lower than the loss ratio that corresponds to the provisional, usually subject to a max commission The ceding commission DECREASES if the loss ratio is higher than the loss ratio that corresponds to the provisional, usually subject to a min commission As you can see, the ceding commission “slides” in the opposite direction as the loss ratio A slide is particularly useful when the reinsurer and the insurer’s loss picks differ “Put your money where your mouth is”

Sliding Scale Example Provisional Ceding Commission = 20% If the loss ratio is less than 65%, then the commission will increase by 1 point for each point decrease in loss ratio. The max commission will be 25% (at a 60%) This is said to slide “1 to 1” and is considered a “steep slide” If the loss ratio is greater than 65%, the commission will decrease by .5 points for each 1 point increase in loss ratio. The min commission will be 15% at a 75%. This is said to slide “1/2 to 1” Similar to the PC, at a 60% ELR, is the Expected Ceding Commission = Ceding Commission at the Expected Loss Ratio?

Valuing a Sliding Scale Commission

Loss Ratio Corridor A loss ratio corridor is a provision that forces the ceding company to retain losses that would otherwise be ceded to the reinsurance treaty Useful when there is a difference in LR pick, but not nearly as common as a slide For example, the ceding company could keep 100% of the losses between a 75% and 85% loss ratio – or a “10 point corridor attaching at 75%” If the gross loss ratio = 75%, then the ceded loss ratio = 75% (no corridor attaches) If the gross loss ratio = 80%, then the ceded loss ratio also = 75% Corridor “attaches” at 75% for 10 points Ceding company takes all losses between 75% and 80% (or 5 points) Therefore the ceded loss ratio still = 75% If the gross loss ratio = 85%, then the ceded loss ratio STILL = 75%! Corridor attaches and is fully exhausted for the full 10 points If the gross loss ratio = 100%, then the ceded loss ratio = ??? The corridor doesn’t have to be 100% assumed by the cedant – can be any %

Loss Ratio Cap This is the maximum loss ratio that could be ceded to the treaty Once losses hit or exceed the cap, that’s it – Example: 200% Loss Ratio Cap If the loss ratio before the cap is 150%, the ceded loss ratio is 150% If the loss ratio before the cap is 300%, the ceded loss ratio is 200% Useful for new / start up operations where the limit to premium ratio may be dreadfully imbalanced. New Umbrella program offering $10M policy limits but only plans on writing $3M in premium the first year May be the only way for such a reinsurance treaty to get placed – particularly on start up business - while the cap is generally high, at least the deal downside is limited…

Determining an Aggregate Distribution – 3 Methods Judgmentally select loss ratio outcomes and corresponding probabilities whose weighted average equals your expected loss ratio May not contain enough bad scenarios if you’re basing your points on experience May be the easiest to explain to underwriters Easy, but again, may not reflect enough bad outcomes – be careful! Fit statistical distribution to on level loss ratios Reasonable for Pro Rata (QS) Treaties Most actuaries use a Lognormal Distribution here Reflected skewed distribution of loss ratios Relatively easy to use Loss Ratios here are assumed to follow a lognormal distribution which means that the natural log of the loss ratios are normally distributed. Determine an aggregate distribution by modeling the frequency and severity pieces separately and either convolute them or simulate them together Typically used for excess of loss (XOL) treaties Lognormal doesn’t make sense if you can have zero losses Lognormal very likely not skewed enough anyway; XOL can be “hit or miss” (literally)

Judgmentally selected aggregate loss distribution

Lognormal Distribution: ELR = 60%, SD = 10%

Lognormal Distribution: ELR = 60%, SD = 20%

Lognormal Distribution: ELR = 60%, SD = 30%

Is the resulting distribution reasonable? Compare resulting distribution to historical results On leveled loss ratios should be the focus, but don’t completely ignore untrended ultimate loss ratios Sure, I see the rate action they’ve taken, and how trends play in, but how have they actually done, and how volatile have those results been? Do your on leveled loss ratios capture enough cat or shock loss potential? Do you think your historical results are predictive of future results? Show the distribution to your underwriters – see what they think – Ultimately, sometimes a judgmentally selected discrete distribution makes the most sense – and can be the easiest to explain to underwriters

Creating distributions when there’s cat exposure If you have a treaty with significant catastrophe exposure, you should consider modeling the noncat loss ratio separately from the cat loss ratio Noncat could probably be a straight forward lognormal Cat is probably MUCH more skewed – think early CA EQ example! Large chance nothing bad happens, small chance you get clobbered Combining the cat and noncat is easy to do during a simulation – particularly if you assume they are independent – you can simulate a noncat result, and then a cat result, and then just add them together at the end In the case of noncat and cat, it’s very difficult to find one distribution to address all your needs. Since a very skewed distribution generally leads to a higher loss sensitive cost, be careful not to underestimate the value – these things all add up! Even a high CV on your lognormal probably won’t help you out – sure, you get some high loss ratio outcomes which could represent your cat scenarios, but you get a ton of very good outcomes – probably too many – to weight back to your expected loss ratio. You could introduce a minimum loss ratio (truncated) – but that means you won’t weight back to your expected loss ratio without some adjusting.

What about process and parameter uncertainty? Process Uncertainty is the random fluctuation of results around the expected value just due to the random nature of insurance – not every year is going to be the same! Parameter Uncertainty is the fluctuation of results because our parameters used to determine our expected value are never going to be perfect. Are your trend, rate changes, and loss development assumptions reasonable? For this book, are past results a good indication of future results? Changes in mix of business? Changes in management or philosophy Is the book growing? Shrinking? Stable? Selected CV should generally be greater than what is indicated 5 to 10 years of data does not reflect a full range of possibilities Anything with cat exposure really emphasizes this – when was the last CA EQ?

Addressing Parameter Uncertainty: One Suggestion Instead of just choosing one Expected Loss Ratio, choose 3 (or more) Assign weights to the new ELRs so that they all weight back to your original ELR For example if your ELR is a 60%, maybe there’s a 1/3 chance your true mean is 50% and a 1/3 chance your true mean is 70% Simulate the true mean by randomly choosing between the 50%, 60%, and 70%. Once you’ve randomly chosen that’mean (either 50%, 60%, or 70%) then model using the lognormal based on that chosen mean and your selected CV Note the CV will handle your process variance You’re all set!

Loss Sensitive Features on XOLs Excess of Loss (XOL) Treaties Profit Commission Reinstatements Swing Rating Provisions No Claims Bonuses (if anywhere, Cat XOLs) Annual Aggregate Deductibles Loss Ratio Cap

Swing Rating Provisions Ceded premium is dependent on loss experience Reinsurer receives initial premium based on a provisional rate That rate swings up or down depending on the loss experience in accordance to the terms of the contract Typical Swing Rated Terms Provisional Rate = 10% Minimum Rate/Margin = 3% Maximum Rate = 15% “Losses Loaded” at = 1.1 Ceded Rate = Min/Margin + (Ceded Loss / SPI)x(1.1), subject to the max rate of 15% Common on medical malpractice XOLs - not really seen anywhere else Note this feature is an adjustment to PREMIUM – not COMMISSION

Swing Rating Example

Annual Aggregate Deductible The annual aggregate deductible (AAD) refers to layer losses that the cedant retains that would otherwise be ceded to the treaty These are the FIRST losses that get paid in a layer – similar to a loss corridor but an AAD is always the first losses Example: Reinsurer provides a $500,000 x $500,000 excess of loss contract. The cedant retains an AAD of $750,000 This means the cedant keeps the first $750,000 of layer losses Total Loss to Layer = $500,000? Cedant retains entire $500,000 (because AAD is $750,000) No loss is ceded to reinsurers Total Loss to Layer = $1M? Cedant retains entire AAD of $750,000 Reinsurer pays $250,000 If the cedant requests a $500,000 AAD for a treaty, should the actuary reduce his expected layer losses of $1M by $500,000?

Likely no! (But you knew that, right?) As with any of these examples, different shaped distributions will result in different savings

No Claims Bonus A No Claims Bonus provision can be added to an excess of loss contract – it’s exactly what it sounds like Since any pro rata or QS contract is apt to have loss ceded to it – because these structures cover losses of all sizes – not just large losses – a no claims bonus doesn’t make sense Very binary – if there are no losses, cedant can receive a small % of premium back Not a typical feature to see – might see a small no claims bonus on Property Catastrophe XOLs – but only to the tune of a 10% bonus

Limited Reinstatement Provisions Many excess of loss treaties have reinstatement provisions. Such provisions dictate how many times the cedant can use the risk limit of the treaty. Reinstatements can be free or paid – but choosing to reinstate is almost always mandatory Just because it’s paid, doesn’t mean the cedant owes 100% of the reinsurance premium all over again – could be 1st @ 50%, 2nd @75%, etc Catastrophe treaties tend to have “1@100%” – or, you can reinstate the limit once for the full reinsurance premium Limited reinstatements are an implied treaty aggregate limit, or treaty cap. Treaty Aggregate Limit = Risk Limit x (1+ number of reinstatements) Example: $1M x $1M layer with one reinstatement After the cedant uses up the first $1M limit, they get a second limit Treaty Aggregate Limit = $1M x (1+1) = $2M Reinstatement can either be free or paid – contract will tell you so Reinstatement premium can simply be viewed as additional premium that reinsurers receive depending on loss experience

Limited Reinstatement Example

Valuing a Limited Reinstatement Provision

Rating on a Multi Year Block As you can see from all of these structures presented, each year’s results stand on their own. For a PC, the cedant can have a great year 1 and receive a large profit commission in return. But then maybe year 2 is awful – and no PC is paid. Over these two years the cedant may be thrilled, but the reinsurer could be in bad shape. To work to bridge that gap, loss sensitive features can be evaluated using the total treaty experience across multiple years instead. This allows for a smoothing of results – and a smoothing of profit commission paid. This is called rating on a Multi Year Block Modeling a multiyear block implies tightening up your loss distribution – 3 years of data will tend more towards your mean than just one year – law of large numbers kicking in – consider a lower standard deviation, etc.

Deficit / Credit Carryforward Provision Another way to effect some kind of loss sensitive smoothing, but for sliding scale commission deals is to use a Deficit or Credit Carryforward Provision If the loss ratio is SO good and the cedant receives the max ceding commission anyway, the amount that the loss ratio is better than the loss ratio at the max rolls into the next year’s calculation. This is a credit carryforward. If the loss ratio is SO bad and the cedant receives the min ceding commission anyway, the amount that the loss ratio is worse than the loss ratio at the min rolls into the next year’s calculation. This is a deficit carryforward. Similar to a multi year block, this provision works to smooth out loss sensitive results

Modeling Frequency and Severity Separately While a lognormal distribution is relatively easy to use, it is not usually appropriate for XOL treaties Does not reflect “hit or miss” nature of many excess contracts Understates the probability of zero loss May understate the potential of losses MUCH greater than the expected loss Modeling Frequency and Severity separately is more common for XOL Simulation (most common) Excel @ Risk Some broker products – MetaRisk, Remetrica Numerical Methods You can use a lognormal for a very “working” layer – meaning one where you expect A LOT of claims with great certainty

Common Frequency Distributions Poisson is an easy-to-use distribution to model expected claim count Poisson distribution assumes the mean (lambda) and variance of the claim count distribution are equal Discrete distribution – number of claims = 0, 1, 2, 3, etc… Despite the Poisson’s ease of use, Negative Binomial more preferred Same form as the Poisson expect that lambda is no longer considered fixed but rather has a gamma distribution around lambda Variance is greater than the mean (unlike Poisson where they are equal) Preferred over Poisson because it reflects a little more parameter uncertainty regarding the true mean claim count The extra variability of the Negative Binomial is more in line with historical experience

Common Severity Distributions Lognormal Mixed Exponential (currently used by ISO) Pareto Truncated Pareto – was used by ISO before moving to the Mixed Exponential CAVEAT: If you are fitting a severity distribution to actual claims, don’t forget about loss development! (Maybe use ISO curves instead of building your own)

Concluding Remarks There are many loss sensitive features available that can be used to make a reinsurance treaty acceptable to both the cedant and the reinsurer It’s up to the actuary to value the requested features and explain the results to underwriters Depending on the shape of your loss distribution, your loss sensitive feature’s expected cost or savings can vary greatly A little sensitivity testing on a range of distributions can go a long way!