BoostrapODPSample Not as Expected #408
-
I am currently attempting to run a Bootstrap model on an incurred triangle, but the Ultimate results are not as expected. There is a massive amount of variability in the results, most noticeably so for the earlier Origin years. It does not make sense to me that OY 2002 would vary by even anything at all considering it is already fully developed in the triangle. I assume I must be calling a function wrong or perhaps there are some parameters I need to change. It seems that the BS and CL IBNR does converge well enough and using a different Excel based BS shows the expected trend. Hoping someone may be able to help I have researched this for days to no avail! Code:
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi @gracehymas , thanks for the question. A couple things to consider:
import chainladder as cl
tri = cl.load_sample('clrd')['CumPaidLoss'].groupby('LOB').sum().loc['wkcomp']
sims = cl.BootstrapODPSample().fit(tri).resampled_triangles_
cl_model = cl.Chainladder().fit(sims)
ibnr_dist = cl_model.ibnr_.iloc[:, :, 0].to_frame() # Earliest origin IBNR
ibnr_dist.max() == ibnr_dist.min() == 0 I suppose, following from the above example, you could translate this to an ultimate distribution by: ult_dist = tri.latest_diagonal + cl_model.ibnr_ |
Beta Was this translation helpful? Give feedback.
Hi @gracehymas , thanks for the question.
A couple things to consider:
According to CAS Monograph 4, the Bootstrap model requires that incremental values be positive. This property is often violated by Incurred triangles. I believe (but haven't checked) the
BootstrapODPSample
throws these incrementals away which would lead to bias in the resampled triangles. They do offer a suggestion to get around this, but unfortunately, those suggestions are not yet implemented inchainladder-python
. Excerpt from monograph:. I suspect this is causing some biases since your triangle does look like it would have negative incrementals.
As for the ultimates on OY2002 - these are the ultimates for each…