Quantitative Consulting has many interesting research publications. Some of them were completed for the company purposes and others were outcomes of the academic research of our employees. Some of the content is accessible only after registration of your email address, rest of the content is downloadable from this site.
We hope you will find what you are looking for!
The paper analyzes a two-factor credit risk model allowing to capture default and recovery rate variation, their mutual correlation, and dependence on various explanatory variables. At the same time, it allows computing analytically the unexpected credit loss. We propose and empirically implement estimation of the model based on aggregate and exposure level Moody’s default and recovery data. The results confirm existence of significantly positive default and recovery rate correlation. We empirically compare the unexpected loss estimates based on the reduced two-factor model with Monte Carlo simulation results, and with the current regulatory formula outputs. The results show a very good performance of the proposed analytical formula which could feasibly replace the current regulatory formula.
A Two-Factor Model for PD and LGD Correlation, University of Economics, Working Paper. The paper proposes a two-factor model to capture retail portfolio probability of default (PD) and loss given default (LGD) parameters and in particular their mutual correlation. We argue that the standard one-factor models standing behind the Basel II formula and used by a number of studies cannot capture well the correlation between PD and LGD on a large (asymptotic) portfolio. Parameters of the proposed model are estimated using the Markov Chain Monte Carlo (MCMC) method on a sample of real banking data. The results confirm positive stand-alone PD and LGD correlations and indicate a positive mutual PD x LGD correlation. The estimated Bayesian MCMC distributions of the parameters show that the stand alone correlations are strongly significant with a lower significance of the mutual correlation probably due to a too short observed time period.
Survival Analysis in LGD Modeling. IES Working Paper. The paper proposes an application of the survival time analysis methodology to estimations of the Loss Given Default (LGD) parameter. The main advantage of the survival analysis approach compared to classical regression methods is that it allows exploiting partial recovery data. The model is also modified in order to improve performance of the appropriate goodness of fit measures. The empirical testing shows that the Cox proportional model applied to LGD modeling performs better than the linear and logistic regressions. In addition a significant improvement is achieved with the modified “pseudo” Cox LGD model.
The paper follows previous study of capital requirement sensitivity to the definition of default (Witzany, 2008). We study the phenomenon in the context of single-factor models where default and loss given default are driven by one systemic factor and by one or more idiosyncratic factors. In this theoretical framework we propose and analyze a relatively simple remedy of the problem requiring that the LGD parameter is estimated as a quantile of possible losses given default on the same probability level as used to stress probability of default (PD).
Estimating LGD Correlation. University of Economics, Working Paper. The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to find the best estimated correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante one-year stressed LGD, i.e. one-year LGD quantiles on any reasonable probability level.
Basle II Capital Requirement Sensitivity to the Definition of Default. Risk management. Prague : Oeconomica, 2008. The paper is motivated by a disturbing observation according to which the outcome of the regulatory formula significantly depends on the definition of default used to measure the probability of default (PD) and the loss given default (LGD) parameters. Basel II regulatory capital should estimate with certain probability level unexpected credit losses on banking portfolios and so it should not depend on a particular definition of default that does not change real historical and expected losses. The paper provides an explanation of the phenomenon based on the Merton default model and test it using a Monte Carlo simulation. Finally we perform a structural model based simulation to test the hypothesis according to which scoring functions developed with a soft definition of default provide weaker predictive power than the ones developed with a hard definition of default.