Quantitative Consulting has many interesting research publications. Some of them were completed for the company purposes and others were outcomes of the academic research of our employees. Some of the content is accessible only after registration of your email address, rest of the content is downloadable from this site.

We hope you will find what you are looking for!

correlationMCMCrecovery rateregulatory capitalHolt methodExponential smoothingextreme value theoryInvestment Strategylogistic regressionvalue at riskAsset price jumpsbasis spread

Basel II regulation, correlation, default rates, Recovery rates

The paper analyzes a two-factor credit risk model allowing to capture default and recovery rate variation, their mutual correlation, and dependence on various explanatory variables. At the same time, it allows computing analytically the unexpected credit loss. We propose and empirically implement estimation of the model based on aggregate and exposure level Moody’s default and recovery data. The results confirm existence of significantly positive default and recovery rate correlation. We empirically compare the unexpected loss estimates based on the reduced two-factor model with Monte Carlo simulation results, and with the current regulatory formula outputs. The results show a very good performance of the proposed analytical formula which could feasibly replace the current regulatory formula.

11.02.2013

implied volatility, model-free volatility, model-free volatility., realized volatility, Volatility forecasting

In this paper we are testing the forecasting abilities and the information content of 8 popular models of volatility forecasting (EWMA, GARCH, FIGARCH, ARIMA-RV, ARFIMA-RV, HAR-RV, Black-Scholes implied volatility and model-free volatility). The models are applied to 5 years of daily data about the evolution of the EUR/USD exchange rate in order to forecast the realized volatility in 1 day, 5 day and 20 day horizon. The best forecasting results were archieved by the models based on option oprices (Black-Scholes implied volatility and Model-free volatility), followed by the realized volatility models incorporating long memory (ARFIMA-RV and HAR-RV). From these models ARFIMA-RV has dominated the others especially on the longer horizons, where it surpassed even the option models. The tests of the information content showed that the option models do not subsume all of the information contained in the econometric models (ARFIMA and HAR). Because of that we created several hybrid models (using option as well as time series forecasts) that archieved on average better results than any of the basic models on their own.

01.12.2012

Exponential smoothing, Holt–Winters method, irregular time series, seasonal indices, trigonometric functions

The paper suggests a generalization of widely used Holt–Winters smoothing and forecasting method for seasonal time series. The general concept of seasonality modeling is introduced both for the additive and multiplicative case. Several special cases are discussed, including a linear interpolation of seasonal indices and a usage of trigonometric functions. Both methods are fully applicable for time series with irregularly observed data (just the special case of missing observations was covered up to now). Moreover, they sometimes outperform the classical Holt–Winters method even for regular time series. A simulation study and real data examples compare the suggested methods with the classical one.

01.01.2012

jump-diffusion, MCMC, Monte Carlo, Stochastic volatility, value at risk

We formulate a bivariate stochastic volatility jump-diffusion model with correlated jumps and volatilities. An MCMC Metropolis-Hastings sampling algorithm is proposed to estimate the model’s parameters and latent state variables (jumps and stochastic volatilities) given observed returns. The methodology is successfully tested on several artificially generated bivariate time series and then on the two most important Czech domestic financial market time series of the FX (CZK/EUR) and stock (PX index) returns. Four bivariate models with and without jumps and/or stochastic volatility are compared using the deviance information criterion (DIC) confirming importance of incorporation of jumps and stochastic volatility into the model.

01.11.2011

binary logistic regression, coefficient of determination, Gini coefficient, maximum likelihood, ordinary least squares

Estimation and goodness-of-fit criteria in logistic regression model, Bachelor thesis, MFF UK. In this bachelor thesis is described binary logistic regression model and estimation of model's parameters by maximum likelihood method. Then there is proposed algorithm for the least squares method. In the goodness-of-fit criteria part is defined Lorenz curve, Gini coefficient, C-statistics, Kolmogorov-Smirnov statistics and coefficient of determination R2. Their relation to different sample coefficients of correlation is derived. Typical relation between Gini coefficient, Kolmogorov-Smirnov statistics is derived and newly also coefficient of determination R2 via model of normally distributed score of bad and good clients. These derived theoretical results are verified on three real data sets.

23.05.2011

LIBOR market model, short-rate models, stochastic models of interest rates

Stochastic interest rates modeling, Diploma thesis, MFF UK. This thesis studies different stochastic models of interest rates. Theoretical part describes short-rate models, HJM framework and LIBOR Market model. It focuses in detail on widely known short-rate models, i.e. Vašícek, Hull-White and Ho-Lee model, and on LIBOR Market model. This part ends by valuation of interest rate options and model calibration to real data. Analytical part analyses valuation of real non-standard interest rate derivative using different models. Part of this derivative valuation is comparison among models in terms of general valuation and also in terms of capturing the dynamics of interest rates. The aim of this thesis is to describe different stochastic models of interest rates and mainly to compare them with each other.

11.04.2011

Kalman filter, outliers, robust smoothing and forecasting

Recursive time series methods are very popular due to their numerical simplicity. Their theoretical background is usually based on Kalman filtering in state space models (mostly in dynamic linear systems). However, in time series practice one must face frequently to outlying values (outliers), which require applying special methods of robust statistics. In the paper a simple robustification of Kalman filter is suggested using a simple truncation of the recursive residuals. Then this concept is applied mainly to various types of exponential smoothing (recursive estimation in Box–Jenkins models with outliers is also mentioned). The methods are demonstrated using simulated data.

01.03.2011

backtesting, expected shortfall, extreme value theory, GARCH estimations, Risk measures, Value-at-Risk

A Comparison of EVT and Standard VaR Estimations, University of Economics, Working Paper. In this paper, Extreme value theory (EVT) is applied in estimating low quantiles of P/L distribution and the results are compared to common VaR methodologies. The fundamental theory behind EVT is built, and peaks-over-threshold method is used for modeling the tail of the distribution of losses with Generalized Pareto Distribution (GPD). The different VaR methods are then compared using backtesting procedures. Practical issues such as time varying volatility of returns, and multivariate time series (portfolio of financial instruments) are covered.

23.02.2011

KMV model, Merton model, option pricing theory, rating

KMV model in terms of the Czech capital market, Diploma thesis, University of Economics. The thesis is focused on the options of quantifying credit risk by using the concept of the KMV model. The introduction outlines the basic approaches to measuring credit risk. In the following chapters is specified the nature of KMV model with the focus on its application in the Czech capital market. Self-calibration of the KMV model is made in this part. The analytical part related to the quantification of credit risk using the KMV model is implemented on selected companies which are traded on the Prague Stock Exchange. The results obtained are consequently confronted with the official rating degrees of agency Moody's.

12.02.2011

correlation, MCMC, recovery rate, regulatory capital

A Two-Factor Model for PD and LGD Correlation, University of Economics, Working Paper. The paper proposes a two-factor model to capture retail portfolio probability of default (PD) and loss given default (LGD) parameters and in particular their mutual correlation. We argue that the standard one-factor models standing behind the Basel II formula and used by a number of studies cannot capture well the correlation between PD and LGD on a large (asymptotic) portfolio. Parameters of the proposed model are estimated using the Markov Chain Monte Carlo (MCMC) method on a sample of real banking data. The results confirm positive stand-alone PD and LGD correlations and indicate a positive mutual PD x LGD correlation. The estimated Bayesian MCMC distributions of the parameters show that the stand alone correlations are strongly significant with a lower significance of the mutual correlation probably due to a too short observed time period.

07.02.2011

Credit derivatives market, Diploma thesis, University of Economics. The main objective of this thesis is to acquaint the reader with the main types of credit derivatives, with the methods of functioning and with main valuation principles. The theoretical part focus on description of credit derivatives market developement with more detailed description of subjcts, who are operating on credit derivatives market. The analysis fosus on how the financial crises influenced these subjects and their credit derivatives portfolios. New suggested regulation changes are also described. As a conclusion is the estimation of the regulation changes on the credit derivatives market size.

26.01.2011

Credit Risk Management and Modeling, Textbook. The main goal of this textbook is to provide an overview of credit risk management, as well as modeling approaches from the point of view of a classical commercial bank, savings institution, finance company, or other financial market players. It is targeted mainly on current or future credit risk managers and analysts working in the credit risk departments of those institutions. It will be also useful to those overseeing, or working on the loan underwriting side, as well as to middle and top managers. Our point of view will be usually that of a universal bank, since risk management organization, and the techniques for the other institutions can be described as an appropriate subset of the universal financial institutions. The book can be also useful for corporate risk managers, where risk management organization and modeling techniques are, however, normally significantly simpler compared to those of a bank.

14.12.2010

CreditMetrics, CreditRisk+, KMV, Monte Carlo simulation, risk modeling

Portfolio Credit Risk Modeling, Diploma thesis, University of Economics. This thesis focuses on state-of-the-art credit models largely implemented by banks into their banking risk-assessment and complementary valuation system frameworks. Reader is provided in general with both theoretical and applied (practical) approaches that are giving a clear notion how selected portfolio models perform in real-world environment. Our study comprises CreditMetrics, CreditRisk+ and KMV model. In the first part of the thesis, our intention is to clarify theoretically main features, modeling principles and moreover we also suggest hypotheses about strengths/drawbacks of every scrutinized model. Subsequently, in the applied part we test the models in a lab-environment but with real-world market data. Noticeable stress is also put on model calibration. This enables us to con_rm/reject the assumptions we made in the theoretical part.

14.09.2010

Black-Scholes formula, Brownian motion, collective investment, guaranteed investment funds, Monte Carlo simulation, option valuation, volatility estimation

Analysis of guaranteed investment funds, Diploma thesis, University of Economics. This thesis focuses on guaranteed investment funds, which have become very popular among investors in the Czech Republic in recent years. The reason for this popularity is the conservativeness of a typical domestic investor, who appreciates the lower bound for the value of his investment. Guaranteed funds characteristically have a complex structue questionre and valuation of their profitability based solely on intuition is therefore impossible. This analysis tries to provide an answer to th if investing in these funds is reasonable. A large part of the thesis is dedicated to the option theory and option valuation methods, including the famous Black-Scholes formula, as guaranteed investment funds have the characteristics of an option. Thanks to the complicated structure of these products, the analysis itself is done by Monte Carlo simulation.

11.06.2010

data weighing, polls on voting intention, quota sample, sampling surveys, statistical error

Statistical error in representative samples from population, Bachelor thesis, MFF UK.This thesis deals with statistical error estimation in sampling surveys. The aim was to find corrections of statistical error estimations in the situations where the data are weighted or where the data originate from quota samples. It is shown using theoretical considerations to derive more accurate statistical error estimation in the case of quota sample with one quota variable. Test of the validity of adjusted estimate using simulations is shown in the case of weighed data we. Application and comparison of three different methods is demonstrated on the real poll model data and construction of empirical error estimates

27.05.2010

Basel II, Basel III, risk management

Presentation from the Breakfest with SAS held on 28th April, 2010. After an overview of the key risk management principles and tools the presentation focuses on challenging tasks related to Basel II implementation, in particular LGD, EAD, and correlation estimates. The last part then looks on the latest BCBS proposals leading to a Basel III regulatory package trying to learn a lesson from the recent financial crisis.

28.04.2010

correlation, recovery rate, regulatory capital

Survival Analysis in LGD Modeling. IES Working Paper. The paper proposes an application of the survival time analysis methodology to estimations of the Loss Given Default (LGD) parameter. The main advantage of the survival analysis approach compared to classical regression methods is that it allows exploiting partial recovery data. The model is also modified in order to improve performance of the appropriate goodness of fit measures. The empirical testing shows that the Cox proportional model applied to LGD modeling performs better than the linear and logistic regressions. In addition a significant improvement is achieved with the modified “pseudo” Cox LGD model.

01.02.2010

correlation, recovery rate, regulatory capital

The paper follows previous study of capital requirement sensitivity to the definition of default (Witzany, 2008). We study the phenomenon in the context of single-factor models where default and loss given default are driven by one systemic factor and by one or more idiosyncratic factors. In this theoretical framework we propose and analyze a relatively simple remedy of the problem requiring that the LGD parameter is estimated as a quantile of possible losses given default on the same probability level as used to stress probability of default (PD).

01.09.2009

derivatives, Monte Carlo simulations

Derivatives pricing using Monte Carlo simulations, Diploma thesis, University of Economics.This present work focuses on methods of variance reduction of estimations generated by Monte Carlo simulation and their application. Theoretical part of this work contains description of six methods mentioned in current literature. These methods are applied to the valuation of one type of barrier option, so called up-and-out option. First, work focuses on widely used model of underlying asset price process, diffusion process with constant parameters (also called geometric Brownian motion). Then model with stochastic volatility is applied. Then methods of variance reduction are applied and discussed.

01.09.2009

expected shortfall, extreme value theory, value at risk

Analysis and Comparison of Different Value at Risk Models for Nonlinear Portfolio. Diploma Thesis, MFF UK. The thesis describes Value-at-Risk (VaR) and Expected Shortfall (ES) models for measuring market risk. Parametric method, Monte Carlo simulation, and Historical simulation (HS) are presented. The second part of the thesis analyzes Extreme Value Theory (EVT). The fundamental theory behind EVT is built, and peaks-over-threshold (POT) method is introduced. The POT method is then used for modelling the tail of the distribution of losses with Generalized Pareto Distribution (GPD), and is simultaneously illustrated on VaR and ES calculations for PX Index. Practical issues such as multiple day horizon, conditional volatility of returns, and backtesting are also discussed. Subsequently, the application of parametric method, HS and EVT is demonstrated on a sample nonlinear portfolio designed in Mathematica and the results are discussed.

30.06.2009