Quantitative Consulting has many interesting research publications. Some of them were completed for the company purposes and others were outcomes of the academic research of our employees. Some of the content is accessible only after registration of your email address, rest of the content is downloadable from this site.
We hope you will find what you are looking for!
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical technique to prevent model overfitting such as out-sample back-testing turns out to be unreliable in the situation when selection is based on results of too many models tested on the holdout sample. There is an ongoing discussion how to estimate the probability of back-test overfitting and adjust the expected performance indicators like Sharpe ratio in order to reflect properly the effect of multiple testing. We propose a consistent Bayesian approach that consistently yields the desired robust estimates based on an MCMC simulation. The approach is tested on a class of technical trading strategies where a seemingly profitable strategy can be selected in the naïve approach.
A Comparison of EVT and Standard VaR Estimations, University of Economics, Working Paper. In this paper, Extreme value theory (EVT) is applied in estimating low quantiles of P/L distribution and the results are compared to common VaR methodologies. The fundamental theory behind EVT is built, and peaks-over-threshold method is used for modeling the tail of the distribution of losses with Generalized Pareto Distribution (GPD). The different VaR methods are then compared using backtesting procedures. Practical issues such as time varying volatility of returns, and multivariate time series (portfolio of financial instruments) are covered.
This master’s thesis presents the comparison of the logistic regression algorithm and the decision tree algorithm regarding creation of the scoring models of financial institutions. The theoretical part of the thesis focuses on the description of both algorithm and their application in model making. The practical part of the thesis uses both algorithms to make models based on a real dataset and then compares which algorithm gives us better results. The thesis is focused on the applied description of the problem, and therefore it does not include precise mathematical definitions.
We consider counterparty credit risk in the interest rate swap (IRS) contracts in the presence of an adverse dependence between the default time and interest rates, so-called wrong-way risk. The IRS credit valuation adjustment (CVA) semi-analytical formula based on Gaussian copula assumption, presented in Černý and Witzany , is further replaced by Fréchet copula (for extreme dependence) mainly based on the work of Cherubini , called modified approach. The result of all three CVA calculation approaches are compared in a numerical study where we find that our semi-analytical formulas (the Gaussian copula and modified approach) provide more accurate information on IRS CVA price.
A Two-Factor Model for PD and LGD Correlation, University of Economics, Working Paper. The paper proposes a two-factor model to capture retail portfolio probability of default (PD) and loss given default (LGD) parameters and in particular their mutual correlation. We argue that the standard one-factor models standing behind the Basel II formula and used by a number of studies cannot capture well the correlation between PD and LGD on a large (asymptotic) portfolio. Parameters of the proposed model are estimated using the Markov Chain Monte Carlo (MCMC) method on a sample of real banking data. The results confirm positive stand-alone PD and LGD correlations and indicate a positive mutual PD x LGD correlation. The estimated Bayesian MCMC distributions of the parameters show that the stand alone correlations are strongly significant with a lower significance of the mutual correlation probably due to a too short observed time period.
This paper investigates the drivers of cross-currency basis spreads, which were historically close to zero but have widened significantly since the start of the financial crisis. Credit and liquidity risk, as well as supply and demand have often been cited as general factors driving cross-currency basis spreads, however, these spreads may widen beyond what is normally explained by such variables. We suggest market proxies for EUR/USD basis swap spread drivers and build a multiple regression and cointegration model to explain their significance during three different historical periods of basis widening. The most important drivers of the cross-currency basis spreads appear to be short- and medium-term EU financial sector credit risk indicators, and to a slightly lesser extent, short- and medium-term US financial sector credit risk indicators. Another important driver is market volatility for the short-end basis spread, and the EUR/USD exchange rate for the medium term basis spread, and to a lesser extent, the Fed/ECB balance sheet ratio.
Analysis and Comparison of Different Value at Risk Models for Nonlinear Portfolio. Diploma Thesis, MFF UK. The thesis describes Value-at-Risk (VaR) and Expected Shortfall (ES) models for measuring market risk. Parametric method, Monte Carlo simulation, and Historical simulation (HS) are presented. The second part of the thesis analyzes Extreme Value Theory (EVT). The fundamental theory behind EVT is built, and peaks-over-threshold (POT) method is introduced. The POT method is then used for modelling the tail of the distribution of losses with Generalized Pareto Distribution (GPD), and is simultaneously illustrated on VaR and ES calculations for PX Index. Practical issues such as multiple day horizon, conditional volatility of returns, and backtesting are also discussed. Subsequently, the application of parametric method, HS and EVT is demonstrated on a sample nonlinear portfolio designed in Mathematica and the results are discussed.
Analysis of guaranteed investment funds, Diploma thesis, University of Economics. This thesis focuses on guaranteed investment funds, which have become very popular among investors in the Czech Republic in recent years. The reason for this popularity is the conservativeness of a typical domestic investor, who appreciates the lower bound for the value of his investment. Guaranteed funds characteristically have a complex structue questionre and valuation of their profitability based solely on intuition is therefore impossible. This analysis tries to provide an answer to th if investing in these funds is reasonable. A large part of the thesis is dedicated to the option theory and option valuation methods, including the famous Black-Scholes formula, as guaranteed investment funds have the characteristics of an option. Thanks to the complicated structure of these products, the analysis itself is done by Monte Carlo simulation.
A review of vast literature base on Asset Pricing testing in advanced and emerging markets suggests that today no consensus has been reached on what is the right approach for dealing with the emerging market’s specifics such as thin trading, market concentration and high volatility. This paper will consider a class of linear factor models, which are particularly famous due to the acknowledgement of Capital Asset Pricing Model (CAPM) and its subsequent modifications, which are the ultimate topic of this paper. Despite numerous research papers criticising traditional linear models and attempting to alter their embedded limitations, practitioners as well as academics return to already existing models such as the CAPM repeatedly. Anderson, Bollerslev, Diebold and Wu (2006), argue that the death of the model was over exaggerated. Firstly, because the model often works well despite its wrinkles and secondly more advanced multi-factor models that offer better statistical fit, lack the economic explanation of the variables and their interpretation in terms of systematic risk.
This opens a floor for discussion about the choice of asset pricing models, specifically should traditional CAPM model be applied or should alternative models such as D-CAPM be preferred. This paper offers statistical testing of traditional CAPM, Fama French CAPM and D-CAPM on a set of indices and portfolios with the use of GMM two-step simple and multiple regressions. Results has shown that on average downside beta tends to perform better in both emerging and developed markets than traditional beta. However, caution should be given to the use of systematic risk measures, as in case of emerging markets total risk measures such as semivariance and standard deviation can be preferable. Overall, unconditional models should not be a centre of discussion as many research papers along with this master thesis has shown that beta is non-constant over time, which confirms a general finding of non-constant volatility. The last chapter of this paper therefore looks at key conditional models.
Basle II Capital Requirement Sensitivity to the Definition of Default. Risk management. Prague : Oeconomica, 2008. The paper is motivated by a disturbing observation according to which the outcome of the regulatory formula significantly depends on the definition of default used to measure the probability of default (PD) and the loss given default (LGD) parameters. Basel II regulatory capital should estimate with certain probability level unexpected credit losses on banking portfolios and so it should not depend on a particular definition of default that does not change real historical and expected losses. The paper provides an explanation of the phenomenon based on the Merton default model and test it using a Monte Carlo simulation. Finally we perform a structural model based simulation to test the hypothesis according to which scoring functions developed with a soft definition of default provide weaker predictive power than the ones developed with a hard definition of default.
Cointegrated Time Series Models, Diploma thesis, MFF UK. The thesis deals with the concept of cointegration which represents appropriate tool in the analysis of nonstationary processes. Thesis concentrates on the models which are commonly used in the cointegration analysis of the time series. It describes straight connection between error-correction (EC) model and autoregressive distributed lags model (ADL). Theoretical results are applied on real data in demand for money model.
Credit derivatives market, Diploma thesis, University of Economics. The main objective of this thesis is to acquaint the reader with the main types of credit derivatives, with the methods of functioning and with main valuation principles. The theoretical part focus on description of credit derivatives market developement with more detailed description of subjcts, who are operating on credit derivatives market. The analysis fosus on how the financial crises influenced these subjects and their credit derivatives portfolios. New suggested regulation changes are also described. As a conclusion is the estimation of the regulation changes on the credit derivatives market size.
Credit Risk Management and Modeling, Textbook. The main goal of this textbook is to provide an overview of credit risk management, as well as modeling approaches from the point of view of a classical commercial bank, savings institution, finance company, or other financial market players. It is targeted mainly on current or future credit risk managers and analysts working in the credit risk departments of those institutions. It will be also useful to those overseeing, or working on the loan underwriting side, as well as to middle and top managers. Our point of view will be usually that of a universal bank, since risk management organization, and the techniques for the other institutions can be described as an appropriate subset of the universal financial institutions. The book can be also useful for corporate risk managers, where risk management organization and modeling techniques are, however, normally significantly simpler compared to those of a bank.
Decomposition Methods for Time Series with Irregular Observations, Diploma thesis MFF UK. This work deals with extensions of classical exponential smoothing type methods for univariate time series with irregular observations. Extensions of simple exponential smoothing, Holt method, Holt-Winters method and double exponential smoothing which have been developed in past are presented. An alternative method to Wright\\'s modification of simple exponential smoothing for irregular data. A program in which most of the methods presented here are available is a part of the work.
Derivative products in domestic market – legislation, use, options. Essay on the subject Banking, IES FSV UK. This work briefly desribes the principle, practical use and classification of financial derivatives. It also deals with the legislative definition of financial derivatives and their real potential application in the domestic market (in force in 2003).
Derivatives pricing using Monte Carlo simulations, Diploma thesis, University of Economics.This present work focuses on methods of variance reduction of estimations generated by Monte Carlo simulation and their application. Theoretical part of this work contains description of six methods mentioned in current literature. These methods are applied to the valuation of one type of barrier option, so called up-and-out option. First, work focuses on widely used model of underlying asset price process, diffusion process with constant parameters (also called geometric Brownian motion). Then model with stochastic volatility is applied. Then methods of variance reduction are applied and discussed.
We formulate a bivariate stochastic volatility jump-diffusion model with correlated jumps and volatilities. An MCMC Metropolis-Hastings sampling algorithm is proposed to estimate the model’s parameters and latent state variables (jumps and stochastic volatilities) given observed returns. The methodology is successfully tested on several artificially generated bivariate time series and then on the two most important Czech domestic financial market time series of the FX (CZK/EUR) and stock (PX index) returns. Four bivariate models with and without jumps and/or stochastic volatility are compared using the deviance information criterion (DIC) confirming importance of incorporation of jumps and stochastic volatility into the model.
Estimating LGD Correlation. University of Economics, Working Paper. The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to find the best estimated correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante one-year stressed LGD, i.e. one-year LGD quantiles on any reasonable probability level.
The paper analyzes a two-factor credit risk model allowing to capture default and recovery rate variation, their mutual correlation, and dependence on various explanatory variables. At the same time, it allows computing analytically the unexpected credit loss. We propose and empirically implement estimation of the model based on aggregate and exposure level Moody’s default and recovery data. The results confirm existence of significantly positive default and recovery rate correlation. We empirically compare the unexpected loss estimates based on the reduced two-factor model with Monte Carlo simulation results, and with the current regulatory formula outputs. The results show a very good performance of the proposed analytical formula which could feasibly replace the current regulatory formula.
We are comparing two approaches for stochastic volatility and jumps estimation in the EUR/USD time series - the non-parametric power-variation approach using high-frequency returns, and the parametric Bayesian approach (MCMC estimation of SVJD models) using daily returns. We find that both of the methods do identify continuous stochastic volatility similarly, but they do not identify similarly the jump component. Firstly - the jumps estimated using the non-parametric high-frequency estimators are much more numerous than in the case of the Bayesian method using daily data. More importantly - we find that the probabilities of jump occurrences assigned to every day by both of the methods are virtually no rank-correlated (Spearman rank correlation is 0.0148) meaning that the two methods do not identify jumps at the same days. Actually the jump probabilities inferred using the non-parametric approach are not much correlated even with the daily realized variance and the daily squared returns, indicating that the discontinuous price changes (jumps) observed on high-frequencies may not be distinguishable (from the continuous volatility) on the daily frequency. As an additional result we find strong evidence for jump size dependence and jump clustering (based on the self-exciting Hawkes process) of the jumps identified using the non-parametric method (the shrinkage estimator).