Quantitative Consulting has many interesting research publications. Some of them were completed for the company purposes and others were outcomes of the academic research of our employees. Some of the content is accessible only after registration of your email address, rest of the content is downloadable from this site.
We hope you will find what you are looking for!
Estimating LGD Correlation. University of Economics, Working Paper. The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to find the best estimated correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante one-year stressed LGD, i.e. one-year LGD quantiles on any reasonable probability level.
Estimations of Market and Credit Value at Risk. Bachelor thesis, MFF UK. This present work studies statistical estimations of market and credit risk by measure of risk called Value at Risk. This work describes the defintion of Value at Risk, estimations of Value at Risk for market risk by the variance and covariance method, by historical simu- lation method, by Monte Carlo simulation method and estimations of Value at Risk for credit risk by the most widely known methods CreditMetrics, CreditRisk+ and KMV. This part ends by historical development and cal- culation of capital adequacy. The analytical part of the work analyses main advantages and disadvantages of Value at Risk on the example of portfolio compact of exchange-traded funds. The aim of this work is to describe Value at Risk as a whole, describe its advantages and analyse disadvantages.
Weak convergence in D[0,1]. Presentation for PhD. seminar Stochastic modeling in economics and finance, MFF UK, 2008. Presentation topic is weak convergence in space D[0,1].
Exponential Smoothing for Irregular Time Series. Poster for Robust 2008 conference. In this poster are highlighted problems resulting from irregularity of observations and suggested possible solutions. Beside already published methods are presented methods or their modifications by author ( m ranked exponential smoothing, irregularly observed ARIMA(0, 1, 1) process, modified Holt’s method, Holt-Winters method modeling seasonality by goniometric functions).
Exponential Smoothing, Bachelor thesis, MFF UK. The thesis deals with two exponential smoothing type methods for non-seasonal time series with local linear trend. The main part of the thesis is a theoretical derivation of MSE and autocorrelation coefficient of forecasting errors when using Holt method with all combinations of smoothing constants and with the time series generated by ARIMA(0, 2, 2) process with all combinations of its parameters. theoretically derived formulae are applied also to Brown method, derived formulae are verified via simulations and tried on real time series. The practical conclusions related to both methods are formulated.
Improved Holt Method for Irregular Time Series. WDS´08 Proceedings of Contributed Papers, Part I, 2008. The paper suggests an improvement of Holt method for irregular time series as it was presented by Wright. The modification deals with problem of time-close observations. Simulation study is provided to compare the performance of the original and improved method.
Step by Step Credit Risk Model Construction. Bachelor thesis, MFF UK. The aim of the present work is to outline a principle of scoring models construction. Describtion of the logistic regression method, its parameters estimation and their significance testing. On the ground of odds ratio variables it defines the Independence model as an estimate of the conditional odds of client’s ability to pay. It generalizes this model by adding individual weights to groups and categories of clients characteristic. Using this way it comes to the WOE model and Full logistic model. This work also studies the way of measuring the diversification power of the models by the Lorenz curve and Somer’s d statistics as an estimate of the Gini coefficient. It applies the described methods to the practical scoring model construction. On a real data there are compared suitability and diversification power of the introduced models.
Irregular Periodic Time Series. Dissertation progress report, speech on a doctoral seminar Stochastic modelling in economics and finance, MFF UK, 2008.
Exponential Smoothing for Irregular Time Series. Kybernetika magazine, 2008. The paper deals with extensions of exponential smoothing methods for univariate irregular time series. An alternative method to Wright’s modification of simple exponential smoothing is suggested. Exponential smoothing of order m for irregular data is derived in two different ways. Maximum likelihood parameters estimation for forecasting methods in irregular time series is suggested. The suggested methods are compared with the existing ones in a simulation numerical study.
Cointegrated Time Series Models, Diploma thesis, MFF UK. The thesis deals with the concept of cointegration which represents appropriate tool in the analysis of nonstationary processes. Thesis concentrates on the models which are commonly used in the cointegration analysis of the time series. It describes straight connection between error-correction (EC) model and autoregressive distributed lags model (ADL). Theoretical results are applied on real data in demand for money model.
Basle II Capital Requirement Sensitivity to the Definition of Default. Risk management. Prague : Oeconomica, 2008. The paper is motivated by a disturbing observation according to which the outcome of the regulatory formula significantly depends on the definition of default used to measure the probability of default (PD) and the loss given default (LGD) parameters. Basel II regulatory capital should estimate with certain probability level unexpected credit losses on banking portfolios and so it should not depend on a particular definition of default that does not change real historical and expected losses. The paper provides an explanation of the phenomenon based on the Merton default model and test it using a Monte Carlo simulation. Finally we perform a structural model based simulation to test the hypothesis according to which scoring functions developed with a soft definition of default provide weaker predictive power than the ones developed with a hard definition of default.
Valuation of Convexity Related Derivatives. IES WP 4/2008. We will investigate valuation of derivatives with payoff defined as a nonlinear though close to linear function of tradable underlying assets. Derivatives involving Libor or swap rates in arrears, i.e. rates paid in a wrong time, are a typical example. It is generally tempting to replace the future unknown interest rates with the forward rates. We will show rigorously that indeed this is not possible in the case of Libor or swap rates in arrears. We will get a precise fully analytical formula applicable to a wide class of convexity related derivatives. We will illustrate the techniques and different results on a case study of a real life controversial exotic swap.
Speech on a doctoral seminar Stochastic modelling in economics and finance, MFF UK, 2007. The topic is searching for an investment portfolio with a maximal expected investor's utility, exponential utility and relative entropy.
Decomposition Methods for Time Series with Irregular Observations, Diploma thesis MFF UK. This work deals with extensions of classical exponential smoothing type methods for univariate time series with irregular observations. Extensions of simple exponential smoothing, Holt method, Holt-Winters method and double exponential smoothing which have been developed in past are presented. An alternative method to Wright\\'s modification of simple exponential smoothing for irregular data. A program in which most of the methods presented here are available is a part of the work.
Geometrical Properties of the r - neighborhood of Brownian Motion and Related Random Structures. Doctoral dissertation, MFF UK, 2007. The thesis is mainly focused on the study of a particular random compact set called Wiener sausage. Heuristically, it is the trace of a moving spherical object in Euclidean space along Brownian trajectories up to finite time. Apart from expected volume and surface area also their assymptotical behaviour is studied. Moreover, main characteristics of Boolean model of Wiener sausages are derived.
Markowitz model. Term exercise on the subject Optimization II with applications in finance, MFF UK, 2006. Use of Markowitz model (1952) to find optimal equity portfolio in terms of maximizing profit and minimizing risk (i.e. the role of multicriteria optimization). Use of that model on real data of the Czech stock market. Portfolio optimization itself is carried out for five different sets of restrictive conditions.
Maintaining price stability in the Czech Rep. and the euro area. Essay on the subject European Economic Policies, IES FSV UK. This work aims to summarize the relatively short history of inflation targeting in the Czech Republic and describes the mechanism, including the so-called inflation outlook, which is its key attribute. Then it analyzes the role of fiscal policy in the context of targeting inflation. The last chapter is devoted to maintaining price stability in the euro area, which is performed under the supervision of the European Central Bank.
Problem of estimation annual average concentration of radon in building. Seminar work on subject Econometrics, MFF UK. Aim of this work is to construct estimation of annual average concentration of radon in building based on shorter observations. Available were real data from the state office for radiation protection (concentration of radon in observed building and basic meteorological data).
Summary of statistical errors in survey sampling estimation methods. Published in yearbook Statistika ČSÚ, 2005. In this article several possible procedures appropriate for estimation of statistical error in case of relative frequencies are studied. It focuses on different types of suitable probability distributions (hypergeometric, binomial, Poisson) including possible approximations (normal distribution). It suggests also alternative methods for error estimation (bootstrap and jacknife) and methods for error estimation in case of cluster sampling . The present work also deals with error estimation of other commonly used characteristics as averages and standard deviations.
Seigniorage in Continuous Time. Article published in Politická ekonomie magazine, 2004. The government is able to acquire real goods through printing money. Because government does not create wealth through printing money, this revenue, the seigniorage, is at the expense of the public, as the purchasing power of monetary units decreases because of the issue of new money. The authors use the model of auctions to which the public comes with their money and the government with the newly issued money. The value of goods acquired by the government in such an auction equals the newly printed money divided by the sum of the newly printed money and the money spent by the public. Upon this auction model, the authors develop the formula for seigniorage in continuous time. The seigniorage calculated in this way is lower than the seigniorage calculated upon the assumption of discrete changes in economic variables.