Quantitative Consulting has many interesting research publications. Some of them were completed for the company purposes and others were outcomes of the academic research of our employees. Some of the content is accessible only after registration of your email address, rest of the content is downloadable from this site.
We hope you will find what you are looking for!
The profitability of a trading system based on the momentum-like effects of price jumps was tested on the time series of 7 assets (EUR/USD, GBP/USD, USD/CHF and USD/JPY exchange rates and Light Crude Oil, E-Mini S&P 500 and VIX Futures), in each case for 7 different frequencies (ranging from 1-Minute to 1-Day), over a period of more than 20 years (for all assets except for the VIX) ending in the second half of 2015. The proposed trading system entered long and short trades in the direction of price jumps, for the closing price of the period in which the jump occurred. The position was held for a fixed number of periods that was optimized on the in-sample period. Jumps were identified with the non-parametric L-Estimator whose inputs (period used for local volatility calculation and confidence level used for jump detection) were also optimized on the in-sample period. The proposed system achieved promising results for the 4 currency markets, especially at the 15-minute and 30-minute frequencies at which 3 out of the 4 tested currencies turned profitable (with highest profits achieved by USD/CHF, followed by EUR/USD and GBP/USD), with the profits totaling up to 30-50% p.a. in the case of a high-leverage scenario, or 15-25% in the case of a low-leverage scenario. Additionally, the 5-minute frequency turned profitable for USD/CHF and the 4-hour frequency for GBP/USD, while the 1-minute frequency was unprofitable in all cases due to the commissions and the 1-day frequency contained too few jumps to make any conclusions. As for the futures markets, the system achieved profits only on the Light Crude Oil market, on the frequencies of 1-hour, 4-hour and 1-day, with the profits totaling up to 20% p.a. in the case of high leverage or 10% p.a. in the case of low leverage. For USD/JPY, E-Mini S&P 500 Futures and VIX Futures the system achieved mostly a loss. We attribute this (in the latter two cases) to the effect of a rising market risk premium in the case of negative jumps, going against the jump-momentum effect used by the system.
Three different classes of data mining methods (k-Nearest Neighbour, Ridge Regression and Multilayer Perceptron Feed-Forward Neural Networks) are applied for the purpose of quantitative trading on 10 simulated time series, as well as real world time series of 10 currency exchange rates ranging from 1.11.1999 to 12.6.2015. Each method is tested in multiple variants. The k-NN algorithm is applied alternatively with the Euclidian, Manhattan, Mahalanobis and Maximum distance function. The Ridge Regression is applied as Linear and Quadratic, and the Feed-Forward Neural Network is applied with either 1, 2 or 3 hidden layers. In addition to that Principal Component Analysis (PCA) is eventually applied for the dimensionality reduction of the predictor set and the meta-parameters of the methods are optimized on the validation sample. In the simulation study a Stochastic-Volatility Jump-Diffusion model, extended alternatively with 10 different non-linear conditional mean patterns, is used, to simulate the asset price behaviour to which the tested methods are applied. The results show that no single method was able to profit on all of the non-linear patterns in the simulated time series, but instead different methods worked well for different patterns. Alternatively, past price movements and past returns were used as predictors. In the case when the past price movements were used, quadratic ridge regression achieved the most robust results, followed by some of the k-NN methods. In the case when past returns were used, k-NN based methods were the most consistently profitable, followed by the linear ridge regression and quadratic ridge regression. Neural networks, while being able to profit on some of the time series, did not achieve profit on most of the others. No evidence was further found of the PCA method to improve the results of the tested methods in a systematic way. In the second part of the study, the models were applied to empirical foreign exchange rate time series. Overall the profitability of the methods was rather low, with most of them ending with a loss on most of the currencies. The most profitable currency was EURUSD, followed by EURJPY, GBPJPY and EURGBP. The most successful methods were the linear ridge regression and the Manhattan distance based k-NN method which both ended with profits for most of the time series (unlike the other methods). Finally, a forward selection procedure using the linear ridge regression was applied to extend the original predictor set with some technical indicators. The selection procedure achieved limited success in improving the out-sample results for the linear ridge regression model but not the other models.
This paper investigates the drivers of cross-currency basis spreads, which were historically close to zero but have widened significantly since the start of the financial crisis. Credit and liquidity risk, as well as supply and demand have often been cited as general factors driving cross-currency basis spreads, however, these spreads may widen beyond what is normally explained by such variables. We suggest market proxies for EUR/USD basis swap spread drivers and build a multiple regression and cointegration model to explain their significance during three different historical periods of basis widening. The most important drivers of the cross-currency basis spreads appear to be short- and medium-term EU financial sector credit risk indicators, and to a slightly lesser extent, short- and medium-term US financial sector credit risk indicators. Another important driver is market volatility for the short-end basis spread, and the EUR/USD exchange rate for the medium term basis spread, and to a lesser extent, the Fed/ECB balance sheet ratio.
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical technique to prevent model overfitting such as out-sample back-testing turns out to be unreliable in the situation when selection is based on results of too many models tested on the holdout sample. There is an ongoing discussion how to estimate the probability of back-test overfitting and adjust the expected performance indicators like Sharpe ratio in order to reflect properly the effect of multiple testing. We propose a consistent Bayesian approach that consistently yields the desired robust estimates based on an MCMC simulation. The approach is tested on a class of technical trading strategies where a seemingly profitable strategy can be selected in the naïve approach.
Non-parametric approach to financial time series jump estimation, using the L-Estimator, is compared with the parametric approach utilizing a Stochastic-Volatility-Jump-Diffusion (SVJD) model, estimated with MCMC and extended with Particle Filters to estimate the out-sample evolution of its latent state variables, such as the jump occurrences. The comparison is performed on simulated time series with different kinds of dynamics, including Poisson jumps, self-exciting Hawkes jumps with long-term clustering, as well as co-jumps. In addition to that, a comparison is performed on the real world daily time series of 4 major currency exchange rates. The results from the simulation study show that for the purposes of in-sample estimation does the MCMC based parametric approach significantly outperform the L-Estimator. In the case of the out-sample estimates, based on a combination of MCMC an Particle Filters, used to sequentially estimate the jump occurrences immediately at the times at which the jumps occur, does the parametric approach achieve a similar accuracy as the non-parametric one in the case of the simulations with Poisson jumps that are relatively large, and it outperforms the non-parametric approach in the case of Hawkes jumps when the jumps are large. On the other hand, the L-Estimator provides better results than the parametric approach in all of the cases when the simulated jumps are small (1% or less), regardless of the jump process dynamics. The application of the methods to foreign exchange rate time series further shows that the estimates of the parametric method may be biased in the case when large outlier jumps occur in the time series as well as when the stochastic volatility grows too high (as happened during the crisis). In both of these cases, the non-parametric L-Estimator based approach seems to provide more robust jump estimates, less influenced by the mentioned issues.
The thesis covers a wide range of topics from the credit risk modeling with the emphasis put on pricing of the claims subject to the default risk. Starting with a separate general contingent claim pricing framework the key topics are classified into three fundamental parts: firm-value models, reduced-form models, portfolio problems, with a possible finer sub-classification. Every part provides a theoretical discussion, proposal of self-developed methodologies and related applications that are designed so as to be close to the real-world problems.
The text also reveals several new findings from various fields of credit risk modeling. In particular, it is shown (i) that the stock option market is a good source of credit infor- mation, (ii) how the reduced-form modeling framework can be extended to capture more complicated problems, (iii) that the double t copula together with a self-developed port- folio modeling framework outperforms the classical Gaussian copula approaches. Many other, partial findings are presented in the relevant chapters and some other results are also discussed in the Appendix.
A review of vast literature base on Asset Pricing testing in advanced and emerging markets suggests that today no consensus has been reached on what is the right approach for dealing with the emerging market’s specifics such as thin trading, market concentration and high volatility. This paper will consider a class of linear factor models, which are particularly famous due to the acknowledgement of Capital Asset Pricing Model (CAPM) and its subsequent modifications, which are the ultimate topic of this paper. Despite numerous research papers criticising traditional linear models and attempting to alter their embedded limitations, practitioners as well as academics return to already existing models such as the CAPM repeatedly. Anderson, Bollerslev, Diebold and Wu (2006), argue that the death of the model was over exaggerated. Firstly, because the model often works well despite its wrinkles and secondly more advanced multi-factor models that offer better statistical fit, lack the economic explanation of the variables and their interpretation in terms of systematic risk.
This opens a floor for discussion about the choice of asset pricing models, specifically should traditional CAPM model be applied or should alternative models such as D-CAPM be preferred. This paper offers statistical testing of traditional CAPM, Fama French CAPM and D-CAPM on a set of indices and portfolios with the use of GMM two-step simple and multiple regressions. Results has shown that on average downside beta tends to perform better in both emerging and developed markets than traditional beta. However, caution should be given to the use of systematic risk measures, as in case of emerging markets total risk measures such as semivariance and standard deviation can be preferable. Overall, unconditional models should not be a centre of discussion as many research papers along with this master thesis has shown that beta is non-constant over time, which confirms a general finding of non-constant volatility. The last chapter of this paper therefore looks at key conditional models.
This thesis analyses the impact of announcements of macroeconomic indicators in United States on price development of the VIX Futures, S&P500 Futures and EUR/USD FX rate. Theoretical part contains construction and description of individual markets. Empirical part investigates the reaction of market prices after 1, 10 and 30 minutes after announcement of an individual indicator value on a market surprise demonstrated as a difference between reported value and analysts’ expectations. We tried to find a systematic reaction of market participants and the pace of absorption of new information into the market price. There have been found minimum of situations, where we explained the market move as a linear combination of market surprise. However, there was a several cases, where the market did not adjust to announced information quickly and was inefficient in a short period. In the second part of empirical research we tested all significant models on an out-sample data. The goal was to determine whether the market inefficiencies persisted and stable profit could be achieved. We analysed the brutto performance, then netto performance including all transaction costs. Finally, we defined a simple trading rules with a purpose of profit stabilization and lowering the riskiness of trades. For VIX Futures and EUR/USD markets we achieved a low loss, respectively negligible profit. For S&P 500 Futures we obtained a profit strategies for all selected indicators, total profit was high with a very low volatility of invested capital.
In this study we analyse relationship between classical approach to valuation of linear interest rate derivatives and post-crisis approach when the valuation better reflects credit and liquidity risk and economic costs of the transaction on top of the risk-free rate. We discuss the method of collateralization to diminish counterparty credit risk, its impact on derivatives pricing, and how overnight indexed swap (OIS) rates became market standard for discounting future derivatives’ cash flows. We show that using one yield curve to both estimating the forward rates and discounting the expected future cash flows is no longer possible in arbitrage free market. We review in detail three fundamental interest rate derivatives (interest rate swap, basis swap and cross-currency swap) and we derive discount factors used for calculating the present value of expected future cash flows that are consistent with market quotes. We also investigate drivers behind basis spreads, in particular, credit and liquidity risk, and supply and demand forces, and show how they impact valuation of derivatives. We analyse Czech swap rates and propose an estimation of CZK OIS curve and approximate discount rates in case of cross-currency swaps. Finally, we discuss inflation markets and consistent valuation of inflation swaps.
We consider counterparty credit risk in the interest rate swap (IRS) contracts in the presence of an adverse dependence between the default time and interest rates, so-called wrong-way risk. The IRS credit valuation adjustment (CVA) semi-analytical formula based on Gaussian copula assumption, presented in Černý and Witzany , is further replaced by Fréchet copula (for extreme dependence) mainly based on the work of Cherubini , called modified approach. The result of all three CVA calculation approaches are compared in a numerical study where we find that our semi-analytical formulas (the Gaussian copula and modified approach) provide more accurate information on IRS CVA price.
We are comparing two approaches for stochastic volatility and jumps estimation in the EUR/USD time series - the non-parametric power-variation approach using high-frequency returns, and the parametric Bayesian approach (MCMC estimation of SVJD models) using daily returns. We find that both of the methods do identify continuous stochastic volatility similarly, but they do not identify similarly the jump component. Firstly - the jumps estimated using the non-parametric high-frequency estimators are much more numerous than in the case of the Bayesian method using daily data. More importantly - we find that the probabilities of jump occurrences assigned to every day by both of the methods are virtually no rank-correlated (Spearman rank correlation is 0.0148) meaning that the two methods do not identify jumps at the same days. Actually the jump probabilities inferred using the non-parametric approach are not much correlated even with the daily realized variance and the daily squared returns, indicating that the discontinuous price changes (jumps) observed on high-frequencies may not be distinguishable (from the continuous volatility) on the daily frequency. As an additional result we find strong evidence for jump size dependence and jump clustering (based on the self-exciting Hawkes process) of the jumps identified using the non-parametric method (the shrinkage estimator).
Quantitative methods to assess the creditworthiness of the loan applicants are vital for the protability and the transparency of the lending business. With the total loan volumes typical for traditional nancial institutions, even the slightest improvement in credit scoring models can translate into substantial additional prot. Yet for the regulatory reasons and due to the potential model risk, banks tend to be reluctant to replace the logistic regression as an industrial standard with the new algorithms. This does not stop researchers from examining such new approaches, though. This thesis discusses the potential of the support vector machines, to become an alternative to logistic regression in credit scoring. Using
the real-life credit data set obtained from the P2P lending platform Bondora, the scoring models were built to compare the discrimination power of support vector machines against the traditional approach. The results of the comparison were
ambiguous. The linear support vector machines performed worse than logistic regression and their training consumed much more time. On the other hand, support vector machines with non-linear kernel performed better than logistic
regression and the dierence was statistically signicant at 95% level. Despite this success, several factors prevent SVM from the widespread applications in credit scoring, higher training times and lower robustness of the method being two of the major drawbacks. Considering the alternative algorithms which became available in the last 10 years, support vector machines cannot be recommended
as a standalone method for credit risk models.
This master’s thesis presents the comparison of the logistic regression algorithm and the decision tree algorithm regarding creation of the scoring models of financial institutions. The theoretical part of the thesis focuses on the description of both algorithm and their application in model making. The practical part of the thesis uses both algorithms to make models based on a real dataset and then compares which algorithm gives us better results. The thesis is focused on the applied description of the problem, and therefore it does not include precise mathematical definitions.
The thesis compares various approaches to the term structure of interest rates modelling. Several models are built, following two general frameworks: a dynamic Nelson-Siegel approach and an ane class of models. Based on an evaluation of dynamic properties of the estimated models, particularly in terms of impulse-responses and a forecasting performance, eects of an explicit inclusion of macroeconomic variables into the models are tested. The thesis shows, that the benet of such macro-nance extension of the models is varying in time, and also diers for both approaches. However, it is shown that the models can be considered as complementary, as the particular approaches are dierently useful under various macroeconomic conditions and nancial markets situations.
Moreover, unusually long maturities are included into the term structure of interest rates, and some of the models are shown to be able to forecast these maturities as well, particularly in certain periods of time.
The thesis primarily deals with modifications of exponential
smoothing type methods for univariate time series with periodicity and/or certain types of irregularities. A modified Holt method for irregular times series robust to the problem of time-close observations is suggested. The general concept of seasonality modeling is introduced into Holt-Winters method including a linear interpolation of seasonal indices and usage of trigonometric functions
as special cases (the both methods are applicable for irregular observations). The DLS estimation of linear trend with seasonal dummies is investigated and compared with the additive Holt-Winters method. An autocorrelated term is introduced as an additional component in the time series decomposition. The suggested methods are compared with the classical ones using real data examples and/or simulation studies.
The thesis concentrates on property of linearity in time series models, its definitions and possibilities of testing. Presented tests focus mainly on the time domain; these are based on various statistical methods such as regression, neural networks and random fields. Their implementation in R software is described. Advantages and disadvantages for tests, which are implemented in more than one package, are discussed. Second topic of the thesis is additivity in nonlinear models. The definition is introduced as well as tests developed for testing its presence. Several test (both linearity and additivity) have been implemented in R for purposes of simulations. The last chapter deals with application of tests to real data.
This thesis discusses the phenomenon of currency crises, in particular it is devoted to empirical identification of crisis periods. As a crisis indicator, we aim to utilize an exchange market pressure index which has been revealed as a very powerful tool for the exchange market pressure quantification. Since enumeration of the exchange market pressure index is crucial for further analysis, we pay special attention to different approaches of its construction. In the majority of existing literature on exchange market pressure models, a currency crisis is defined as a period of time when the exchange market pressure index exceeds a predetermined level. In contrast to this, we incorporate a probabilistic approach using the extreme value theory. Our goal is to prove that stochastic methods are more accurate, in other words they are more reliable instruments for crisis identification. We illustrate the application of the proposed method on a selected sample of four central European countries over the period 1993 - 2012, or 1993 - 2008 respectively, namely the Czech Republic, Hungary, Poland and Slovakia. The choice of the sample is motivated by the fact that these countries underwent transition reforms to market economies at the beginning of 1990s and therefore could have been exposed to speculative attacks on their newly arisen currencies.
These countries are often assumed to be relatively homogeneous group of countries at similar stage of the integration process. Thus, a resembling development of
exchange market pressure, particularly during the last third of the estimation period, would not be surprising.
The bank regulation embodied in the Basel II Accord has opened-up a new era in estimating recovery rates or complementary loss given default in retail lending credit evaluation process. In this paper we investigate the properties of survival analysis models applied for recovery rates in order to predict loss given default for retail lending. We compare the results to standard techniques such as linear and logistic regressions and discuss the pros and cons of the respective methods. The study is performed on a real dataset of a major Czech bank.
This thesis discusses a statistical modeling approach based on the extreme value theory to describe the behaviour of large claims of an insurance portfolio. We focus on threshold models which analyze exceedances of a high
threshold. This approach has gained in popularity in recent years, as compared with the much older methods based directly on the extreme value distributions. The method is illustated using the group medical claims database recorded over the periods 1997, 1998 and 1999 maintained by the Society of Actuaries. We aim to demonstrate that the proposed model outperforms classical parametric distributions and thus enables to estimate high quantiles or the probable maximum loss more precisely.
The credit value adjustment (CVA) of OTC derivatives is an important part of the Basel III credit risk capital requirements and current accounting rules. Its calculation is not an easy task - not only it is necessary to model the future value of the derivative, but also the probability of default of a counterparty. Another complication arises in the calculation incorporating the wrong-way risk, i.e. the negative dependence between the underlying asset and the default time. A semi-analytical CVA formula simplifying the interest rate swap (IRS) valuation with counterparty credit risk (CCR) including the wrong-way risk is derived and analyzed in the paper. The formula is based on the fact that the CVA of an IRS can be expressed by the swaption price. The link between the interest rates and the default time is represented by a Gaussian copula with constant correlation coefficient. Finally, the results of the semi-analytical approach are compared with results of a complex simulation study.