Workshop on

**Risk Measures & Risk Management General
Aspects**

**May 9-10-11, 2005**

**EURANDOM, Eindhoven, The Netherlands**

**ABSTRACTS**

**Alexandre Adam**

**Hedging Deposit Accounts : new perspectives**

We provide a framework for the interest rate risk management of non maturing deposit accounts. While both business and interest rate risk drive the earnings, only the second source of risk can be hedged in financial markets. We firstly consider investment strategies in the swap market that minimize the variance of quaterly results. An analytical expression of the strategy is provided. Banks also consider risk measures based on quantiles such as VaR and expected shortfall. We thus consider some alternative strategies that are likely to improve hedging performance with respect to these risk measures.

**Hansjoerg Albrecher**

Joint work with J. Teugels

**Asymptotic Analysis of Measures of Variation **

The sample coefficient of variation and the sample dispersion are
two examples of widely used measures of variation.

We show that their applicability in practice heavily depends on the existence of
sufficiently many moments of the underlying distribution. In particular, a set
of results is derived that illustrates the behavior of these measures of
variation when such a moment condition is not satisfied. As a side product, this
leads to a new method for estimating the extreme value index of Pareto-type
tails.

**Mark Davis**

**Current Topics in Credit Risk **

In recent years there has been a huge increase in the trading of financial risk and in redistribution of risk through structured financial products such as CDOs (collateralized debt obligations). This talk will concentrate on the area in which the most important current problems lie, namely in portfolio credit risk. The challenge here is to get good models of the joint risk in a large (50-100) portfolio of credit-risky assets. We start with the relationship between credit spreads and default probabilities and describe the currently popular approach of constructing joint default distributions by the use of copulas. We then move on to dynamic models and discuss various approaches aimed at capturing the interaction effects.

**Michel Denuit**

**Axiomatic aspects and applications of distortion risk measures in
actuarial science **

Different sets of axioms leading to distortion risk measures are presented. Properties of these risk measures are recalled. In particular, sums of correlated or heterogeneous risks are considered. An application to the pricing of longevity bonds (in the context of the securitization of longevity risk) is also discussed.

**Robert Elliott
**Joint work with Carlton Osakwe

**Option Pricing for Pure Jump Processes with Markov Switching Compensators**

The paper proposes a model for asset prices which is the exponential of a pure jump process with an N-state Markov switching compensator. This model extends that of Madan and Konikov. Such a process has a good chance of capturing all empirical stylized features of stock price dynamics. A closed form representation of its characteristic function is given. Option pricing is formulated in Fourier transform space.

**Tom Fischer**

**Differentiability of Risk Measures: Applications, Problems, Remedies **

When a risk measure is differentiable with respect to the units of the measured portfolio, there are interesting applications for its gradient. For certain measures, the gradient has turned out to be the unique fair per-unit allocation principle. Another application is even more straightforward: gradient search can be used for risk and performance optimization. In this talk, we will have a closer look at such applications of differentiability. Theoretical problems arising within this context will be addressed. As remedies, we present allocation and optimization concepts which can be useful when dealing with such problems in a more practical context.

**Hans Föllmer**

**Convex risk measures and robust projections**

**Hans U. Gerber**

**Optimal Dividends in the Brownian Motion Model with Credit and Debit
Interest**

The income process of a company is modeled by a Brownian motion, and in addition, the surplus earns investment income at a constant rate of credit interest. Dividends are paid to the shareholders according to a barrier strategy. It is shown how the expected discounted value of the dividends and the optimal dividend barrier can be calculated; Kummer's confluent hypergeometric differential equation plays a key role in this context. An alternative assumption is that business can go on after ruin, as long as it is profitable. When the surplus is negative, a higher rate of debit interest is applied. Several numerical examples document the influence of the parameters on the optimal dividend strategy.

(This talk is based on a joint paper with Jun Cai and Hailiang Yang).

**Marc Goovaerts **

Join work with Roger Laeven

**Decision principles derived from
risk measures **

**Werner Hürlimann **

**Distortion risk measures and economic capital**

To provide incentive for active risk management, it is argued that a sound coherent distortion risk measure should preserve some higher degree stop-loss orders, at least the degree three convex order. Such risk measures are called tail-preserving risk measures. It is shown that under some common axioms and other plausible conditions, a tail-preserving coherent distortion risk measure identifies necessarily with the Wang right-tail measure or the expected value measure. This main result is applied to derive an optimal economic capital formula.

**Zinoviy Landsman **

**Elliptical families and tilting: premium, allocations, from family to
copulas**

Exponential tilting, as a bivariate version of the Esscher premium, is
regarded ( Wang (2002)) as a convenient tool in risk measurement and portfolio
allocation. The main component of this measure is the variance-covariance
structure of the multivariate distribution, which makes it especially attractive
for multivariate normal portfolio, since it is uniquely determined by the
underlying variance-covariance structure.

The assumption about the normal distribution nowadays seems too restrictive. We
note, however, that if the distribution deviates from the normal, the allocation
methods based on exponential tilting fail to reflect this deviation even if
the distribution still preserves the same variance-covariance structure as
normal.

We suggest the elliptical tilting as a natural generalization of the exponential
tilting, which coincides with the latter if the underlying distribution is
normal, as a tool for deriving a portfolio decomposition formula for the
multivariate elliptical family. The traditional variance premium is also
generalized, and is now affected by the shape of the distribution.

The next step in weakening the restrictions of the model is in considering the
elliptical copulas with symmetric marginals. This generalization allows, for
example, to get rid of the identically distributed marginals, a main
disadvantage of the elliptical family. The suggested premium principle can be
modified for this case as well.

*Reference*

Wang, S. (2002) "A Set of New Methods and Tools for Enterprise Risk Capital
Management and Portfolio Optimization", 2002 CAS Summer Forum, Dynamic Financial
Analysis Discussion papers.

**Helmut Mausser**

**VaR Contributions in a Conditional-Independence Credit Framework
**

An obligor's VaR contribution is given by a conditional expectation, namely its expected loss when the portfolio loss equals the VaR. In a conditional-independence framework, which captures correlated credit transitions by relating obligors to a common set of factors, VaR contributions are typically estimated using Monte Carlo simulation. We discuss some of the challenges associated with this task and describe how L-estimators and importance sampling can be applied to the problem.

**Thomas Mikosch**

**Long-range dependence effects and ARCH modelling**

**Wlodzimierz Ogryczak**

**From Stochastic Dominance to Convex and Coherent Risk Measures **

Two methods are frequently used for modeling the choice among uncertain outcomes: stochastic dominance and mean-risk approaches. The former is based on the axiomatic model of risk-averse preferences but does not provide a convenient computational recipe. It is, in fact, a multiple criteria model with a continuum of criteria. The mean-risk approaches are appealing to decision makers as they quantify the risk a lucid form of scalar risk measures. For typical dispersion statistics used as risk measures, the mean-risk approach may lead to inferior conclusions if the normality of distribution is not guaranteed. Several risk measures, however, can be combined with the mean itself into the complementary achievement criteria thus generating SSD consistent performance (safety) measures.

The goal of this paper is twofold. First, we review the SSD consistency results for risk measures. In particular, we show that the commonly used polyhedral risk measures can be derived from the basic SSD shortfall criteria. Although for the measures using quantile tail characteristics of the distribution, one need to exploit duality relations of convex analysis to develop the quantile model of stochastic dominance. We classify the models with respect to the use of deviational type risk measures (to be minimized) or the complementary achievement type safety measures (to be maximized), where the latter are in harmony with the SSD order. This allows us to identify some potentialy useful new measures still deserving further research.

Second, we reexamine convexity properties of the measures. Again, we demonstrate that the SSD based safety measures after simple change of the sign become coherent measures while their complementary deviational measures are convex. We discuss also general conditions for deviational risk measures sufficient to provide the SSD consistency of the corresponding safety measures. We address open problems of risk measures consistent with the higher degree stochastic dominance orders.

**Soumik Pal**

**A Unified View of Hedging and Risk Management**

**Jostein Paulsen **

**Statistical estimation in insurance models **

For the past five years I have been involved in developing a pricing model for a major marine insurance company. Statistical methods based on historical data containing claims information, deductibles, sums insured and other covariates have been developed. We discuss the problem of finding good statistical models as well as picking the right (transformed) covariates. Theoretical and practical issues of such a process are considered, and possible extensions will also be discussed. Some of these present very interesting statistical problems. The consequences of the model choices on prices are illustrated, and various "measures" of whether the models are adequate (also for reinsurance purposes) are included. The models that are presented are actively used by the company both for direct insurance as well as for reinsurance.

**Georg Ch. Pflug**

**Acceptability, risk capital and risk deviation functionals: Primal and
dual properties for one- and multiperiod models **

We define acceptability, risk capital and risk deviation functionals and their interrelation. Convexity (concavity) properties allow to define dual representations, which characterize the properties in an easy way. Moreover, we study the extension to multiperiod models. An important aspect in multiperid models is the information process. We show how information decreases risk i.e. we identify the shadow prices for information.

**Svetlozar Rachev
Momentum strategies using risk-adjusted stock ranking criteria **

Contemporary and previous studies on momentum strategies use simple cumulative or total return criterion for stock ranking on monthly data. In our study we extend the momentum methodology by applying risk-adjusted criteria for momentum portfolio construction. Our alternative criteria are in the form of risk-return ratios which conform to properties of coherent risk measures and, in different form, capture the risk of the tail distribution. Alternative risk-adjusted stock ranking criteria are applicable when stock returns are not normally distributed and they facilitate the use of daily data. Replacing the cumulative return by the risk-adjusted criterion, we also utilize the ratio as the objective function in the portfolio optimization problem and obtain optimal risky winner and loser portfolios. We find that risk-adjusted stock ranking criteria are able to generate more profitable momentum strategies than those based on usual cumulative or total return criterion. Moreover, our results are robust to transaction costs for both equal-weighted and optimized-weighted strategies. In particular, our alternative ratios outperform the cumulative return and the Sharpe ratio across all strategies measured by total realized return and independent performance measures over the observed period.

**Alexander Schied
Model uncertainty, risk measures, and optimal strategies
**Financial market models are typically not entirely precise in their
description of real world price processes. Instead of one optimal model, there
may be a variety of models that describe the different features of the market
more or less accurately. Thus, investors are facing model uncertainty. In this
talk, we discuss a particular class of investor preferences that take into
account aversion against both model uncertainty and risk in its classical form.
It turns out that these preference structures lead to certain utility
functionals based on risk measures. We then discuss the problem of constructing
optimal investment strategies for these utility functionals. It turns out that
this problem can formally be reduced to a classical utility maximization problem
for a representative model, but that in many cases this representative model may
admit arbitrage opportunities.

**Hanspeter Schmidli**

**Optimisation Problems in Non-Life Insurance**

During the last few years, control of the classical risk process has been
considered by serveral authors. We discuss here two of them: (1) the problem of
optimal dividend payments, and (2) minimisation of the ruin probability.

The method is the Hamilton-Jacobi-Bellman approach. In particular, we discuss
the first problem and the problems arising from the facts, that the value
function not necessarily is differentiable, and that the solution to the
Hamilton-Jacobi-Bellman equation is not unique.

**Dmitrii Silvestrov **

**Reinsurance Analyser **

An experimental program system for analysis and comparison of reinsurance contracts is presented. An approach realised in this program is based on global stochastic modelling of various flows of claims with different types of claim and inter-claim time distributions. These flows are processed with the use of different types of reinsurance contracts. The parameters of the contracts are balanced by average-(re)insurer-payment type parameters. Then contracts compared by additional risk and other characteristics.

**Dirk Tasche **

**Basel II extended: The multi-factor version of the Basel II credit
portfolio model **

The risk weight functions on which the Basel II "Internal Ratings Based Approach" is based were developed by considering a special credit portfolio model, the so-called "Asymptotic Single Risk Factor Model'' (Gordy, 2003, "A Risk-Factor Model Foundation for Ratings-Based Bank Capital Rules"). This model is characterised by its computational simplicity and the property that the risk weights of single credits depend only upon the characteristics of these credits, but not upon the composition of the portfolio ("Portfolio Invariance"). As a consequence, the model can reflect neither exposure concentrations nor segmentation effects (say by industry branches).

The model's inability to detect exposure concentrations entails a potential underestimation of the risk inherent in the portfolio, whereas its fault in recognizing the diversification effects following from segmentation could result in a potential overestimation of portfolio risk. The Basel Committee decided to deal in Pillar 2 of the Revised Framework (see http://www.bis.org/publ/bcbs107.htm ) with the potential underestimation of concentration risk. Hence there is no automatism of extended capital requirements for concentration risks, but banks will have to demonstrate to the supervisors that they have established appropriate procedures to keep concentrations under control. An alternative, quantitative way of tackling the exposure concentration issue was suggested in Emmer and Tasche (2005, "Calculating Credit Risk Capital Charges with the One-Factor Model").

In the present paper, we suggest a minimal -- in the spirit of Emmer and Tasche (2005) -- extension of the Basel II model that allows to study the effects of segmentation on portfolio risk. Admitting several risk factors instead of a single factor only and applying the same transition to the limit as described in Gordy (2003) we arrive at versions of the model that lift the no segmentation restriction. Deriving the formulae for capital requirements in the asymptotic multi risk factors setting then represents the main contribution of the paper to the subject. From a computational point of view, the resulting formulae are more demanding than in the one factor case, and -- necessarily, as otherwise diversification effects could not be recognised -- they are not portfolio invariant any longer. Examples calculated with a two-factor model indicate that there can be a substantial reduction of capital requirements by diversification effects generated by segmentation.

**Hailiang Yang**

**Optimal Consumption Strategy in a Discrete-Time Model with Credit Risk**

This paper analyzes the consumption problem of a risk averse investor in discrete time model. We assume that the return of risky asset is credit ranking sensitive and that the credit ranking is described using a Markov Chain with an absorbing state which represents the default state. We formulate the investor's decision as a problem in optimal stochastic control. The closed form expression of the optimal consumption strategy has been obtained. In addition, we investigate the impact of credit risk on the optimal strategy. We employ some tools in stochastic orders to obtain the properties of the optimal strategy.

*Last modified:
24-02-09
Maintained by L. Coolen*