Research‎ > ‎

Working Group on RISK

Working Group on (quantitative) Risk Analysis / Uncertainty
- ESSEC Business School - 
with the support of the group BFA (Banque Finance Assurance) of the SFdS (Société Française de Statistique) 
& the support of the French Institute of Actuaries (IA)

Organizer: Prof. Marie Kratz

The main objective  is the resolution of concrete problems which affect the industry or of subjects of public interest, in the field of (quantitative) risk analysis, creating new tools or methods.

Many skills are today distributed in various domains, financial, industrial, academic,..., therefore the intention behind the creation of this group is:

  • to organize a structured dynamics in the field of risk analysis
  • to facilitate the exchanges between professionals and academics
  • to generate possible collaborations on identified problems
  • to create and to work on research projects

To this end, regular meetings will include:

  • presentations of articles or chapters of books
  • informal presentations/discussions
  • invited seminars

Each participant might propose a precise theme, a work on a given article or on a project.

The presentation slides are in general available: see the attachments at the end of the page.

Our one hour meetings will take place generally twice a month, on Fridays, at 12:30pm, EEE - ESSEC La Défense.

If you are interested and want to be part of the mailing list, please contact Marie Kratz.

Note: since March 2012, the French Institute of Actuaries (IA) supports the WG Risk and considers it as part of its program PPC (Perfectionnement Professionnel Continu). To participate to a conference holds for 6 points PPC.

 Calendar of the meetings  in 2013-2014 (following this link):

Calendar of the meetings in 2012-2013:

3rd Term:

Colloquium of the International Actuarial Association

June 24-26, 2013,  Lyon 

Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.


Romain Speisser (Actuary, Essec Business School)
June 13, 12:30 pm, La Défense EEE, room 138

Evaluation of the risk of a pandemic and constructionof a partial internal model health insurance under Solvency II

Except for world wars, the pandemic risk surely constitutes the most threatening risk in terms of human lives, as epitomized by the 1918 Spanish influenza pandemic which resulted in 40 to 50 million deaths in a few months’ time. Since a severe pandemic would have large consequences on insurers among the world, Solvency II specifically targets this risk through two sub-modules of its standard formula, namely the SCR Life CAT and the SCR Health CAT Pandemic.                                                      Mainly two types of models exist to assess the impact of a pandemic. There are actuarial models based on historical data, and epidemiological models which simulate the spread of the disease. In this presentation, we introduce one model of each type used in practice, and try to fix the inconsistencies in order to use them in two distinctive partial internal models for the pandemic risk in life insurance and health insurance. 

Vincent Ruol (Actuary, Inspector of Social Affairs)
May, 31, 12:30 pm, La Défense EEE, room 203

Historical perspective and prospective of regulation in insurance 

Insurance is a business particularly regulated. The first attempts of this control were set up already at the end of the 19th century, and public intervention has since regularly increased. Its forms have been deeply reformed, especially during the last quarter of the century. Above all, the regulator's role is intended to deeply evolve with the introduction of the new European directive of Solvency 2 and the creation of the French regulatory authority (ACP).

Christoph Hummel (Dr. Head of non-life & Modelling - Secquaero Advisors)
May, 22, 12:30 pm, La Défense EEE, room 104

Multiple dependencies of risks: Models for actuarial practise 

An adequate description of dependencies between risk factors is essential in the assessment of the risk capital with internal models. In this talk a simple modelling tool kit providing insight into a variety of stochastic dependence structures is introduced. By means of unconventional examples some approaches used in practise in the assessment of the capital requirements for Solvency II are challenged. The talk concludes with a discussion on practical applications of the methods presented.  

Giovanni Puccetti (Prof. Univ. of Firenze, Italy)
April, 22, 12:30 pm, La Défense EEE, room 220

The rearrangement algorithm: A new tool for computing bounds on risk measures 

We introduce a numerical algorithm which allows for the computation of sharp upper and lower bounds on the Value-at-Risk/Expected Shortfall of high-dimensional risk portfolios having fixed marginal distributions. These bounds can be interpreted as a measure of model uncertainty induced by possible dependence scenarios. 

This is a joint work with Paul Embrechts (ETH Zürich) and L. Rüschendorf (Freiburg Univ.).

Conference AssurFinance2013

April 9, 2013,  Lyon 

organized with the support of CREAR


2nd Term:
Walter Farkas (Prof., ETH Zürich, Switzerland)
March 22, 12:30pm, La Défense EEE, room 101 

Capital requirements with defaultable securities

The adequacy of the capitalization of a financial institution is typically defined in terms of acceptance sets of financial positions. Risk measures are used to determine the minimum amount of capital - the so-called capital requirement - that has to be raised and invested in a portfolio of a prespecified class of tradable assets to make a position acceptable. We allow for general acceptance sets and general positive eligible (or "reference'') assets, which include defaultable bonds, options, or limited liability assets. Since the payoff of these assets is not bounded away from zero the resulting capital requirements cannot be transformed into cash-invariant risk measures by a simple change of numeraire. However, extending the range of eligible assets is important because, as exemplified by the recent financial crisis, the existence of default-free securities may not be a realistic assumption to make. We study finiteness and continuity properties of capital requirements in this general context. We show how to reduce risk measures with respect to multiple eligible assets to the risk measures with respect to a single eligible asset by properly enlarging the acceptance set. Risk measures with respect to multiple eligible assets are shown to be non-trivial when no acceptability arbitrage is possible, i.e. when not every position can be made acceptable by adding a zero-cost portfolio of eligible assets. We derive a theorem on the structure of closed convex acceptance sets based solely on the external characterization of general closed convex sets. A distinguishing feature of our approach is that convex risk measures are represented as the supreme of an objective function that depends exclusively on the acceptance set, where the supremum is taken over a set that varies with the choice of the class of eligible assets. We apply our results to capital requirements based on Value-at-Risk and Tail-Value-at-Risk acceptability, the two most important acceptability criteria in practice.  

Michael Schmutz (Doctor, Univ. of Bern, Switzerland )

March 4, 12:30pm, La Défense EEE, room 220

Static and (symmetry based) semi-static replication strategies in actuarial science 

Recently there has been an increasing interest in static- and semi-static replication strategies in actuarial science, e.g. in the context of Guaranteed Minimum Withdrawal Benefits that play an important role in the North American Actuarial Industry. Static and semi-static replication strategies trace their roots to financial mathematics. We will discuss some basic aspects of static replications. Furthermore, the main mechanism of symmetry based semi-static hedging strategies will be explained and conditions under which they can be applied will be discussed.

Annamaria Olivieri (Prof., Univ. of Parma, Italy)

February 18, 12:30pm, La Défense EEE, room 138

Sharing the longevity risk between annuitants and annuity provider

The benefits provided by conventional life annuity and pension products are guaranteed in face of adverse experience in respect of investment returns or

mortality rates. Facing the risks by charging high premiums can make life annuities even less attractive than how currently perceived by potential customers. Appropriate solutions should be suggested, aiming at providing retirees with effective alternatives to the income drawdown (or “self-annuitization”). We focus on the risk of unanticipated mortality improvements, that is the nondiversifiable aggregate longevity risk. Non-conventional life annuities can be defined, aiming at linking, to some extent, the annuity benefit to the mortality experienced in the group of annuitants, and / or in the market of life annuities (or pensions) and / or in the population. This link implies sharing the risks arising from experienced mortality between annuitants and annuity provider. Various approaches can be adopted in order to link annuity benefits to mortality experience, or to updated forecasts of future mortality trends. We propose a rather general model that aims at providing a unifying point of view from which several practicable schemes, sharing the common purpose of  transferring part of the longevity risk to the annuitants, can be analyzed and compared. We consider the possibility of changing the annuity benefit by relating the benefit itself to the experienced mortality, or to updated mortality forecasts, or both. We suggest that the experienced mortality can be directly measured, or adjusted through an inference procedure. Further, we investigate the possibility to mitigate the reduction in the annuity benefit due to  unanticipated mortality improvements through investment profit participation. This is a joint work with Ermanno Pitacco, University of Trieste, Italy.

Naji Freiha (Actuary, Head of Risks & Capital Adequacy - DEXIA Group)
February 4, 12:30 pm, La défense EEE, room 138

Integrated Risks & Capital Management 

The banking industry is facing severe economic and regulatory capital shortages, due to more stringent regulations for capital and various pressures by investors and rating-agencies. Hence, integrated risk measurements and capital management are becoming of key importance to risk executives and general managers.

Current practices in capital and integrated risk management rely on two metrics: regulatory and economic capital. Their simultaneous use generates complexity in daily risk decisions. It hinders commonly shared internal risks assessments and practical capital management from the board level directions to their execution by different risk committees.
The aim of the presentation is to describe a new framework (IRCM: Integrated Risks & Capital Management) and its implementation process allowing to: 

1. Quantify jointly the impacts of risks on earnings, liquidity, funding and on both economic and regulatory capital, hence fully aligning the two metrics through integrated risk maps,

2. Bring better coherence and complementary comparisons between stress testing approaches and economic capital estimations built on quantitative models.

The IRCM enhance readability of risks at executive committees and management boards’ levels. It also creates common metrics of risks to financial management, accounting and to risk management. Quantitative risk management concepts and techniques are of paramount importance in this framework for scenario analysis, risk measures and eventually for decision support.

Marie Kratz (Prof., ESSEC Business School)
January 24, 12:30pm, La défense EEE, room 203

There is a VaR beyond usual approximations

A normal approximation is often chosen in practice for the unknown distribution of the yearly log returns of financial assets, justified by the use of the CLT (Central Limit Theorem), when assuming aggregation of iid observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with risk measures; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of risk measures when working on financial data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. It may also be useful for a better estimation of the capital when using Monte-Carlo simulations for which the convergence may be an issue. We explore new approaches to handle this problem, numerically as well as theoretically, based on properties of upper order statistics. We compare them with existing methods, for instance with one based on the Generalized Central Limit Theorem.

1st Term:

Laila Elbahtouri (Actuary, SCOR, Paris)

December 17, 12:30pm, La Défense EEE, room 201

Explicit diversification benefit for dependent risks

Understanding the diversification benefit (DB) is essential to the business model of reinsurance companies. It allows for efficient risk management, capital optimization and competitive pricing. This is why, it is important to model and compute the diversified Risk Adjusted Capital (RAC) of a portfolio from the marginal distributions of the risks and their dependence structure.  The purpose of this paper is to provide explicit DB formulas and the capital allocation to the various stand-alone risks. We do this by means of mixing techniques (Oakes, 1989). Examples for bivariate case will be given according to survival copulas (Clayton and Gumbel) and marginals (Pareto and Weibull) which allow the calculation and the allocation of DB.


The numerical convergence of this quantity is tested against the analytical result and shown to be good for marginal with relatively light tails (tail indices below or equal to 2) and slow for the risk measure (tail Value-at-Risk) but reasonable for the DB when the marginal are very fat tailed (tail index close to 1). Further research will be outlined. Work done in collaboration with M. Dacorogna (SCOR) and M. Kratz (ESSEC).

Dirk Tasche (Dr., FSA, London)

November 30, 12:30pm, La Défense EEE, room 220 

Is Expected Shortfall a better risk measure than VaR?

The Basel Committee on Banking Supervision recently suggested that Expected Shortfall could replace Value-at-Risk to improve the measurement of risk in the trading book. We discuss the conceptual and practical pros and cons of this proposal, arguing that actually the combined use of the two risk measures might be the way forward.


Franck Adekambi (Dr., School of Statistics & Actuarial Science, Univ. of Witwatersrand, Johannesburg, South Africa) November 23, 12:30pm, La Défense EEE, room 103

The Asymptotic Ruin Problem In Health Care Insurance With Interest

Ramsay (1984) in paper published in Insurance: Mathematics and Economics, entitled «The asymptotic ruin problem when the healthy and sick periods form an alternating renewal process» found expressions for the probabilities of sickness and health, the first moment of the aggregate amount of premium received up to the end of the -th healthy period, the aggregate amount of benefit paid out up to the end of the -th sickness period, n =1,2,3,... and an approximation of the probability of ruin but ignoring the effect of force of interest.
This paper aims to consider the Ramsay model modified by the inclusion of interest rate. Upper bounds for the ultimate ruin probability are derived by martingale and recursive techniques. Examples will be given when the length of the period of sickness and health follow a particular distribution.


November, 19, 2012, La Défense EEE, Room 203

Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.

Scientific program and Registration Form

Rama Cont (Dr., 
CNRS & Imperial College, London)

October 26, 

12:30 pm, EEE - ESSEC La Défense - room 220

Demystifying Black Swans : fire sales, price impact and endogenous risk

Prices, volatilities and correlation parameters often exhibit erratic behavior and extreme fluctuations during market crises. The traditional approach has been to either model these occurrences as "extreme" events or statistical outliers, or entirely dismiss them as 'black swans', impossible to model quantitatively. We argue that many such 'black swans' are in fact manifestations of endogenous market instabilities that arise as a result of feedback effects between price behavior and  the resulting supply/demand dynamics generated by market participants. We propose some simple models which allow quantitative modeling of such endogenous risks and present some applications to the Quant Crash of August 2007 and the  Great Deleveraging following the collapse of Lehman Brothers. 

Michel Dacorogna (Dr., Group Deputy CRO, SCOR)                                                                                                   October 12, 12:30pm, SCOR (5 avenue Kléber - 75 116 Paris), auditorium Kléber, (with the support of SCOR Group) 

Adding Time Diversification to Risk Diversification, the Case for Equalization reserves for Natural Catastrophes

The business of insurance is based on a simple concept of spreading the risks endured by an individual amongst as large a group of persons as possible. As long as he can evaluate the expected loss with a reasonable accuracy, the insurer is able to ask for a price that will cover this loss, plus a premium to pay the capital he needs to set aside. This capital is provided to ensure the payment of loss up to a certain probability. Over the years the insurance business has prospered, and has become a central actor in the economies of developed countries. In the early thirties, mathematicians such as Kolmogorov or Fisher were able to explain successful mechanics of the insurance industry through the law of large numbers.
In the last thirty years practitioners and mathematicians have come to recognize that certain risks do not follow the law of large numbers. Natural catastrophes are on such risk category. By nature, such events are of low frequency but high severity. In practice this means that most underwriting years will end up with few losses. However if a natural catastrophe does occur, the loss will probably be much higher than the expected value. In such cases the mathematical functions of expectation and variance do not always converge. The addition of one extra event will increase the value of both functions, even if their estimation was based on many years of previous event occurrence.
Catastrophe risk is a difficult issue for insurers. The insurer wants to benefit from the high return this business can generate but does not want to pay the price of its very high volatility. The solution to this problem is to invest in a program that includes a certain form of savings during the years where the losses are benign. This allows the insurer to face the obligations when a big loss occurs, mitigating the high volatility catastrophe insurance, or even the need to raise new capital. This savings program is called “time diversification or equalization reserves” to differentiate it from existing capital. The difference between equalization reserves and capital is twofold: there will be no return on this money above the risk free rate; and no new business will be written against it.
Equalization reserves have been banned by the new accounting rules and regulations on the basis that if no loss incurs during the contract period the money belongs to the shareholder who originally put his capital at risk for this transaction. This argument is perfectly valid if it is possible to reasonably estimate loss and average claims over a given period, i.e. for loss distributions showing low volatility. With high severity, low frequency losses, we have already seen that this is not the case. Those insurers who did not experience a major claim in such a line have simply been lucky. Effectively the investor is playing roulette. At some point he risks loosing his capital to pay back a large claim, as the premium the insurer will get during that particular year will certainly not cover it. Moreover, he would have no chance of recouping some of his losses because the company would become insolvent once it has lost its capital, and would not be able to write any new business.
Using a model of the balance sheets of two firms submitted to the same risks1, one that is allowed to keep equalization reserves and one that pays all the extra gains as dividends, we analyze the value of both firms from the point of view of the shareholders. Both company start with the same capital. We submit both companies to the same losses over a period of thirty years and analyze the cash flows resulting from their business in terms of the Sharpe Ratio and the Merton model that would give the value of the call option on the assets of the company. We explore two different distributions lognormal and Fréchet with various tails, but all with the same Value-at-Risk at 99%. First, we see that, within certain rules, it is possible to buildup equalization reserves. Second, in all the cases, we show that the company that holds equalization reserves has a higher value for the investor. We present and discuss details of the model and the results.
1- Work done in collaboration with H.J. Albrecher (EPFL), M. Moller (SCOR) and S. Sahiti (EPFL & SCOR)

Calendar of the meetings  in 2011-2012:

3rd Term:

Davide Canestraro (Dr., SCOR 



June 21, 12:30pm,

La Défense EEE (Cnit), room 104

PrObEx and Internal Model - calibrating dependencies among risks in Non-Life

The latest financial crisis has dramatically shown that dependence among risks cannot be ignored. (Re)insurance companies may use copula models in order to prudently account for dependence (especially in the tail) within their internal models. Once a certain copula model has been chosen, actuaries are left with the issue of estimating the copula parameter(s). However, standard statistical estimation techniques generally fail to provide a reliable estimate if data is scarce. In order to reduce the parameter uncertainty, we developed a Bayesian model to calibrate copula parameters: PrObEx. We provide an introduction to such a key innovation in SCOR’s internal model, discussing both the mathematical aspects and the practical implementation for calibrating dependencies in Non-Life. Our method can be used also in other contexts (e.g. Economy, Life, etc.).
D. Canestraro has been awarded for this presentation on PrObEx, with the prize “for the best Scientific Presentation by a Young Actuary at the First European Congress of Actuaries” (Brussels, June 7-8, 2012).

Philipp Arbenz (Dr., SCOR Zürich)

May 24, 12:30pm, La Défense EEE, room 138

High dimensional risk aggregation: a hierarchical approach with copulas.


Risk aggregation is highly relevant for solvency calculations, risk management and capital allocation. From a statistical and computational point of view, most risk aggregation methodologies are very challenging in high dimensions. A prudent approach requires the consideration of dependencies between all risks, which caused the popularity of copulas. However, the common copula classes are restrictive and too symmetric. We present a hierarchical risk aggregation method which is flexible in high dimensions. With this method it suffices to specify low dimensional copulas for each aggregation step in the hierarchy. Arbitrary copulas and margins can be combined. An efficient algorithm for numerical approximation is presented.

Sofiane Aboura (MCF, Univ. Paris Dauphine)

May 9, 12:30pm, La Défense EEE, room 138

Disentangling Crashes from Tail Events

The study of tail events has become a central preoccupation for academics,
investors and policy makers, given the recent financial turmoil. However,
the question on what differentiates a crash from a tail event remains
This work elaborates a new definition of stock market crash taking a
risk management perspective based on an augmented extreme value theory
methodology. An empirical test on the French stock market (1968-2008)
indicates that it experienced only two crashes in 2007-2008 among the 12

identified over the whole period. 

2nd Term: 

Bertrand Maillet (Head of Research at 

ABN AMRO, and  


MCF, Univ. Orléans


March 28, 1

2:30 pm, La Défense EEE, room 101

Risk Models-at-Risk

The recent experience from the global financial crisis has raised serious doubts about the accuracy of standard risk measures as a tool to quantify extreme downward risksRisk measures are hence subject to a “model risk” due, e.g., to the specification and estimation uncertainty. Therefore, regulators have proposed that financial institutions assess the “model risk” but, as yet, there is no accepted approach for computing such a risk. We propose a general framework to computerisk measures robust to the model risk, while focusing on the Value-at-Risk (VaR). The proposed procedure aims empirically adjusting the imperfect quantile estimate based on a backtesting framework, assessing the good quality of VaR models such as the frequency, the independence and the magnitude of violations. We also provide a fair comparison between the main risk models using the same metric that corresponds to model risk required corrections.

Wolfgang Dick (Prof., ESSEC Business School)

March 14, 1

2:30 pm, La Défense EEE, room 237 

Recognition and measurement risks in financial statements. 

Financial statements are supposed to give a true and fair view of the financial position and economic performance of an entity. This requires to recognize and measure all its assets and liabilities. Under IFRS, recognition and measurement of these elements depends on assumptions, estimations etc about what will happen in the future. Depending on the Management's decision about these assumptions and estimations true and fair view disclosed in the financial statements may differ and impact analysts' judgments. This presentation will introduce to the key concepts of financial statements presentation under IFRS and highlight some of related recognition and measurement risks.

Olivier Lopez (MCF, ISUP-LSTA, UPMC Paris 6)

March 7 (instead of Feb. 29), 1

2:30 pm, La Défense EEE (Cnit), room 102

Multiple change-point detection in a Poisson process

 with applications to non-life insurance.

We consider an unhomogeneous Poisson process with unknown intensity. This intensity is approximated by a piecewise constant function. We provide new non-asymptotic results on estimating this intensity. The techniques that we consider are applied to a nonlife insurance portfolio, in order to monitor the stability of the claim process.


orina Constantinescu (Prof., Univ. of Liverpool)

February 7, 1

2:30 pm, La Défense EEE, room 236

Risk processes with premium adjusted to solvency targets

The traditional point of view of ruin theory is 

reversed: rather than studying the probability of ruin as a 

function of the initial reserve under a fixed premium, we 

adjust the premium so as to obtain a given ruin probability 

(solvency requirement) for a fixed initial reserve (the 

financial capacity of the insurer). 

Thierry Roncalli (


Head of Research & Development - Lyxor Asset Management & 


Evry )

January 18, 1

2:30 pm, La Défense EEE, room 236

Portfolio Optimization versus Risk-Budgeting Allocation.

Portfolio allocation is generally based on optimization method (Minimum variance, Markowitz, Merton, Black-Litterman, etc.). The first part of this presentation is to show that portfolio optimization faces  several drawbacks in terms of concentration, stability and management. We will show that risk-budgeting techniques is an alternative method which appears more robust. In particular, we will focus the second part of the presentation in one of the most simple risk-budgeting methods, when the risk budgets are the same. In this case, we obtain the ERC (Equal Risk Contribution) portfolio. After giving the mathematical properties of the ERC portfolio, we will present some applications to manage equity funds (like alternative-weighted indexes) and diversified funds (like risk parity funds). In the third part of the presentation, we will focus on risk-budgeting methods, when the risk budgets are not the same. We will generalize some properties of the ERC portfolio and present an application to manage the sovereign credit risk in bond portfolios.


1st Term: 

Pierre Miehe (Deputy CEO, ACTUARIS International)

December 1412:30 pm, SCOR auditorium (with the support of SCOR Group), La Défense 

Risk Management in Insurance: What technical solutions to answer the huge modeling requirements?  


The new European regulation for insurance “Solvency II” encourages insurers of big/medium size companies to implement internal models for the calculation of their required capital. 

All risks have to be modeled simultaneously. For the estimation of tails and extreme events 

10 000 simulations at least are recommended. For big companies this can theoretically imply runtimes of several weeks… This becomes even worse with the Simulations in Simulations issue. 

The aim of this presentation is to review a range of solutions to deal with this huge number of simulations, through the analyze of adapted random generators, models optimizations, and potential simplified “artificial intelligence” implementations.

Jean-Gabriel Attali (Dr, Consultant, Formerly Strategy Analyst at Exane Derivatives)                                                    November 3012:30 pm, La Défense EEE, room 101   

Risk measurement and its limits in asset management                

Management of risk is a key point in the area of asset management. The aim of this presentation is to address various aspects of risk as well as its measurement or control.  In particular, we will focus on the limits of some risk measures, in the sense that risk is often underestimated when it is measured ex-post, especially when the manager is very active or uses derivatives products. We will show that the management of ex-post risk is easy when it is measured by volatility (or Var), even in time of crisis, simply by managing the exposition of the fund. Unfortunately, this technique does not ensure the safety of principal which remains the main objective of  the industry of asset management.

Paul Deheuvels (Prof., UPMC Paris 6)

November 16 (Wednesday)12:15 pm, La Défense EEE, room 104

On Difficulties of Risk Modelling and Portfolio Analysis

The statistical modelling of insurance claims is typically achieved through lognormal models, which often fit well to the central portion of a number of data sets. What is meant by the “central portion” of observations embraces the set of values, truncated below and above by “appropriate” threshold levels. The most troublesome claim-values are, as could be expected, large observations, but small values may generate other technical difficulties as well, which we will not consider here in detail. Large values typically fall in the domain of reinsurance, and are often quite well interpreted by Pareto-type models. The main difficulty comes here from the available “large” observations, which compose, most of the time, data sets so small as to render their statistical analysis troublesome. We have considered recently a number of portfolios, each composed of a few thousand observations, and observed, in each case, that only a small percentage of the data could be well interpreted by Pareto-type models. Such models become difficult to fit when the number of observations falling in this range is, for example, of the order of 5 to 20. Another class of technical problems to cope with is, first, to find the right statistical assessment methods to compare between each other the respective risks of several portfolios, and second, to assess the goodness-of-fit of a given portfolio with respect to a specified model. In particular, when some large observations fall into the domain of Pareto distributions, the finiteness of claim-values variances is often questionable. Practically all the data sets which we have considered lead to estimations of Pareto indexes for large claims corresponding to finite expectations, but yet, to infinite variances. This, in itself, suffices to support the idea that most “usual” statistical comparison methods (such as that using the Student test technology) are ineffective. We shall illustrate these questions through the analysis of a real data set, and propose some new methods to bring solutions to the corresponding problems.

Daniel Zajdenweber (Prof. , Univ. Paris X Nanterre)

November 2, 12:30 pm, La Défense EEE, room 101

Extremal Events in a Bank Operational Losses


Operational losses are true dangers for banks since their maximal values to signal default are 

difficult to predict. This risky situation is unlike default risk whose maximum values are limited 

by the amount of credit granted. For example, our data from a very large US bank show that this 

bank could suffer, on average, more than four major losses a year. This bank had seven losses 

exceeding hundreds of millions of dollars over its 52 documented losses of more than $1 million 

during the 1994-2004 period. The tail of the loss distribution (a Pareto distribution without 

expectation whose characteristic exponent is 0.95 ! ! ! 1) shows that this bank can fear extreme 

operational losses ranging from $1 billion to $11 billion, at probabilities situated respectively 

between 1% and 0.1%. The corresponding annual insurance premiums are evaluated to range 

between $350 M and close to $1 billion. 

Keywords: Bank operational loss, value at risk, Pareto distribution, insurance premium, extremal event. 

Joint work with 

H. Dahen and G. Dionne (Journal of Operational Risks, 2010).

Jean-Philippe Bruneton (NaXys, Namur Univ., Belgium; formerly Risk Analyst at SCOR Swizerland, Financial modelling team)

October 20 (Thursday)12:30 pm, La Défense EEE, room 101

Diversification benefit in Gaussian Aggregation Trees

Insurance risks are often aggregated together with the help of copulas: risks whose individual marginals are known are tied together via some function (the copula). In mathematical language, their joint distribution is defined via the copula. We focus our study on the aggregation of risks within so-called hierarchical trees: A tree of aggregation refers to applying several times such a process, linking some marginals together, then linking the resulting marginals together, and so on and so forth, until the total portfolio is modelised. We study in particular the "Gaussian Tree" where both marginals and copulas are Gaussian. This enables exact anlytical results for the r.v. of the total portfolio, and therefore the exact computation of the diversification benefit (the release in solvency capital due to diversification of risks, but taking into account their interdependencies). Such a toy model enables us to study the impact of the width and depth of the tree on the diversification benefit. We show that "tight trees" diversify better than "fat trees". We also show some numerical results that support this conclusion also outside the Gaussian world: Lognormal Trees aggregated via Gaussian or Clayton copula show the same overall behavior.

Calendar of the meetings  in 2010-2011:

3rd Term: 

Blaise Bourgeois (Head of Life Product Risks, AXA, Group Risk Management, Paris)                                                       June 16, 12:30 pm, La Défense EEE, room 138  

Risk Neutral Valuation in Insurance         

Insurance markets have traditionnally been presented as incomplete markets, using statistical approaches based on real-world measures to price & evaluate risks, rather than market-consistent valuation methods (aka risk-neutral framework). Whilst this may continue to hold to some extent for Property & Casualty business, the Life & Savings sector has progressively adapted its valuation & risk measurement methodologies and tools to a world more akeen to modern financial markets than traditional actuarial reserving methods. This paper presents the building blocks of these risk-neutral valuation methods / tools now commonly applied by most major L&S insurance companies in Europe. This framework will form from 2013 onwards the backbone of the new Solvency II regulation.                                                                                                                       Key words: Market-Consistent Embedded Value, Replicating Portfolio, Economic Capital, Dynamic Hedging, Completing Insurance markets

Invitation to the Conference on Mathematical Modeling of Systematic Risk                                          June 7, 2011 (IHP Paris)                                        see:

Nizar Touzi (Prof., Ecole Polytechnique, Palaiseau)

May 19, 12:30 pm, La Défense EEE, room 201

Model independent bounds under calibration constraints: a stochastic control approach

We develop a stochastic control approach for the derivation of model independent bounds for derivatives under various calibration constraints. Unlike the previous literature, our formulation seeks the optimal no arbitrage bounds given the knowledge of the distribution at some (or various) point in time. By convex duality techniques, this problem is converted into an optimal transportation problem along controlled stochastic dynamics. We also provide precise connections with the Azema-Yor solution of the Skorohod Embedding problem, and we obtain some extensions. 

Susanne Emmer (Dr., UBS, Zürich) 

May 5, 12:30 pm, La Défense EEE, room 101

Credit Stress Loss

After a short overview of currently used credit risk measures and the stress testing landscape we give an introduction to different stress testing methodologies which are applied to the wealth management and business banking portfolio.

Giles Brennand (Prof., the Chinese University of Hong-Kong, and Independent Consultant)
April 21, 12:30 pm, La Défense EEE, room 101

Towards Modern Risk Management

Risk management specialists, practitioners and academics, have paid much attention to estimation of the likelihood of untoward events. Consideration of the fundamental nature of risk as an essentially human construct identifies many other aspects of risk management that deserve at least as much attention as the likelihood of untoward events.

Jean-Paul Renne (Banque de France, Paris)
April 7, 12:30 pm, La Défense EEE, room 236 

Default, Liquidity and Crises: an econometric framework

In this paper, we present a general discrete-time affine framework aimed at jointly modeling yield curves associated with different debtors. The underlying fixed-income securities may differ in terms of credit quality and/or in terms of liquidity. The risk factors follow conditionally Gaussian processes, with drifts and variance-covariance matrices that are subject to regime shifts described by a Markov chain with (historical) non-homogenous transition probabilities. While flexible, the model remains tractable. In particular, bond prices are given by quasi-explicit formulas. Various numerical examples are proposed, including a sector-contagion model and credit-rating modeling.

2nd Term: (there will be only a few meetings since I will be working abroad; we will meet again on a regular basis on the 3rd term).

Hansjoerg Albrecher (Prof., HEC Lausanne, Switzerland)                                                                                              March 24, 12:30 pm, SCOR (La Défense), Auditorium (with the support of the SCOR Group)   

On refracted stochastic processes and the analysis of insurance risk                                                                   

We show a somewhat surprising identity for first passage probabilities of spectrally-negative Levy processes that are refracted at their running maximum and discuss extensions of this identity and its applications in the study of insurance risk processes in the presence of tax payments. In addition, we discuss a statistic that is related to the sample coefficient of variation which leads to an alternative simple method for estimating the extreme value index of Pareto-type tails from corresponding iid claim data with infinite variance.

Laila Elbahtouri (SCOR, Actuary)                                                                                                                                  Feb. 312pm-1pm, La Défense EEE, room 104    

Discussion on the paper by Albrecher et al. "Explicit ruin formulas with dependence among risks", Insurance: Mathematics and Economics 48 (2011) 265-270

Doug Andrews (Senior lecturer at Univ. of Southampton &  Actuary (Fellow of the IA &Canadian IA,  SOA))                   1pm-2pmLa Défense EEE, room 104          

Risk Management considerations for the Canada Pension Plan: a case study

What are the risks that the Canada Pension Plan faces and how are they managed? This presentation begins with a review of the governance structure, the actuarial valuation method and the automatic balancing mechanism. It then examines particular assumptions, such as fertility, mortality, migration, productivity and investment. It explains the risks, discusses the methods used to derive the assumptions, and outlines some of the sensitivity tests used to quantify the risk.

Olivier Wintenberger (MCF, Univ. Paris Dauphine)                                                                                          

Jan. 19, 11:15 am, Cergy ESSEC campus, room N305 

Limit laws for sums of weakly dependent data with infinite variances  


Many econometrical models take into account two stylized facts: the dependence structure of the time series (absence of correlations, dependence of the squares) and the high volatility (tails as power laws). If the dependence is not too strong, the asymptotic behavior of the sum is no longer distributed as a normal law but as another type of stable law. Thus, the dependence and the power law marginals drive the limit law in a very intricated way. In a work done with K. Bartkiewicz, A. Jakubowski and T. Mikosch, we suceeded to give a non standard central limit theorem where we clearly determined in the limiting stable law what is due to the dependence and what is due to the marginal law. We apply our result to the sum of the stationary solution of the GARCH(1,1) model.

Jan. 12, 12:30pm, La Défense EEE, room 202: Discussion on projects by groups

1st Term: 

Véronique Maume-Deschamps
 (Prof., ISFA Lyon)

Dec. 16,  12:30 pm, La Défense EEE, room 236

Some multivariate risk indicators; estimation and application to reserve allocation

We consider some risk indicators of vectorial risk processes. These indicators are expected sums of some penalties that each line of business would have to pay due to its temporary potential insolvency. The dependency between lines of business is taken into account. By using stochastic algorithms, we may estimate the minimum of these risks indicators, under a fixed total capital constraint. This minimization may apply to reserve allocation.

Désiré Binam (Dr. Consultant Front-Office)

Dec. 2,  11:15 am, La Défense EEE room 237

How the trading technology changes the market microstructure

The evolution of trading technologies has deeply modified the microstructure of today’s financial markets. We will see how the technology is changing the trading process and strategy and how players deal with new concepts such as Direct Market Access (DMA), Smart Order Routing (SOR), High Frequency Trading (HFT) and Algorithmic trading.


Philippe Soulier (Prof., Univ. Paris Nanterre), 

Nov. 25, 12:30pm, La Défense EEE, room 334

Prediction for time series after catastrophic events

We consider the problem of prediction in a time series when the conditioning event is extreme. The quantities of interest are the limiting conditional distribution of future events given that the past was extreme, and the normalizing functions needed to obtain non degenerate these limit laws. I will consider two classes of processes: GARCH-type and stochastic volatility processes. The main difference is the presence or absence of clustering of extremes.

Nov. 4, 12:30pm, La Défense EEE, room 237

- Discussion on some projects.

- Invitation to the Finance Department Seminar on  Nov. 8,  4:30 pm - ESSEC Campus Cergy Pontoise (room N 305) by Heitor Almeida (Univ. Illinois), Aggregate Risk and the Choice between Cash and Lines of Credit.

(see for the abstract).

Cristina Butucea (Prof., Univ. Marne La Vallée)

Oct. 14, 12:30pm, La Défense, room 236

Various ways to summarize a curve with a number: estimation and applications 

Curve estimation is very frequently studied in various areas. We shall discuss the pros and cons of parametric versus nonparametric estimation of curves. A good compromise in many examples is to recover less information about the underlying function to be estimated (summarize it) using the nonparametric techniques. We shall discuss several ways to do so and focus on excess mass estimation.

Calendar of the meetings  in 2009-2010:

2nd Term: 

M. Bardos & M.Kratz for the  BFA (Banque Finance Assurance) group, SFdS                                                                   April 9, 9am - 5pm: IHP (Institut Henri Poincaré, 75005 Paris),                                                                                    

Workshop on "Financial Regulation"                                                                                                               (download the program; see the attachments).  

Laurent Ferrara (Banque de France et Univ. Paris Ouest)                                                                                               April 1, 6pm,  La Défense EEE, room 236                                                                                                               

Assesing the recession risk anticipated by financial markets

Conference IDEI/Scor, "Integration of Extremal Events in Quantitative Risk Management"                                       March 19, 2010                                                                                                                                     

Marie Kratz (Prof., Essec Business School)                                                                                                                  March 4, 12:30pm, La Défense EEE , room 138                                                                                                                  

On alarm systems. Applications for Insurance companies and for Health surveillance Institute

Guillaume Chevillon (Prof., Essec Business School)                                                                                                        Feb. 4, 12:30pm, La Défense EEE , room 104 

Informal presentation 

Sir David F. Hendry (Nuffield College, Oxford)                                                                                                                   Jan. 28, 6pmLa Défense EEE, room 201                                                                                                                

Empirical Model Discovery

The talk summarizes a great deal of recent research, and explains how it facilitates the discovery of empirical models, greatly reducing the risks from model mis-specification and data contamination. Model evaluation concerns discovering what is wrong; robust statistics as discovering which sub-sample is reliable; non-parametric methods as discovering the functional form; and model selection as discovering which model best matches the given criteria. However, the high dimensionality, non-linearity, inertia, endogeneity, evolution, and abrupt change characteristic of economic data, which interact to make empirical modelling difficult and pose substantive risks of ending with an incorrect representation, make it essential to tackle all of these jointly. Automatic methods enable formulation, selection, estimation, and evaluation on a scale well beyond the powers of humans alone, including when there are more candidate variables than observations, while allowing theory models to be embedded in the discovery process. Live computer illustrations using Autometrics show the remarkable power and feasibility of this exciting approach. 

Michel M. Dacorogna (Dr., Head of SCOR Group Financial Modeling)                                                                               Jan. 21, 12:30pm, La Défense EEE, room 104                                                                                                               

There will be a next crisis surprising us someday – how can we be prepared to survive it?

   1st Term:              

Arthur Charpentier (Prof., Univ. Rennes 1 & Ecole Polytechnique)
Dec. 17, 12pm, La Défense EEE, amphi 138

Extremes and Dependence in the context of Solvency II for Insurance Companies

Conference RISK (Risk Intelligence Symposium & Knowledge) 
Dec. 3, 2009 
organized by OTC Conseil & Univ. Paris Ouest; Caisse des dépôts et consignations – Paris.

Fernando Oliveira (Prof., Essec Business School)
Nov. 19, 10am, N305 (campus Cergy)

Informal Presentation

Fabrice Cavarretta (Prof., Essec Business School)
Nov.3, 2009 11:30am, Le Club (campus Cergy)
Distinguishing Extreme vs. Average Effects in Nascent Firms: an Organizational Risk approach to resources

Oct. 29, 2009 10am, N305 (campus Cergy)

Discussion on the objectives of this Working Group

Marie KRATZ,
Feb 2, 2010, 12:30 PM
Marie KRATZ,
Jan 27, 2011, 12:06 PM
Marie KRATZ,
Apr 2, 2010, 6:10 AM
Marie KRATZ,
Apr 2, 2010, 6:09 AM
Marie KRATZ,
Jun 22, 2012, 2:01 PM
Marie KRATZ,
Nov 4, 2011, 3:29 AM
Marie KRATZ,
Nov 30, 2011, 8:13 AM
Marie KRATZ,
Dec 14, 2011, 9:19 AM
Marie KRATZ,
Nov 25, 2012, 1:32 PM
Marie KRATZ,
Apr 18, 2011, 2:04 AM
Marie KRATZ,
Mar 3, 2010, 10:15 AM
Marie KRATZ,
Feb 2, 2010, 12:41 PM
Marie KRATZ,
Apr 10, 2012, 6:46 AM
Marie KRATZ,
Jun 29, 2011, 9:30 AM
Marie KRATZ,
Feb 8, 2012, 9:46 AM
Marie KRATZ,
Oct 14, 2010, 4:40 AM
Marie KRATZ,
Nov 4, 2011, 3:29 AM
Marie KRATZ,
Jan 6, 2011, 3:20 AM
Marie KRATZ,
Dec 1, 2012, 10:36 AM
Marie KRATZ,
Apr 18, 2011, 2:02 AM
Marie KRATZ,
Apr 18, 2011, 2:03 AM
Marie KRATZ,
Nov 5, 2011, 11:02 AM
Marie KRATZ,
Feb 3, 2013, 8:00 AM
Inès Desvignes,
Oct 15, 2012, 5:36 AM
Marie KRATZ,
Feb 2, 2010, 12:35 PM
Marie KRATZ,
May 26, 2011, 7:22 AM
Marie KRATZ,
Mar 7, 2012, 7:21 AM
Marie KRATZ,
Jan 20, 2011, 7:14 AM
Marie KRATZ,
May 30, 2012, 1:04 AM
Marie KRATZ,
Nov 26, 2010, 10:50 AM
Marie KRATZ,
May 14, 2012, 3:35 PM
Marie KRATZ,
May 19, 2011, 2:42 AM
Marie KRATZ,
Jan 25, 2012, 9:05 AM
Marie KRATZ,
Jan 4, 2011, 1:52 PM
Marie KRATZ,
Mar 27, 2012, 2:24 PM