- ESSEC Business School - with the support of the group BFA (Banque Finance Assurance) of the SFdS (Société Française de Statistique) & the support of the French Institute of Actuaries (IA)Organizer: Prof. Marie KratzThe Many skills are today distributed in various domains, financial, industrial, academic,..., therefore the intention behind the creation of this group is: - to organize a structured dynamics in the field of risk analysis
- to facilitate the exchanges between professionals and academics
- to generate possible collaborations on identified problems
- to create and to work on research projects
To this end, - presentations of articles or chapters of books
- informal presentations/discussions
- invited seminars
Each participant might propose a precise theme, a work on a given article or on a project. The presentation slides are in general available: see the attachments at the end of the page. Our one hour meetings will take place generally twice a month, on Fridays, at 12:30pm, EEE - ESSEC La Défense.
If you are interested and want to be part of the mailing list, please contact Marie Kratz. Note: since March 2012, the French Institute of Actuaries (IA) supports the WG Risk and considers it as part of its program PPC (Perfectionnement Professionnel Continu). To participate to a conference holds for 6 points PPC.
http://crear.essec.edu/working-group-on-risk/meeting-schedule-2013-2014 Calendar of the meetings in 2012-2013:3rd Term:
Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA. Romain Speisser (Actuary, Essec Business School)June 13, 12:30 pm, La Défense EEE, room 138Evaluation of the risk of a pandemic and constructionof a partial internal model health insurance under Solvency II
Vincent Ruol (Actuary, Inspector of Social Affairs)May, 31, 12:30 pm, La Défense EEE, room 203Historical perspective and prospective of regulation in insurance Insurance is a business particularly regulated. The first attempts of this control were set up already at the end of the 19th century, and public intervention has since regularly increased. Its forms have been deeply reformed, especially during the last quarter of the century. Above all, the regulator's role is intended to deeply evolve with the introduction of the new European directive of Solvency 2 and the creation of the French regulatory authority (ACP).Christoph Hummel (Dr. Head of non-life & Modelling - Secquaero Advisors)May, 22, 12:30 pm, La Défense EEE, room 104Multiple dependencies of risks: Models for actuarial practise
Giovanni Puccetti (Prof. Univ. of Firenze, Italy)April, 22, 12:30 pm, La Défense EEE, room 220The rearrangement algorithm: A new tool for computing bounds on risk measures
organized with the support of CREAR 2nd Term: Walter Farkas (Prof., ETH Zürich, Switzerland)March 22, 12:30pm, La Défense EEE, room 101 Capital requirements with defaultable securities
Static and (symmetry based) semi-static replication strategies in actuarial science Recently there has been an increasing interest in static- and semi-static replication strategies in actuarial science, e.g. in the context of Guaranteed Minimum Withdrawal Benefits that play an important role in the North American Actuarial Industry. Static and semi-static replication strategies trace their roots to financial mathematics. We will discuss some basic aspects of static replications. Furthermore, the main mechanism of symmetry based semi-static hedging strategies will be explained and conditions under which they can be applied will be discussed.
Sharing the longevity risk between annuitants and annuity provider The benefits provided by conventional life annuity and pension products are guaranteed in face of adverse experience in respect of investment returns or
Naji Freiha (Actuary, Head of Risks & Capital Adequacy - DEXIA Group)February 4, 12:30 pm, La défense EEE, room 138Integrated Risks & Capital Management
Marie Kratz (Prof., ESSEC Business School)
January 24, 12:30pm, La défense EEE, room 203There is a VaR beyond usual approximationsA normal approximation is often chosen in practice for the unknown distribution of the yearly log returns of financial assets, justified by the use of the CLT (Central Limit Theorem), when assuming aggregation of iid observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with risk measures; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of risk measures when working on financial data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. It may also be useful for a better estimation of the capital when using Monte-Carlo simulations for which the convergence may be an issue. We explore new approaches to handle this problem, numerically as well as theoretically, based on properties of upper order statistics. We compare them with existing methods, for instance with one based on the Generalized Central Limit Theorem.
Laila Elbahtouri (Actuary, SCOR, Paris)
Understanding the diversification benefit (DB) is essential to the business model of reinsurance companies. It allows for efficient risk management, capital optimization and competitive pricing. This is why, it is important to model and compute the diversified Risk Adjusted Capital (RAC) of a portfolio from the marginal distributions of the risks and their dependence structure. The purpose of this paper is to provide explicit DB formulas and the capital allocation to the various stand-alone risks. We do this by means of mixing techniques (Oakes, 1989). Examples for bivariate case will be given according to survival copulas (Clayton and Gumbel) and marginals (Pareto and Weibull) which allow the calculation and the allocation of DB. The numerical convergence of this quantity is tested against the analytical result and shown to be good for marginal with relatively light tails (tail indices below or equal to 2) and slow for the risk measure (tail Value-at-Risk) but reasonable for the DB when the marginal are very fat tailed (tail index close to 1). Further research will be outlined. Work done in collaboration with M. Dacorogna (SCOR) and M. Kratz (ESSEC).
Ramsay (1984) in paper published in Insurance: Mathematics and Economics, entitled «The asymptotic ruin problem when the healthy and sick periods form an alternating renewal process» found expressions for the probabilities of sickness and health, the first moment of the aggregate amount of premium received up to the end of the n -th healthy period, the aggregate amount of benefit paid out up to the end of the n -th sickness period, n =1,2,3,... and an approximation of the probability of ruin but ignoring the effect of force of interest.This paper aims to consider the Ramsay model modified by the inclusion of interest rate. Upper bounds for the ultimate ruin probability are derived by martingale and recursive techniques. Examples will be given when the length of the period of sickness and health follow a particular distribution.
Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.
Rama Cont (Dr., CNRS & Imperial College, London)
Demystifying Black Swans : fire sales, price impact and endogenous risk
Prices, volatilities and correlation parameters often exhibit erratic behavior and extreme fluctuations during market crises. The traditional approach has been to either model these occurrences as "extreme" events or statistical outliers, or entirely dismiss them as 'black swans', impossible to model quantitatively. We argue that many such 'black swans' are in fact manifestations of endogenous market instabilities that arise as a result of feedback effects between price behavior and the resulting supply/demand dynamics generated by market participants. We propose some simple models which allow quantitative modeling of such endogenous risks and present some applications to the Quant Crash of August 2007 and the Great Deleveraging following the collapse of Lehman Brothers.
References:
R Cont, L Wagalath (2011)
Running for the exit: short selling and endogenous correlation in financial markets to appear in: Mathematical Finance.R Cont, L Wagalath (2012)
The business of insurance is based on a simple concept of spreading the risks endured by an individual amongst as large a group of persons as possible. As long as he can evaluate the expected loss with a reasonable accuracy, the insurer is able to ask for a price that will cover this loss, plus a premium to pay the capital he needs to set aside. This capital is provided to ensure the payment of loss up to a certain probability. Over the years the insurance business has prospered, and has become a central actor in the economies of developed countries. In the early thirties, mathematicians such as Kolmogorov or Fisher were able to explain successful mechanics of the insurance industry through the law of large numbers.
In the last thirty years practitioners and mathematicians have come to recognize that certain risks do not follow the law of large numbers. Natural catastrophes are on such risk category. By nature, such events are of low frequency but high severity. In practice this means that most underwriting years will end up with few losses. However if a natural catastrophe does occur, the loss will probably be much higher than the expected value. In such cases the mathematical functions of expectation and variance do not always converge. The addition of one extra event will increase the value of both functions, even if their estimation was based on many years of previous event occurrence.
Catastrophe risk is a difficult issue for insurers. The insurer wants to benefit from the high return this business can generate but does not want to pay the price of its very high volatility. The solution to this problem is to invest in a program that includes a certain form of savings during the years where the losses are benign. This allows the insurer to face the obligations when a big loss occurs, mitigating the high volatility catastrophe insurance, or even the need to raise new capital. This savings program is called “time diversification or equalization reserves” to differentiate it from existing capital. The difference between equalization reserves and capital is twofold: there will be no return on this money above the risk free rate; and no new business will be written against it.
Equalization reserves have been banned by the new accounting rules and regulations on the basis that if no loss incurs during the contract period the money belongs to the shareholder who originally put his capital at risk for this transaction. This argument is perfectly valid if it is possible to reasonably estimate loss and average claims over a given period, i.e. for loss distributions showing low volatility. With high severity, low frequency losses, we have already seen that this is not the case. Those insurers who did not experience a major claim in such a line have simply been lucky. Effectively the investor is playing roulette. At some point he risks loosing his capital to pay back a large claim, as the premium the insurer will get during that particular year will certainly not cover it. Moreover, he would have no chance of recouping some of his losses because the company would become insolvent once it has lost its capital, and would not be able to write any new business.
Using a model of the balance sheets of two firms submitted to the same risks1, one that is allowed to keep equalization reserves and one that pays all the extra gains as dividends, we analyze the value of both firms from the point of view of the shareholders. Both company start with the same capital. We submit both companies to the same losses over a period of thirty years and analyze the cash flows resulting from their business in terms of the Sharpe Ratio and the Merton model that would give the value of the call option on the assets of the company. We explore two different distributions lognormal and Fréchet with various tails, but all with the same Value-at-Risk at 99%. First, we see that, within certain rules, it is possible to buildup equalization reserves. Second, in all the cases, we show that the company that holds equalization reserves has a higher value for the investor. We present and discuss details of the model and the results.1- Work done in collaboration with H.J. Albrecher (EPFL), M. Moller (SCOR) and S. Sahiti (EPFL & SCOR)
Zürich)
La Défense EEE (Cnit), room 104 PrObEx and Internal Model - calibrating dependencies among risks in Non-LifeThe latest financial crisis has dramatically shown that dependence among risks cannot be ignored. (Re)insurance companies may use copula models in order to prudently account for dependence (especially in the tail) within their internal models. Once a certain copula model has been chosen, actuaries are left with the issue of estimating the copula parameter(s). However, standard statistical estimation techniques generally fail to provide a reliable estimate if data is scarce. In order to reduce the parameter uncertainty, we developed a Bayesian model to calibrate copula parameters: PrObEx. We provide an introduction to such a key innovation in SCOR’s internal model, discussing both the mathematical aspects and the practical implementation for calibrating dependencies in Non-Life. Our method can be used also in other contexts (e.g. Economy, Life, etc.).
D. Canestraro has been awarded for this presentation on PrObEx, with the prize “for the best Scientific Presentation by a Young Actuary at the First European Congress of Actuaries” (Brussels, June 7-8, 2012).
The study of tail events has become a central preoccupation for academics,
investors and policy makers, given the recent financial turmoil. However,
the question on what differentiates a crash from a tail event remains
unsolved.
This work elaborates a new definition of stock market crash taking a
risk management perspective based on an augmented extreme value theory
methodology. An empirical test on the French stock market (1968-2008)
indicates that it experienced only two crashes in 2007-2008 among the 12
identified over the whole period.
Bertrand Maillet (Head of Research at ABN AMRO, andMCF, Univ. Orléans )
March 28, 1 2:30 pm, La Défense EEE, room 101
The recent experience from the global financial crisis has raised serious doubts about the accuracy of standard risk measures as a tool to quantify extreme downward risks. Risk measures are hence subject to a “model risk” due, e.g., to the specification and estimation uncertainty. Therefore, regulators have proposed that financial institutions assess the “model risk” but, as yet, there is no accepted approach for computing such a risk. We propose a general framework to computerisk measures robust to the model risk, while focusing on the Value-at-Risk (VaR). The proposed procedure aims empirically adjusting the imperfect quantile estimate based on a backtesting framework, assessing the good quality of VaR models such as the frequency, the independence and the magnitude of violations. We also provide a fair comparison between the main risk models using the same metric that corresponds to model risk required corrections.
March 14, 1 2:30 pm, La Défense EEE, room 237
Financial statements are supposed to give a true and fair view of the financial position and economic performance of an entity. This requires to recognize and measure all its assets and liabilities. Under IFRS, recognition and measurement of these elements depends on assumptions, estimations etc about what will happen in the future. Depending on the Management's decision about these assumptions and estimations true and fair view disclosed in the financial statements may differ and impact analysts' judgments. This presentation will introduce to the key concepts of financial statements presentation under IFRS and highlight some of related recognition and measurement risks.
March 7 (instead of Feb. 29), 1 2:30 pm, La Défense EEE (Cnit), room 102
We consider an unhomogeneous Poisson process with unknown intensity. This intensity is approximated by a piecewise constant function. We provide new non-asymptotic results on estimating this intensity. The techniques that we consider are applied to a nonlife insurance portfolio, in order to monitor the stability of the claim process.
February 7, 1 2:30 pm, La Défense EEE, room 236
The traditional point of view of ruin theory is reversed: rather than studying the probability of ruin as a function of the initial reserve under a fixed premium, we adjust the premium so as to obtain a given ruin probability (solvency requirement) for a fixed initial reserve (the financial capacity of the insurer). Thierry Roncalli ( Head of Research & Development - Lyxor Asset Management & Univ. Evry )
January 18, 1 2:30 pm, La Défense EEE, room 236
Portfolio allocation is generally based on optimization method (Minimum variance, Markowitz, Merton, Black-Litterman, etc.). The first part of this presentation is to show that portfolio optimization faces several drawbacks in terms of concentration, stability and management. We will show that risk-budgeting techniques is an alternative method which appears more robust. In particular, we will focus the second part of the presentation in one of the most simple risk-budgeting methods, when the risk budgets are the same. In this case, we obtain the ERC (Equal Risk Contribution) portfolio. After giving the mathematical properties of the ERC portfolio, we will present some applications to manage equity funds (like alternative-weighted indexes) and diversified funds (like risk parity funds). In the third part of the presentation, we will focus on risk-budgeting methods, when the risk budgets are not the same. We will generalize some properties of the ERC portfolio and present an application to manage the sovereign credit risk in bond portfolios.
December 14, 12:30 pm, SCOR auditorium (with the support of SCOR Group), La Défense
The new European regulation for insurance “Solvency II” encourages insurers of big/medium size companies to implement internal models for the calculation of their required capital. All risks have to be modeled simultaneously. For the estimation of tails and extreme events 10 000 simulations at least are recommended. For big companies this can theoretically imply runtimes of several weeks… This becomes even worse with the Simulations in Simulations issue. The aim of this presentation is to review a range of solutions to deal with this huge number of simulations, through the analyze of adapted random generators, models optimizations, and potential simplified “artificial intelligence” implementations. Jean-Gabriel Attali (Dr, Consultant, Formerly Strategy Analyst at Exane Derivatives) November 30, 12:30 pm, La Défense EEE, room 101 Risk measurement and its limits in asset management Management of risk is a key point in the area of asset management. The aim of this presentation is to address various aspects of risk as well as its measurement or control. In particular, we will focus on the limits of some risk measures, in the sense that risk is often underestimated when it is measured ex-post, especially when the manager is very active or uses derivatives products. We will show that the management of ex-post risk is easy when it is measured by volatility (or Var), even in time of crisis, simply by managing the exposition of the fund. Unfortunately, this technique does not ensure the safety of principal which remains the main objective of the industry of asset management.
November 2, 12:30 pm, La Défense EEE, room 101
Abstract:
Diversification benefit in Gaussian Aggregation Trees
June 7, 2011 (IHP Paris) see: http://www.proba.jussieu.fr/pageperso/ramacont/SystemicRiskParis2011/
Susanne Emmer (Dr., UBS, Zürich)
Credit Stress Loss
Giles Brennand (Prof., the Chinese University of Hong-Kong, and Independent Consultant)Towards Modern Risk Management
Jean-Paul Renne (Banque de France, Paris)April 7, 12:30 pm, La Défense EEE, room 236 Default, Liquidity and Crises: an econometric frameworkIn this paper, we present a general discrete-time affine framework aimed at jointly modeling yield curves associated with different debtors. The underlying fixed-income securities may differ in terms of credit quality and/or in terms of liquidity. The risk factors follow conditionally Gaussian processes, with drifts and variance-covariance matrices that are subject to regime shifts described by a Markov chain with (historical) non-homogenous transition probabilities. While flexible, the model remains tractable. In particular, bond prices are given by quasi-explicit formulas. Various numerical examples are proposed, including a sector-contagion model and credit-rating modeling.
On refracted stochastic processes and the analysis of insurance risk
Discussion Insurance: Mathematics and Economics 48 (2011) 265-270
Risk Management considerations for the Canada Pension Plan: a case study
Limit laws for sums of weakly dependent data with infinite variances
Véronique Maume-Deschamps (Prof., ISFA Lyon)
Some multivariate risk indicators; estimation and application to reserve allocation
How the trading technology changes the market microstructure
- Invitation to the Finance Department Seminar on (see https://sites.google.com/a/essec.edu/seminaires-dept-finance/ for the abstract).
Workshop on "Financial Regulation" (download the program; see the attachments).
Conference IDEI/Scor, "Integration of Extremal Events in Quantitative Risk Management"
On alarm systems. Applications for Insurance companies and for Health surveillance Institute
Empirical Model Discovery The talk summarizes a great deal of recent research, and explains how it facilitates the discovery of empirical models, greatly reducing the risks from model mis-specification and data contamination. Model evaluation concerns discovering what is wrong; robust statistics as discovering which sub-sample is reliable; non-parametric methods as discovering the functional form; and model selection as discovering which model best matches the given criteria. However, the high dimensionality, non-linearity, inertia, endogeneity, evolution, and abrupt change characteristic of economic data, which interact to make empirical modelling difficult and pose substantive risks of ending with an incorrect representation, make it essential to tackle all of these jointly. Automatic methods enable formulation, selection, estimation, and evaluation on a scale well beyond the powers of humans alone, including when there are more candidate variables than observations, while allowing theory models to be embedded in the discovery process. Live computer illustrations using Autometrics show the remarkable power and feasibility of this exciting approach.
There will be a next crisis surprising us someday – how can we be prepared to survive it?
Arthur Charpentier (Prof., Univ. Rennes 1 & Ecole Polytechnique)Dec. 17, 12pm, La Défense EEE, amphi 138Extremes and Dependence in the context of Solvency II for Insurance Companies Conference RISK (Risk Intelligence Symposium & Knowledge) Dec. 3, 2009 organized by OTC Conseil & Univ. Paris Ouest; Caisse des dépôts et consignations – Paris.Fernando Oliveira (Prof., Essec Business School)Nov. 19, 10am, N305 (campus Cergy)Informal Presentation Fabrice Cavarretta (Prof., Essec Business School)Nov.3, 2009 11:30am, Le Club (campus Cergy) Distinguishing Extreme vs. Average Effects in Nascent Firms: an Organizational Risk approach to resources Oct. 29, 2009 10am, N305 (campus Cergy)Discussion on the objectives of this Working Group |

Research >