Working Group on (quantitative) Risk Analysis / Uncertainty
- ESSEC Business School -
with the support of the group BFA (Banque Finance Assurance) of the SFdS (Société Française de Statistique)
& the support of the French Institute of Actuaries (IA)
Organizer: Prof. Marie Kratz
The main objective is the resolution of concrete problems which affect the industry or of subjects of public interest, in the field of (quantitative) risk analysis, creating new tools or methods.
Many skills are today distributed in various domains, financial, industrial, academic,..., therefore the intention behind the creation of this group is:
To this end, regular meetings will include:
Each participant might propose a precise theme, a work on a given article or on a project.
The presentation slides are in general available: see the attachments at the end of the page.
Our one hour meetings will take place generally twice a month, on Fridays, at 12:30pm, EEE - ESSEC La Défense.
If you are interested and want to be part of the mailing list, please contact Marie Kratz.
Note: since March 2012, the French Institute of Actuaries (IA) supports the WG Risk and considers it as part of its program PPC (Perfectionnement Professionnel Continu). To participate to a conference holds for 6 points PPC.
Calendar of the meetings in 2013-2014 (following this link):
Calendar of the meetings in 2012-2013:
June 24-26, 2013, Lyon
Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.
Romain Speisser (Actuary, Essec Business School)
June 13, 12:30 pm, La Défense EEE, room 138
Evaluation of the risk of a pandemic and constructionof a partial internal model health insurance under Solvency II
Except for world wars, the pandemic risk surely constitutes the most threatening risk in terms of human lives, as epitomized by the 1918 Spanish influenza pandemic which resulted in 40 to 50 million deaths in a few months’ time. Since a severe pandemic would have large consequences on insurers among the world, Solvency II specifically targets this risk through two sub-modules of its standard formula, namely the SCR Life CAT and the SCR Health CAT Pandemic. Mainly two types of models exist to assess the impact of a pandemic. There are actuarial models based on historical data, and epidemiological models which simulate the spread of the disease. In this presentation, we introduce one model of each type used in practice, and try to fix the inconsistencies in order to use them in two distinctive partial internal models for the pandemic risk in life insurance and health insurance.
Vincent Ruol (Actuary, Inspector of Social Affairs)
May, 31, 12:30 pm, La Défense EEE, room 203
Historical perspective and prospective of regulation in insurance
Insurance is a business particularly regulated. The first attempts of this control were set up already at the end of the 19th century, and public intervention has since regularly increased. Its forms have been deeply reformed, especially during the last quarter of the century. Above all, the regulator's role is intended to deeply evolve with the introduction of the new European directive of Solvency 2 and the creation of the French regulatory authority (ACP).
Christoph Hummel (Dr. Head of non-life & Modelling - Secquaero Advisors)
May, 22, 12:30 pm, La Défense EEE, room 104
Multiple dependencies of risks: Models for actuarial practise
An adequate description of dependencies between risk factors is essential in the assessment of the risk capital with internal models. In this talk a simple modelling tool kit providing insight into a variety of stochastic dependence structures is introduced. By means of unconventional examples some approaches used in practise in the assessment of the capital requirements for Solvency II are challenged. The talk concludes with a discussion on practical applications of the methods presented.
Giovanni Puccetti (Prof. Univ. of Firenze, Italy)
April, 22, 12:30 pm, La Défense EEE, room 220
The rearrangement algorithm: A new tool for computing bounds on risk measures
We introduce a numerical algorithm which allows for the computation of sharp upper and lower bounds on the Value-at-Risk/Expected Shortfall of high-dimensional risk portfolios having fixed marginal distributions. These bounds can be interpreted as a measure of model uncertainty induced by possible dependence scenarios.
This is a joint work with Paul Embrechts (ETH Zürich) and L. Rüschendorf (Freiburg Univ.).
April 9, 2013, Lyon
organized with the support of CREAR
Walter Farkas (Prof., ETH Zürich, Switzerland)
March 22, 12:30pm, La Défense EEE, room 101
Capital requirements with defaultable securities
The adequacy of the capitalization of a financial institution is typically defined in terms of acceptance sets of financial positions. Risk measures are used to determine the minimum amount of capital - the so-called capital requirement - that has to be raised and invested in a portfolio of a prespecified class of tradable assets to make a position acceptable. We allow for general acceptance sets and general positive eligible (or "reference'') assets, which include defaultable bonds, options, or limited liability assets. Since the payoff of these assets is not bounded away from zero the resulting capital requirements cannot be transformed into cash-invariant risk measures by a simple change of numeraire. However, extending the range of eligible assets is important because, as exemplified by the recent financial crisis, the existence of default-free securities may not be a realistic assumption to make. We study finiteness and continuity properties of capital requirements in this general context. We show how to reduce risk measures with respect to multiple eligible assets to the risk measures with respect to a single eligible asset by properly enlarging the acceptance set. Risk measures with respect to multiple eligible assets are shown to be non-trivial when no acceptability arbitrage is possible, i.e. when not every position can be made acceptable by adding a zero-cost portfolio of eligible assets. We derive a theorem on the structure of closed convex acceptance sets based solely on the external characterization of general closed convex sets. A distinguishing feature of our approach is that convex risk measures are represented as the supreme of an objective function that depends exclusively on the acceptance set, where the supremum is taken over a set that varies with the choice of the class of eligible assets. We apply our results to capital requirements based on Value-at-Risk and Tail-Value-at-Risk acceptability, the two most important acceptability criteria in practice.
March 4, 12:30pm, La Défense EEE, room 220
Static and (symmetry based) semi-static replication strategies in actuarial science
Recently there has been an increasing interest in static- and semi-static replication strategies in actuarial science, e.g. in the context of Guaranteed Minimum Withdrawal Benefits that play an important role in the North American Actuarial Industry. Static and semi-static replication strategies trace their roots to financial mathematics. We will discuss some basic aspects of static replications. Furthermore, the main mechanism of symmetry based semi-static hedging strategies will be explained and conditions under which they can be applied will be discussed.
Sharing the longevity risk between annuitants and annuity provider
The benefits provided by conventional life annuity and pension products are guaranteed in face of adverse experience in respect of investment returns or
mortality rates. Facing the risks by charging high premiums can make life annuities even less attractive than how currently perceived by potential customers. Appropriate solutions should be suggested, aiming at providing retirees with effective alternatives to the income drawdown (or “self-annuitization”). We focus on the risk of unanticipated mortality improvements, that is the nondiversifiable aggregate longevity risk. Non-conventional life annuities can be defined, aiming at linking, to some extent, the annuity benefit to the mortality experienced in the group of annuitants, and / or in the market of life annuities (or pensions) and / or in the population. This link implies sharing the risks arising from experienced mortality between annuitants and annuity provider. Various approaches can be adopted in order to link annuity benefits to mortality experience, or to updated forecasts of future mortality trends. We propose a rather general model that aims at providing a unifying point of view from which several practicable schemes, sharing the common purpose of transferring part of the longevity risk to the annuitants, can be analyzed and compared. We consider the possibility of changing the annuity benefit by relating the benefit itself to the experienced mortality, or to updated mortality forecasts, or both. We suggest that the experienced mortality can be directly measured, or adjusted through an inference procedure. Further, we investigate the possibility to mitigate the reduction in the annuity benefit due to unanticipated mortality improvements through investment profit participation. This is a joint work with Ermanno Pitacco, University of Trieste, Italy.
Naji Freiha (Actuary, Head of Risks & Capital Adequacy - DEXIA Group)February 4, 12:30 pm, La défense EEE, room 138
The banking industry is facing severe economic and regulatory capital shortages, due to more stringent regulations for capital and various pressures by investors and rating-agencies. Hence, integrated risk measurements and capital management are becoming of key importance to risk executives and general managers.
Current practices in capital and integrated risk management rely on two metrics: regulatory and economic capital. Their simultaneous use generates complexity in daily risk decisions. It hinders commonly shared internal risks assessments and practical capital management from the board level directions to their execution by different risk committees.
The IRCM enhance readability of risks at executive committees and management boards’ levels. It also creates common metrics of risks to financial management, accounting and to risk management. Quantitative risk management concepts and techniques are of paramount importance in this framework for scenario analysis, risk measures and eventually for decision support.
Marie Kratz (Prof., ESSEC Business School)
January 24, 12:30pm, La défense EEE, room 203
There is a VaR beyond usual approximations
A normal approximation is often chosen in practice for the unknown distribution of the yearly log returns of financial assets, justified by the use of the CLT (Central Limit Theorem), when assuming aggregation of iid observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with risk measures; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of risk measures when working on financial data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. It may also be useful for a better estimation of the capital when using Monte-Carlo simulations for which the convergence may be an issue. We explore new approaches to handle this problem, numerically as well as theoretically, based on properties of upper order statistics. We compare them with existing methods, for instance with one based on the Generalized Central Limit Theorem.
Understanding the diversification benefit (DB) is essential to the business model of reinsurance companies. It allows for efficient risk management, capital optimization and competitive pricing. This is why, it is important to model and compute the diversified Risk Adjusted Capital (RAC) of a portfolio from the marginal distributions of the risks and their dependence structure. The purpose of this paper is to provide explicit DB formulas and the capital allocation to the various stand-alone risks. We do this by means of mixing techniques (Oakes, 1989). Examples for bivariate case will be given according to survival copulas (Clayton and Gumbel) and marginals (Pareto and Weibull) which allow the calculation and the allocation of DB.
The numerical convergence of this quantity is tested against the analytical result and shown to be good for marginal with relatively light tails (tail indices below or equal to 2) and slow for the risk measure (tail Value-at-Risk) but reasonable for the DB when the marginal are very fat tailed (tail index close to 1). Further research will be outlined. Work done in collaboration with M. Dacorogna (SCOR) and M. Kratz (ESSEC).
The Basel Committee on Banking Supervision recently suggested that Expected Shortfall could replace Value-at-Risk to improve the measurement of risk in the trading book. We discuss the conceptual and practical pros and cons of this proposal, arguing that actually the combined use of the two risk measures might be the way forward.
(Dr., School of Statistics & Actuarial Science, Univ. of Witwatersrand, Johannesburg, South Africa)
The Asymptotic Ruin Problem In Health Care Insurance With Interest
Ramsay (1984) in paper published in Insurance: Mathematics and Economics, entitled «The asymptotic ruin problem when the healthy and sick periods form an alternating renewal process» found expressions for the probabilities of sickness and health, the first moment of the aggregate amount of premium received up to the end of the n -th healthy period, the aggregate amount of benefit paid out up to the end of the n -th sickness period, n =1,2,3,... and an approximation of the probability of ruin but ignoring the effect of force of interest.
This paper aims to consider the Ramsay model modified by the inclusion of interest rate. Upper bounds for the ultimate ruin probability are derived by martingale and recursive techniques. Examples will be given when the length of the period of sickness and health follow a particular distribution.
November, 19, 2012, La Défense - EEE, Room 203
Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.
Rama Cont (Dr.,
CNRS & Imperial College, London)
Demystifying Black Swans : fire sales, price impact and endogenous risk
R Cont, L Wagalath (2011)
Running for the exit: short selling and endogenous correlation in financial markets to appear in: Mathematical Finance.
R Cont, L Wagalath (2012)
Michel Dacorogna (Dr., Group Deputy CRO, SCOR) October 12(with the support of SCOR Group)
Adding Time Diversification to Risk Diversification, the Case for Equalization reserves for Natural Catastrophes
The business of insurance is based on a simple concept of spreading the risks endured by an individual amongst as large a group of persons as possible. As long as he can evaluate the expected loss with a reasonable accuracy, the insurer is able to ask for a price that will cover this loss, plus a premium to pay the capital he needs to set aside. This capital is provided to ensure the payment of loss up to a certain probability. Over the years the insurance business has prospered, and has become a central actor in the economies of developed countries. In the early thirties, mathematicians such as Kolmogorov or Fisher were able to explain successful mechanics of the insurance industry through the law of large numbers.
In the last thirty years practitioners and mathematicians have come to recognize that certain risks do not follow the law of large numbers. Natural catastrophes are on such risk category. By nature, such events are of low frequency but high severity. In practice this means that most underwriting years will end up with few losses. However if a natural catastrophe does occur, the loss will probably be much higher than the expected value. In such cases the mathematical functions of expectation and variance do not always converge. The addition of one extra event will increase the value of both functions, even if their estimation was based on many years of previous event occurrence.
Catastrophe risk is a difficult issue for insurers. The insurer wants to benefit from the high return this business can generate but does not want to pay the price of its very high volatility. The solution to this problem is to invest in a program that includes a certain form of savings during the years where the losses are benign. This allows the insurer to face the obligations when a big loss occurs, mitigating the high volatility catastrophe insurance, or even the need to raise new capital. This savings program is called “time diversification or equalization reserves” to differentiate it from existing capital. The difference between equalization reserves and capital is twofold: there will be no return on this money above the risk free rate; and no new business will be written against it.
Equalization reserves have been banned by the new accounting rules and regulations on the basis that if no loss incurs during the contract period the money belongs to the shareholder who originally put his capital at risk for this transaction. This argument is perfectly valid if it is possible to reasonably estimate loss and average claims over a given period, i.e. for loss distributions showing low volatility. With high severity, low frequency losses, we have already seen that this is not the case. Those insurers who did not experience a major claim in such a line have simply been lucky. Effectively the investor is playing roulette. At some point he risks loosing his capital to pay back a large claim, as the premium the insurer will get during that particular year will certainly not cover it. Moreover, he would have no chance of recouping some of his losses because the company would become insolvent once it has lost its capital, and would not be able to write any new business.
Using a model of the balance sheets of two firms submitted to the same risks1, one that is allowed to keep equalization reserves and one that pays all the extra gains as dividends, we analyze the value of both firms from the point of view of the shareholders. Both company start with the same capital. We submit both companies to the same losses over a period of thirty years and analyze the cash flows resulting from their business in terms of the Sharpe Ratio and the Merton model that would give the value of the call option on the assets of the company. We explore two different distributions lognormal and Fréchet with various tails, but all with the same Value-at-Risk at 99%. First, we see that, within certain rules, it is possible to buildup equalization reserves. Second, in all the cases, we show that the company that holds equalization reserves has a higher value for the investor. We present and discuss details of the model and the results.
1- Work done in collaboration with H.J. Albrecher (EPFL), M. Moller (SCOR) and S. Sahiti (EPFL & SCOR)
Calendar of the meetings in 2011-2012:
Davide Canestraro (Dr., SCOR
La Défense EEE (Cnit), room 104
PrObEx and Internal Model - calibrating dependencies among risks in Non-Life
D. Canestraro has been awarded for this presentation on
The study of tail events has become a central preoccupation for academics,
investors and policy makers, given the recent financial turmoil. However,
the question on what differentiates a crash from a tail event remains
This work elaborates a new definition of stock market crash taking a
risk management perspective based on an augmented extreme value theory
methodology. An empirical test on the French stock market (1968-2008)
indicates that it experienced only two crashes in 2007-2008 among the 12
identified over the whole period.
Wolfgang Dick (Prof., ESSEC Business School)
March 14, 1
2:30 pm, La Défense EEE, room 237
Recognition and measurement risks in financial statements.
Olivier Lopez (MCF, ISUP-LSTA, UPMC Paris 6)
March 7 (instead of Feb. 29), 1
2:30 pm, La Défense EEE (Cnit), room 102
Multiple change-point detection in a Poisson process
with applications to non-life insurance.
We consider an unhomogeneous Poisson process with unknown intensity. This intensity is approximated by a piecewise constant function. We provide new non-asymptotic results on estimating this intensity. The techniques that we consider are applied to a nonlife insurance portfolio, in order to monitor the stability of the claim process.
orina Constantinescu (Prof., Univ. of Liverpool)
February 7, 1
2:30 pm, La Défense EEE, room 236
Risk processes with premium adjusted to solvency targets
Thierry Roncalli (
Head of Research & Development - Lyxor Asset Management &
January 18, 1
2:30 pm, La Défense EEE, room 236
Portfolio Optimization versus Risk-Budgeting Allocation.
Portfolio allocation is generally based on optimization method (Minimum variance, Markowitz, Merton, Black-Litterman, etc.). The first part of this presentation is to show that portfolio optimization faces several drawbacks in terms of concentration, stability and management. We will show that risk-budgeting techniques is an alternative method which appears more robust. In particular, we will focus the second part of the presentation in one of the most simple risk-budgeting methods, when the risk budgets are the same. In this case, we obtain the ERC (Equal Risk Contribution) portfolio. After giving the mathematical properties of the ERC portfolio, we will present some applications to manage equity funds (like alternative-weighted indexes) and diversified funds (like risk parity funds). In the third part of the presentation, we will focus on risk-budgeting methods, when the risk budgets are not the same. We will generalize some properties of the ERC portfolio and present an application to manage the sovereign credit risk in bond portfolios.
Pierre Miehe (Deputy CEO, ACTUARIS International)
December 14, 12:30 pm, SCOR auditorium (with the support of SCOR Group), La Défense
Risk Management in Insurance: What technical solutions to answer the huge modeling requirements?
All risks have to be modeled simultaneously. For the estimation of tails and extreme events
10 000 simulations at least are recommended. For big companies this can theoretically imply runtimes of several weeks… This becomes even worse with the Simulations in Simulations issue.
The aim of this presentation is to review a range of solutions to deal with this huge number of simulations, through the analyze of adapted random generators, models optimizations, and potential simplified “artificial intelligence” implementations.
Jean-Gabriel Attali (Dr, Consultant, Formerly Strategy Analyst at Exane Derivatives) November 30, 12:30 pm, La Défense EEE, room 101
Risk measurement and its limits in asset management
Management of risk is a key point in the area of asset management. The aim of this presentation is to address various aspects of risk as well as its measurement or control. In particular, we will focus on the limits of some risk measures, in the sense that risk is often underestimated when it is measured ex-post, especially when the manager is very active or uses derivatives products. We will show that the management of ex-post risk is easy when it is measured by volatility (or Var), even in time of crisis, simply by managing the exposition of the fund. Unfortunately, this technique does not ensure the safety of principal which remains the main objective of the industry of asset management.
Paul Deheuvels (Prof., UPMC Paris 6)
November 16 (Wednesday), 12:15 pm, La Défense EEE, room 104
On Difficulties of Risk Modelling and Portfolio Analysis
The statistical modelling of insurance claims is typically achieved through lognormal models, which often fit well to the central portion of a number of data sets. What is meant by the “central portion” of observations embraces the set of values, truncated below and above by “appropriate” threshold levels. The most troublesome claim-values are, as could be expected, large observations, but small values may generate other technical difficulties as well, which we will not consider here in detail. Large values typically fall in the domain of reinsurance, and are often quite well interpreted by Pareto-type models. The main difficulty comes here from the available “large” observations, which compose, most of the time, data sets so small as to render their statistical analysis troublesome. We have considered recently a number of portfolios, each composed of a few thousand observations, and observed, in each case, that only a small percentage of the data could be well interpreted by Pareto-type models. Such models become difficult to fit when the number of observations falling in this range is, for example, of the order of 5 to 20. Another class of technical problems to cope with is, first, to find the right statistical assessment methods to compare between each other the respective risks of several portfolios, and second, to assess the goodness-of-fit of a given portfolio with respect to a specified model. In particular, when some large observations fall into the domain of Pareto distributions, the finiteness of claim-values variances is often questionable. Practically all the data sets which we have considered lead to estimations of Pareto indexes for large claims corresponding to finite expectations, but yet, to infinite variances. This, in itself, suffices to support the idea that most “usual” statistical comparison methods (such as that using the Student test technology) are ineffective. We shall illustrate these questions through the analysis of a real data set, and propose some new methods to bring solutions to the corresponding problems.
November 2, 12:30 pm, La Défense EEE, room 101
difficult to predict. This risky situation is unlike default risk whose maximum values are limited
by the amount of credit granted. For example, our data from a very large US bank show that this
bank could suffer, on average, more than four major losses a year. This bank had seven losses
exceeding hundreds of millions of dollars over its 52 documented losses of more than $1 million
during the 1994-2004 period. The tail of the loss distribution (a Pareto distribution without
expectation whose characteristic exponent is 0.95 ! ! ! 1) shows that this bank can fear extreme
operational losses ranging from $1 billion to $11 billion, at probabilities situated respectively
between 1% and 0.1%. The corresponding annual insurance premiums are evaluated to range
Jean-Philippe Bruneton (NaXys, Namur Univ., Belgium; formerly Risk Analyst at SCOR Swizerland, Financial modelling team)
October 20 (Thursday), 12:30 pm, La Défense EEE, room 101
Calendar of the meetings in 2010-2011:
Blaise Bourgeois (Group Risk Management, Paris) June 16, 12:30 pm, La Défense EEE, room 138
Risk Neutral Valuation in Insurance
Insurance markets have traditionnally been presented as incomplete markets, using statistical approaches based on real-world measures to price & evaluate risks, rather than market-consistent valuation methods (aka risk-neutral framework). Whilst this may continue to hold to some extent for Property & Casualty business, the Life & Savings sector has progressively adapted its valuation & risk measurement methodologies and tools to a world more akeen to modern financial markets than traditional actuarial reserving methods. This paper presents the building blocks of these risk-neutral valuation methods / tools now commonly applied by most major L&S insurance companies in Europe. This framework will form from 2013 onwards the backbone of the new Solvency II regulation. Key words: Market-Consistent Embedded Value, Replicating Portfolio, Economic Capital, Dynamic Hedging, Completing Insurance markets
Invitation to the Conference on Mathematical Modeling of Systematic Risk June 7, 2011 (IHP Paris) see: http://www.proba.jussieu.fr/pageperso/ramacont/SystemicRiskParis2011/
May 19, 12:30 pm, La Défense EEE, room 201
Susanne Emmer (Dr., UBS, Zürich)
May 5, 12:30 pm, La Défense EEE, room 101
After a short overview of currently used credit risk measures and the stress testing landscape we give an introduction to different stress testing methodologies which are applied to the wealth management and business banking portfolio.
Giles Brennand (Prof., the Chinese University of Hong-Kong, and Independent Consultant)April 21, 12:30 pm, La Défense EEE, room 101
Towards Modern Risk Management
Jean-Paul Renne (Banque de France, Paris)
April 7, 12:30 pm, La Défense EEE, room 236
Default, Liquidity and Crises: an econometric framework
In this paper, we present a general discrete-time affine framework aimed at jointly modeling yield curves associated with different debtors. The underlying fixed-income securities may differ in terms of credit quality and/or in terms of liquidity. The risk factors follow conditionally Gaussian processes, with drifts and variance-covariance matrices that are subject to regime shifts described by a Markov chain with (historical) non-homogenous transition probabilities. While flexible, the model remains tractable. In particular, bond prices are given by quasi-explicit formulas. Various numerical examples are proposed, including a sector-contagion model and credit-rating modeling.
2nd Term: (there will be only a few meetings since I will be working abroad; we will meet again on a regular basis on the 3rd term).
Hansjoerg Albrecher (Prof., HEC Lausanne, Switzerland) March 24, 12:30 pm, SCOR (La Défense), Auditorium (with the support of the SCOR Group)
On refracted stochastic processes and the analysis of insurance risk
We show a somewhat surprising identity for first passage probabilities of spectrally-negative Levy processes that are refracted at their running maximum and discuss extensions of this identity and its applications in the study of insurance risk processes in the presence of tax payments. In addition, we discuss a statistic that is related to the sample coefficient of variation which leads to an alternative simple method for estimating the extreme value index of Pareto-type tails from corresponding iid claim data with infinite variance.
Laila Elbahtouri (SCOR, Actuary) Feb. 3, 12pm-1pm, La Défense EEE, room 104
Discussion on the paper by Albrecher et al. "Explicit ruin formulas with dependence among risks", Insurance: Mathematics and Economics 48 (2011) 265-270
Doug Andrews (Senior lecturer at Univ. of Southampton & Actuary (Fellow of the IA &Canadian IA, SOA)) 1pm-2pm, La Défense EEE, room 104
Risk Management considerations for the Canada Pension Plan: a case study
What are the risks that the Canada Pension Plan faces and how are they managed? This presentation begins with a review of the governance structure, the actuarial valuation method and the automatic balancing mechanism. It then examines particular assumptions, such as fertility, mortality, migration, productivity and investment. It explains the risks, discusses the methods used to derive the assumptions, and outlines some of the sensitivity tests used to quantify the risk.
Jan. 19, 11:15 am, Cergy ESSEC campus, room N305
Jan. 12, 12:30pm, La Défense EEE, room 202: Discussion on projects by groups
Véronique Maume-Deschamps (Prof., ISFA Lyon)
Dec. 16, 12:30 pm, La Défense EEE, room 236
Désiré Binam (Dr. Consultant Front-Office)
Dec. 2, 11:15 am, La Défense EEE room 237
How the trading technology changes the market microstructure
The evolution of trading technologies has deeply modified the microstructure of today’s financial markets. We will see how the technology is changing the trading process and strategy and how players deal with new concepts such as Direct Market Access (DMA), Smart Order Routing (SOR), High Frequency Trading (HFT) and Algorithmic trading.
Philippe Soulier (Prof., Univ. Paris Nanterre),
Nov. 25, 12:30pm, La Défense EEE, room 334
conditioning event is extreme. The quantities of interest are the limiting conditional distribution of future events given that the past was extreme, and the normalizing functions needed to obtain non degenerate these limit laws. I will consider two classes of processes: GARCH-type and stochastic volatility processes. The main difference is the presence or absence of clustering of
Nov. 4, 12:30pm, La Défense EEE, room 237
- Discussion on some projects.
- Invitation to the Finance Department Seminar on
(see https://sites.google.com/a/essec.edu/seminaires-dept-finance/ for the abstract).
Cristina Butucea (Prof., Univ. Marne La Vallée)
Oct. 14, 12:30pm, La Défense, room 236
Various ways to summarize a curve with a number: estimation and applications
Calendar of the meetings in 2009-2010:
M. Bardos & M.Kratz for the BFA (Banque Finance Assurance) group, SFdS April 9, 9am - 5pm: IHP (Institut Henri Poincaré, 75005 Paris),
Workshop on "Financial Regulation" (download the program; see the attachments).
Laurent Ferrara (Banque de France et Univ. Paris Ouest) April 1, 6pm, La Défense EEE, room 236
Assesing the recession risk anticipated by financial markets
Conference IDEI/Scor, "Integration of Extremal Events in Quantitative Risk Management" March 19, 2010 http://www.idei.fr/conference/conf_scor.html
Marie Kratz (Prof., Essec Business School) March 4, 12:30pm, La Défense EEE , room 138
On alarm systems. Applications for Insurance companies and for Health surveillance Institute
Guillaume Chevillon (Prof., Essec Business School) Feb. 4, 12:30pm, La Défense EEE , room 104
Sir David F. Hendry (Nuffield College, Oxford) Jan. 28, 6pm, La Défense EEE, room 201
Empirical Model Discovery
The talk summarizes a great deal of recent research, and explains how it facilitates the discovery of empirical models, greatly reducing the risks from model mis-specification and data contamination. Model evaluation concerns discovering what is wrong; robust statistics as discovering which sub-sample is reliable; non-parametric methods as discovering the functional form; and model selection as discovering which model best matches the given criteria. However, the high dimensionality, non-linearity, inertia, endogeneity, evolution, and abrupt change characteristic of economic data, which interact to make empirical modelling difficult and pose substantive risks of ending with an incorrect representation, make it essential to tackle all of these jointly. Automatic methods enable formulation, selection, estimation, and evaluation on a scale well beyond the powers of humans alone, including when there are more candidate variables than observations, while allowing theory models to be embedded in the discovery process. Live computer illustrations using Autometrics show the remarkable power and feasibility of this exciting approach.
Michel M. Dacorogna (Dr., Head of SCOR Group Financial Modeling) Jan. 21, 12:30pm, La Défense EEE, room 104
There will be a next crisis surprising us someday – how can we be prepared to survive it?
Arthur Charpentier (Prof., Univ. Rennes 1 & Ecole Polytechnique)
Dec. 17, 12pm, La Défense EEE, amphi 138
Extremes and Dependence in the context of Solvency II for Insurance Companies
Conference RISK (Risk Intelligence Symposium & Knowledge)
Dec. 3, 2009
organized by OTC Conseil & Univ. Paris Ouest; Caisse des dépôts et consignations – Paris.
Fernando Oliveira (Prof., Essec Business School)
Nov. 19, 10am, N305 (campus Cergy)
Fabrice Cavarretta (Prof., Essec Business School)
Nov.3, 2009 11:30am, Le Club (campus Cergy)
Distinguishing Extreme vs. Average Effects in Nascent Firms: an Organizational Risk approach to resources
Oct. 29, 2009 10am, N305 (campus Cergy)
Discussion on the objectives of this Working Group