Working Group on (quantitative) Risk Analysis / Uncertainty
- ESSEC Business School -
with the support of the group BFA (Banque Finance Assurance) of the SFdS (Société Française de Statistique)
& the support of the French Institute of Actuaries (IA)
Organizer: Prof. Marie Kratz
The main objective is the resolution of concrete problems which affect the industry or of subjects of public interest, in the field of (quantitative) risk analysis, creating new tools or methods.
Many skills are today distributed in various domains, financial, industrial, academic,..., therefore the intention behind the creation of this group is:
To this end, regular meetings will include:
Each participant might propose a precise theme, a work on a given article or on a project.
The presentation slides are in general available: see the attachments at the end of the page.
Our one hour meetings will take place generally twice a month, on Fridays, at 12:30pm, EEE - ESSEC La Défense.
If you are interested and want to be part of the mailing list, please contact Marie Kratz.
Note: since March 2012, the French Institute of Actuaries (IA) supports the WG Risk and considers it as part of its program PPC (Perfectionnement Professionnel Continu). To participate to a conference holds for 6 points PPC.
Calendar of the meetings in 2012-13:
Please note the new website of the Working Group on Risk (from February 2013):
February 4, 12:30pm, La défense EEE, room 138
Actuary, Head of Risks & Capital Adequacy (DEXIA Group)
Abstract: The banking industry is facing severe economic and regulatory capital shortages, due to more stringent regulations for capital and various pressures by investors and rating-agencies. Hence, integrated risk measurements and capital management are becoming of key importance to risk executives and general managers.
Current practices in capital and integrated risk management rely on two metrics: regulatory and economic capital. Their simultaneous use generates complexity in daily risk decisions. It hinders commonly shared internal risks assessments and practical capital management from the board level directions to their execution by different risk committees.
The IRCM enhance readability of risks at executive committees and management boards’ levels. It also creates common metrics of risks to financial management, accounting and to risk management. Quantitative risk management concepts and techniques are of paramount importance in this framework for scenario analysis, risk measures and eventually for decision support.
March 4, 12:30pm,
La Défense EEE, room 220, Michael Schmutz (Dr.)
March 22, 12:30pm, La Défense EEE, Walter Farkas (Prof., ETH Zürich)
Abstract: A normal approximation is often chosen in practice for the unknown distribution of the yearly log returns of financial assets, justified by the use of the CLT (Central Limit Theorem), when assuming aggregation of iid observations in the portfolio model. Such a choice of modeling, in particular using light tail distributions, has proven during the crisis of 2008/2009 to be an inadequate approximation when dealing with risk measures; as a consequence, it leads to a gross underestimation of the risks. The main objective of our study is to obtain the most accurate evaluations of risk measures when working on financial data under the presence of heavy tail and to provide practical solutions for accurately estimating high quantiles of aggregated risks. It may also be useful for a better estimation of the capital when using Monte-Carlo simulations for which the convergence may be an issue. We explore new approaches to handle this problem, numerically as well as theoretically, based on properties of upper order statistics. We compare them with existing methods, for instance with one based on the Generalized Central Limit Theorem.
January 24, 12:30pm, La défense EEE, room 203, Marie Kratz (Prof., ESSEC Business School): There is a VaR beyond usual approximations
December 17, 12:30pm, La Défense EEE, room 201, Laila Elbahtouri (Actuary, SCOR, Paris): Explicit diversification benefit for dependent risks
Understanding the diversification benefit (DB) is essential to the business model of reinsurance companies. It allows for efficient risk management, capital optimization and competitive pricing. This is why, it is important to model and compute the diversified Risk Adjusted Capital (RAC) of a portfolio from the marginal distributions of the risks and their dependence structure. The purpose of this paper is to provide explicit DB formulas and the capital allocation to the various stand-alone risks. We do this by means of mixing techniques (Oakes, 1989). Examples for bivariate case will be given according to survival copulas (Clayton and Gumbel) and marginals (Pareto and Weibull) which allow the calculation and the allocation of DB.
The numerical convergence of this quantity is tested against the analytical result and shown to be good for marginal with relatively light tails (tail indices below or equal to 2) and slow for the risk measure (tail Value-at-Risk) but reasonable for the DB when the marginal are very fat tailed (tail index close to 1). Further research will be outlined. Work done in collaboration with M. Dacorogna (SCOR) and M. Kratz (ESSEC).
November 30, 12:30pm, La Défense EEE, room 220, Dirk Tasche (Dr., FSA, London): Is Expected Shortfall a better risk measure than VaR?
Abstract: The Basel Committee on Banking Supervision recently suggested that Expected Shortfall could replace Value-at-Risk to improve the measurement of risk in the trading book. We discuss the conceptual and practical pros and cons of this proposal, arguing that actually the combined use of the two risk measures might be the way forward.
November 23, 12:30pm, La Défense EEE, room 103, Franck Adekambi (Dr., School of Statistics & Actuarial Science, University of the Witwatersrand, Johannesburg, South Africa.) : The Asymptotic Ruin Problem In Health Care Insurance With Interest
Abstract:Ramsay (1984) in paper published in Insurance: Mathematics and Economics, entitled «The asymptotic ruin problem when the healthy and sick periods form an alternating renewal process» found expressions for the probabilities of sickness and health, the first moment of the aggregate amount of premium received up to the end of the n -th healthy period, the aggregate amount of benefit paid out up to the end of the n -th sickness period, n =1,2,3,... and an approximation of the probability of ruin but ignoring the effect of force of interest.
This paper aims to consider the Ramsay model modified by the inclusion of interest rate. Upper bounds for the ultimate ruin probability are derived by martingale and recursive techniques. Examples will be given when the length of the period of sickness and health follow a particular distribution.
November, 19, 2012,
ESSEC Executive Education - La Défense - Room 203
Organization: Working Group on Risk - ESSEC & Swiss Life, with the support of the BFA group (SFdS) and IA.
October 26 (Friday),
12:30pm, EEE - ESSEC La Défense - room 220,
Rama Cont (Dr.,
CNRS & Imperial College, London) :
Demystifying Black Swans : fire sales, price impact and endogenous risk
R Cont, L Wagalath (2011)
Running for the exit: short selling and endogenous correlation in financial markets to appear in: Mathematical Finance.
R Cont, L Wagalath (2012)
October 12 (Friday), 12:30pm, SCOR (5 avenue Kléber - 75 116 Paris), auditorium Kléber,
(with the support of SCOR Group),Michel Dacorogna (Dr., Group Deputy CRO, SCOR) : Adding Time Diversification to Risk Diversification, the Case for Equalization reserves for Natural Catastrophes
Abstract: The business of insurance is based on a simple concept of spreading the risks endured by an individual amongst as large a group of persons as possible. As long as he can evaluate the expected loss with a reasonable accuracy, the insurer is able to ask for a price that will cover this loss, plus a premium to pay the capital he needs to set aside. This capital is provided to ensure the payment of loss up to a certain probability. Over the years the insurance business has prospered, and has become a central actor in the economies of developed countries. In the early thirties, mathematicians such as Kolmogorov or Fisher were able to explain successful mechanics of the insurance industry through the law of large numbers.
In the last thirty years practitioners and mathematicians have come to recognize that certain risks do not follow the law of large numbers. Natural catastrophes are on such risk category. By nature, such events are of low frequency but high severity. In practice this means that most underwriting years will end up with few losses. However if a natural catastrophe does occur, the loss will probably be much higher than the expected value. In such cases the mathematical functions of expectation and variance do not always converge. The addition of one extra event will increase the value of both functions, even if their estimation was based on many years of previous event occurrence.
Catastrophe risk is a difficult issue for insurers. The insurer wants to benefit from the high return this business can generate but does not want to pay the price of its very high volatility. The solution to this problem is to invest in a program that includes a certain form of savings during the years where the losses are benign. This allows the insurer to face the obligations when a big loss occurs, mitigating the high volatility catastrophe insurance, or even the need to raise new capital. This savings program is called “time diversification or equalization reserves” to differentiate it from existing capital. The difference between equalization reserves and capital is twofold: there will be no return on this money above the risk free rate; and no new business will be written against it.
Equalization reserves have been banned by the new accounting rules and regulations on the basis that if no loss incurs during the contract period the money belongs to the shareholder who originally put his capital at risk for this transaction. This argument is perfectly valid if it is possible to reasonably estimate loss and average claims over a given period, i.e. for loss distributions showing low volatility. With high severity, low frequency losses, we have already seen that this is not the case. Those insurers who did not experience a major claim in such a line have simply been lucky. Effectively the investor is playing roulette. At some point he risks loosing his capital to pay back a large claim, as the premium the insurer will get during that particular year will certainly not cover it. Moreover, he would have no chance of recouping some of his losses because the company would become insolvent once it has lost its capital, and would not be able to write any new business.
Using a model of the balance sheets of two firms submitted to the same risks1, one that is allowed to keep equalization reserves and one that pays all the extra gains as dividends, we analyze the value of both firms from the point of view of the shareholders. Both company start with the same capital. We submit both companies to the same losses over a period of thirty years and analyze the cash flows resulting from their business in terms of the Sharpe Ratio and the Merton model that would give the value of the call option on the assets of the company. We explore two different distributions lognormal and Fréchet with various tails, but all with the same Value-at-Risk at 99%. First, we see that, within certain rules, it is possible to buildup equalization reserves. Second, in all the cases, we show that the company that holds equalization reserves has a higher value for the investor. We present and discuss details of the model and the results.
1- Work done in collaboration with H.J. Albrecher (EPFL), M. Moller (SCOR) and S. Sahiti (EPFL & SCOR)
Calendar of the meetings in 2011-12:
June 21, 12:30pm,
La Défense EEE (Cnit), room 104,
Davide Canestraro (Dr., SCOR
Zürich):PrObEx and Internal Model - calibrating dependencies among risks in Non-Life
Abstract:The latest financial crisis has dramatically shown that dependence among risks cannot be ignored. (Re)insurance companies may use copula models in order to prudently account for dependence (especially in the tail) within their internal models. Once a certain copula model has been chosen, actuaries are left with the issue of estimating the copula parameter(s). However, standard statistical estimation techniques generally fail to provide a reliable estimate if data is scarce. In order to reduce the parameter uncertainty, we developed a Bayesian model to calibrate copula parameters: PrObEx. We provide an introduction to such a key innovation in SCOR’s internal model, discussing both the mathematical aspects and the practical implementation for calibrating dependencies in Non-Life. Our method can be used also in other contexts (e.g. Economy, Life, etc.).
D. Canestraro has been awarded for this presentation on
May 24, 12:30pm, La Défense EEE, room 138, Philipp Arbenz (Dr., SCOR Zürich):
High dimensional risk aggregation: a hierarchical approach with copulas.
Risk aggregation is highly relevant for solvency calculations, risk management and capital allocation. From a statistical and computational point of view, most risk aggregation methodologies are very challenging in high dimensions. A prudent approach requires the consideration of dependencies between all risks, which caused the popularity of copulas. However, the common copula classes are restrictive and too symmetric. We present a hierarchical risk aggregation method which is flexible in high dimensions. With this method it suffices to specify low dimensional copulas for each aggregation step in the hierarchy. Arbitrary copulas and margins can be combined. An efficient algorithm for numerical approximation is presented.
May 9, 12:30pm, La Défense EEE, room 138, Sofiane Aboura (MCF, Univ. Paris Dauphine): Disentangling Crashes from Tail Events.
Abstract: The study of tail events has become a central preoccupation for academics,
investors and policy makers, given the recent financial turmoil. However,
the question on what differentiates a crash from a tail event remains
This work elaborates a new definition of stock market crash taking a
risk management perspective based on an augmented extreme value theory
methodology. An empirical test on the French stock market (1968-2008)
indicates that it experienced only two crashes in 2007-2008 among the 12
identified over the whole period.
March 28, 1
2:30 pm, La Défense EEE (Cnit), room 101, Bertrand Maillet (Head of Research atABN AMRO, and
MCF, Univ. Orléans
): Risk Models-at-Risk.
Abstract: The recent experience from the global financial crisis has raised serious doubts about the accuracy of standard risk measures as a tool to quantify extreme downward risks. Risk measures are hence subject to a “model risk” due, e.g., to the specification and estimation uncertainty. Therefore, regulators have proposed that financial institutions assess the “model risk” but, as yet, there is no accepted approach for computing such a risk. We propose a general framework to computerisk measures robust to the model risk, while focusing on the Value-at-Risk (VaR). The proposed procedure aims empirically adjusting the imperfect quantile estimate based on a backtesting framework, assessing the good quality of VaR models such as the frequency, the independence and the magnitude of violations. We also provide a fair comparison between the main risk models using the same metric that corresponds to model risk required corrections.
March 14, 1
2:30 pm, La Défense EEE (Cnit), room 237, Wolfgang Dick (Prof., ESSEC): Recognition and measurement risks in financial statements.
Abstract:Financial statements are supposed to give a true and fair view of the financial position and economic performance of an entity. This requires to recognize and measure all its assets and liabilities. Under IFRS, recognition and measurement of these elements depends on assumptions, estimations etc about what will happen in the future. Depending on the Management's decision about these assumptions and estimations true and fair view disclosed in the financial statements may differ and impact analysts' judgments. This presentation will introduce to the key concepts of financial statements presentation under IFRS and highlight some of related recognition and measurement risks.
March 7 (instead of Feb. 29), 1
2:30 pm, La Défense EEE (Cnit), room 102, Olivier Lopez (MCF, ISUP-LSTA, UPMC Paris 6): Multiple change-point detection in a Poisson process
with applications to non-life insurance.
Abstract:We consider an unhomogeneous Poisson process with unknown intensity. This intensity is approximated by a piecewise constant function. We provide new non-asymptotic results on estimating this intensity. The techniques that we consider are applied to a nonlife insurance portfolio, in order to monitor the stability of the claim process.
February 7, 1
2:30 pm, La Défense EEE (Cnit), room 236, Corina Constantinescu (Prof., Univ. of Liverpool): Risk processes with premium adjusted to solvency targets.
Abstract: The traditional point of view of ruin theory is
reversed: rather than studying the probability of ruin as a
function of the initial reserve under a fixed premium, we
adjust the premium so as to obtain a given ruin probability
(solvency requirement) for a fixed initial reserve (the
financial capacity of the insurer).
January 18, 1
2:30 pm, La Défense EEE (Cnit), room 236, Thierry Roncalli (
Head of Research&Development, Lyxor Asset Management & Evry Univ.):
Portfolio Optimization versus Risk-Budgeting Allocation.
Portfolio allocation is generally based on optimization method (Minimum variance, Markowitz, Merton, Black-Litterman, etc.). The first part of this presentation is to show that portfolio optimization faces several drawbacks in terms of concentration, stability and management. We will show that risk-budgeting techniques is an alternative method which appears more robust. In particular, we will focus the second part of the presentation in one of the most simple risk-budgeting methods, when the risk budgets are the same. In this case, we obtain the ERC (Equal Risk Contribution) portfolio. After giving the mathematical properties of the ERC portfolio, we will present some applications to manage equity funds (like alternative-weighted indexes) and diversified funds (like risk parity funds). In the third part of the presentation, we will focus on risk-budgeting methods, when the risk budgets are not the same. We will generalize some properties of the ERC portfolio and present an application to manage the sovereign credit risk in bond portfolios.
December 14, 12:30 pm, SCOR auditorium (with the support of SCOR Group), La Défense: Pierre Miehe (Deputy CE0, ACTUARIS International): Risk Management in Insurance: What technical solutions to answer the huge modeling requirements?
The new European regulation for insurance “Solvency II” encourages insurers of big/medium size companies to implement internal models for the calculation of their required capital.
All risks have to be modeled simultaneously. For the estimation of tails and extreme events
10 000 simulations at least are recommended. For big companies this can theoretically imply runtimes of several weeks… This becomes even worse with the Simulations in Simulations issue.
The aim of this presentation is to review a range of solutions to deal with this huge number of simulations, through the analyze of adapted random generators, models optimizations, and potential simplified “artificial intelligence” implementations.
November 30 (Wednesday), 12:30 pm, La Défense EEE (Cnit), room 101: Jean-Gabriel Attali (Dr, Consultant, Formerly Strategy Analyst at Exane Derivatives): Risk measurement and its limits in asset management.
Abstract: Management of risk is a key point in the area of asset management. The aim of this presentation is to address various aspects of risk as well as its measurement or control. In particular, we will focus on the limits of some risk measures, in the sense that risk is often underestimated when it is measured ex-post, especially when the manager is very active or uses derivatives products. We will show that the management of ex-post risk is easy when it is measured by volatility (or Var), even in time of crisis, simply by managing the exposition of the fund. Unfortunately, this technique does not ensure the safety of principal which remains the main objective of the industry of asset management.
November 16 (Wednesday), 12:15 pm, La Défense EEE (Cnit), room 104: Paul Deheuvels (Prof., UPMC Paris 6): On Difficulties of Risk Modelling and Portfolio Analysis.
Abstract: The statistical modelling of insurance claims is typically achieved through lognormal models, which often fit well to the central portion of a number of data sets. What is meant by the “central portion” of observations embraces the set of values, truncated below and above by “appropriate” threshold levels. The most troublesome claim-values are, as could be expected, large observations, but small values may generate other technical difficulties as well, which we will not consider here in detail. Large values typically fall in the domain of reinsurance, and are often quite well interpreted by Pareto-type models. The main difficulty comes here from the available “large” observations, which compose, most of the time, data sets so small as to render their statistical analysis troublesome. We have considered recently a number of portfolios, each composed of a few thousand observations, and observed, in each case, that only a small percentage of the data could be well interpreted by Pareto-type models. Such models become difficult to fit when the number of observations falling in this range is, for example, of the order of 5 to 20. Another class of technical problems to cope with is, first, to find the right statistical assessment methods to compare between each other the respective risks of several portfolios, and second, to assess the goodness-of-fit of a given portfolio with respect to a specified model. In particular, when some large observations fall into the domain of Pareto distributions, the finiteness of claim-values variances is often questionable. Practically all the data sets which we have considered lead to estimations of Pareto indexes for large claims corresponding to finite expectations, but yet, to infinite variances. This, in itself, suffices to support the idea that most “usual” statistical comparison methods (such as that using the Student test technology) are ineffective. We shall illustrate these questions through the analysis of a real data set, and propose some new methods to bring solutions to the corresponding problems.
November 2, 12:30 pm, La Défense EEE (Cnit), room 101:
Daniel Zajdenweber (Prof. , Univ. Paris X Nanterre):
Extremal Events in a Bank Operational Losses.
Operational losses are true dangers for banks since their maximal values to signal default are
difficult to predict. This risky situation is unlike default risk whose maximum values are limited
by the amount of credit granted. For example, our data from a very large US bank show that this
bank could suffer, on average, more than four major losses a year. This bank had seven losses
exceeding hundreds of millions of dollars over its 52 documented losses of more than $1 million
during the 1994-2004 period. The tail of the loss distribution (a Pareto distribution without
expectation whose characteristic exponent is 0.95 ! ! ! 1) shows that this bank can fear extreme
operational losses ranging from $1 billion to $11 billion, at probabilities situated respectively
between 1% and 0.1%. The corresponding annual insurance premiums are evaluated to range
between $350 M and close to $1 billion.
Keywords: Bank operational loss, value at risk, Pareto distribution, insurance premium, extremal event.
Joint work with
H. Dahen and G. Dionne (Journal of Operational Risks, 2010).
October 20 (Thursday), 12:30 pm, La Défense EEE (Cnit), room 101: Jean-Philippe Bruneton (NaXys, Namur Univ., Belgium; formerly Risk Analyst at SCOR Swizerland, Financial modelling team):
Diversification benefit in Gaussian Aggregation Trees.
Abstract: Insurance risks are often aggregated together with the help of copulas: risks whose individual marginals are known are tied together via some function (the copula). In mathematical language, their joint distribution is defined via the copula. We focus our study on the aggregation of risks within so-called hierarchical trees: A tree of aggregation refers to applying several times such a process, linking some marginals together, then linking the resulting marginals together, and so on and so forth, until the total portfolio is modelised. We study in particular the "Gaussian Tree" where both marginals and copulas are Gaussian. This enables exact anlytical results for the r.v. of the total portfolio, and therefore the exact computation of the diversification benefit (the release in solvency capital due to diversification of risks, but taking into account their interdependencies). Such a toy model enables us to study the impact of the width and depth of the tree on the diversification benefit. We show that "tight trees" diversify better than "fat trees". We also show some numerical results that support this conclusion also outside the Gaussian world: Lognormal Trees aggregated via Gaussian or Clayton copula show the same overall behavior.
Calendar of the meetings in 2010-11:
June 16, 12:30 pm, La Défense EEE (Cnit), room 138: Blaise Bourgeois (Head of Life Product Risks, AXA, Group Risk Management, Paris): Risk Neutral Valuation in Insurance.
Abstract: Insurance markets have traditionnally been presented as incomplete markets, using statistical approaches based on real-world measures to price & evaluate risks, rather than market-consistent valuation methods (aka risk-neutral framework). Whilst this may continue to hold to some extent for Property & Casualty business, the Life & Savings sector has progressively adapted its valuation & risk measurement methodologies and tools to a world more akeen to modern financial markets than traditional actuarial reserving methods. This paper presents the building blocks of these risk-neutral valuation methods / tools now commonly applied by most major L&S insurance companies in Europe. This framework will form from 2013 onwards the backbone of the new Solvency II regulation.
Key words: Market-Consistent Embedded Value, Replicating Portfolio, Economic Capital, Dynamic Hedging, Completing Insurance markets
June 7, Invitation to the Conference on Mathematical Modeling of Systematic Risk (IHP Paris)
May 19, 12:30 pm, La Défense EEE (Cnit), room 201:
Nizar Touzi (Prof., Ecole Polytechnique, Palaiseau): Model independent bounds under calibration constraints: a stochastic control approach.
Abstract: We develop a stochastic control approach for the derivation of model independent bounds for derivatives under various calibration constraints. Unlike the previous literature, our formulation seeks the optimal no arbitrage bounds given the knowledge of the distribution at some (or various) point in time. By convex duality techniques, this problem is converted into an optimal transportation problem along controlled stochastic dynamics. We also provide precise connections with the Azema-Yor solution of the Skorohod Embedding problem, and we obtain some extensions.
May 5, 12:30 pm, La Défense EEE (Cnit), room 101:
Suzanne Emmer (Dr., UBS, Zürich):
Abstract: After a short overview of currently used credit risk measures and the stress testing landscape we give an introduction to different stress testing methodologies which are applied to the wealth management and business banking portfolio.
Risk management specialists, practitioners and academics, have paid much attention to estimation of the likelihood of untoward events. Consideration of the fundamental nature of risk as an essentially human construct identifies many other aspects of risk management that deserve at least as much attention as the likelihood of untoward events.
April 7, 12:30 pm, La Défense EEE (Cnit), room 236: Jean-Paul Renne (Banque de France, Paris), Default, Liquidity and Crises: an econometric framework.
Abstract: In this paper, we present a general discrete-time affine framework aimed at jointly modeling yield curves associated with different debtors. The underlying fixed-income securities may differ in terms of credit quality and/or in terms of liquidity. The risk factors follow conditionally Gaussian processes, with drifts and variance-covariance matrices that are subject to regime shifts described by a Markov chain with (historical) non-homogenous transition probabilities. While flexible, the model remains tractable. In particular, bond prices are given by quasi-explicit formulas. Various numerical examples are proposed, including a sector-contagion model and credit-rating modeling.
2nd Term: (there will be only a few meetings since I will be working abroad; we will meet again on a regular basis on the 3rd term).
March 24, 12:30 pm, SCOR (La Défense), Auditorium (with the support of the SCOR Group): Hansjoerg Albrecher (Prof., HEC Lausanne, Switzerland), On refracted stochastic processes and the analysis of insurance risk.
Abstract: We show a somewhat surprising identity for first passage probabilities of spectrally-negative Levy processes that are
Feb. 3, 12pm-1pm, La Défense EEE (Cnit), room 104: Discussion by Laila Elbahtouri (SCOR, Actuary) on the paper by Albrecher et al. "Explicit ruin formulas wit dependence among risks", Insurance: Mathematics and Economics 48 (2011) 265-270
1pm-2pm, La Défense EEE (Cnit), room 104: Doug Andrews (Senior lecturer at Univ. of Southampton & Actuary (Fellow of the IA &Canadian IA, SOA)), Risk management considerations for the Canada Pension Plan: a case study.
Abstract: What are the risks that the Canada Pension Plan faces and how are they managed? This presentation begins with a review of the governance structure, the actuarial valuation method and the automatic balancing mechanism. It then examines particular assumptions, such as fertility, mortality, migration, productivity and investment. It explains the risks, discusses the methods used to derive the assumptions, and outlines some of the sensitivity tests used to quantify the risk.
Jan. 19, 11:15 am, Cergy ESSEC campus, room N305,
Olivier Wintenberger (MCF, Univ. Paris Dauphine), Limit laws for sums of weakly dependent data with infinite variances.
Abstract: Many econometrical models take into account two stylized facts: the dependence structure of the time series (absence of correlations, dependence of the squares) and the high volatility (tails as power laws). If the dependence is not too strong, the asymptotic behavior of the sum is no longer distributed as a normal law but as another type of stable law. Thus, the dependence and the power law marginals drive the limit law in a very intricated way. In a work done with K. Bartkiewicz, A. Jakubowski and T. Mikosch, we suceeded to give a non standard central limit theorem where we clearly determined in the limiting stable law what is due to the dependence and what is due to the marginal law. We apply our result to the sum of the stationary solution of the GARCH(1,1) model.
Jan. 12, 12:30pm, La Défense EEE, room 202: discussion on projects by groups
Dec. 16, 12:30 pm, La Défense EEE, room 236,
Véronique Maume-Deschamps (Prof., ISFA Lyon), Some multivariate risk indicators; estimation and application to reserve allocation.
Abstract: We consider some risk indicators of vectorial risk processes. These indicators are expected sums of some penalties that each line of business would have to pay due to its temporary potential insolvency. The dependency between lines of business is taken into account. By using stochastic algorithms, we may estimate the minimum of these risks indicators, under a fixed total capital constraint. This minimization may apply to reserve allocation.
Dec. 2, 11:15 am, La Défense EEE room 237,
Désiré Binam (Dr., Consultant Front-office), How the trading technology changes the market microstructure.
Abstract: The evolution of trading technologies has deeply modified the microstructure of today’s financial markets. We will see how the technology is changing the trading process and strategy and how players deal with new concepts such as Direct Market Access (DMA), Smart Order Routing (SOR), High Frequency Trading (HFT) and Algorithmic trading.
Nov. 25, 12:30pm, La Défense EEE, room 334,
Philippe Soulier (Prof., Univ. Paris Nanterre),
Abstract: We consider the problem of prediction in a time series when the conditioning event is extreme. The quantities of interest are the limiting conditional distribution of future events given that the past was extreme, and the normalizing functions needed to obtain non degenerate these limit laws. I will consider two classes of processes: GARCH-type and stochastic volatility processes. The main difference is the presence or absence of clustering of extremes.
Nov. 4, 12:30pm, La Défense EEE, room 237:
-Discussion on some projects.
-Invitation to the Finance Department Seminar on Nov. 8, 4:30 pm - ESSEC Campus Cergy Pontoise (room N 305) by Heitor Almeida (Univ. Illinois), Aggregate Risk and the Choice between Cash and Lines of Credit.
(see https://sites.google.com/a/essec.edu/seminaires-dept-finance/ for the abstract).
Oct. 14, 12:30pm, La Défense, room 236:
- Set up of the agenda and the topics/objectives for the trimester.
- Seminar by Cristina Butucea (Prof., Univ. Marne La Vallée), Various ways to summarize a curve with a number: estimation and applications.
Abstract: Curve estimation is very frequently studied in various areas. We shall discuss the pros and cons of parametric versus nonparametric estimation of curves. A good compromise in many examples is to recover less information about the underlying function to be estimated (summarize it) using the nonparametric techniques. We shall discuss several ways to do so and focus on excess mass estimation.
Calendar of the meetings in 2009-10:
April 9: 9am-5pm: IHP (Institut Henri Poincaré, 75005 Paris), workshop on "Financial Regulation"; organization: M. Bardos & M.Kratz for the BFA (Banque Finance Assurance) group, SFdS. (download the program; see the attachments).
April 1, 6pm, La Défense EEE, room 236: Laurent Ferrara (Banque de France et Univ. Paris Ouest), Assesing the recession risk anticipated by financial markets
March 19, Conference IDEI/Scor, "Integration of Extremal Events in Quantitative Risk Management": http://www.idei.fr/conference/conf_scor.html
March 4, 12:30pm, La Défense EEE , room 138: Marie Kratz (Prof., Essec), On alarm systems. Applications for Insurance companies and for Health surveillance Institute.
Feb. 18, 12:30pm, La Défense EEE, room 138: Marie Kratz (Prof., Essec), On alarm systems. Applications for Insurance companies and for Health surveillance Institute.
Feb. 4, 12:30pm, La Défense EEE , room 104: informal presentation by Guillaume Chevillon (Prof., Essec).
Jan. 28, 6pm, La Défense EEE, room 201: Sir David F. Hendry (Nuffield College, Oxford), Empirical Model Discovery.
Jan. 21, 12:30pm, La Défense EEE, room 104: Michel M. Dacorogna (Dr, Head of SCOR Group Financial Modeling), There will be a next crisis surprising us someday – how can we be prepared to survive it?
Dec. 17, 12pm, La Défense EEE, amphi 138: Arthur Charpentier (Prof., Univ. Rennes 1 & Ecole Polytechnique), Extremes and Dependence in the context of Solvency II for Insurance Companies.
Dec. 3, 8:30am: conference RISK (Risk Intelligence Symposium & Knowledge) organized by OTC Conseil & Univ. Paris Ouest; Caisse des dépôts et consignations – Paris.
Nov. 19, 10am, N305 (campus Cergy): informal presentation by Fernando Oliveira (Prof., Essec)
Nov.3, 11:30am, Le Club (campus Cergy): Fabrice Cavarretta (Prof., Essec), Distinguishing Extreme vs. Average Effects in Nascent Firms: an Organizational Risk approach to resources.
Oct. 29, 10am, N305 (campus Cergy): discussion on the objectives of this working group