|HomeServiceStatisticsExtreme Value TheoryMarket and Term Structure ModelsDatamining|
Extreme value theory (EVT) is a statistical discipline to describe and understand quantifiable rare events. It is especially well suited to describe the heavy tails of wins and losses distributions.
Members of the Approximity team contributed to the following publications:
Advanced Extremal Models for Operational Risk
Managing risk lies at the heart of the financial services industry. Regulatory frameworks, such as Basel II for banking and Solvency 2 for insurance, mandate a focus on operational risk. In this paper some of the more recent Extreme Value Theory (EVT) which maybe be useful towards the statistical analysis of certain types of operational loss data. The key attraction of EVT is that it offers a set of ready-made approaches to the most difficult problem of operational risk analysis, that is how can risks tat are both extrene and rare be modelled appropriately?
Extreme Value Theory can save your neck (whitepaper)
How tall should one design an embankment so that the sea reaches this level only once in 100 years? How large might a possible stock market crash be tomorrow?
Many real life questions require estimation, but since no data or only few has been observed - as by definition extreme events are rare - essential estimations are more often based on feeling than on fact. Extreme Value Theory (EVT) is a branch of statistics that deals with such rare situations and that gives a scientific alternative to pure guesswork.
Summary: If you look at fat tails, consider using EVT, as EVT is too expensive to ignore. EVT as well as any other model is only an abstraction of reality and not a silver bullet: no science can replace experience, domain knowledge and human intuition, as our work in risk management for the finance and corporate industry has shown us again and again. online html version
Generalized Addivite Modelling for Sample Exremes
Aon Re Europe Science Team Meeting, EURANDOM, Eindhoven University of Technology, September 18-19, 2003, Statistical Issues in Actuarial Risk
In this talk, we describe smooth non-stationary generalized additive modelling for sample extremes, in which spline smoothers are incorporated into models for exceedances over high thresholds. Fitting is by maximum penalized likelihood estimation, with uncertainty assessed by differences of deviances and by bootstrap simulation. The approach in the insurance context is illustrated on simulated data.
Generalized Additive Models for Sample Extremes.
Submitted to the Journal of the Royal Statistical Society, C.
see the description of the above slides given at Aon Re Europe Science Team Meeting
Estimating Value-at-Risk for financial time series: an approach combining self-exciting processes and extreme value theory
RISK DAY 2002, Mini-Conference on Risk Management in Finance and Insurance, RiskLab, ETH and University of Zrich
We consider the modelling of rare events in financial time series, and introduce a marked point process model for the excesses of the time series over a high threshold that combines a self-exciting process for the exceedances with a mark (size) dependent process. This allows realistic models for rare events, in which recent events affect the current intensity for exceedances more than distant ones, but it also allows the intensity to depend on the marks of the events. Estimates of Value-at-Risk are derived for real datasets and backtested. Longer paper about the method
A statistical analysis of the shareprice of the SAIR group (1996-2001) from a risk manager's point of view.
Derivatives Use, Trading & Regulation, Volume 8, 2/2002
Over the recent years, Extreme Value Theory (EVT) has been used in order to statistically analyse financial data showing clear non-normal behaviour. Several examples coming from market, credit and operational risk have been discussed. In the present paper we look at the particular case of Swissair and quantify, using EVT, the extremal behaviour of the returns. For this, we go beyond the traditional EVT and introduce new methodology such as smoothing and more advanced maximum likelihood techniques.
Between Data Science and Applied Data Analysis; Proceedings of the 26th Annual Conference of the Gesellschaft Für Klassifikation E.V; Springer-Verlag, 2003, pp. 387-394.
In recent years there have been a number of developments in the datamining techniques used in the analysis of terrabyte-sized logfiles resulting from Internet-based applications. The information which these datamining techniques provide allow knowledge engineers to rapidly direct business decisions. Current datamining methods however, are generally efficient only in the cases when the information obtained in the logfiles is close to the average. This means that in cases where non-standard logfiles (extreme data) are being studied, these methods provide unrealistic and erroneous results. Non-standard logfiles often have a large bearing on the analysis of web applications, the information which they provide can impact on new or even well established services. In this paper aspects of the recent Extreme Value Theory methodology are discussed. Particular emphasis is made to its application; a unique toolkit is provided with which to describe, understand and predict the non-standard fluctuations as discovered in real-life Internet-sourced log data.
Smooth Extremal Models in Finance.
To appear in Journal of Risk and Insurance.
Extreme Value Theory (EVT) has developed very rapidly over the past two decades both methodologically and with respect to applications. Whereas (non-life) actuaries have, at least implicitly, used EVT techniques for a long time, mainly through the emergence of quantitative Risk Management, EVT has entered more recently the finance stage as a useful toolkit for describing non-standard (more precisely non-normal) price fluctuations. The Peaks Over Threshold (POT) method appears flexible and it considers exceedances over a threshold U. Since each exceedance is associated with a specific event, it is possible to let the scale and shape parameters depend on covariates. For instance, insurance losses can be of different types (so-called lines), credit loss data typically will be a function of credit scores, business type, exogenous economic variables, time and other information. Operational losses will typically belong to various subclasses (fraud, system failures, backoffice errors, ...) and their occurrence no doubt shows a non--constant (often stochastic) intensity, possibly depending on such factors as business cycles, transaction intensity etc. Large losses may become more or less frequent over time, or indeed they may become more or less severe. It is also well-known that in general, insurance and financial losses show cyclic behaviour. In this paper we discuss some of the more recent EVT methodology which may be useful in handling the presence of such covariates and the resulting modelling of extremal events.