Regulatory Capital and Modeling
85
and the tail of the loss distribution. Overall, simplicity and transparency have
been favored over mathematical complexity, which has the benefit of increasing
model stability. The model also traces each contribution: every loss event,
every scenario, has an identifiable impact on the resulting capital number. This
remarkable traceability makes the model better accepted and actionable by the
business.
Internal loss
data
BEICF
BODY
TAIL
Scenarios
External loss data
(ORX)
V
aR at 99.9% =
regulatory capital
Cut-off based on
frequency
threshold
RCSA (frequency
assess
m
ent of
control failures)
F I G U R E 8 . 1
Model structure of an AMA bank
Source: Chapelle, A. University College London, 2015.
T y p e s o f M o d e l s
Over the years, four types of models have evolved: stochastic, scenario-based, hybrid
and factor-based. Unfortunately, the last one is still in its infancy.
S t o c h a s t i c M o d e l s
These models are part of the loss distribution approach (LDA);
they are purely quantitative and based on past losses. Statistical techniques are used
to extrapolate to a 99.9% percentile the future distribution of losses. LDA is the most
widespread approach and is now increasingly mixed with scenario-based data.
S c e n a r i o- b a s e d M o d e l s
Models using scenario assessment were described in the
previous chapter and are more qualitative. They are usually found in firms where the
lack of internal loss data prevents the use of stochastic modeling. Scenario-based
models tend to be more common in Europe and in the insurance industry, where
collecting loss data is not as established as in banking.
86
RISK ASSESSMENT
H y b r i d M o d e l s
Hybrid models are currently the most common type of approach, and
more in line with regulatory expectations and the four inputs for AMA model require-
ments. Hybrid models use a combination of data distributions from past incidents and
prospective losses from scenarios to derive information operational losses distributions
at a 99.9% confidence interval. Hybrid models are capital calculations based on inci-
dent data, with adjustments to account for scenarios and for risk management quality.
F a c t o r M o d e l s
Factor models explain the behavior of a variable from the values
taken by its various influencing factors. They are common in equity pricing: stock mar-
ket returns are predicted based on factors such as the risk of the asset (Capital Asset
Pricing Model), the size of the firm and its level of financial distress (Fama–French),
and the momentum of the price movement, upward or downward (Carhart). In 2000,
during the early days of operational risk modeling, factor modeling was one of the
possible paths for operational risk models: to determine operational risk losses by
explaining their variables (size of the firm, economic context, internal controls, gov-
ernance, culture, remuneration structures, etc.). A few academic studies were pub-
lished, but the nascent trend was quickly overtaken by the LDA and the swathes of
past loss data that were thrown at the issue. Additionally, the scarcity of data related
to internal control, environment, governance and culture, less straightforward to cap-
ture, increased the difficulty of calibrating factor-based models. Personally I regret this
turn of events. The current overall lack of stability and confidence in the operational
risk models is certainly an argument for a fundamental revision of operational risk
modeling methods.
L o s s D i s t r i b u t i o n A p p r o a c h i n a N u t s h e l l
Modeling activities require a lot of observation data points. Models are a simplified,
theoretical representation of the reality, built on repeated observations to derive stable
patterns and common laws governing the data observed.
When regulators decided to require banks to hold capital for operational risk, data
on incidents were barely collected – except for external fraud – and so were particularly
scarce. To increase the number of data points at their disposal to fit statistical distri-
butions, modelers called upon an old actuarial technique: the LDA. The principle is to
decompose risk events into two of their components: how often they occur (frequency)
and how much they cost (severity). By decomposing a single event into its frequency
and severity, it multiplies by two the number of observation points available for mod-
eling. Modelers at the French bank Credit Lyonnais published a seminal paper
2
on
applying LDA to operational risk and the approach has now become common practice.
2
Frachot, A., Georges, P. and Roncalli, T. (2001) “Loss distribution approach for operational
risk,” Group Recherche Operationel Credit Lyonnais France, Working Paper.
Do'stlaringiz bilan baham: |