The following is a review of the Market Risk Measurement and Management principles designed to address the
learning objectives set forth by GARP®. Cross-reference to GARP assigned reading—Dowd, Chapter 4.
READING 2
NON-PARAMETRIC APPROACHES
Study Session 1
EXAM FOCUS
This reading introduces nonparametric estimation and bootstrapping (i.e., resampling).
The key difference between these approaches and parametric approaches discussed in
the previous reading is that with nonparametric approaches the underlying distribution
is not specified, and it is a data driven, not assumption driven, analysis. For example,
historical simulation is limited by the discreteness of the data, but nonparametric
analysis “smooths” the data points to allow for any VaR confidence level between
observations. For the exam, pay close attention to the description of the bootstrap
historical simulation approach as well as the various weighted historical simulations
approaches.
MODULE 2.1: NONPARAMETRIC APPROACHES
Nonparametric estimation does not make restrictive assumptions about the underlying
distribution like parametric methods, which assume very specific forms such as normal
or lognormal distributions. Nonparametric estimation lets the data drive the
estimation. The flexibility of these methods makes them excellent candidates for VaR
estimation, especially if tail events are sparse.
Bootstrap Historical Simulation Approach
LO 2.a: Apply the bootstrap historical simulation approach to estimate coherent
risk measures.
The bootstrap historical simulation is a simple and intuitive estimation procedure. In
essence, the bootstrap technique draws a sample from the original data set, records the
VaR from that particular sample and “returns” the data. This procedure is repeated over
and over and records multiple sample VaRs. Since the data is always “returned” to the
data set, this procedure is akin to sampling with replacement. The best VaR estimate
from the full data set is the average of all sample VaRs.
This same procedure can be performed to estimate the expected shortfall (ES). Each
drawn sample will calculate its own ES by slicing the tail region into n slices and
averaging the VaRs at each of the n − 1 quantiles. This is exactly the same procedure
described in the previous reading. Similarly, the best estimate of the expected shortfall
for the original data set is the average of all of the sample expected shortfalls.
Empirical analysis demonstrates that the bootstrapping technique consistently
provides more precise estimates of coherent risk measures than historical simulation
on raw data alone.
Applying Nonparametric Estimation
LO 2.b: Describe historical simulation using non-parametric density estimation.
The clear advantage of the traditional historical simulation approach is its simplicity.
One obvious drawback, however, is that the discreteness of the data does not allow for
estimation of VaRs between data points. If there were 100 historical observations, then
it is straightforward to estimate VaR at the 95% or the 96% confidence levels, and so
on. However, this method is unable to incorporate a confidence level of 95.5%, for
example. More generally, with n observations, the historical simulation method only
allows for n different confidence levels.
One of the advantages of nonparametric density estimation is that the underlying
distribution is free from restrictive assumptions. Therefore, the existing data points can
be used to “smooth” the data points to allow for VaR calculation at all confidence levels.
The simplest adjustment is to connect the midpoints between successive histogram
bars in the original data set’s distribution. See Figure 2.1 for an illustration of this
surrogate density function. Notice that by connecting the midpoints, the lower bar
“receives” area from the upper bar, which “loses” an equal amount of area. In total, no
area is lost, only displaced, so we still have a probability distribution function, just with
a modified shape. The shaded area in Figure 2.1 represents a possible confidence
interval, which can be utilized regardless of the size of the data set. The major
improvement of this nonparametric approach over the traditional historical simulation
approach is that VaR can now be calculated for a continuum of points in the data set.
Figure 2.1: Surrogate Density Function
Following this logic, one can see that the linear adjustment is a simple solution to the
interval problem. A more complicated adjustment would involve connecting curves,
rather than lines, between successive bars to better capture the characteristics of the
data.
Weighted Historical Simulation Approaches
LO 2.c: Compare and contrast the age-weighted, the volatility-weighted, the
correlation-weighted, and the filtered historical simulation approaches.
The previous weighted historical simulation, discussed in Reading 1, assumed that both
current and past (arbitrary) n observations up to a specified cutoff point are used when
computing the current period VaR. Older observations beyond the cutoff date are
assumed to have a zero weight and the relevant n observations have equal weight of (1
/ n). While simple in construction, there are obvious problems with this method.
Namely, why is the nth observation as important as all other observations, but the (n +
1)th observation is so unimportant that it carries no weight? Current VaR may have
“ghost effects” of previous events that remain in the computation until they disappear
(after n periods). Furthermore, this method assumes that each observation is
independent and identically distributed. This is a very strong assumption, which is
likely violated by data with clear seasonality (i.e., seasonal volatility). This reading
identifies four improvements to the traditional historical simulation method.
Age-Weighted Historical Simulation
The obvious adjustment to the equal-weighted assumption used in historical simulation
is to weight recent observations more and distant observations less. One method
proposed by Boudoukh, Richardson, and Whitelaw is as follows.1 Assume w(1) is the
probability weight for the observation that is one day old. Then w(2) can be defined as
λw(1), w(3) can be defined as λ2w(1), and so on. The decay parameter, λ, can take on
values 0 ≤ λ ≤ 1 where values close to 1 indicate slow decay. Since all of the weights
must sum to 1, we conclude that w(1) = (1 − λ) / (1 − λn). More generally, the weight for
an observation that is i days old is equal to:
The implication of the age-weighted simulation is to reduce the impact of ghost effects
and older events that may not reoccur. Note that this more general weighting scheme
suggests that historical simulation is a special case where λ = 1 (i.e., no decay) over the
estimation window.
PROFESSOR’S NOTE
This approach is also known as the hybrid approach.
Volatility-Weighted Historical Simulation
Another approach is to weight the individual observations by volatility rather than
proximity to the current date. This was introduced by Hull and White to incorporate
changing volatility in risk estimation.2 The intuition is that if recent volatility has
increased, then using historical data will underestimate the current risk level. Similarly,
if current volatility is markedly reduced, the impact of older data with higher periods
of volatility will overstate the current risk level.
This process is captured in the expression here for estimating VaR on day T. The
expression is achieved by adjusting each daily return, rt,i on day t upward or downward
based on the then-current volatility forecast, σt,i (estimated from a GARCH or EWMA
model) relative to the current volatility forecast on day T.
Thus, the volatility-adjusted return, , is replaced with a larger (smaller) expression if
current volatility exceeds (is below) historical volatility on day i. Now, VaR, ES, and any
other coherent risk measure can be calculated in the usual way after substituting
historical returns with volatility-adjusted returns.
There are several advantages of the volatility-weighted method. First, it explicitly
incorporates volatility into the estimation procedure in contrast to other historical
methods. Second, the near-term VaR estimates are likely to be more sensible in light of
current market conditions. Third, the volatility-adjusted returns allow for VaR
estimates that are higher than estimates with the historical data set.
Correlation-Weighted Historical Simulation
As the name suggests, this methodology incorporates updated correlations between
asset pairs. This procedure is more complicated than the volatility-weighting approach,
but it follows the same basic principles. Since the corresponding LO does not require
calculations, the exact matrix algebra would only complicate our discussion. Intuitively,
the historical correlation (or equivalently variance-covariance) matrix needs to be
adjusted to the new information environment. This is accomplished, loosely speaking,
by “multiplying” the historic returns by the revised correlation matrix to yield updated
correlation-adjusted returns.
Let us look at the variance-covariance matrix more closely. In particular, we are
concerned with diagonal elements and the off-diagonal elements. The off-diagonal
elements represent the current covariance between asset pairs. On the other hand, the
diagonal elements represent the updated variances (covariance of the asset return with
itself) of the individual assets.
Notice that updated variances were utilized in the previous approach as well. Thus,
correlation-weighted simulation is an even richer analytical tool than volatility-
weighted simulation because it allows for updated variances (volatilities) as well as
covariances (correlations).
Filtered Historical Simulation
The filtered historical simulation is the most comprehensive, and hence most
complicated, of the nonparametric estimators. The process combines the historical
simulation model with conditional volatility models (like GARCH or asymmetric
GARCH). Thus, the method contains both the attractions of the traditional historical
simulation approach with the sophistication of models that incorporate changing
volatility. In simplified terms, the model is flexible enough to capture conditional
volatility and volatility clustering as well as a surprise factor that could have an
asymmetric effect on volatility.
The model will forecast volatility for each day in the sample period and the volatility
will be standardized by dividing by realized returns. Bootstrapping is used to simulate
returns which incorporate the current volatility level. Finally, the VaR is identified from
the simulated distribution. The methodology can be extended over longer holding
periods or for multi-asset portfolios.
In sum, the filtered historical simulation method uses bootstrapping and combines the
traditional historical simulation approach with rich volatility modeling. The results are
then sensitive to changing market conditions and can predict losses outside the
historical range. From a computational standpoint, this method is very reasonable even
for large portfolios, and empirical evidence supports its predictive ability.
Advantages and Disadvantages of Nonparametric Methods
LO 2.d: Identify advantages and disadvantages of non-parametric estimation
methods.
Any risk manager should be prepared to use nonparametric estimation techniques.
There are some clear advantages to nonparametric methods, but there is some danger
as well. Therefore, it is incumbent to understand the advantages, the disadvantages, and
the appropriateness of the methodology for analysis.
Advantages of nonparametric methods include the following:
Intuitive and often computationally simple (even on a spreadsheet).
Not hindered by parametric violations of skewness, fat tails, et cetera.
Avoids complex variance-covariance matrices and dimension problems.
Data is often readily available and does not require adjustments (e.g., financial
statements adjustments).
Can accommodate more complex analysis (e.g., by incorporating age-weighting with
volatility-weighting).
Disadvantages of nonparametric methods include the following:
Analysis depends critically on historical data.
Volatile data periods lead to VaR and ES estimates that are too high.
Quiet data periods lead to VaR and ES estimates that are too low.
Difficult to detect structural shifts/regime changes in the data.
Cannot accommodate plausible large impact events if they did not occur within the
sample period.
Difficult to estimate losses significantly larger than the maximum loss within the data
set (historical simulation cannot; volatility-weighting can, to some degree).
Need sufficient data, which may not be possible for new instruments or markets.
MODULE QUIZ 2.1
1. Johanna Roberto has collected a data set of 1,000 daily observations on equity returns. She is
concerned about the appropriateness of using parametric techniques as the data appears skewed.
Ultimately, she decides to use historical simulation and bootstrapping to estimate the 5% VaR.
Which of the following steps is most likely to be part of the estimation procedure?
A. Filter the data to remove the obvious outliers.
B. Repeated sampling with replacement.
C. Identify the tail region from reordering the original data.
D. Apply a weighting procedure to reduce the impact of older data.
2. All of the following approaches improve the traditional historical simulation approach for
estimating VaR except the:
A. volatility-weighted historical simulation.
B. age-weighted historical simulation.
C. market-weighted historical simulation.
D. correlation-weighted historical simulation.
3. Which of the following statements about age-weighting is most accurate?
A. The age-weighting procedure incorporates estimates from GARCH models.
B. If the decay factor in the model is close to 1, there is persistence within the data set.
C. When using this approach, the weight assigned on day i is equal to w(i)= λi–1×(1−λ) / (1−λi).
D. The number of observations should at least exceed 250.
4. Which of the following statements about volatility-weighting is true?
A. Historic returns are adjusted, and the VaR calculation is more complicated.
B. Historic returns are adjusted, and the VaR calculation procedure is the same.
C. Current period returns are adjusted, and the VaR calculation is more complicated.
D. Current period returns are adjusted, and the VaR calculation is the same.
5. All of the following items are generally considered advantages of nonparametric estimation
methods except:
A. ability to accommodate skewed data.
B. availability of data.
C. use of historical data.
D. little or no reliance on covariance matrices.
KEY CONCEPTS
LO 2.a
Bootstrapping involves resampling a subset of the original data set with replacement.
Each draw (subsample) yields a coherent risk measure (VaR or ES). The average of the
risk measures across all samples is then the best estimate.
LO 2.b
The discreteness of historical data reduces the number of possible VaR estimates since
historical simulation cannot adjust for significance levels between ordered
observations. However, nonparametric density estimation allows the original histogram
to be modified to fill in these gaps. The process connects the midpoints between
successive columns in the histogram. The area is then “removed” from the upper bar
and “placed” in the lower bar, which creates a “smooth” function between the original
data points.
LO 2.c
One important limitation to the historical simulation method is the equal weight
assumed for all data in the estimation period, and zero weight otherwise. This arbitrary
methodology can be improved by using age-weighted simulation, volatility-weighted
simulation, correlation-weighted simulation, and filtered historical simulation.
The age-weighted simulation method adjusts the most recent (distant) observations to
be more (less) heavily weighted.
The volatility-weighting procedure incorporates the possibility that volatility may
change over the estimation period, which may understate or overstate current risk by
including stale data. The procedure replaces historic returns with volatility-adjusted
returns; however, the actual procedure of estimating VaR is unchanged (i.e., only the
data inputs change).
Correlation-weighted simulation updates the variance-covariance matrix between the
assets in the portfolio. The off-diagonal elements represent the covariance pairs while
the diagonal elements update the individual variance estimates. Therefore, the
correlation-weighted methodology is more general than the volatility-weighting
procedure by incorporating both variance and covariance adjustments.
Filtered historical simulation is the most complex estimation method. The procedure
relies on bootstrapping of standardized returns based on volatility forecasts. The
volatility forecasts arise from GARCH or similar models and are able to capture
conditional volatility, volatility clustering, and/or asymmetry.
LO 2.d
Advantages of nonparametric models include: data can be skewed or have fat tails; they
are conceptually straightforward; there is readily available data; and they can
accommodate more complex analysis. Disadvantages focus mainly on the use of
historical data, which limits the VaR forecast to (approximately) the maximum loss in
the data set; they are slow to respond to changing market conditions; they are affected
by volatile (quiet) data periods; and they cannot accommodate plausible large losses if
not in the data set.
ANSWER KEY FOR MODULE QUIZ
Module Quiz 2.1
1. B Bootstrapping from historical simulation involves repeated sampling with
replacement. The 5% VaR is recorded from each sample draw. The average of the
VaRs from all the draws is the VaR estimate. The bootstrapping procedure does
not involve filtering the data or weighting observations. Note that the VaR from
the original data set is not used in the analysis. (LO 2.a)
2. C Market-weighted historical simulation is not discussed in this reading. Age-
weighted historical simulation weights observations higher when they appear
closer to the event date. Volatility-weighted historical simulation adjusts for
changing volatility levels in the data. Correlation-weighted historical simulation
incorporates anticipated changes in correlation between assets in the portfolio.
(LO 2.c)
3. B If the intensity parameter (i.e., decay factor) is close to 1, there will be persistence
(i.e., slow decay) in the estimate. The expression for the weight on day i has i in the
exponent when it should be n. While a large sample size is generally preferred,
some of the data may no longer be representative in a large sample. (LO 2.c)
4. B The volatility-weighting method adjusts historic returns for current volatility.
Specifically, return at time t is multiplied by (current volatility estimate /
volatility estimate at time t). However, the actual procedure for calculating VaR
using a historical simulation method is unchanged; it is only the inputted data that
changes. (LO 2.c)
5. C The use of historical data in nonparametric analysis is a disadvantage, not an
advantage. If the estimation period was quiet (volatile) then the estimated risk
measures may understate (overstate) the current risk level. Generally, the largest
VaR cannot exceed the largest loss in the historical period. On the other hand, the
remaining choices are all considered advantages of nonparametric methods. For
instance, the nonparametric nature of the analysis can accommodate skewed data,
data points are readily available, and there is no requirement for estimates of
covariance matrices. (LO 2.d)
1 Boudoukh, J., M. Richardson, and R. Whitelaw. 1998. “The best of both worlds: a hybrid approach to calculating
value at risk.” Risk 11: 64–67.
2 Hull,J., and A. White. 1998. “Incorporating volatility updating into the historical simulation method for value-at-
risk.” Journal of Risk 1: 5–19.