As the persistence parameter under EWMA is lowered, which of the following would be true:
The persistence parameter, , is the coefficient of the prior day's variance in EWMA calculations. A higher value of the persistence parameter tends to 'persist' the prior value of variance for longer. Consider an extreme example - if the persistence parameter is equal to 1, the variance under EWMA will never change in response to returns.
1 - is the coefficient of recent market returns. As is lowered, 1 - increases, giving a greater weight to recent market returns or shocks. Therefore, as is lowered, the model will react faster to market shocks and give higher weights to recent returns, and at the same time reduce the weight on prior variance which will tend to persist for a shorter period.
The 99% 10-day VaR for a bank is $200mm. The average VaR for the past 60 days is $250mm, and the bank specific regulatory multiplier is 3. What is the bank's basic VaR based market risk capital charge?
The current Basel rules for the basic VaR based charge for market risk capital set market risk capital requirements as the maximum of the following two amounts:
1. 99%/10-day VaR,
2. Regulatory Multiplier x Average 99%/10-day VaR of the past 60 days
The 'regulatory multiplier' is a number between 3 and 4 (inclusive) calculated based on the number of 1% VaR exceedances in the previous 250 days, as determined by backtesting.
- If the number of exceedances is <= 4, then the regulatory multiplier is 3.
- If the number of exceedances is between 5 and 9, then the multiplier = 3 + 0.2*(N-4), where N is the number of exceedances.
- If the number of exceedances is >=10, then the multiplier is 4.
So you can see that in most normal situations the risk capital requirement will be dictated by the multiplier and the prior 60-day average VaR, because the product of these two will almost often be greater than the current 99% VaR.
The correct answer therefore is = max(200mm, 3*250mm) = $750mm.
Interestingly, also note that a 99% VaR should statistically be exceeded 1%*250 days = 2.5 times, which means if the bank's VaR model is performing as it should, it will still need to use a reg multiplier of 3.
Which of the following are valid approaches for extreme value analysis given a dataset:
1. The Block Maxima approach
2. Least squares approach
3. Maximum likelihood approach
4. Peak-over-thresholds approach
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a variety of applications across risk management.
Financial institutions need to take volatility clustering into account:
1. To avoid taking on an undesirable level of risk
2. To know the right level of capital they need to hold
3. To meet regulatory requirements
4. To account for mean reversion in returns
Volatility clustering leads to levels of current volatility that can be significantly different from long run averages. When volatility is running high, institutions need to shed risk, and when it is running low, they can afford to increase returns by taking on more risk for a given amount of capital. An institution's response to changes in volatility can be either to adjust risk, or capital, or both. Accounting for volatility clustering helps institutions manage their risk and capital and therefore statements I and II are correct.
Regulatory requirements do not require volatility clustering to be taken into account (at least not yet). Therefore statement III is not correct, and neither is IV which is completely unrelated to volatility clustering.
Which of the following are valid approaches for extreme value analysis given a dataset:
1. The Block Maxima approach
2. Least squares approach
3. Maximum likelihood approach
4. Peak-over-thresholds approach
For EVT, we use the block maxima or the peaks-over-threshold methods. These provide us the data points that can be fitted to a GEV distribution.
Least squares and maximum likelihood are methods that are used for curve fitting, and they have a variety of applications across risk management.
Currently there are no comments in this discussion, be the first to comment!