Replication files for "Time Varying Extremes", Ulrich Mueller and Mark Watson, September 2024.

************************ Data ******************* 

(1) The raw data files are in the folders \Data\'Application' where 'Application' = WeatherDamages, Returns, CitySize, FirmSize.
(2) The data are processed using Matlab files in the Matlab folder as follows

(a) Weather Datamages:
	Monthly capital stock is computed in Weather_Damages_Capital_Stock_Smooth.m
	Un-normalized monthly damages: Weather_Damages_Events_Data_Monthly.m
	Normalized damages: Weather_Damages_Events_Data_Monthly_Normalized.m
	
(b) Returns
  GARCH (1,1) model and standardized daily returns: Returns_Garch_Daily_Returns.m
  Find Biannual extremes: 
         Returns_Process_Data_Find_Extremes.m
  Write Returns to Excel Files for use in Stan: 
         Returns_Process_Data_Excel_Files.m
  
(c) City Size
    Process Data: 
         CitySize_Data.m
 
(d) Firm Size
  Process Data: 
         FirmSize_Data_process.m
  Write Data to Excel File for Stan: 
         FirmSize_Data_Write_Excel_for_Stan.m
  
*********************** Likelihood-Based Results Reported in Tables 1 and 2 *****************

The are computed in:
(a) WeatherDamages_Likelihood_Results.m  (Run twice, for normalized on un-normalized data)
(b) Returns_Likelihood_Results.m (Run twice, for largest and smallest returns)
(c) CitySize_Likelihood_Results_MultipleK.m (This carries out the analysis for several values of k. The paper reports results for k = 30.)
(d) FirmSize_Likelihood_Results_MultipleK.m (This carries out the analysis for several values of k. The paper reports results for k = 30.)

Critical values (used to produce 'adjusted test statistic's) used for these tests are discussed below 


*************************** Bayes Analysis  **********************
The Stan directory contains: Stan programs for the 'static' analysis, the model with 4 random walks (alpha, xi, sigma, mu) discussed in the paper, and various 'Model_1' and 'Model_2' models for the Bayes factor calculations. These are in the 'Stan_Code' subdirectory.
Also, for each application, there is a subdirectory with R programs that compute the Stan HMC draws. These use the cmdstanr package.

The programs are:
(a) Static_Model.r: This is Bayes analysis of the statitic (non-TVP) model. Results are not reported in the paper for this model.
(b) 4RW_Model.r: This is the TVP model with four RW parameters as described in the paper.
(c) 4RW_v_others.r:  This runs 'Model_1' and 'Model_2' for the constrained models examined in the Bayes Factors. 

Bayes Factors Reported in Table 3:
These are computed in 
WeatherDamages_Bayes_Factor_Calculation.m
Returns_Bayes_Factor_Calculation.m
CitySize_Bayes_Factor_Calculation.m
and
FirmSize_Bayes_Factor_Calculation.m


******************************** Figures ****************************
Figure 1: WeatherDamages_Plot.m
Figure 2: Returns_Plot.m
Figure 3: CitySize_FirmSize_Plot.m
Figure 4: Plot_Paths_All_Illustrains.m (This reads in the HMC draws from 4RW_Model.r and produces the plot. It can also be modified to examine other features of the posterior.)
Figure 5: WeatherDamages_Plot_Appendix.m
Figure 6: Returns_Plot_Appendix.m


***************************** Critical Values / Adjustments for test statistics *********************

The critical value adjustment factors discussed in Sections 2 and 3 and Appendix B are computed using the Fortran program in the Fortran folder. The programs are described in the readme.txt file in that folder. The coefficients for the associated critical value/adjustment factors are saved in CSV files. These are available in the Matlab\CVS_Files folder. They have the following format:

**** Inference Using k largest observations:

Consider four tests:
 Test 0: The Nyblom L test for stability
 Test 2: The LR test for q_90 = q_90_0 (used for confidence intervals in Table 1)
 Test 3: The LR test for mu = sigma/xi
 Test 4: The LR test for mu = sigma/xi and xi = 1. 

For each test, the critical value adjustment depends on k and T. We produced adjustment factors for our applications: The SP500 Returns applications use k=1 and T = 194. The City and Firm applications use k = 30 and T = 4; we also used other values of k (and mention these results in the text), and the folder therefore contains files for T = 4 and several values of k.

For each (test,k,T) we computed the values of (a0,a1,a2) as described in the text and appendix B. Results are in the CSV files labelled: cvx_test?_k_T.csv with ? refers to test, so that ? = (0,2,3,4) and the values of k and T are described above.

Also there is
  Test 1: The LR test for xi = xi_0 (used for confidence intervals in Table 1)

Here, xi is known, so we need a critical for each value of xi_0 considered. These are computed and saved in cvs_test1_k_T.csv, where the first row shows the value of xi_0 and second row shows the critical value. The critical values are a smooth function of xi_0, the results shown in the table interpolates between the grid points contained in the file.


******* Inference Using Exceedances:

We consider the same 5 tests described above. As described in Section 2.5, for each of these tests we use a critical value that depends on xi_hat and 3 parameters (a0, a1, a2). The resulting adjustments depend on T (where T=522 in the Weather damage application) and the value of tau(t). Using the un-normalized weather data tau(t)=tau is time invariant, but is time varying using the normalized weather data. Thus we produce the files

cvx_test?_tau_522.csv and cvx_test?_tvt_522.csv where ? = (0,1,2,3,4) for the time-invariant tau ('tau') and time varying tau ('tvt') cases. 

******* Matlab Template for computing the critical value function
The Matlab script Compute_a0a1a2_CVFunction.m carries out the steps in Appendix B for computing the coefficients a0, a1, and a2 for the Test 3 for inference using the largest k observations with k = 1 and T = 194. (Results from this differ slightly from those shown in cvx_test3_1_194.csv because of different random numbers used and different optimization algorithms.)
