Christophe Gouel and Nicolas Legrand, "Estimating the Competitive
Storage Model with Trending Commodity Prices", Journal of Applied
Econometrics, Vol. 32, No. 4, 2017, pp. 744-763.
All files are zipped in the file gl-files.zip, which contains the
following folders:
Analysis Article Data Figures R Results
All text files are in Unix format.
This readme file documents the data and the programs allowing to
reproduce all results in the article. Figures are drawn in
[R](https://cran.r-project.org/) and estimations are carried out in
[MATLAB](http://www.mathworks.com/).
The results can be replicated by launching successively
1. The R file `LaunchBeforeEstimations.R`.
2. The makefile in the Analysis folder.
3. The R file `LaunchAfterEstimations.R`.
4. The makefile in the Analysis/MCstudy folder (to replicate the Monte-Carlo
experiments in the online appendix).
DATA
====
The data are available in the Data folder. Source data are in the
Data_Downloads subfolder. Other data files are secondary files created
by the programs.
Files organization:
- `CopperConsumption-1960-2013.csv`: copper consumption from the pdf
of The World Copper Factbook 2014 available from [the International
Copper Study Group website](http://www.icsg.org/). Not used in the
paper, but some statistics are calculated on it in the R files.
- `EXCHANGEPOUND_1900-2011.csv`: exchange rate pound/$. Used in
coordination with the HPIM index which is in pound.
- `gycpi-2011-01.csv`: Grilli and Yang updated data coming from
[Stephan Pfaffenzeller's website]
(http://www.stephan-pfaffenzeller.com/cpi.html).
- `hpim-1650-2014.csv`: historical price index of manufactures from
Harvey et al. (2010), The Review of Economics and Statistics. Not used
in the paper, but called in the programs for robustness checks.
- `Production.csv`: Production data from various sources (FAOSTAT, British
Geological Survey, World Copper Factbook).
- `StockVar1000-1961-2011.csv`: Stock variations from FAOSTAT.
- `uscpi-1800-2012.csv`: US CPI from [Minneapolis Fed]
(https://www.minneapolisfed.org/community/teaching-aids/cpi-calculator-information/consumer-price-index-1800).
DEPENDENCIES
============
The results have been obtained on a Linux computer running Ubuntu
12.04.5 (64-bit), using the following softwares and libraries. With 2
quad-core Intel Xeon E5345 (2.33 GHz) processors and 32 GB of RAM, it
takes many days for all estimations to complete. This is because of
the very large number of likelihood evaluations required by the
particle swarm algorithm to find a global maximum. With a simpler
optimization solver, a few minutes/seconds would be enough to get a
local maximum.
MATLAB R2014a (64-bit)
----------------------
MATLAB programs depend on the following packages:
- [econometric-tools commit 89da288]
(https://github.com/christophe-gouel/econometric-tools), licensed
under Expat, present in the `Analysis` folder.
- [PSwarm v2.1](http://www.norg.uminho.pt/aivaz/pswarm/), licensed under GNU
LGPL, present in the `Analysis` folder.
R version 3.3.1 (64-bit)
------------------------
List of packages from `sessionInfo()`:
- Base packages: base, datasets, graphics, grDevices, methods, splines, stats,
utils
- Other packages: chron 2.3-47, cmrutils 1.3, Formula 1.2-1, ggplot2
2.1.0, Hmisc 3.17-4, lattice 0.20-33, plyr 1.8.3, reshape2 1.4.1,
R.matlab 3.5.1, rms 4.5-0, SparseM 1.7, survival 2.39-4, xtable 1.8-2
- Loaded via a namespace (and not attached): acepack 1.3-3.3, cluster
2.0.4, codetools 0.2-14, colorspace 1.2-6, compiler 3.3.1, data.table
1.9.6, digest 0.6.9, foreign 0.8-66, grid 3.3.1, gridExtra 2.2.1,
gtable 0.2.0, labeling 0.3, latticeExtra 0.6-28, magrittr 1.5, MASS
7.3-45, Matrix 1.2-6, MatrixModels 0.4-1, multcomp 1.4-5, munsell
0.4.3, mvtnorm 1.0-5, nlme 3.1-128, nnet 7.3-12, polspline 1.1.12,
quantreg 5.24, RColorBrewer 1.1-2, Rcpp 0.12.4, R.methodsS3 1.7.1,
R.oo 1.20.0, rpart 4.1-10, R.utils 2.3.0, sandwich 2.3-4, scales
0.4.0, stringi 1.0-1, stringr 1.0.0, TH.data 1.0-7, tools 3.3.1, zoo
1.7-13
TABLE OF CONTENTS
=================
Analysis
--------
Folder that includes the main program files (in MATLAB):
- `AssessAutocor.m` Calculates two-first order coefficents for different
parameterization of the storage model (the data is then processed to
`FiguresAutocorr.R` to create **figure 2**).
- `Autocor.m` Computes the sample autocorrelation of a matrix of a
time series x.
- `CompDataFeatures.m` Processes the various estimation output to
compute the percentiles of the distribution of the predicted data
features on all consecutive subsamples of length T and for all trend
specifications.
- `EstimationXXXXTrend.m` Script running the estimation of the storage
model with the XXXX trend.
- `ExportResults.m` Exports the tables estimation results in LaTeX.
- `HPExamples.m` Applies `HPfilter.m` to the raw data (the data is
then processed in the R-file of the same name to create **figure 1**)
and compute the descriptive statistics (**table A3**).
- `HPFilter.m` Executes a standard HP filter using a sparse matrix.
- `InitLogLikPb.m` Initialization of the problem to be maximized.
- `invprctile.m` Calculates the inverse percentiles of a sample.
- `Makefile` Launches everything.
- `Legendre.m` creates an orthogonal Legendre Polynomial.
- `LogLikelihood.m` Calculates the log-likelihood for given price
observations and given parameters.
- `LogLikVectorized.m` vectorizes the evaluation of the
log-likelihood, allowing parallel evaluation.
- `OverallInitialization.m` Initializes all the general parameters for the
estimations.
- `SimulStorage.m` simulates the storage model once it has been solved.
- `SolveStorageDL.m` solves the storage model with Deaton & Laroque
approach.
- `startup.m` adds folder to path and hide some warnings.
- `TransPolyResults.m` Performs the AIC, the LR tests and exports the
model results to LaTeX for models with polynomial trends (not in the
paper).
- `TransRCSResults.m` Performs the AIC, the LR tests and exports the model
results to LaTeX (**tables 1-6, A4, and A6-A8**)
- `truncated_normal_rule.m` Discretizes the truncated normal
distribution of the supply shock (from
).
- `UnconditionalProbaility.m` Calculates the probability of the first
observation of the Simulated Likelihood Estimator
- `WheatPrice.m` Prepares the price function of wheat without trend to be
plotted in R by `WheatPrice.R`.
### MCstudy
Folder that includes all the functions needed for the Monte-Carlo experiments.
- `FinalRes.m` builds the final tables and extracts the tex files (**tables
A1-A2**).
- `InitMonteCarlo.m` Initialization of the MC experiments.
- `Makefile` Launches everything.
- `Simul_CML.m` MC experiments for the conditional ML.
- `Simul_UML.m` MC experiments for the unconditional ML.
- `startup.m` adds folder to path and hide some warnings.
R
---
- `BSrcspline.R` creates the restricted cubic basis spline matrices
for MATLAB.
- `DataPreparation.R` prepares the data for the estimations.
- `FiguresAutocorr.R`plot the first-order correlation (**figure 2**).
- `functions.R` some functions to save the figures.
- `HPExamples.R`plots examples prices with their HP trend (**figure 1**).
- `LaunchAfterEstimations.R` launches all R files required after
estimations.
- `LaunchBeforeEstimations.R` launches all R files required before
estimations.
- `PlotPrices.R` plots all prices with their respective trends
(**figure A2**).
- `Production.R` calculates the coefficient of variation of production
(**table A5**).
- `WheatPrice.R`plots the price function for the model estimated
without trend for wheat (**figure A1**).
Please address any question to:
Christophe Gouel, christophe.gouel [AT] grignon.inra.fr