This paper extends Markov chain bootstrapping to the case of multivariate continuous-valued stochastic processes. To this purpose, we follow the approach of searching an optimal partition of the state space of an observed (multivariate) time series. The optimization problem is based on a distance indicator calculated on the transition probabilities of the Markov chain. Such criterion aims at grouping those states exhibiting similar transition probabilities.Asecond methodological contribution is represented by the addition of a contiguity constraint, which is introduced to force the states to group only if they have “near” values (in the state space). This requirement meets two important aspects: first, it allows amore intuitive interpretation of the results; second, it contributes to control the complexity of the problem, which explodes with the cardinality of the states. The computational complexity of the optimization problem is also addressed through the introduction of a novel Tabu Search algorithm, which improves both the quality of the solution found and the computing times with respect to a similar heuristic previously advanced in the literature. The bootstrap method is applied to two empirical cases: the bivariate process of prices and volumes of electricity in the Spanish market; the trivariate process composed of prices and volumes of a US company stock (McDonald’s) and prices of theDowJones IndustrialAverage index. In addition, the method is compared with two other well-established bootstrap methods. The results show the good distributional properties of the present proposal, as well as a clear superiority in reproducing the dependence among the data.
Approximating multivariate Markov chains for bootstrapping through contiguous partitions / Cerqueti, Roy; Falbo, Paolo; Guastaroba, Gianfranco; Pelizzari, Cristian. - In: OR SPECTRUM. - ISSN 1436-6304. - 37:(2015), pp. 803-841.
Approximating multivariate Markov chains for bootstrapping through contiguous partitions
CERQUETI, ROY;
2015
Abstract
This paper extends Markov chain bootstrapping to the case of multivariate continuous-valued stochastic processes. To this purpose, we follow the approach of searching an optimal partition of the state space of an observed (multivariate) time series. The optimization problem is based on a distance indicator calculated on the transition probabilities of the Markov chain. Such criterion aims at grouping those states exhibiting similar transition probabilities.Asecond methodological contribution is represented by the addition of a contiguity constraint, which is introduced to force the states to group only if they have “near” values (in the state space). This requirement meets two important aspects: first, it allows amore intuitive interpretation of the results; second, it contributes to control the complexity of the problem, which explodes with the cardinality of the states. The computational complexity of the optimization problem is also addressed through the introduction of a novel Tabu Search algorithm, which improves both the quality of the solution found and the computing times with respect to a similar heuristic previously advanced in the literature. The bootstrap method is applied to two empirical cases: the bivariate process of prices and volumes of electricity in the Spanish market; the trivariate process composed of prices and volumes of a US company stock (McDonald’s) and prices of theDowJones IndustrialAverage index. In addition, the method is compared with two other well-established bootstrap methods. The results show the good distributional properties of the present proposal, as well as a clear superiority in reproducing the dependence among the data.File | Dimensione | Formato | |
---|---|---|---|
OR-Spectrum.pdf
solo gestori archivio
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
718.34 kB
Formato
Adobe PDF
|
718.34 kB | Adobe PDF | Contatta l'autore |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.