represent multivariate probability distributions of time series in terms of each time

series’s dependence on others. In general, it is computationally prohibitive to sta-

tistically infer an arbitrary model from data. However, if we constrain the model to

have a tree topology, the corresponding learning algorithms become tractable. The

expressive power of tree-structured distributions are low, since only n − 1 dependen-

cies are explicitly encoded for an n node tree. One way to improve the expressive

power of tree models is to combine many of them in a mixture model. This work

presents and uses simulations to validate extensions of the standard mixtures of trees

model for i.i.d data to the setting of time series data. We also consider the setting

where the tree mixture itself forms a hidden Markov chain, which could be better

suited for approximating time-varying seasonal data in the real world. Both of these

are evaluated on artificial data sets.