Nonparametric identification and maximum likelihood estimation for hidden Markov models
Hajo Holzmann (University of Marburg)
Wednesday 12th October, 2016 15:00-16:00 Maths 203
Finite-state hidden Markov models (HMMs), also called Markov-dependent finite mixtures, form a popular, frequently used model class for serially dependent observations with unobserved heterogeneity.
In this talk nonparametric identification and maximum likelihood estimation for finite-state HMMs are investigated. We obtain identification of the parameters as well as the order of the Markov chain if the transition probability matrices have full-rank and are ergodic, and if the state-dependent distributions are all distinct, but not necessarily linearly independent. Based on this identification result, we develop nonparametric maximum likelihood estimation theory. First, we show that the asymptotic contrast, the Kullback--Leibler divergence of the hidden Markov model, identifies the true parameter vector nonparametrically as well. Second, for classes of state-dependent densities which are arbitrary mixtures of a parametric family, we show consistency of the nonparametric maximum likelihood estimator. Here, identification of the mixing distributions need not be assumed.
Consequences for a-posteriori clustering when using HMMs with state-dependent finite mixtures are also discussed.