In statistics, a hidden Markov random field is a generalization of a hidden Markov model. Instead of having an underlying Markov chain, hidden Markov random fields have an underlying Markov random field.

Suppose that we observe a random variable , where . Hidden Markov random fields assume that the probabilistic nature of is determined by the unobservable Markov random field , . That is, given the neighbors of is independent of all other (Markov property). The main difference with a hidden Markov model is that neighborhood is not defined in 1 dimension but within a network, i.e. is allowed to have more than the two neighbors that it would have in a Markov chain. The model is formulated in such a way that given , are independent (conditional independence of the observable variables given the Markov random field).

In the vast majority of the related literature, the number of possible latent states is considered a user-defined constant. However, ideas from nonparametric Bayesian statistics, which allow for data-driven inference of the number of states, have been also recently investigated with success, e.g.[1]

See also

References

  1. Sotirios P. Chatzis, Gabriel Tsechpenakis, “The Infinite Hidden Markov Random Field Model,” IEEE Transactions on Neural Networks, vol. 21, no. 6, pp. 1004–1014, June 2010.
  • Yongyue Zhang; Smith, Stephen; Brady, Michael (11 May 2000). "Hidden Markov Random Field Model". Hidden Markov Random Field Model and Segmentation of Brain MR Images. Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB). FMRIB Technical Report TR00YZ1.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.