**Outline**
**Hidden Markov Model in TensorFlow**
The HMM is a sequence model. A sequence model or sequence classifier is a model whose job is to assign a label or class to each unit in a sequence, thus mapping a sequence of observations to a sequence of labels. An HMM is a probabilistic sequence model: given a sequence of units (words, letters, morphemes, sentences, whatever), they compute a probability distribution over possible sequences of labels and choose the best label sequence.
Sequence labeling tasks come up throughout speech and language processing, a fact that isn’t too surprising if we consider that language consists of sequences at many representational levels. [1].
The code can be find at:
https://github.com/MarvinBertin/HiddenMarkovModel_TensorFlow
An example: The MusArt Music-Retrieval System [2]