Hidden Markov Model Example

authors:
Jacob Schreiber [jmschreiber91@gmail.com]
Nicholas Farn [nicholasfarn@gmail.com]

A simple example highlighting how to build a model using states, add transitions, and then run the algorithms, including showing how training on a sequence improves the probability of the sequence.


In [1]:
import random
from pomegranate import *

random.seed(0)

First we will create the states of the model, one uniform and one normal.


In [2]:
state1 = State( UniformDistribution(0.0, 1.0), name="uniform" )
state2 = State( NormalDistribution(0, 2), name="normal" )

We will then create the model by creating a HiddenMarkovModel instance. Then we will add the states.


In [3]:
model = HiddenMarkovModel( name="ExampleModel" )
model.add_state( state1 )
model.add_state( state2 )

Now we'll add the start states to the model.


In [4]:
model.add_transition( model.start, state1, 0.5 )
model.add_transition( model.start, state2, 0.5 )

And the transition matrix.


In [5]:
model.add_transition( state1, state1, 0.4 )
model.add_transition( state1, state2, 0.4 )
model.add_transition( state2, state2, 0.4 )
model.add_transition( state2, state1, 0.4 )

Finally the ending states to the model.


In [6]:
model.add_transition( state1, model.end, 0.2 )
model.add_transition( state2, model.end, 0.2 )

To finalize the model, we "bake" it.


In [7]:
model.bake()

New we'll create a sample sequence using our model.


In [8]:
sequence = model.sample()
print(sequence)


[-0.42128476]

Now we'll feed the sequence through a forward algorithm with our model.


In [9]:
print(model.forward( sequence )[ len(sequence), model.end_index ])


-3.9368559131089986

Next we'll do the same, except with a backwards algorithm.


In [10]:
print(model.backward( sequence )[0,model.start_index])


-3.9368559131089986

Then we'll feed the sequence again, through a forward-backward algorithm.


In [11]:
trans, ems = model.forward_backward( sequence )
print(trans)
print(ems)


[[0. 0. 0. 1.]
 [0. 0. 0. 0.]
 [1. 0. 0. 0.]
 [0. 0. 0. 0.]]
[[  0. -inf]]

Finally we'll train our model with our example sequence.


In [12]:
model.fit( [ sequence ] )


Out[12]:
{
    "class" : "HiddenMarkovModel",
    "name" : "ExampleModel",
    "start" : {
        "class" : "State",
        "distribution" : null,
        "name" : "ExampleModel-start",
        "weight" : 1.0
    },
    "end" : {
        "class" : "State",
        "distribution" : null,
        "name" : "ExampleModel-end",
        "weight" : 1.0
    },
    "states" : [
        {
            "class" : "State",
            "distribution" : {
                "class" : "Distribution",
                "name" : "NormalDistribution",
                "parameters" : [
                    NaN,
                    NaN
                ],
                "frozen" : false
            },
            "name" : "normal",
            "weight" : 1.0
        },
        {
            "class" : "State",
            "distribution" : {
                "class" : "Distribution",
                "name" : "UniformDistribution",
                "parameters" : [
                    0.0,
                    1.0
                ],
                "frozen" : false
            },
            "name" : "uniform",
            "weight" : 1.0
        },
        {
            "class" : "State",
            "distribution" : null,
            "name" : "ExampleModel-start",
            "weight" : 1.0
        },
        {
            "class" : "State",
            "distribution" : null,
            "name" : "ExampleModel-end",
            "weight" : 1.0
        }
    ],
    "end_index" : 3,
    "start_index" : 2,
    "silent_index" : 2,
    "edges" : [
        [
            2,
            1,
            0.0,
            0.5,
            null
        ],
        [
            2,
            0,
            1.0,
            0.5,
            null
        ],
        [
            1,
            1,
            0.4,
            0.4,
            null
        ],
        [
            1,
            0,
            0.4,
            0.4,
            null
        ],
        [
            1,
            3,
            0.20000000000000004,
            0.2,
            null
        ],
        [
            0,
            0,
            0.0,
            0.4,
            null
        ],
        [
            0,
            1,
            0.0,
            0.4,
            null
        ],
        [
            0,
            3,
            1.0,
            0.2,
            null
        ]
    ],
    "distribution ties" : []
}

Then repeat the algorithms we fed the sequence through before on our improved model.


In [13]:
print("Forward")
print(model.forward( sequence )[ len(sequence), model.end_index ])
print()
print("Backward")
print(model.backward( sequence )[ 0,model.start_index ])
print()
trans, ems = model.forward_backward( sequence )
print(trans)
print(ems)


Forward
nan

Backward
nan

[[nan nan  0. nan]
 [nan nan  0. nan]
 [nan nan  0.  0.]
 [ 0.  0.  0.  0.]]
[[nan nan]]