Algorithms Exercise 3

Imports


In [1]:
%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np

In [2]:
from IPython.html.widgets import interact


:0: FutureWarning: IPython widgets are experimental and may change in the future.

Character counting and entropy

Write a function char_probs that takes a string and computes the probabilities of each character in the string:

  • First do a character count and store the result in a dictionary.
  • Then divide each character counts by the total number of character to compute the normalized probabilties.
  • Return the dictionary of characters (keys) and probabilities (values).

In [22]:
def char_probs(s):
    """Find the probabilities of the unique characters in the string s.
    
    Parameters
    ----------
    s : str
        A string of characters.
    
    Returns
    -------
    probs : dict
        A dictionary whose keys are the unique characters in s and whose values
        are the probabilities of those characters.
    """
    a = {}
    for n in s:
        if n in s:
            i = s.count(n)
            a[n] = i
    b = a
    for m in b:
        b[m] = a[m]/len(s)
    return b

In [23]:
test1 = char_probs('aaaa')
assert np.allclose(test1['a'], 1.0)
test2 = char_probs('aabb')
assert np.allclose(test2['a'], 0.5)
assert np.allclose(test2['b'], 0.5)
test3 = char_probs('abcd')
assert np.allclose(test3['a'], 0.25)
assert np.allclose(test3['b'], 0.25)
assert np.allclose(test3['c'], 0.25)
assert np.allclose(test3['d'], 0.25)

The entropy is a quantiative measure of the disorder of a probability distribution. It is used extensively in Physics, Statistics, Machine Learning, Computer Science and Information Science. Given a set of probabilities $P_i$, the entropy is defined as:

$$H = - \Sigma_i P_i \log_2(P_i)$$

In this expression $\log_2$ is the base 2 log (np.log2), which is commonly used in information science. In Physics the natural log is often used in the definition of entropy.

Write a funtion entropy that computes the entropy of a probability distribution. The probability distribution will be passed as a Python dict: the values in the dict will be the probabilities.

To compute the entropy, you should:

  • First convert the values (probabilities) of the dict to a Numpy array of probabilities.
  • Then use other Numpy functions (np.log2, etc.) to compute the entropy.
  • Don't use any for or while loops in your code.

In [70]:
def entropy(d):
    """Compute the entropy of a dict d whose values are probabilities."""
    c = np.array(d.values)
    H = -max(np.cumsum(np.dot(c,np.log2(c))))
    return H

In [71]:
entropy({'a':0.5,'b':0.5})


---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-71-ae26bc387ccc> in <module>()
----> 1 entropy({'a':0.5,'b':0.5})

<ipython-input-70-c5dbf4cbf258> in entropy(d)
      2     """Compute the entropy of a dict d whose values are probabilities."""
      3     c = np.array(d.values)
----> 4     H = -max(np.cumsum(np.dot(c,np.log2(c))))
      5     return H

AttributeError: 'builtin_function_or_method' object has no attribute 'log2'

In [45]:
f={'a':0.5,'b':0.5}

In [69]:
f.values


Out[69]:
<function dict.values>

In [52]:
ff = np.array(f)
ff


Out[52]:
array({'b': 0.5, 'a': 0.5}, dtype=object)

In [66]:
assert np.allclose(entropy({'a': 0.5, 'b': 0.5}), 1.0)
assert np.allclose(entropy({'a': 1.0}), 0.0)


---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-66-2b809aa64202> in <module>()
----> 1 assert np.allclose(entropy({'a': 0.5, 'b': 0.5}), 1.0)
      2 assert np.allclose(entropy({'a': 1.0}), 0.0)

<ipython-input-65-a7e31ee663c7> in entropy(d)
      2     """Compute the entropy of a dict d whose values are probabilities."""
      3     c = np.array(d.values)
----> 4     H = -max(np.cumsum(c*np.log2(c)))
      5     return H

AttributeError: 'builtin_function_or_method' object has no attribute 'log2'

Use IPython's interact function to create a user interface that allows you to type a string into a text box and see the entropy of the character probabilities of the string.


In [21]:
interact(entropy(char_probs), s='Type string here')


---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-21-e5200f232820> in <module>()
----> 1 interact(entropy(char_probs), s='Type string here')

<ipython-input-15-7003c808634d> in entropy(d)
      1 def entropy(d):
      2     """Compute the entropy of a dict d whose values are probabilities."""
----> 3     c = np.array(d[0])
      4     return c

TypeError: 'function' object is not subscriptable

In [ ]:
assert True # use this for grading the pi digits histogram