In [8]:
%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
In [9]:
from IPython.html.widgets import interact
Write a function char_probs that takes a string and computes the probabilities of each character in the string:
In [13]:
def char_probs(s):
"""Find the probabilities of the unique characters in the string s.
Parameters
----------
s : str
A string of characters.
Returns
-------
probs : dict
A dictionary whose keys are the unique characters in s and whose values
are the probabilities of those characters.
"""
dictionary = {}
for n in s:
dictionary[n]= (s.count(n))/len(s)
return dictionary
Out[13]:
In [14]:
test1 = char_probs('aaaa')
assert np.allclose(test1['a'], 1.0)
test2 = char_probs('aabb')
assert np.allclose(test2['a'], 0.5)
assert np.allclose(test2['b'], 0.5)
test3 = char_probs('abcd')
assert np.allclose(test3['a'], 0.25)
assert np.allclose(test3['b'], 0.25)
assert np.allclose(test3['c'], 0.25)
assert np.allclose(test3['d'], 0.25)
The entropy is a quantiative measure of the disorder of a probability distribution. It is used extensively in Physics, Statistics, Machine Learning, Computer Science and Information Science. Given a set of probabilities $P_i$, the entropy is defined as:
$$H = - \Sigma_i P_i \log_2(P_i)$$In this expression $\log_2$ is the base 2 log (np.log2), which is commonly used in information science. In Physics the natural log is often used in the definition of entropy.
Write a funtion entropy that computes the entropy of a probability distribution. The probability distribution will be passed as a Python dict: the values in the dict will be the probabilities.
To compute the entropy, you should:
dict to a Numpy array of probabilities.np.log2, etc.) to compute the entropy.for or while loops in your code.
In [92]:
def entropy(d):
"""Compute the entropy of a dict d whose values are probabilities."""
"""Return a list of 2-tuples of (word, count), sorted by count descending."""
#t = np.array(d)
#t = np.sort(t)
H = 0
l = [(i,d[i]) for i in d]
t = sorted(l, key = lambda x:x[1], reverse = True)
for n in t:
H = H + (n[1])*np.log2(n[1])
#t = char_probs(t)*np.log2(char_probs(t))
return -H
entropy({'a': 0.5, 'b': 0.5})
Out[92]:
In [93]:
assert np.allclose(entropy({'a': 0.5, 'b': 0.5}), 1.0)
assert np.allclose(entropy({'a': 1.0}), 0.0)
Use IPython's interact function to create a user interface that allows you to type a string into a text box and see the entropy of the character probabilities of the string.
In [102]:
def z(x):
print(entropy(char_probs(x)))
return entropy(char_probs(x))
In [103]:
interact(z, x='string');
In [ ]:
assert True # use this for grading the pi digits histogram