In [1]:
%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
In [2]:
from IPython.html.widgets import interact
Write a function char_probs
that takes a string and computes the probabilities of each character in the string:
In [27]:
def char_probs(s):
"""Find the probabilities of the unique characters in the string s.
Parameters
----------
s : str
A string of characters.
Returns
-------
probs : dict
A dictionary whose keys are the unique characters in s and whose values
are the probabilities of those characters.
"""
chars = list(s)
chardic = {}
probs = {}
for ch in chars:
if ch in chardic:
chardic[ch]+=1
else:
chardic[ch]=1
for ch in chardic:
probs[ch]=chardic[ch]/len(chars)
return probs
In [28]:
test1 = char_probs('aaaa')
assert np.allclose(test1['a'], 1.0)
test2 = char_probs('aabb')
assert np.allclose(test2['a'], 0.5)
assert np.allclose(test2['b'], 0.5)
test3 = char_probs('abcd')
assert np.allclose(test3['a'], 0.25)
assert np.allclose(test3['b'], 0.25)
assert np.allclose(test3['c'], 0.25)
assert np.allclose(test3['d'], 0.25)
The entropy is a quantiative measure of the disorder of a probability distribution. It is used extensively in Physics, Statistics, Machine Learning, Computer Science and Information Science. Given a set of probabilities $P_i$, the entropy is defined as:
$$H = - \Sigma_i P_i \log_2(P_i)$$In this expression $\log_2$ is the base 2 log (np.log2
), which is commonly used in information science. In Physics the natural log is often used in the definition of entropy.
Write a funtion entropy
that computes the entropy of a probability distribution. The probability distribution will be passed as a Python dict
: the values in the dict
will be the probabilities.
To compute the entropy, you should:
dict
to a Numpy array of probabilities.np.log2
, etc.) to compute the entropy.for
or while
loops in your code.
In [29]:
def entropy(d):
"""Compute the entropy of a dict d whose values are probabilities."""
probs = np.array(list(d.values()))
probdist = probs*np.log2(probs)
H=-np.cumsum(probdist)[-1]
return H
In [30]:
assert np.allclose(entropy({'a': 0.5, 'b': 0.5}), 1.0)
assert np.allclose(entropy({'a': 1.0}), 0.0)
Use IPython's interact
function to create a user interface that allows you to type a string into a text box and see the entropy of the character probabilities of the string.
In [33]:
def show_entropy(s):
d=char_probs(s)
H=entropy(d)
print(H)
interact(show_entropy, s="Hello World");
In [32]:
assert True # use this for grading the pi digits histogram
In [ ]:
In [ ]: