```
In [ ]:
```%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np

```
In [ ]:
```from IPython.html.widgets import interact

Write a function `char_probs`

that takes a string and computes the probabilities of each character in the string:

- First do a character count and store the result in a dictionary.
- Then divide each character counts by the total number of character to compute the normalized probabilties.
- Return the dictionary of characters (keys) and probabilities (values).

```
In [1]:
```def char_probs(s):
l = len(s)
dic = {l}
for i in l:
prob = i/l
"""Find the probabilities of the unique characters in the string s.
Parameters
----------
s : str
A string of characters.
Returns
-------
probs : dict
A dictionary whose keys are the unique characters in s and whose values
are the probabilities of those characters.
"""
# YOUR CODE HERE
raise NotImplementedError()

```
In [ ]:
```test1 = char_probs('aaaa')
assert np.allclose(test1['a'], 1.0)
test2 = char_probs('aabb')
assert np.allclose(test2['a'], 0.5)
assert np.allclose(test2['b'], 0.5)
test3 = char_probs('abcd')
assert np.allclose(test3['a'], 0.25)
assert np.allclose(test3['b'], 0.25)
assert np.allclose(test3['c'], 0.25)
assert np.allclose(test3['d'], 0.25)

The entropy is a quantiative measure of the disorder of a probability distribution. It is used extensively in Physics, Statistics, Machine Learning, Computer Science and Information Science. Given a set of probabilities $P_i$, the entropy is defined as:

$$H = - \Sigma_i P_i \log_2(P_i)$$In this expression $\log_2$ is the base 2 log (`np.log2`

), which is commonly used in information science. In Physics the natural log is often used in the definition of entropy.

Write a funtion `entropy`

that computes the entropy of a probability distribution. The probability distribution will be passed as a Python `dict`

: the values in the `dict`

will be the probabilities.

To compute the entropy, you should:

- First convert the values (probabilities) of the
`dict`

to a Numpy array of probabilities. - Then use other Numpy functions (
`np.log2`

, etc.) to compute the entropy. - Don't use any
`for`

or`while`

loops in your code.

```
In [ ]:
```def entropy(d):
"""Compute the entropy of a dict d whose values are probabilities."""
# YOUR CODE HERE
raise NotImplementedError()

```
In [ ]:
```assert np.allclose(entropy({'a': 0.5, 'b': 0.5}), 1.0)
assert np.allclose(entropy({'a': 1.0}), 0.0)

`interact`

function to create a user interface that allows you to type a string into a text box and see the entropy of the character probabilities of the string.

```
In [ ]:
```# YOUR CODE HERE
raise NotImplementedError()

```
In [ ]:
```assert True # use this for grading the pi digits histogram