Gradients

$\pi$


In [4]:
from math import sin, cos
import matplotlib.pyplot as plt

In [5]:
def any_function(x):
    return x**2 + sin(x) - x**cos(x*2)

In [10]:
plt.plot([1,2,3,4])
plt.ylabel('some numbers')
plt.show()


Linear Regression Example

A simple Linear Regression based on Least Squares is usually solved by $\mathbf{w} = (\mathbf{X}\mathbf{X}^T)^{-1}\mathbf{X}\hat{y}$ - i.e. it requires (relatively) expensive operation like matrix multiplication and matrix inverse. Sometimes the data is large and using linear algebra is simply not possible.

Gradiens come to help! By using Gradient Descent on the Squared Loss $E = \frac{1}{2}\sum\limits_{\mathbf{x}_i \in \mathbf{X}} (f(\mathbf{x}_i) - y)^2$ we may be able to find hopefully the right weights.

The derivative of $E$ with respect to a 'weight' $w_i$ in $f(\cdot)$ is $ E'_{w_i} = \frac{dE}{dw_i}$


In [ ]:

Notes

At times, especially for 'difficult' functions, it is nescessary to comptue the gradient using Automatic Differentiation


In [ ]: