In this lab, you will learn the basics of differentiation.

Estimated Time Needed: **25 min**

The following are the libraries we are going to use for this lab.

```
In [ ]:
```# These are the libraries will be useing for this lab.
import torch
import matplotlib.pylab as plt
import torch.functional as F

`x`

and set the parameter `requires_grad`

to true because you are going to take the derivative of the tensor.

```
In [ ]:
```# Create a tensor x
x = torch.tensor(2.0, requires_grad = True)
print("The tensor x: ", x)

Then let us create a tensor according to the equation $ y=x^2 $.

```
In [ ]:
```# Create a tensor y according to y = x^2
y = x ** 2
print("The result of y = x^2: ", y)

Then let us take the derivative with respect x at x = 2

```
In [ ]:
```# Take the derivative. Try to print out the derivative at the value x = 2
y.backward()
print("The dervative at x = 2: ", x.grad)

The preceding lines perform the following operation:

$\frac{\mathrm{dy(x)}}{\mathrm{dx}}=2x$

$\frac{\mathrm{dy(x=2)}}{\mathrm{dx}}=2(2)=4$

Let us try to calculate the derivative for a more complicated function.

```
In [ ]:
```# Calculate the y = x^2 + 2x + 1, then find the derivative
x = torch.tensor(2.0, requires_grad = True)
y = x ** 2 + 2 * x + 1
print("The result of y = x^2 + 2x + 1: ", y)
y.backward()
print("The dervative at x = 2: ", x.grad)

The function is in the following form: $y=x^{2}+2x+1$

The derivative is given by:

$\frac{\mathrm{dy(x)}}{\mathrm{dx}}=2x+2$

$\frac{\mathrm{dy(x=2)}}{\mathrm{dx}}=2(2)+2=6$

Determine the derivative of $ y = 2x^3+x $ at $x=1$

```
In [ ]:
```# Practice: Calculate the derivative of y = 2x^3 + x at x = 1
# Type your code here

Double-click **here** for the solution.

We can also calculate **Partial Derivatives**. Consider the function: $f(u,v)=vu+u^{2}$

Let us create `u`

tensor, `v`

tensor and `f`

tensor

```
In [ ]:
```# Calculate f(u, v) = v * u + u^2 at u = 1, v = 2
u = torch.tensor(1.0,requires_grad=True)
v = torch.tensor(2.0,requires_grad=True)
f = u * v + u ** 2
print("The result of v * u + u^2: ", f)

This is equivalent to the following:

$f(u=1,v=2)=(2)(1)+1^{2}=3$

Now let us take the derivative with respect to `u`

:

```
In [ ]:
```# Calculate the derivative with respect to u
f.backward()
print("The partial derivative with respect to u: ", u.grad)

the expression is given by:

$\frac{\mathrm{\partial f(u,v)}}{\partial {u}}=v+2u$

$\frac{\mathrm{\partial f(u=1,v=2)}}{\partial {u}}=2+2(1)=4$

Now, take the derivative with respect to `v`

:

```
In [ ]:
```# Calculate the derivative with respect to v
print("The partial derivative with respect to u: ", v.grad)

The equation is given by:

$\frac{\mathrm{\partial f(u,v)}}{\partial {v}}=u$

$\frac{\mathrm{\partial f(u=1,v=2)}}{\partial {v}}=1$

```
In [ ]:
```# Calculate the derivative with multiple values
x = torch.linspace(-10, 10, 10, requires_grad = True)
Y = x ** 2
y = torch.sum(x ** 2)

We can plot the function and its derivative

```
In [ ]:
```# Take the derivative with respect to multiple value. Plot out the function and its derivative
y.backward()
plt.plot(x.detach().numpy(), Y.detach().numpy(), label = 'function')
plt.plot(x.detach().numpy(), x.grad.detach().numpy(), label = 'derivative')
plt.xlabel('x')
plt.legend()
plt.show()

**relu** activation function is an essential function in neural networks. We can take the derivative as follows:

```
In [ ]:
```import torch.nn.functional as F

```
In [ ]:
```# Take the derivative of Relu with respect to multiple value. Plot out the function and its derivative
x = torch.linspace(-3, 3, 100, requires_grad = True)
Y = F.relu(x)
y = Y.sum()
y.backward()
plt.plot(x.detach().numpy(), Y.detach().numpy(), label = 'function')
plt.plot(x.detach().numpy(), x.grad.detach().numpy(), label = 'derivative')
plt.xlabel('x')
plt.legend()
plt.show()

```
In [ ]:
```# Practice: Calculate the derivative of f = u * v + (u * v) ** 2 at u = 2, v = 1
# Type the code here

Double-click **here** for the solution.

Joseph Santarcangelo has a PhD in Electrical Engineering, his research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD.

Other contributors: Michelle Carey, Mavis Zhou