Objetivos:

  • Implementar as principais funções de ativação
  • Entender intuitivamente como $w$ e $b$ influenciam nas funções de ativação

Sumário


In [ ]:
import numpy as np
import matplotlib.pyplot as plt
import ipywidgets as wg
from ipywidgets import interactive, fixed

%matplotlib inline

In [ ]:
def plot_interactive(w, b, func, ylim=fixed((0, 1)), show_der=False):
    plt.figure(0)
    
    x = np.linspace(-10, 10, num=1000)
    z = w*x + b
    y = func(z)
    
    plt.plot(x, y, color='blue')
    if show_der:
        der = func(z, derivative=True)
        y_der_z = der
        y_der_x = w*der
        plt.plot(x, y_der_z, color='red')
        plt.plot(x, y_der_x, color='green')
    plt.xlim(-10, 10)
    plt.ylim(ylim[0], ylim[1])
    plt.show()

Linear

$$y=x$$$$y^\prime = 1$$

In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(linear), ylim=fixed((-10, 10)))
interactive_plot

Sigmoid

$$y = \frac{1}{1+e^{-x}}$$$$y^\prime = y(1-y)$$

In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(sigmoid))
interactive_plot

Tanh

$$y = \frac{e^x - e^{-x}}{e^x+e^{-x}}$$$$y^\prime = 1 - y^2$$

In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(tanh), ylim=fixed((-2, 2)))
interactive_plot

Rectified Linear Unit (ReLU)

$$y = max(0, x)$$$$\frac{\partial y}{\partial x} = \begin{cases}0 &,\ x \leq 0\\1 &,\ x > 0\end{cases}$$

Obs.: Lembrando que a derivada da ReLU quando x = 0 não existe matematicamente, mas é convencionalmente definida como 0.


In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(relu), ylim=fixed((-1, 10)))
interactive_plot

Leaky ReLU

$$y = \begin{cases}\alpha x &,\ x \leq 0\\x &,\ x > 0\end{cases}$$$$\frac{\partial y}{\partial x} = \begin{cases}\alpha &,\ x \leq 0\\1 &,\ x > 0\end{cases}$$

In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(leaky_relu), ylim=fixed((-1, 10)))
interactive_plot

Exponential Linear Unit (eLU)

$$y = \begin{cases}\alpha(e^x -1) &,\ x \leq 0\\x &,\ x > 0\end{cases}$$$$\frac{\partial y}{\partial x} = \begin{cases}y + \alpha &,\ x \leq 0\\1 &,\ x > 0\end{cases}$$

In [ ]:


In [ ]:
interactive_plot = interactive(plot_interactive, w=(-2.0, 2.0), b=(-3, 3, 0.5), func=fixed(elu), ylim=fixed((-2, 10)))
interactive_plot

Tabela das Funções de Ativação

Referências