Logistic Regression

In this lab, you will cover logistic regression using Pytorch.

Table of Contents


Import the following modules:


In [ ]:
import torch.nn as nn
import torch
import matplotlib.pyplot as plt

Logistic Function


In [ ]:
torch.manual_seed(2)

Create a tensor ranging from -100 to 100:


In [ ]:
z=torch.arange(-100,100,0.1).view(-1, 1)
z

Create a sigmoid object:


In [ ]:
sig=nn.Sigmoid()

Apply the element-wise function Sigmoid with the object:


In [ ]:
yhat=sig(z)

Plot the results:


In [ ]:
plt.plot(z.numpy(),yhat.numpy())
plt.xlabel('z')
plt.ylabel('yhat')

Apply the element-wise Sigmoid from the function module and plot the results:


In [ ]:
yhat=torch.sigmoid(z)
plt.plot(z.numpy(),yhat.numpy())

Build a Logistic Regression Using nn.Sequential

Create a 1x1 tensor where x represents one data sample with one dimension, and 2x1 tensor X represents two data samples of one dimension:


In [ ]:
x=torch.tensor([[1.0]])
X=torch.tensor([[1.0],[100]])
print('x=',x)
print('X=',X)

Create a logistic regression object with the nn.Sequential model with a one-dimensional input:


In [ ]:
model=nn.Sequential(nn.Linear(1,1),nn.Sigmoid())

The object is represented in the following diagram:

In this case, the parameters are randomly insulated. You can view them the following ways:


In [ ]:
print("list(model.parameters()):\n ", list(model.parameters()))
print( "\nmodel.state_dict():\n ",model.state_dict())

Make a prediction with one sample:


In [ ]:
yhat=model(x)
yhat

Calling the object performed the following operation:

Make a prediction with multiple samples:


In [ ]:
yhat=model(X)
yhat

Calling the object performed the following operation:

Create a 1x2 tensor where x represents one data sample with one dimension, and 2x3 tensor X represents one data sample of two dimension:


In [ ]:
x=torch.tensor([[1.0,1.0]])
X=torch.tensor([[1.0,1.0],[1.0,2.0],[1.0,3.0]])
print('x=',x)
print('X=',X)

Create a logistic regression object with the nn.Sequential model with a two-dimensional input:


In [ ]:
model=nn.Sequential(nn.Linear(2,1),nn.Sigmoid())

The object will apply the Sigmoid function to the output of the linear function as shown in the following diagram:

In this case, the parameters are randomly insulated. You can view them the following ways:


In [ ]:
print("list(model.parameters()):\n ", list(model.parameters()))
print( "\nmodel.state_dict():\n ",model.state_dict())

Make a prediction with one sample:


In [ ]:
yhat=model(x)
yhat

The operation is represented in the following diagram:

Make a prediction with multiple samples:


In [ ]:
yhat=model(X)
yhat

The operation is represented in the following diagram:

Build Custom Modules

In this section, you will build a custom model. The model or object function is identical to using nn.Sequential.

Create a logistic regression custom module:


In [ ]:
class logistic_regression(nn.Module):
    def __init__(self,n_inputs):
        super(logistic_regression,self).__init__()
        self.linear=nn.Linear(n_inputs,1)
    def forward(self,x):
        yhat=torch.sigmoid(self.linear(x))
        return yhat

Create a 1x1 tensor where x represents one data sample with one dimension, and 3x1 tensor X represents one data sample of one dimension:


In [ ]:
x=torch.tensor([[1.0]])
X=torch.tensor([[-100],[0],[100.0]])
print('x=',x)
print('X=',X)

Create a model to predict one dimension:


In [ ]:
model=logistic_regression(1)

In this case, the parameters are randomly insulated. You can view them the following ways:


In [ ]:
print("list(model.parameters()):\n ", list(model.parameters()))
print( "\nmodel.state_dict():\n ",model.state_dict())

Make a prediction with multiple samples:


In [ ]:
yhat=model(x)
yhat

Make a prediction with multiple samples:


In [ ]:
yhat=model(X)
yhat

Create a logistic regression object with a function with two inputs:


In [ ]:
model=logistic_regression(2)

Create a 1x2 tensor where x represents one data sample with one dimension, and 3x2 tensor X represents one data sample of one dimension:


In [ ]:
x=torch.tensor([[1.0,2.0]])
X=torch.tensor([[100,-100],[0.0,0.0],[-100,100]])
print('x=',x)
print('X=',X)

Make a prediction with one sample:


In [ ]:
yhat=model(x)
yhat

Make a prediction with multiple samples:


In [ ]:
yhat=model(X)
yhat

About the Authors:

Joseph Santarcangelo has a PhD in Electrical Engineering. His research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition.

Other contributors: Michelle Carey, Mavis Zhou


Copyright © 2018 cognitiveclass.ai. This notebook and its source code are released under the terms of the MIT License.