The perceptron is a classifier model represented by a vector [X], that represents the inputs, and a vector [W] that represents the weights of the model:
The weighted sum y(t) is filtered through a step function f (the Heaviside or sign functions are usually used).
f = 1 if y(t) >0
f= 0 if y(t) <=0
f = 1 if y(t) >0
f = -1 if y(t) <=0
In [16]:
from random import choice
from numpy import array, dot, random
# Here is the implementation with the Heaviside function.
Heaviside = lambda x: 0 if x <=0 else 1
# For implementation with the sign function:
#step_ = lambda x: -1 if x < 0 else 1
training= [(array([0,0,1]), 0),
(array([0,1,1]), 1),
(array([1,0,1]), 1),
(array([1,1,1]), 1),]
test=[array([0,1,1]),array([0,0,1])]
w=random.rand(3)
print("w initial=", w)
errors =[]
eta= 2/10
n=50
for i in range(n):
x, expected = choice(training)
result=dot(w,x)
error = expected - Heaviside(result)
errors.append(error)
if error!=0:
w+= eta*error*x
print("w final=", w)
for x in test:
result = dot(x,w)
print('Test Results')
print("{}:{} -->{}".format(x[:2], result, Heaviside(result)))
In [17]:
%matplotlib inline
from pylab import plot, ylim
ylim([-1, 1])
plot(errors)
Out[17]: