The first dataset we will import is the Iris Dataset
In [4]:
from sklearn import datasets
X, y = datasets.make_hastie_10_2(n_samples=12000, random_state=1)
#make random test and train set
from sklearn import cross_validation
from sklearn.cross_validation import train_test_split
train_x, test_x, train_y, test_y = train_test_split(X, y, test_size=0.3, random_state=0)
In [3]:
First we train te network on x dataset
In [ ]:
%run NeuralNetwork
import cPickle as pickle
#the neural network is based on code by Riaan Zoetmulder
inputData = X
targetData = y
myNN = NN.NNetwork(len(inputData[1]) , 60, 1 , 0.1, 0.5)
myNN.backPropagation(np.asarray(inputData), np.asarray(targetData), 1000)
#saves the trained state of the network
with open('NeuralNetwork.p', 'wb') as output_file:
pickle.dump(myNN, output_file, -1)
If you already trained the dataset there will be a pickle file with the trained network available. Now underneath we test in on the test-set
In [ ]:
import cpickle as pickle
#has definition accuracy, accuracy(y_target, y_predict)
%run modelSelection
In [ ]:
In [2]:
Here I import some code to plot the (k)PCA. I use the PCA libary from Sklearn
In [5]:
#important to have this magic line inplace, otherwise the notebook will not plot
%matplotlib inline
#this imports the file from the folder by running all definitions from file will be in the memory of the kernel
%run PCA_visualization
kPCA_visualization2d(X, y)
In [ ]:
Instead of import we use %run, which works because it will store all the functions in the memory of the kernel. As you can see, the example works.
In [14]:
%run notebook_import_test
print_import()
In [ ]: