In this Jupyter notebook, I will be looking at one of the several choices for neural network modeling in the .NET ecosystem, the Accord machine learning framework, which can be found here:

http://accord-framework.net/

The Auto MPG dataset from the UCI Machine Learning repository will be used for modeling, which can be found here:

https://archive.ics.uci.edu/ml/datasets/Auto+MPG

The following code will load some Accord .NET modules. Accord contains a variety of signal processing and machine learning algorithms for the .NET environment. In this notebook, I will be using the Accord.MachineLearning and Accord.Neuro namespaces, so be sure to copy all required dlls to the working directory of your Jupyter notebook. (For more information about how to load these packages into a Jupyter notebook, refer to the previous blog post):


In [1]:
#r "Accord.Neuro.dll"
#r "Accord.MachineLearning.dll"
open System
open System.IO
open Accord.MachineLearning
open Accord.Neuro

If you haven't worked with F# before, the syntax can be hard to understand at first. I strongly recommend "FSharp for Fun and Profit" (http://fsharpforfunandprofit.com/) to pick up the basics.

The following code will load the data from the auto mpg CSV file. Before this step, I did some basic data cleaning on the file, which included converting the .data extension to .csv, removing all rows that had missing "horsepower" data, removing the last column, and splitting the dataset into an X data file ("autox.csv") and a Y data file ("autoy.csv").


In [2]:
// This code will read all data from the file, skip the header row, and 
// split each row using a comma separator into 6 columns.
let xdata = File.ReadAllLines("C:\\Users\\ERIC\\Desktop\\Auto MPG\\autox.csv")
                |> Array.toSeq
                |> Seq.skip 1
                |> Seq.toArray
                |> Array.map ( fun t -> t.Split(',') 
                                        |> Array.map(fun t -> double t ))
// This code will read all data from the file, skip the header row, and 
// split each row using a comma separator into 1 column.
let ydata = File.ReadAllLines("C:\\Users\\ERIC\\Desktop\\Auto MPG\\autoy.csv")
                |> Array.toSeq
                |> Seq.skip 1
                |> Seq.toArray
                |> Array.map ( fun t -> t.Split(',') 
                                        |> Array.map(fun t -> double t ))

Next, we'll divide the data into a training and testing dataset, necessary for cross validation. Accord.NET has a "SplitSetValidation" class to easily accomplish this. The first parameter of "SplitSetValidation" is the size of the input matrix, and the second parameter is the proportion of data to use for training. I use the "TrainingSet" and "ValidationSet" indices from the "SplitSetValidation" to grab a subset of rows from the original array.

The "X" values used to predict "MPG" (Column 0) will be the "Cylinder", "Displacement", "Horsepower", "Weight", "Accleration", and "Year" columns (1 through 6).


In [3]:
// Divide the data into testing and training datasets
let split = SplitSetValidation(xdata.Length, 0.7)
let XTrain = [| for i in split.TrainingSet -> xdata.[i]|]
let yTrain = [| for i in split.TrainingSet -> ydata.[i]|]
let XTest = [| for i in split.ValidationSet -> xdata.[i]|]
let yTest = [| for i in split.ValidationSet -> ydata.[i]|]

First, we will construct a very simple neural network with a sigmoid activation function, using two hidden layers with 10 nodes each. The learning algorithm will be the Levenberg-Marquardt method.


In [4]:
// This function initializes a new neural network by the number of 
// inputs and the neurons in each hidden layer. neurons is an array
// of nodes in each hidden layer. For example, [| 5; 10 |] means 5
// nodes in the first layer and 10 in the second layer.
let createNetwork (numInputs: int) (neurons: int[]) =   
    let act = SigmoidFunction()
    let network = ActivationNetwork(act, numInputs, neurons)
    Learning.LevenbergMarquardtLearning network 
    
// As an example, this code will initialize a neural network with two hidden 
// layers, with 10 nodes each. Important note- the neurons need to be an 
// array, not a list. Create an array using [| X; Y |] notation.
let nn1 = createNetwork 2 [| 10; 1 |]

Next, we will construct a function to run the neural network called "runNetwork". To make the function a little easier to use, I'm just going to call the "createNetwork" function inside the "runNetwork" function. This will save lines of code when running a loop of many different neural network architectures.

Inside the function, I have created a very simple stopping criteria. Defining stopping criteria within a neural network allows the network to stop execution early if certain modeling conditions have been reached (for example, a low training error). The learning method employed here is stochastic; depending on the random initial conditions and the local minima encountered in the error space, the network will likely achieve different results each time it is run. To keep the end goal in focus, what we're really interested in is the error on the testing data, not the training data. The following code will detect if a neural network is "stuck" in a local minima; if so, we want to stop execution and check the testing error. If it's good enough, we're done. If not, we may need to adjust the learning rate so that it does not get stuck in local minima as easily.


In [5]:
// This function will train the neural network until the stopping
// criteria is reached. It calls the createNetwork function above.
let runNetwork (neurons: int[]) (inputs: double[][]) (output: double[][]) (num_inp: int) =
    let nn = createNetwork num_inp neurons
    // Given that ANNs are stochastic, there is not really a way to get
    // around the use of "mutable" here, at least that I see.
    let mutable previous = 2.0
    let mutable error = 1.0
    let mutable check = true
    let mutable epoch = 0
    while check do
        error <- nn.RunEpoch(inputs, output)
        // The error returned above is the sum of squared errors- let's make it the mean error
        error <- Math.Sqrt(error) / (float inputs.Length)
        // The stopping criteria is that the error must change by at least 0.00001 each epoch, 
        // the error must be less than 1.2, and # of epochs must be less than 100000
        check <- ((Math.Abs(previous - error) > 0.00001) || (error > 1.2)) && (epoch < 100000)
        previous <- error
        epoch <- epoch + 1
    nn

// Run the neural network
let nn = runNetwork [| 10; 1 |] XTrain yTrain 6
let error = nn.RunEpoch(XTest, yTest)
let test_error = Math.Sqrt(error) / (float XTest.Length)
test_error


Out[5]:
1.011253676

Based on the testing error, we were able to predict the MPG with a mean absolute error of 1.011, which is fairly good.

Alternatively, the data could have been loaded using Deedle, which is a data cleaning and preprocessing library similar to pandas in Python. The following code demonstrates how this would have been done:


In [6]:
#r "Deedle.dll"
open Deedle

// Load the data from the CSV file
let df = Frame.ReadCsv "auto-mpg.csv" |> Frame.dropSparseRows
// Examine the column names
df.Columns.Keys
// Specify the X and Y columns to keep
let xcols = [ "Cylinder"; "Displacement"; "Weight"; "Acceleration"; "Year" ]
let ycols = ["MPG"]

// Divide the data into testing and training datasets
let split = SplitSetValidation(392, 0.7)
let train = df.Rows.[split.TrainingSet]
let test = df.Rows.[split.ValidationSet]
let XTrain = train.Columns.[xcols] |> Frame.toArray2D
let yTrain = train.Columns.[ycols] |> Frame.toArray2D
let XTest = test.Columns.[xcols] |> Frame.toArray2D
let yTest = test.Columns.[ycols] |> Frame.toArray2D