neu
neuralcli
pypi i neuralcli
neu

neuralcli

command line neural network

by Hugo Rut

1.6 (see all)
pypi i neuralcli
Readme

Command Line Neural Network

Neuralcli provides a simple command line interface to a python implementation of a simple classification neural network. Neuralcli allows a quick way and easy to get instant feedback on a hypothesis or to play around with one of the most popular concepts in machine learning today.

Installation

Installation of neuralcli is provided through pip, just run:

pip install neuralcli

If you don't have some of the libraries used, such as numpy or skitlearn the install make take some time as pip installs all the dependencies. After pip finishes the install run the following command -v neuralcli to check that the executable has been successfully added.

Troubleshooting

When you run neuralcli for the first time you may get an output similar to below

/usr/local/lib/python2.7/site-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')

This is just a warning from matplotlib, and will be removed the next time you run the command.

Additionally matlibplot may throw another error that will produce an output similar to:

**RuntimeError**: Python is not installed as a framework

To fix this issue follow the steps outlined here

Use

Neuralcli comes bundled with three main commands.

Train

The train command takes a set of input features along with their expected outputs and performs backpropogation to learn the weights for a neural network. These weights can be saved to an output file to use for classification prediction later. The command takes the following.

parameters:

nametypedescriptionexample
Xfilea file path to a CSV which holds your training data./train.csv
Yfilea file path to a CSV which holds your expected outputs for the training examples./expected.csv

flags:

nametypedescriptiondefaultexample
--lamfloatThe regularization amount10.07
--maxiterintThe maximum iterations for chosen to minimise the cost function25030
--outputstringA file path to save the minimised parameters tonil./output.csv
--normalizeboolPerform normalization on the training settruefalse
--verboseboolOutput the training progresstruefalse

example:

$ neuralcli train ./X.csv ./Y.csv --output=./weights.csv --normalize=true

Once you run the train command the neural network will intialize and begin to learn the weights, you should see an output similar to bellow if the --verbose flag is set to true.

Predict

The prediction command takes a set of learned weights and a given input to predict a an ouput. The learned weights are loaded into the neural network by providing an file which holds them in a rolled 1 * n vector shape. In order for the predict command to work correctly these parameters need to be unrolled and therefore you need to provide the sizes of the input layer, hidden layer, and output labels that you wish to unroll the

parameters:

nametypedescriptionexample
xfilethe file that holds the 1 * n row example that should be predicted./input.csv
paramsfileThe file that holds a 1 * n rolled parameter vector (saved from the train command)./ouput.csv
labelsintThe size of the output layer that the parameters were trained on3

flags:

nametypedescriptiondefaultexample
--normalizeboolPerform normalization on the training settruefalse
--sizehintThe size of the hidden layer if it differs from the input layernil8

example:

$ neuralcli predict ./x.csv 3 ./params.csv 

Neuralcli will now print a prediction in INT form, corresponding to the index of you output labels. e.g. 0 will correspond to you first classification label.

Test

The test command gives some primitive feedback about the correctness of your hypothesis by running a diagnostic check on the given data set and expected output. This method plots the the margin of prediction error against the increase in size of training examples. This can be useful to determine what is going wrong with your hypothesis, i.e. whether it is underfitting or overfitting the training set.

parameters:

nametypedescriptionexample
Xfilea file path to a CSV which holds your training data./train.csv
Yfilea file path to a CSV which holds your expected outputs for the training examples./expected.csv

flags:

nametypedescriptiondefaultexample
--lamfloatThe regularization amount10.07
--maxiterintThe maximum iterations for chosen to minimise the cost function25030
--normalizeboolPerform normalization on the training settruefalse
--verboseboolOutput the training progresstruefalse
--stepintThe increments that the training will increase the set by10100

example:

$ neuralcli train ./X.csv ./Y.csv --step=50 --normalize=true

Neural cli will then run the test sequence printing its progress as it increases the size of the training set.

After this runs it will then print a plot of the hypothesis error against the size of training set the weights where learned on. Below is an example graph plotted from the iris dataset.

VersionTagPublished
1.6
6yrs ago
1.5
6yrs ago
1.4
6yrs ago
1.2
6yrs ago
No alternatives found
No tutorials found
Add a tutorial
No dependencies found

Rate & Review

100
No reviews found
Be the first to rate