nim

nimblenet

This is an efficient implementation of a fully connected neural network in NumPy. The network can be trained by a variety of learning algorithms: backpropagation, resilient backpropagation and scaled conjugate gradient learning. The network has been developed with PYPY in mind.

Showing:

Popularity

Downloads/wk

0

GitHub Stars

287

Maintenance

Last Commit

2yrs ago

Contributors

2

Package

Dependencies

0

License

UNKNOWN

Categories

Readme

Neural network written in Python (NumPy)

This is an implementation of a fully connected neural network in NumPy. By using the matrix approach to neural networks, this NumPy implementation is able to harvest the power of the BLAS library and efficiently perform the required calculations. The network can be trained by a wide range of learning algorithms.

Visit the project page or Read the documentation.

The code has been tested.

Implemented learning algorithms:

  • Vanilla Backpropagation
  • Backpropagation with classical momentum
  • Backpropagation with Nesterov momentum
  • RMSprop
  • Adagrad
  • Adam
  • Resilient Backpropagation
  • Scaled Conjugate Gradient
  • SciPy’s Optimize

Installation


pip install nimblenet

Requirements

  • Python
  • NumPy
  • Optionally: SciPy

This script has been written with PYPY in mind. Use their jit-compiler to run this code blazingly fast.

Features:

  • Implemented with matrix operations to ensure high performance.
  • Dropout regularization is available to reduce overfitting. Implemented as desribed here.
  • Martin Møller's Scaled Conjugate Gradient for Fast Supervised Learning as published here.
  • PYPY friendly (requires pypy-numpy).
  • Features a selection of cost functions (error functions) and activation functions

Example Usage


    from nimblenet.activation_functions import sigmoid_function
    from nimblenet.cost_functions import cross_entropy_cost
    from nimblenet.learning_algorithms import RMSprop
    from nimblenet.data_structures import Instance
    from nimblenet.neuralnet import NeuralNet


    dataset        = [
        Instance( [0,0], [0] ), Instance( [1,0], [1] ), Instance( [0,1], [1] ), Instance( [1,1], [0] )
    ]

    settings       = {
        "n_inputs" : 2,
        "layers"   : [  (2, sigmoid_function), (1, sigmoid_function) ]
    }

    network        = NeuralNet( settings )
    training_set   = dataset
    test_set       = dataset
    cost_function  = cross_entropy_cost


    RMSprop(
            network,           # the network to train
            training_set,      # specify the training set
            test_set,          # specify the test set
            cost_function,     # specify the cost function to calculate error
        )

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100