Announcing SuperNN 1.0.0


After five years of development, I am happy to announce the 1.0.0 version of SuperNN. The source code and binaries are available here.

SuperNN is a C++ library that implements supervised training algorithms for artificial neural networks. It includes implementations of the classic backpropagation-based algorithms, including efficient versions such as the iRPROP [Igel and Hüsken, 2000] and the second-order Neuron-by-Neuron [Wilamowski, Yu, 2010].

SuperNN depends only on Eigen, at compilation time, and is licensed under the LGPL. It provides an exporter tool that allows one to generate standalone Java or C++ code to use trained neural networks without requiring SuperNN at runtime. It has been tested on multiple GNU/Linux distributions, and on MS-Windows with VS2010 and VS2013.

Feel free to submit suggestions and pull requests!

Usage example

The program below solves the classic exclusive-or problem. As the training/test data is small, it could be represented statically in the program and it is suited as a standalone example.

#include <supernn>
#include <iostream>

using namespace SuperNN;
using namespace std;

int main()
{
    /** Create the network */
    Network net = Network::make_fcc(2, 2, 1);
    net.set_activation(ACT_SIGMOID_SYMMETRIC, 0.5);
    net.init_weights(-0.2, 0.2);

    /** Create the trainer and prepare the network */
    NBN t;
    t.prepare(net);
    
    /** Create the training data from memory */
    const double data[] = {
        -1.0, -1.0, -1,
        -1.0, +1.0, +1,
        +1.0, -1.0, +1,
        +1.0, +1.0, -1
    };
    const Data full(4, 3, data);
    
    /** Train the network */
    const unsigned e = t.train(net, full, 1e-5, 50);
    
    /** Print some statistics */
    cout << "Epochs        : " << e << endl;
    cout << "Classification: " << net.calc_class(full) << endl;
    
    return 0;
}

There are more examples in the tests folder on the source code package.