After five years of development, I am happy to announce the 1.0.0 version of SuperNN. The source code and binaries are available here.
SuperNN is a C++ library that implements supervised training algorithms for artificial neural networks. It includes implementations of the classic backpropagation-based algorithms, including efficient versions such as the iRPROP [Igel and Hüsken, 2000] and the second-order Neuron-by-Neuron [Wilamowski, Yu, 2010].
SuperNN depends only on Eigen, at compilation time, and is licensed under the LGPL. It provides an exporter tool that allows one to generate standalone Java or C++ code to use trained neural networks without requiring SuperNN at runtime. It has been tested on multiple GNU/Linux distributions, and on MS-Windows with VS2010 and VS2013.
Feel free to submit suggestions and pull requests!
Usage example
The program below solves the classic exclusive-or problem. As the training/test data is small, it could be represented statically in the program and it is suited as a standalone example.
There are more examples in the tests folder on the source code package.