Commit Graph

5 Commits

Author SHA1 Message Date
David Monllaó
e83f7b95d5 Fix activation functions support (#163)
- Backpropagation using the neuron activation functions derivative
- instead of hardcoded sigmoid derivative
- Added missing activation functions derivatives
- Sigmoid forced for the output layer
- Updated ThresholdedReLU default threshold to 0 (acts as a ReLU)
- Unit tests for derivatives
- Unit tests for classifiers using different activation functions
- Added missing docs
2018-01-09 11:09:59 +01:00
David Monllaó
c4ad117d28 Ability to update learningRate in MLP (#160)
* Allow people to update the learning rate

* Test for learning rate setter
2017-12-05 21:09:06 +01:00
David Monllaó
b1d40bfa30 Change from theta to learning rate var name in NN (#159) 2017-11-20 23:39:50 +01:00
David Monllaó
de50490154 Neural networks partial training and persistency (#91)
* Neural networks partial training and persistency

* cs fixes

* Add partialTrain to nn docs

* Test for invalid partial training classes provided
2017-05-23 09:03:05 +02:00
David Monllaó
4af8449b1c Neural networks improvements (#89)
* MultilayerPerceptron interface changes

- Signature closer to other algorithms
- New predict method
- Remove desired error
- Move maxIterations to constructor

* MLP tests for multiple hidden layers and multi-class

* Update all MLP-related tests

* coding style fixes

* Backpropagation included in multilayer-perceptron
2017-05-18 00:07:14 +02:00