- Backpropagation using the neuron activation functions derivative
- instead of hardcoded sigmoid derivative
- Added missing activation functions derivatives
- Sigmoid forced for the output layer
- Updated ThresholdedReLU default threshold to 0 (acts as a ReLU)
- Unit tests for derivatives
- Unit tests for classifiers using different activation functions
- Added missing docs
* Multiple training data sets allowed
* Tests with multiple training data sets
* Updating docs according to #38
Documenting all models which predictions will be based on all
training data provided.
Some models already supported multiple training data sets.