mirror of
https://github.com/Llewellynvdm/php-ml.git
synced 2024-11-05 04:57:52 +00:00
e83f7b95d5
- Backpropagation using the neuron activation functions derivative - instead of hardcoded sigmoid derivative - Added missing activation functions derivatives - Sigmoid forced for the output layer - Updated ThresholdedReLU default threshold to 0 (acts as a ReLU) - Unit tests for derivatives - Unit tests for classifiers using different activation functions - Added missing docs |
||
---|---|---|
.. | ||
ActivationFunction | ||
Network | ||
Node | ||
LayerTest.php |