Add docs for neural network

This commit is contained in:
Arkadiusz Kondas 2016-08-14 19:14:56 +02:00
parent b1978cf5ca
commit 3599367ce8
7 changed files with 72 additions and 5 deletions

View File

@ -10,7 +10,7 @@
![PHP-ML - Machine Learning library for PHP](docs/assets/php-ml-logo.png)
Fresh approach to Machine Learning in PHP. Algorithms, Cross Validation, Preprocessing, Feature Extraction and much more in one library.
Fresh approach to Machine Learning in PHP. Algorithms, Cross Validation, Neural Network, Preprocessing, Feature Extraction and much more in one library.
PHP-ML requires PHP >= 7.0.
@ -62,6 +62,9 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* [Classification Report](http://php-ml.readthedocs.io/en/latest/machine-learning/metric/classification-report/)
* Workflow
* [Pipeline](http://php-ml.readthedocs.io/en/latest/machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/backpropagation/)
* Cross Validation
* [Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/random-split/)
* [Stratified Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/stratified-random-split/)

View File

@ -3,7 +3,7 @@
"type": "library",
"description": "PHP-ML - Machine Learning library for PHP",
"license": "MIT",
"keywords": ["machine learning","pattern recognition","computational learning theory","artificial intelligence"],
"keywords": ["machine learning","pattern recognition","neural network","computational learning theory","artificial intelligence"],
"homepage": "https://github.com/php-ai/php-ml",
"authors": [
{

View File

@ -62,6 +62,9 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* [Classification Report](machine-learning/metric/classification-report/)
* Workflow
* [Pipeline](machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](machine-learning/neural-network/backpropagation/)
* Cross Validation
* [Random Split](machine-learning/cross-validation/random-split/)
* [Stratified Random Split](machine-learning/cross-validation/stratified-random-split/)

View File

@ -2,7 +2,7 @@
Classifier implementing the k-nearest neighbors algorithm.
### Constructor Parameters
## Constructor Parameters
* $k - number of nearest neighbors to scan (default: 3)
* $distanceMetric - Distance object, default Euclidean (see [distance documentation](math/distance/))
@ -12,7 +12,7 @@ $classifier = new KNearestNeighbors($k=4);
$classifier = new KNearestNeighbors($k=3, new Minkowski($lambda=4));
```
### Train
## Train
To train a classifier simply provide train samples and labels (as `array`). Example:
@ -24,7 +24,7 @@ $classifier = new KNearestNeighbors();
$classifier->train($samples, $labels);
```
### Predict
## Predict
To predict sample label use `predict` method. You can provide one sample or array of samples:

View File

@ -0,0 +1,29 @@
# Backpropagation
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent.
## Constructor Parameters
* $network (Network) - network to train (for example MultilayerPerceptron instance)
* $theta (int) - network theta parameter
```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
$network = new MultilayerPerceptron([2, 2, 1]);
$training = new Backpropagation($network);
```
## Training
Example of XOR training:
```
$training->train(
$samples = [[1, 0], [0, 1], [1, 1], [0, 0]],
$targets = [[1], [1], [0], [0]],
$desiredError = 0.2,
$maxIteraions = 30000
);
```

View File

@ -0,0 +1,29 @@
# MultilayerPerceptron
A multilayer perceptron (MLP) is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate outputs.
## Constructor Parameters
* $layers (array) - array with layers configuration, each value represent number of neurons in each layers
* $activationFunction (ActivationFunction) - neuron activation function
```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
$mlp = new MultilayerPerceptron([2, 2, 1]);
// 2 nodes in input layer, 2 nodes in first hidden layer and 1 node in output layer
```
## Methods
* setInput(array $input)
* getOutput()
* getLayers()
* addLayer(Layer $layer)
## Activation Functions
* BinaryStep
* Gaussian
* HyperbolicTangent
* Sigmoid (default)

View File

@ -18,6 +18,9 @@ pages:
- Classification Report: machine-learning/metric/classification-report.md
- Workflow:
- Pipeline: machine-learning/workflow/pipeline.md
- Neural Network:
- Multilayer Perceptron: machine-learning/neural-network/multilayer-perceptron.md
- Backpropagation training: machine-learning/neural-network/backpropagation.md
- Cross Validation:
- RandomSplit: machine-learning/cross-validation/random-split.md
- Stratified Random Split: machine-learning/cross-validation/stratified-random-split.md