Neural networks improvements (#89)

* MultilayerPerceptron interface changes

- Signature closer to other algorithms
- New predict method
- Remove desired error
- Move maxIterations to constructor

* MLP tests for multiple hidden layers and multi-class

* Update all MLP-related tests

* coding style fixes

* Backpropagation included in multilayer-perceptron
This commit is contained in:
David Monllaó 2017-05-18 06:07:14 +08:00 committed by Arkadiusz Kondas
parent 7ab80b6e56
commit 4af8449b1c
18 changed files with 371 additions and 345 deletions

View File

@ -76,8 +76,7 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* Workflow
* [Pipeline](http://php-ml.readthedocs.io/en/latest/machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/backpropagation/)
* [Multilayer Perceptron Classifier](http://php-ml.readthedocs.io/en/latest/machine-learning/neural-network/multilayer-perceptron-classifier/)
* Cross Validation
* [Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/random-split/)
* [Stratified Random Split](http://php-ml.readthedocs.io/en/latest/machine-learning/cross-validation/stratified-random-split/)

View File

@ -65,8 +65,7 @@ Example scripts are available in a separate repository [php-ai/php-ml-examples](
* Workflow
* [Pipeline](machine-learning/workflow/pipeline)
* Neural Network
* [Multilayer Perceptron](machine-learning/neural-network/multilayer-perceptron/)
* [Backpropagation training](machine-learning/neural-network/backpropagation/)
* [Multilayer Perceptron Classifier](machine-learning/neural-network/multilayer-perceptron-classifier/)
* Cross Validation
* [Random Split](machine-learning/cross-validation/random-split/)
* [Stratified Random Split](machine-learning/cross-validation/stratified-random-split/)

View File

@ -1,30 +0,0 @@
# Backpropagation
Backpropagation, an abbreviation for "backward propagation of errors", is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent.
## Constructor Parameters
* $network (Network) - network to train (for example MultilayerPerceptron instance)
* $theta (int) - network theta parameter
```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
$network = new MultilayerPerceptron([2, 2, 1]);
$training = new Backpropagation($network);
```
## Training
Example of XOR training:
```
$training->train(
$samples = [[1, 0], [0, 1], [1, 1], [0, 0]],
$targets = [[1], [1], [0], [0]],
$desiredError = 0.2,
$maxIteraions = 30000
);
```
You can train the neural network using multiple data sets, predictions will be based on all the training data.

View File

@ -0,0 +1,50 @@
# MLPClassifier
A multilayer perceptron (MLP) is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate outputs.
## Constructor Parameters
* $inputLayerFeatures (int) - the number of input layer features
* $hiddenLayers (array) - array with the hidden layers configuration, each value represent number of neurons in each layers
* $classes (array) - array with the different training set classes (array keys are ignored)
* $iterations (int) - number of training iterations
* $theta (int) - network theta parameter
* $activationFunction (ActivationFunction) - neuron activation function
```
use Phpml\Classification\MLPClassifier;
$mlp = new MLPClassifier(4, [2], ['a', 'b', 'c']);
// 4 nodes in input layer, 2 nodes in first hidden layer and 3 possible labels.
```
## Train
To train a MLP simply provide train samples and labels (as array). Example:
```
$mlp->train(
$samples = [[1, 0, 0, 0], [0, 1, 1, 0], [1, 1, 1, 1], [0, 0, 0, 0]],
$targets = ['a', 'a', 'b', 'c']
);
```
## Predict
To predict sample label use predict method. You can provide one sample or array of samples:
```
$mlp->predict([[1, 1, 1, 1], [0, 0, 0, 0]]);
// return ['b', 'c'];
```
## Activation Functions
* BinaryStep
* Gaussian
* HyperbolicTangent
* Sigmoid (default)

View File

@ -1,29 +0,0 @@
# MultilayerPerceptron
A multilayer perceptron (MLP) is a feedforward artificial neural network model that maps sets of input data onto a set of appropriate outputs.
## Constructor Parameters
* $layers (array) - array with layers configuration, each value represent number of neurons in each layers
* $activationFunction (ActivationFunction) - neuron activation function
```
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
$mlp = new MultilayerPerceptron([2, 2, 1]);
// 2 nodes in input layer, 2 nodes in first hidden layer and 1 node in output layer
```
## Methods
* setInput(array $input)
* getOutput()
* getLayers()
* addLayer(Layer $layer)
## Activation Functions
* BinaryStep
* Gaussian
* HyperbolicTangent
* Sigmoid (default)

View File

@ -21,8 +21,7 @@ pages:
- Workflow:
- Pipeline: machine-learning/workflow/pipeline.md
- Neural Network:
- Multilayer Perceptron: machine-learning/neural-network/multilayer-perceptron.md
- Backpropagation training: machine-learning/neural-network/backpropagation.md
- Multilayer Perceptron Classifier: machine-learning/neural-network/multilayer-perceptron-classifier.md
- Cross Validation:
- RandomSplit: machine-learning/cross-validation/random-split.md
- Stratified Random Split: machine-learning/cross-validation/stratified-random-split.md

View File

@ -0,0 +1,67 @@
<?php
declare(strict_types=1);
namespace Phpml\Classification;
use Phpml\Classification\Classifier;
use Phpml\Exception\InvalidArgumentException;
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
use Phpml\NeuralNetwork\ActivationFunction;
use Phpml\NeuralNetwork\Layer;
use Phpml\NeuralNetwork\Node\Bias;
use Phpml\NeuralNetwork\Node\Input;
use Phpml\NeuralNetwork\Node\Neuron;
use Phpml\NeuralNetwork\Node\Neuron\Synapse;
use Phpml\Helper\Predictable;
class MLPClassifier extends MultilayerPerceptron implements Classifier
{
/**
* @param mixed $target
* @return int
*/
public function getTargetClass($target): int
{
if (!in_array($target, $this->classes)) {
throw InvalidArgumentException::invalidTarget($target);
}
return array_search($target, $this->classes);
}
/**
* @param array $sample
*
* @return mixed
*/
protected function predictSample(array $sample)
{
$output = $this->setInput($sample)->getOutput();
$predictedClass = null;
$max = 0;
foreach ($output as $class => $value) {
if ($value > $max) {
$predictedClass = $class;
$max = $value;
}
}
return $this->classes[$predictedClass];
}
/**
* @param array $sample
* @param mixed $target
*/
protected function trainSample(array $sample, $target)
{
// Feed-forward.
$this->setInput($sample)->getOutput();
// Back-propagate.
$this->backpropagation->backpropagate($this->getLayers(), $this->getTargetClass($target));
}
}

View File

@ -66,6 +66,14 @@ class InvalidArgumentException extends \Exception
return new self('Invalid clusters number');
}
/**
* @return InvalidArgumentException
*/
public static function invalidTarget($target)
{
return new self('Target with value ' . $target . ' is not part of the accepted classes');
}
/**
* @param string $language
*
@ -89,6 +97,15 @@ class InvalidArgumentException extends \Exception
*/
public static function invalidLayersNumber()
{
return new self('Provide at least 2 layers: 1 input and 1 output');
return new self('Provide at least 1 hidden layer');
}
/**
* @return InvalidArgumentException
*/
public static function invalidClassesNumber()
{
return new self('Provide at least 2 different classes');
}
}

View File

@ -71,7 +71,7 @@ abstract class LayeredNetwork implements Network
foreach ($this->getLayers() as $layer) {
foreach ($layer->getNodes() as $node) {
if ($node instanceof Neuron) {
$node->refresh();
$node->reset();
}
}
}

View File

@ -4,34 +4,93 @@ declare(strict_types=1);
namespace Phpml\NeuralNetwork\Network;
use Phpml\Estimator;
use Phpml\Exception\InvalidArgumentException;
use Phpml\NeuralNetwork\Training\Backpropagation;
use Phpml\NeuralNetwork\ActivationFunction;
use Phpml\NeuralNetwork\Layer;
use Phpml\NeuralNetwork\Node\Bias;
use Phpml\NeuralNetwork\Node\Input;
use Phpml\NeuralNetwork\Node\Neuron;
use Phpml\NeuralNetwork\Node\Neuron\Synapse;
use Phpml\Helper\Predictable;
class MultilayerPerceptron extends LayeredNetwork
abstract class MultilayerPerceptron extends LayeredNetwork implements Estimator
{
use Predictable;
/**
* @param array $layers
* @var array
*/
protected $classes = [];
/**
* @var int
*/
private $iterations;
/**
* @var Backpropagation
*/
protected $backpropagation = null;
/**
* @param int $inputLayerFeatures
* @param array $hiddenLayers
* @param array $classes
* @param int $iterations
* @param ActivationFunction|null $activationFunction
* @param int $theta
*
* @throws InvalidArgumentException
*/
public function __construct(array $layers, ActivationFunction $activationFunction = null)
public function __construct(int $inputLayerFeatures, array $hiddenLayers, array $classes, int $iterations = 10000, ActivationFunction $activationFunction = null, int $theta = 1)
{
if (count($layers) < 2) {
if (empty($hiddenLayers)) {
throw InvalidArgumentException::invalidLayersNumber();
}
$this->addInputLayer(array_shift($layers));
$this->addNeuronLayers($layers, $activationFunction);
$nClasses = count($classes);
if ($nClasses < 2) {
throw InvalidArgumentException::invalidClassesNumber();
}
$this->classes = array_values($classes);
$this->iterations = $iterations;
$this->addInputLayer($inputLayerFeatures);
$this->addNeuronLayers($hiddenLayers, $activationFunction);
$this->addNeuronLayers([$nClasses], $activationFunction);
$this->addBiasNodes();
$this->generateSynapses();
$this->backpropagation = new Backpropagation($theta);
}
/**
* @param array $samples
* @param array $targets
*/
public function train(array $samples, array $targets)
{
for ($i = 0; $i < $this->iterations; ++$i) {
$this->trainSamples($samples, $targets);
}
}
/**
* @param array $sample
* @param mixed $target
*/
protected abstract function trainSample(array $sample, $target);
/**
* @param array $sample
* @return mixed
*/
protected abstract function predictSample(array $sample);
/**
* @param int $nodes
*/
@ -92,4 +151,15 @@ class MultilayerPerceptron extends LayeredNetwork
$nextNeuron->addSynapse(new Synapse($currentNeuron));
}
}
/**
* @param array $samples
* @param array $targets
*/
private function trainSamples(array $samples, array $targets)
{
foreach ($targets as $key => $target) {
$this->trainSample($samples[$key], $target);
}
}
}

View File

@ -68,7 +68,7 @@ class Neuron implements Node
return $this->output;
}
public function refresh()
public function reset()
{
$this->output = 0;
}

View File

@ -9,8 +9,6 @@ interface Training
/**
* @param array $samples
* @param array $targets
* @param float $desiredError
* @param int $maxIterations
*/
public function train(array $samples, array $targets, float $desiredError = 0.001, int $maxIterations = 10000);
public function train(array $samples, array $targets);
}

View File

@ -4,18 +4,11 @@ declare(strict_types=1);
namespace Phpml\NeuralNetwork\Training;
use Phpml\NeuralNetwork\Network;
use Phpml\NeuralNetwork\Node\Neuron;
use Phpml\NeuralNetwork\Training;
use Phpml\NeuralNetwork\Training\Backpropagation\Sigma;
class Backpropagation implements Training
class Backpropagation
{
/**
* @var Network
*/
private $network;
/**
* @var int
*/
@ -27,96 +20,62 @@ class Backpropagation implements Training
private $sigmas;
/**
* @param Network $network
* @param int $theta
* @var array
*/
public function __construct(Network $network, int $theta = 1)
private $prevSigmas;
/**
* @param int $theta
*/
public function __construct(int $theta)
{
$this->network = $network;
$this->theta = $theta;
}
/**
* @param array $samples
* @param array $targets
* @param float $desiredError
* @param int $maxIterations
* @param array $layers
* @param mixed $targetClass
*/
public function train(array $samples, array $targets, float $desiredError = 0.001, int $maxIterations = 10000)
public function backpropagate(array $layers, $targetClass)
{
$samplesCount = count($samples);
for ($i = 0; $i < $maxIterations; ++$i) {
$resultsWithinError = $this->trainSamples($samples, $targets, $desiredError);
if ($resultsWithinError === $samplesCount) {
break;
}
}
}
/**
* @param array $samples
* @param array $targets
* @param float $desiredError
*
* @return int
*/
private function trainSamples(array $samples, array $targets, float $desiredError): int
{
$resultsWithinError = 0;
foreach ($targets as $key => $target) {
$result = $this->network->setInput($samples[$key])->getOutput();
if ($this->isResultWithinError($result, $target, $desiredError)) {
++$resultsWithinError;
} else {
$this->trainSample($samples[$key], $target);
}
}
return $resultsWithinError;
}
/**
* @param array $sample
* @param array $target
*/
private function trainSample(array $sample, array $target)
{
$this->network->setInput($sample)->getOutput();
$this->sigmas = [];
$layers = $this->network->getLayers();
$layersNumber = count($layers);
// Backpropagation.
for ($i = $layersNumber; $i > 1; --$i) {
$this->sigmas = [];
foreach ($layers[$i - 1]->getNodes() as $key => $neuron) {
if ($neuron instanceof Neuron) {
$sigma = $this->getSigma($neuron, $target, $key, $i == $layersNumber);
$sigma = $this->getSigma($neuron, $targetClass, $key, $i == $layersNumber);
foreach ($neuron->getSynapses() as $synapse) {
$synapse->changeWeight($this->theta * $sigma * $synapse->getNode()->getOutput());
}
}
}
$this->prevSigmas = $this->sigmas;
}
}
/**
* @param Neuron $neuron
* @param array $target
* @param int $targetClass
* @param int $key
* @param bool $lastLayer
*
* @return float
*/
private function getSigma(Neuron $neuron, array $target, int $key, bool $lastLayer): float
private function getSigma(Neuron $neuron, int $targetClass, int $key, bool $lastLayer): float
{
$neuronOutput = $neuron->getOutput();
$sigma = $neuronOutput * (1 - $neuronOutput);
if ($lastLayer) {
$sigma *= ($target[$key] - $neuronOutput);
$value = 0;
if ($targetClass === $key) {
$value = 1;
}
$sigma *= ($value - $neuronOutput);
} else {
$sigma *= $this->getPrevSigma($neuron);
}
@ -135,28 +94,10 @@ class Backpropagation implements Training
{
$sigma = 0.0;
foreach ($this->sigmas as $neuronSigma) {
foreach ($this->prevSigmas as $neuronSigma) {
$sigma += $neuronSigma->getSigmaForNeuron($neuron);
}
return $sigma;
}
/**
* @param array $result
* @param array $target
* @param float $desiredError
*
* @return bool
*/
private function isResultWithinError(array $result, array $target, float $desiredError)
{
foreach ($target as $key => $value) {
if ($result[$key] > $value + $desiredError || $result[$key] < $value - $desiredError) {
return false;
}
}
return true;
}
}

View File

@ -1,80 +0,0 @@
<?php
declare(strict_types=1);
namespace Phpml\Regression;
use Phpml\Helper\Predictable;
use Phpml\NeuralNetwork\ActivationFunction;
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
class MLPRegressor implements Regression
{
use Predictable;
/**
* @var MultilayerPerceptron
*/
private $perceptron;
/**
* @var array
*/
private $hiddenLayers;
/**
* @var float
*/
private $desiredError;
/**
* @var int
*/
private $maxIterations;
/**
* @var ActivationFunction
*/
private $activationFunction;
/**
* @param array $hiddenLayers
* @param float $desiredError
* @param int $maxIterations
* @param ActivationFunction $activationFunction
*/
public function __construct(array $hiddenLayers = [10], float $desiredError = 0.01, int $maxIterations = 10000, ActivationFunction $activationFunction = null)
{
$this->hiddenLayers = $hiddenLayers;
$this->desiredError = $desiredError;
$this->maxIterations = $maxIterations;
$this->activationFunction = $activationFunction;
}
/**
* @param array $samples
* @param array $targets
*/
public function train(array $samples, array $targets)
{
$layers = $this->hiddenLayers;
array_unshift($layers, count($samples[0]));
$layers[] = count($targets[0]);
$this->perceptron = new MultilayerPerceptron($layers, $this->activationFunction);
$trainer = new Backpropagation($this->perceptron);
$trainer->train($samples, $targets, $this->desiredError, $this->maxIterations);
}
/**
* @param array $sample
*
* @return array
*/
protected function predictSample(array $sample)
{
return $this->perceptron->setInput($sample)->getOutput();
}
}

View File

@ -0,0 +1,129 @@
<?php
declare(strict_types=1);
namespace tests\Phpml\Classification;
use Phpml\Classification\MLPClassifier;
use Phpml\NeuralNetwork\Training\Backpropagation;
use Phpml\NeuralNetwork\Node\Neuron;
use PHPUnit\Framework\TestCase;
class MLPClassifierTest extends TestCase
{
public function testMLPClassifierLayersInitialization()
{
$mlp = new MLPClassifier(2, [2], [0, 1]);
$this->assertCount(3, $mlp->getLayers());
$layers = $mlp->getLayers();
// input layer
$this->assertCount(3, $layers[0]->getNodes());
$this->assertNotContainsOnly(Neuron::class, $layers[0]->getNodes());
// hidden layer
$this->assertCount(3, $layers[1]->getNodes());
$this->assertNotContainsOnly(Neuron::class, $layers[1]->getNodes());
// output layer
$this->assertCount(2, $layers[2]->getNodes());
$this->assertContainsOnly(Neuron::class, $layers[2]->getNodes());
}
public function testSynapsesGeneration()
{
$mlp = new MLPClassifier(2, [2], [0, 1]);
$layers = $mlp->getLayers();
foreach ($layers[1]->getNodes() as $node) {
if ($node instanceof Neuron) {
$synapses = $node->getSynapses();
$this->assertCount(3, $synapses);
$synapsesNodes = $this->getSynapsesNodes($synapses);
foreach ($layers[0]->getNodes() as $prevNode) {
$this->assertContains($prevNode, $synapsesNodes);
}
}
}
}
public function testBackpropagationLearning()
{
// Single layer 2 classes.
$network = new MLPClassifier(2, [2], ['a', 'b'], 1000);
$network->train(
[[1, 0], [0, 1], [1, 1], [0, 0]],
['a', 'b', 'a', 'b']
);
$this->assertEquals('a', $network->predict([1, 0]));
$this->assertEquals('b', $network->predict([0, 1]));
$this->assertEquals('a', $network->predict([1, 1]));
$this->assertEquals('b', $network->predict([0, 0]));
}
public function testBackpropagationLearningMultilayer()
{
// Multi-layer 2 classes.
$network = new MLPClassifier(5, [3, 2], ['a', 'b']);
$network->train(
[[1, 0, 0, 0, 0], [0, 1, 1, 0, 0], [1, 1, 1, 1, 1], [0, 0, 0, 0, 0]],
['a', 'b', 'a', 'b']
);
$this->assertEquals('a', $network->predict([1, 0, 0, 0, 0]));
$this->assertEquals('b', $network->predict([0, 1, 1, 0, 0]));
$this->assertEquals('a', $network->predict([1, 1, 1, 1, 1]));
$this->assertEquals('b', $network->predict([0, 0, 0, 0, 0]));
}
public function testBackpropagationLearningMulticlass()
{
// Multi-layer more than 2 classes.
$network = new MLPClassifier(5, [3, 2], ['a', 'b', 4]);
$network->train(
[[1, 0, 0, 0, 0], [0, 1, 0, 0, 0], [0, 0, 1, 1, 0], [1, 1, 1, 1, 1], [0, 0, 0, 0, 0]],
['a', 'b', 'a', 'a', 4]
);
$this->assertEquals('a', $network->predict([1, 0, 0, 0, 0]));
$this->assertEquals('b', $network->predict([0, 1, 0, 0, 0]));
$this->assertEquals('a', $network->predict([0, 0, 1, 1, 0]));
$this->assertEquals('a', $network->predict([1, 1, 1, 1, 1]));
$this->assertEquals(4, $network->predict([0, 0, 0, 0, 0]));
}
/**
* @expectedException \Phpml\Exception\InvalidArgumentException
*/
public function testThrowExceptionOnInvalidLayersNumber()
{
new MLPClassifier(2, [], [0, 1]);
}
/**
* @expectedException \Phpml\Exception\InvalidArgumentException
*/
public function testThrowExceptionOnInvalidClassesNumber()
{
new MLPClassifier(2, [2], [0]);
}
/**
* @param array $synapses
*
* @return array
*/
private function getSynapsesNodes(array $synapses): array
{
$nodes = [];
foreach ($synapses as $synapse) {
$nodes[] = $synapse->getNode();
}
return $nodes;
}
}

View File

@ -1,74 +0,0 @@
<?php
declare(strict_types=1);
namespace tests\Phpml\NeuralNetwork\Network;
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Node\Neuron;
use PHPUnit\Framework\TestCase;
class MultilayerPerceptronTest extends TestCase
{
public function testMultilayerPerceptronLayersInitialization()
{
$mlp = new MultilayerPerceptron([2, 2, 1]);
$this->assertCount(3, $mlp->getLayers());
$layers = $mlp->getLayers();
// input layer
$this->assertCount(3, $layers[0]->getNodes());
$this->assertNotContainsOnly(Neuron::class, $layers[0]->getNodes());
// hidden layer
$this->assertCount(3, $layers[1]->getNodes());
$this->assertNotContainsOnly(Neuron::class, $layers[0]->getNodes());
// output layer
$this->assertCount(1, $layers[2]->getNodes());
$this->assertContainsOnly(Neuron::class, $layers[2]->getNodes());
}
public function testSynapsesGeneration()
{
$mlp = new MultilayerPerceptron([2, 2, 1]);
$layers = $mlp->getLayers();
foreach ($layers[1]->getNodes() as $node) {
if ($node instanceof Neuron) {
$synapses = $node->getSynapses();
$this->assertCount(3, $synapses);
$synapsesNodes = $this->getSynapsesNodes($synapses);
foreach ($layers[0]->getNodes() as $prevNode) {
$this->assertContains($prevNode, $synapsesNodes);
}
}
}
}
/**
* @param array $synapses
*
* @return array
*/
private function getSynapsesNodes(array $synapses): array
{
$nodes = [];
foreach ($synapses as $synapse) {
$nodes[] = $synapse->getNode();
}
return $nodes;
}
/**
* @expectedException \Phpml\Exception\InvalidArgumentException
*/
public function testThrowExceptionOnInvalidLayersNumber()
{
new MultilayerPerceptron([2]);
}
}

View File

@ -46,7 +46,7 @@ class NeuronTest extends TestCase
$this->assertEquals(0.5, $neuron->getOutput(), '', 0.01);
$neuron->refresh();
$neuron->reset();
$this->assertEquals(0.88, $neuron->getOutput(), '', 0.01);
}

View File

@ -1,30 +0,0 @@
<?php
declare(strict_types=1);
namespace tests\Phpml\NeuralNetwork\Training;
use Phpml\NeuralNetwork\Network\MultilayerPerceptron;
use Phpml\NeuralNetwork\Training\Backpropagation;
use PHPUnit\Framework\TestCase;
class BackpropagationTest extends TestCase
{
public function testBackpropagationForXORLearning()
{
$network = new MultilayerPerceptron([2, 2, 1]);
$training = new Backpropagation($network);
$training->train(
[[1, 0], [0, 1], [1, 1], [0, 0]],
[[1], [1], [0], [0]],
$desiredError = 0.3,
40000
);
$this->assertEquals(0, $network->setInput([1, 1])->getOutput()[0], '', $desiredError);
$this->assertEquals(0, $network->setInput([0, 0])->getOutput()[0], '', $desiredError);
$this->assertEquals(1, $network->setInput([1, 0])->getOutput()[0], '', $desiredError);
$this->assertEquals(1, $network->setInput([0, 1])->getOutput()[0], '', $desiredError);
}
}