2016-04-08 22:36:48 +00:00
|
|
|
# KNearestNeighbors Classifier
|
|
|
|
|
|
|
|
Classifier implementing the k-nearest neighbors algorithm.
|
|
|
|
|
2016-08-14 17:14:56 +00:00
|
|
|
## Constructor Parameters
|
2016-04-08 22:36:48 +00:00
|
|
|
|
|
|
|
* $k - number of nearest neighbors to scan (default: 3)
|
2017-01-05 20:06:10 +00:00
|
|
|
* $distanceMetric - Distance object, default Euclidean (see [distance documentation](../../math/distance.md))
|
2016-04-08 22:36:48 +00:00
|
|
|
|
|
|
|
```
|
|
|
|
$classifier = new KNearestNeighbors($k=4);
|
2016-04-16 19:41:37 +00:00
|
|
|
$classifier = new KNearestNeighbors($k=3, new Minkowski($lambda=4));
|
2016-04-08 22:36:48 +00:00
|
|
|
```
|
|
|
|
|
2016-08-14 17:14:56 +00:00
|
|
|
## Train
|
2016-04-08 22:36:48 +00:00
|
|
|
|
2019-11-02 10:41:34 +00:00
|
|
|
To train a classifier, simply provide train samples and labels (as `array`). Example:
|
2016-04-08 22:36:48 +00:00
|
|
|
|
|
|
|
```
|
|
|
|
$samples = [[1, 3], [1, 4], [2, 4], [3, 1], [4, 1], [4, 2]];
|
|
|
|
$labels = ['a', 'a', 'a', 'b', 'b', 'b'];
|
|
|
|
|
|
|
|
$classifier = new KNearestNeighbors();
|
|
|
|
$classifier->train($samples, $labels);
|
|
|
|
```
|
|
|
|
|
2017-02-01 18:06:38 +00:00
|
|
|
You can train the classifier using multiple data sets, predictions will be based on all the training data.
|
|
|
|
|
2016-08-14 17:14:56 +00:00
|
|
|
## Predict
|
2016-04-08 22:36:48 +00:00
|
|
|
|
2019-11-02 10:41:34 +00:00
|
|
|
To predict sample label use the `predict` method. You can provide one sample or array of samples:
|
2016-04-08 22:36:48 +00:00
|
|
|
|
|
|
|
```
|
|
|
|
$classifier->predict([3, 2]);
|
|
|
|
// return 'b'
|
|
|
|
|
|
|
|
$classifier->predict([[3, 2], [1, 5]]);
|
|
|
|
// return ['b', 'a']
|
|
|
|
```
|