K-nearest Neighbors (k-NN)
K-nearest neighbors is a non-parametric statistical method, which predicts based on the k nearest neighbors. k-NN classifiers are an example of what's called instance based or memory based supervised learning.
It can be applied to both classification and regression problems.
- When applied to classification problems, the prediction result would be the most common one among the k nearest training observations.
- When applied to regression problems, the prediction result would be the average of the k nearest training observations.
Some key notes about this method:
- It makes few assumptions about structure of data
- It gives potentially accurate but sometimes unstable predictions
- It is sensitive to small changes in training data
0
7
Tags
Data Science
D2L
Dive into Deep Learning @ D2L
Learn After
Reference Video: K-Nearest Neighbors
Medium: Difference between K-Means and KNN
Math/Python Explanation: Difference Between K-Means and KNN
Machine Learning Basics with KNN Algorithm
KNN Regression
A Practical Introduction to K-Nearest Neighbors Algorithm for Regression (Reference)
KNN in practice
Reference video: K-Nearest Neighbors: Classification and Regression
sklearn.neighbors.KNeighborsClassifier
Classification Algorithm of K-Nearest Neighbors
K-Nearest Neighbors Advantages and Disadvantages
What class would a KNeighborsClassifer classify the new point as for k = 1 and k = 3?
Which of the following is true for the nearest neighbor classifier?
1-Nearest Neighbor Algorithm
Distance Function