KNN (K Nearest Neighbors)


KNN - K Nearest Neighbors

Understanding KNN:

  • It is one of the simplest algorithms in the machine learning world although still it is being widely used in real world applications.
  • We use the traditional and most common distance method Euclidean distance to find the nearest neighbors even though there are several distance methods available which are Manhattan distance, Minkowski distance. Euclidean distance is called as crow flies it means shortest direct root, Manhattan distance is called City Block as it goes through street pedestrian roots.
  • It is a Lazy Learning process since it does not learn anything (Zero cost based) from the Train Dataset but keeps the Train Dataset in memory and calculates the Euclidean distance from all Test data points to all Train Data Points when we want to predict on Test data points, so it is called as a Lazy Learner. Other Models like Logistic, Decision trees are called Eager Learners.
  • KNN is a Non Parametric Model since there is no assumption about the data distribution, It is Instance based Learning. Training data points represents the Knowledge in KNN. 
































Previous Post Next Post

Contact Form