A recent assignment in a machine learning class called for drawing the k-nearest-neighbor decision boundary for some given values of k, starting with k=1. The task involved using standard Euclidean distance between the starting points to determine the class of the nearest neighbors, and at the same time to draw (by hand) the resulting figure.
We were given two sets of figures to start:
and
I did the assignment by hand as the plot was not that complex, plus it was a great way to learn exactly what K-NN is doing... but I thought "Hmm, we can code this. It shouldn't be too hard or take too long!"... Famous last words...
But it wasn't as bad as I expected, and the results were great! I coded a K-NN solver using R and the 'knn()' function from the 'class' package. It was very straightforward, and I didn't code as much as just prepare the data, give it to 'knn()' and then plot the results with ggplot.
I added the code to my Github Machine Learning repository, and a sample output image of the solver is bellow, for k=1: