Build and train a K-Nearest Neighbors model, then evaluate its performance using actual test data. Learn how predictions align with real-world outcomes and explore methods of measuring model accuracy.
Key Insights
- Use a K-Nearest Neighbors classifier with three neighbors to train a machine learning model.
- Fit the model using provided training data (x-train) along with its known classifications.
- Confirm the accuracy of model predictions by comparing them against the actual results from test data, with further evaluation methods detailed in the subsequent section.
This lesson is a preview from our Data Science & AI Certificate Online (includes software) and Python Certification Online (includes software & exam). Enroll in a course for detailed lessons, live instructor support, and project-based training.
Let's create our model, train it, and check its predictions. We'll call it `knn_model`, and it's the K Neighbors Classifier, and we pass it `n_neighbors=3`. Run that code block, and now we'll train it—or the other term for that is fit it. Fit it to the data by giving it the `X_train` data and the corresponding answers for the `X_train` data.
And our model is trained. What we get back is a model. Now let's actually take a look at these predictions.
We'll say I want you to now give me some predictions based on some test data. Hey, okay, given these flowers without the answers, what is your prediction as to where each one fits? Let's print out the model predictions, the predicted values, and the correct answers, and we'll make a list out of `y_test` to do that. All right, let's check that out.
It seems pretty good. 1, 0, 0, 0, 2, 2, 1. 1, 0, 0, 0, 2, 2, 1. Well, there's 30 of these. We can eyeball it eventually, but let's see how well it actually performed.
We'll measure it in many different ways in our next section.