Hyper-parameterization in case of K-Fold Validation

In KNN, we call varying value of k as hyper-parameterization to determine what value of k produces the best model for training & testing.

When we are doing K-fold validation to average out the error, would we also call it as hyper-parameterization?

1 Like

Hi @siddhantjawa18,

Hyperparameterization refers to adjusting parameters of a model in order to control the learning rate of that model. K-fold validation is a method to minimize the error from model training and testing by splitting the dataset into multiple folds (usually 3 or more) for training, testing, and validating the model. Adjusting the number of folds doesn’t have any direct influence on the learning rate of the model used, it only helps give better results.

So the answer is no, we can’t call it Hyperparameterization.

2 Likes