Random Forest Out of Bag Error Estimate

If the Random Forest algorithm already computes an estimate for the out-of-bag error, then why bother doing K-Fold cross-validation when tuning this model? Could you do hyper-parameter tuning using only the out-of-bag error estimate?

Out of bag error is simply error computed on samples not seen during training. Out-of-bag estimate for the generalization error is the error rate of the out-of-bag classifier on the training set (compare it with known yi’s). In Breiman’s original implementation of the random forest algorithm, each tree is trained on about 2/3 of the total training data. As the forest is built, each tree can thus be tested (similar to leave one out cross validation) on the samples not used in building that tree. This is the out of bag error estimate - an internal error estimate of a random forest as it is being constructed.