Hello everybody...
Me and some friends have been looking at the cross validation topic in the Julian Izenman book. At page 122 there's a doubt that merged while reading. We were able to understand the 10-fold and 5-fold at bias and mean squared error, but on the leave-one-out we couldn't understand why the variance has an increment. Here's the citation of the book itself:
"As well as issues of computational complexity ,the difference between taking V = 5 or 10 and taking V = n is one of “bias versus variance.” The
leave-one-out rule yields an estimate of PE R that has low bias but high
variance (arising from the high degree of similarity between the leave-one-
out learning sets), whereas the 5–fold or 10–fold rule yields an estimate
of PE R with higher bias but lower mean squared error (and also lower
variance). "
Perhaps we misunderstood the text but, we would appreciate your help if that'd be the case.
Thanks in advanced.
Me and some friends have been looking at the cross validation topic in the Julian Izenman book. At page 122 there's a doubt that merged while reading. We were able to understand the 10-fold and 5-fold at bias and mean squared error, but on the leave-one-out we couldn't understand why the variance has an increment. Here's the citation of the book itself:
"As well as issues of computational complexity ,the difference between taking V = 5 or 10 and taking V = n is one of “bias versus variance.” The
leave-one-out rule yields an estimate of PE R that has low bias but high
variance (arising from the high degree of similarity between the leave-one-
out learning sets), whereas the 5–fold or 10–fold rule yields an estimate
of PE R with higher bias but lower mean squared error (and also lower
variance). "
Perhaps we misunderstood the text but, we would appreciate your help if that'd be the case.
Thanks in advanced.