

Bootstrap and Jackknife
Bootstrap and Jackknife algorithms don't really give you something for nothing. They give you something you previously ignored. The jackknife is an algorithm for resampling from an existing sample to get estimates of the behavior of the single sample's statistics. An example of the jackknife would be to omit the first, 2^{nd}, 3^{rd}, ... observation from a sample of size n. Then compute the (n1) averages. The variance of these resampled averages is an estimate of the variance of the original sample mean. The bootstrap is a generalization of the jackknife that resamples, with replacement, some number of times (say 1000), and computes the statistic of interest from these resamples, thereby providing an estimate of the original sample's variability. (e.g. The 95^{th} percentile is estimated by the 950^{th} observation of the ordered 1000 resamples). The name, of course, comes from its apparent ability to pull itself up by its own bootstraps. (In Rudolph Erich Raspe's tale, Baron Munchausen had fallen to the bottom of a deep lake and just as he was to succumb to his fate he thought to pull himself up by his own bootstraps.) _____________ Note: While this simple algorithm works quite well, it is termed a "naive" bootstrap because easily implemented improvements can be made to reduce potential bias. Reference: Efron, Bradley and Robert J. Tibshirani, An Introduction to the Bootstrap, Chapman and Hall, 1993


