General Instructions

To do each example, just click the Submit button. You do not have to type in any R instructions or specify a dataset. That's already done for you.

Mice

Bootstrapping the Sample Mean

Chapter 2 in Efron and Tibshirani.

Comments

Bootstrapping Something Else

This example differs from the preceding one only in the estimator being a 15% trimmed mean rather than the ordinary mean. Everything else is the same.

Chapter 2 in Efron and Tibshirani.

Comments

Law (Correlation)

Nonparametric Bootstrap

Section 6.3 in Efron and Tibshirani.

Comments

Parametric Bootstrap

Section 6.5 in Efron and Tibshirani.

Comments

Nonparametric Bootstrap Revisited

Section 6.3 in Efron and Tibshirani.

Comments

Having stolen the variance stabilizing transformation trick (hyperbolic tangent, in this case) from the theoreticians, we can apply it to the nonparametric bootstrap as well.

It is still true that

tanh(z.hat + c(-1, 1) * qnorm(0.975) * sd(z.star))

is a better bootstrap confidence interval for ρ than the naive interval

rho.hat + c(-1, 1) * qnorm(0.975) * sd(rho.star)

The hyperbolic tangent transformation is no longer exactly variance stabilizing. That depended on the population being normal. Nevertheless, it still does approximately the right thing, as can be seen from the histograms.

Testing (Students)

Section 7.2 in Efron and Tibshirani.

Comments

The Moral of the Story

The bootstrap is not just for simple problems. Although for pedagogical reasons, a lot of what the book and we do is simple (to keep the main issues clear), the bootstrap works in very complicated situations where there is no other way to approach the analysis.

The bootstrap keeps going after theory poops out.