Rules

No rules. This is practice.

Grades

No grades. This is practice.

Disclaimer

These practice problems are supplied without any guarantee that they will help you do the quiz problems. However, they were written after the quiz problems were written and with the intention that they would help.

These practice problems are also supplied without any guarantee that they are exactly or even nearly like the quiz problems. However, they are like at least some quiz problems in at least some respects.

Problem 1

This problem uses the data read in by


foo <- read.csv("http://www.stat.umn.edu/geyer/s17/3701/data/p5p1.csv")

which makes foo a data frame having one variable x, which is quantitative.

We assume these data are IID normal with mean θ and variance θ2. Thus there is only one parameter.

Note that θ2 is the variance, but the third parameter of the R function dnorm is the standard deviation. There are three obvious estimators of θ

Calculate all 3 of these

Problem 2

This problem continues the preceding problem, using the same data and model.

We know how to calculate confidence intervals based on the sample mean and the MLE, but not for the other estimator in problem 1.

Calculate 95% confidence intervals centered at those two estimators.

Problem 3

This problem continues the preceding problem, using the same data and model.

As in the example in Section 5.4 of the course notes on statistical models, part II, it is a bit surprising that the confidence interval based on the MLE is wider than the confidence interval based on the sample mean.

So calculate the confidence interval that uses expected Fisher information.

Hint: This is hard because the formula for the PDF has σ in it, and σ = |θ|, and the R function D does not know how to deal with the absolute value. So write this as sqrt(theta^2). The R function D does know how to deal with square root and powers.

Problem 4

This problem continues the preceding problem, using the same data and model.

Make a "likelihood-based" confidence interval (a level set of the log likelihood) as in the example in Section 5.4.4 of the course notes on statistical models, part II.