Rules

No rules. This is practice.

Grades

No grades. This is practice.

Disclaimer

These practice problems are supplied without any guarantee that they will help you do the quiz problems. However, they were written after the quiz problems were written and with the intention that they would help.

These practice problems are also supplied without any guarantee that they are exactly or even nearly like the quiz problems. However, they are like at least some quiz problems in at least some respects.

Problem 1

This problem is a redo of Practice Problems 6, Problem 1 (the solutions for which have been posted). The only difference is that we are going to parallelize the computations using the R function mclapply in the base package parallel following Section 7.1 of the course notes about parallel computing).

You may have everything in your solution as in the solution for Practice Problems 6, Problem 1 except for parallelization. You have to break up the work into multiple pieces, and do each piece using mclapply operating on one component of the list it is given.

On unix (Linux or Mac OS X) presumably you want the same number of pieces at the R function detectCores in the R package parallel says your computer has.

On Microsoft Windows this method of parallelization does not work. But the documentation for mclapply says the function will work (but just not do any parallelization) if the optional argument mc.cores = 1 is supplied. This allows you to do this problem satisfactorily even if you have Windows. Also you know that you could actually do parallelization this way if you ever get a real computer (TM).

Don't forget to

Problem 2

This problem is a redo of the the preceding problem. (the solutions for which have been posted). The only difference is that we are going to parallelize the computations using the R function parLapply in the base package parallel (rather than the R function mclapply from the same package, which was used in the preceding problem following Section 7.2 of the course notes about parallel computing).

Unlike in the proceeding problem, this method should work equally well on Windows and unix.

In addition to the hints to the preceding problem, don't forget that you may need to use the R function clusterExport in R package parallel.

Problem 3

This problem is about Bayesian inference via Markov chain Monte Carlo (MCMC), which was covered in the course notes on that subject.

The statistical model is, like the example in the course notes, logistic regression, but simpler.

For data we will use the toy logistic regression data used in the package vignette for the CRAN package mcmc that we will also use for doing MCMC.


vignette("demo", "mcmc")

Unlike the analysis in that package vignette, for the prior we are going to use what the course notes on Bayesian inference call the "method of made-up data" (which is a slight extension of the well known and widely used method of conjugate priors, which we did not explain in the course notes, and will not explain here).

If

p(θ) = exp(θ) / [1 + exp(θ)]
q(θ) = 1 / [1 + exp(θ)]
Then we are going to use the prior
i = 1p pi) qi)
where βi, i = 1, …, p are the logitistic regression coefficients.

I assure you this is a proper prior. This comes from theory (Diaconis and Ylvisaker (1979), which is beyond the scope of this course.

The justification for this prior is the same as in the example in the course notes. It is the likelihood we would if we had binomial data with one success and one failure for each parameter, with the parameter being the linear predictor for this made-up data (logit of success probability).

Follow the example in the vignette except for using this different prior until you have gotten some good simulations.

Then, following Section 9.4.6 of the course notes on Bayesian inference make posterior PDF for each parameter except the intercept (nobody cares about that).