To do each example, just click the "Submit" button. You do not have to type in any R instructions or specify a dataset. That's already done for you.
For comparison, the second line gives the usual parametric analysis
based on assumption of population normality and using Pearson's
product-moment
correlation coefficient (what many textbooks
just call correlation
with no qualifying adjectives).
Same thing, except that we don't have to bother with the
alternative = "greater"
.
Unfortunately, cor.test
doesn't do confidence intervals
for Kendall's tau, so we have to do by hand
in R.
For comparison, the second analysis (below the blank line) gives the usual
parametric analysis based on assumption of population normality and using
Pearson's product-moment
correlation coefficient (what many textbooks
just call correlation
with no qualifying adjectives). A reference
is Lindgren Statistical Theory, 4th ed., p. 427.
Unfortunately, the Spearman
mode of the cor.test
function doesn't do the right thing in the presence of ties.
The cited reference for the algorithm it uses doesn't have any
adjustment for ties.
Hollander and Wolfe give a very bizarre correction for ties that actually changes the point estimate (not just its estimated asymptotic variance) for no reason they explain. Rather than do that, we will do a Monte Carlo test.
Spearman's rho doesn't actually estimate any population quantity of interest.
It does estimate φ defined on p. 405 in Hollander and Wolfe, but that's not a very interesting quantity.
Hence we don't consider it an estimator of anything and the question of confidence intervals is moot.