next up previous
Up: Stat 5101 Homework Solutions

Statistics 5101, Fall 2000, Geyer


Homework Solutions #7

Problem L6-2

(a)

Let Y be the r. v. denoting the number of successes in four trials. Then $Y \sim \text{Bin}(4, 0.5)$.

\begin{displaymath}P(Y = 4) = \binom{4}{4}
\left(\frac{1}{2}\right)^4 \left(\frac{1}{2}\right)^0
= \frac{1}{2^4} = \frac{1}{16}.
\end{displaymath}

(b)

Since $Y \sim \text{Bin}(10, 0.5)$,

\begin{eqnarray*}P(Y \leq 8) & = & 1 - P(Y > 8 ) \\
& = & 1 - P(Y = 9) - P(Y =...
...2} \right) ^0 \\
& = & 1 - \frac{11}{2^{10}} \\
& = & 0.989
\end{eqnarray*}


(c)

Since $Y \sim \text{Bin}(12, 0.5)$,

\begin{eqnarray*}P(Y \geq 3) & = & 1 - P(Y \leq 2 ) \\
& = & 1 - P(Y = 0) - P(...
...\right) ^{10} \\
& = & 1 - \frac{79}{2^{12}} \\
& = & 0.981
\end{eqnarray*}


(d)

Assuming independent trials, the results of the preceding three trials are irrelevant. Since $Y \sim \text{Bin}(3, 0.5)$,

\begin{displaymath}P(Y = 3) = \binom{3}{3}
\left( \frac{1}{2} \right) ^3 \left( \frac{1}{2} \right) ^0 = \frac{1}{2^3} = \frac{1}{8}.
\end{displaymath}

Problem L6-6

(a)

Since $X \sim \text{Bin}(100, 0.05)$,

\begin{displaymath}E(X) = 100 \times 0.05 = 5
\end{displaymath}

and

\begin{displaymath}\mathop{\rm sd}\nolimits(X) = \sqrt{100 \times 0.05 \times 0.95} = \sqrt{4.75} = 2.179.
\end{displaymath}

(b)

Since $X \sim \text{Bin}(50, 0.05)$,

\begin{displaymath}P(X \leq 3) = \sum_{k=0}^3 \binom{50}{k} (.05)^k (.95) ^{50 - k} = 0.76
\end{displaymath}

Problem L6-23


\begin{eqnarray*}P(X_1 = k \mid X_1 + X_2 = m) & = & P(X_1 = k, X_2 = m - k \mid...
...inom{n_1}{k}
\binom{n_2}{m - k}
}{ \binom{n_1 + n_2}{m}
}
\end{eqnarray*}


That is, $\text{Hypergeometric}(n_1 + n_2, n_1, m)$.

Problem L6-25

(a)

Let U denote the number of trials required to obtain the r-th success; then $U \sim \text{NegBin}(3, 0.05)$. Hence

\begin{displaymath}E(U) = \frac{r}{p} = \frac{3}{.05} = 60.
\end{displaymath}

(b)


\begin{displaymath}P(\text{0 or 1 successes in first 7 tries})
= 0.95^7 + 7 \times 0.05 \times 0.95^6 = 0.9556
\end{displaymath}

Problem L6-30

(a)

$X \sim \text{Poi}(4)$. So

\begin{displaymath}P(X \leq 6)
=
e^{-4} \left[ 1 + 4 + \frac{4^2}{2!} + \fra...
...{4^4}{4!}
+ \frac{4^5}{5!}+ \frac{4^6}{6!} \right] = 0.8893
\end{displaymath}

(b)

The average number in 1/4 min is $\lambda t = 1$. Hence

\begin{displaymath}P(X \geq 3) = 1 - P(X < 3)
= 1 - e^{-1} \left[ 1 + 1 + \frac{1^2}{2!} \right] = 0.0803.
\end{displaymath}

(c)

The average number in 1/2 min is $\lambda t = 2$. Hence

P(X = 0) = e-2 = 0.1353

Problem L6-44

There are 30 arrivals per hour, or 2.5 per minute, so

\begin{displaymath}P(\text{at least two arrivals in a five-minute interval})
= 1 - e^{-2.5}(1 + 2.5) = 0.7127.
\end{displaymath}

Problem L6-45

X1 and X2 are independent, so $X_1 + X_2 \sim \text{Poi}(\lambda t_1 + \lambda t_2)$. Then

\begin{eqnarray*}P(X_1 = k \mid X_1 + X_2 = m) & = & \frac{P(X_1 = k, X_2 = m - ...
... t_2)}} \\
& = & \frac{m!}{k! ( m - k)!} p^k (1 - p)^{m - k},
\end{eqnarray*}


where p = t1 / (t1 + t2). Hence the conditional distribution is $\text{Bin}\bigl(m, \frac{t_1}{t_1 + t_2}\bigr)$.

Problem N3-7

(a)

The conditional density of Y given X is

\begin{displaymath}f(y \mid x)
=
\frac{1}{\sqrt{2\pi/x}}e^{-\frac{1}{2(1/x)} y^2}.
=
\frac{x^{1/2}}{\sqrt{2 \pi}}e^{- x y^2 / 2}.
\end{displaymath}

The marginal density of X is

\begin{displaymath}f(x) = \frac{\lambda^{\alpha}x^{\alpha-1}e^{-\lambda x}}{\Gamma(\alpha)}
\end{displaymath}

Thus the joint density is (marginal times conditional)

\begin{displaymath}\begin{split}
f(x, y)
& =
\frac{x^{1/2}}{\sqrt{2 \pi}}e...
...x^{\alpha + 1/2 - 1} e^{- (\lambda + y^2 / 2) x}
\end{split}
\end{displaymath}

It is clear from the form that this, considered as a function of x for fixed y is proportional to a $\text{Gam}(\alpha + 1 / 2, \lambda + y^2 / 2)$ density. Thus that is the conditional distribution. The conditional density is

\begin{displaymath}f(x \mid y)
=
\frac{(\lambda + \frac{y^2}{2})^{\alpha + 1...
...ac{1}{2})}
x^{\alpha + 1/2 - 1} e^{- (\lambda + y^2 / 2) x}
\end{displaymath}

(b)

The marginal is joint over conditional

\begin{displaymath}\begin{split}
f_Y(y)
& =
\frac{\displaystyle
\frac{\l...
...a}{(\lambda+\frac{y^2}{2})^{\alpha+\frac{1}{2}}}
\end{split}
\end{displaymath}

Problem N3-8

(a)


\begin{displaymath}E(Y \mid X) = E(X + X^2 + Z \mid X) = X + X^2 + E(Z) = X + X^2
\end{displaymath}

(b)


\begin{displaymath}\begin{split}
\mathop{\rm var}\nolimits(Y \mid X)
& =
E...
...(Z^2)
\\
& =
\mathop{\rm var}\nolimits(Z)
\end{split}
\end{displaymath}

(c)

Just another way of describing part (a). The conditional expectation is the best prediction.

(d)

Just another way of describing part (d). The expected squared prediction error of the best prediction is $E\{\mathop{\rm var}\nolimits(Y \mid X)\}$, which in this case is the same as $\mathop{\rm var}\nolimits(Y \mid X)$, because the conditional variance is not actually a function of X.

Problem N4-1

The convolution formula is

\begin{displaymath}f_{X + Y}(z) = \sum_{y = 0}^\infty f_X(z - y) f_Y(y)
\end{displaymath}

(Theorem 1.7 in the notes with the integral replaced by a sum because Y is discrete). But the sum actually only goes from 0 to z because x = z - y must be nonnegative. Thus

\begin{displaymath}\begin{split}
f_{X + Y}(z)
& =
\sum_{y = 0}^z
\frac{\...
...z !}
e^{- (\mu_X + \mu_Y)}
(\mu_X + \mu_Y)^z
\end{split}
\end{displaymath}

which is the $\text{Poi}(\mu_X + \mu_Y)$ density.

Problem N4-3

The count has a $\text{Poi}(84.2)$ distribution. So is variance is 84.2 and its standard deviation is $\sqrt{84.2} = 9.176$.

Problem N4-4


\begin{displaymath}\begin{split}
E\left(\frac{1}{X+1}\right)
& =
\sum_{x=...
...\sum_{x=0}^\infty
\frac{e^{-\mu}\mu^x}{(x+1)!}
\end{split}
\end{displaymath}

Do the change of variables y = x + 1. Then

\begin{displaymath}\begin{split}
E\left(\frac{1}{X+1}\right)
& =
\sum_{y ...
...u} \sum_{y = 1}^\infty \frac{e^{-\mu}\mu^y}{y !}
\end{split}
\end{displaymath}

The sum now looks like the sum of a Poisson density equaling one except that the y = 0 term is missing. Thus the sum is one minus the y = 0 term

\begin{displaymath}E\left(\frac{1}{X+1}\right)
=
\frac{1 - e^{- \mu}}{\mu}
\end{displaymath}

and the answer is A times this, since A is a constant.
next up previous
Up: Stat 5101 Homework Solutions
Charles Geyer
2000-11-14