Homework #10

EE 364: Spring 2026

ImportantAssignment Details

Assigned: 06 April
Due: Thursday, 23 April at 16:00

BrightSpace Assignment: Homework 10

WarningInstructions

Write your solutions to these homework problems. Submit your work to BrightSpace by the due date. Show all work and box answers where appropriate. Do not guess.


Problem 0

Daily derivation #10

  • Show that the sample variance is unbiased and consistent: \(E[S_X^2(n)] = \sigma_X^2\) for all \(n\) and \(S_X^2(n) \xrightarrow{p} \sigma_X^2\).

Problem 1

Suppose that \(20\%\) of voters are in favor of certain legislation. A large number \(n\) of voters are polled and a relative frequency estimate \(f_A(n)\) for the above proportion is obtained. How many voters should be polled in order that the probability is at least \(0.95\) that \(f_A(n)\) differs from \(0.20\) by less than \(0.02\).

Problem 2

The random-variable sequence \(X_1, X_2, X_3, \ldots\) consists of similarly distributed binomial random variables \(X_n \sim b(n,p)\) where \(0 < p < 1\). Define the sequence of estimators \(\theta_n\) as \(\theta_n = 1 - \frac{X_n}{n}\). Is \(\theta_n\) a consistent estimator for \(1 - p\)?

Problem 3

You randomly survey \(n\) voters before the state election between candidates John and Mary. Say you use the estimator \(\widehat{\theta}_n = \frac{X- \sqrt{n}/4}{n + \sqrt{n}}\) to estimate the unknown probability \(p\) that a voter will vote for John. Random variable \(X\) is the number of voters that vote for John in \(n\) votes. Is the estimator \(\widehat{\theta}_n\) consistent?

Problem 4

Random variables \(Y_n\) are independent Poisson with mean \(\sum_{i=1}^n \frac{1}{i}\). Does \(X_n = \frac{Y_n}{\ln n}\) converge in probability? [Hint: \(\ln n + \frac{1}{n} \le \sum_{i=1}^{n} \frac{1}{i} \le \ln n + 1\).]

Problem 5

Let \(X_1, \ldots, X_n\) be i.i.d. \(U[0, \theta]\) where \(\theta > 0\) is unknown. Consider two estimators of \(\theta\):

\[\hat{\theta}_A = 2\bar{X}_n, \qquad \hat{\theta}_B = \max(X_1, \ldots, X_n).\]

You may use: \(E[\hat{\theta}_B] = \frac{n}{n+1}\theta\) and \(V[\hat{\theta}_B] = \frac{n\theta^2}{(n+1)^2(n+2)}\).

  1. Compute the bias and mean-square error of each estimator.

  2. Which estimator has smaller MSE for \(n \ge 3\)?

  3. Does either estimator converge in mean-square to \(\theta\)?

  4. Is either a consistent estimator of \(\theta\)?

Problem 6

Let \(X_1, \ldots, X_n\) be i.i.d. with mean \(\mu \ne 0\) and variance \(\sigma^2\). Consider the estimator \(\hat{\mu}_n = c\,\bar{X}_n\) for constant \(c > 0\).

  1. Compute the bias and mean-square error of \(\hat{\mu}_n\) as an estimator of \(\mu\).

  2. Find the value \(c^*\) that minimizes \(\text{MSE}(\hat{\mu}_n)\).

  3. For what values of \(c\) does \(\hat{\mu}_n\) have strictly lower MSE than \(\bar{X}_n\)?

Problem 7

John records \(n\) independent lifetimes of a mechanical component. Each lifetime \(X_k \sim \textrm{Gamma}(2, \theta)\) with unknown scale parameter \(\theta\). He estimates \(\theta\) with \(\hat{\theta}_n = \bar{X}_n / 2\). Mary suggests using \(\tilde{\theta}_n = \frac{n}{2(n+1)}\bar{X}_n\) instead, claiming it has strictly smaller mean-square error for every \(n \ge 1\) and every \(\theta > 0\). Do you agree?

Problem 8

An environmental engineer measures dissolved oxygen in a reservoir with a portable sensor. She takes \(n\) independent readings \(X_k = \mu + \epsilon_k\) where \(\mu\) is the true concentration (mg/L) and the errors \(\epsilon_k\) are i.i.d. with \(E[\epsilon_k] = 0.2\) (systematic sensor bias) and \(V[\epsilon_k] = 4\).

  1. Compute \(\textrm{MSE}(\bar{X}_n)\) as an estimator of \(\mu\).

  2. How large must \(n\) be to guarantee \(P\!\left[|\bar{X}_n - \mu| > 1\right] \le 0.05\)?

  3. A colleague discovers and corrects the systematic bias. How large must \(n\) be now?

  4. Suppose instead that the colleague estimates the bias using \(m\) independent calibration readings, producing \(\hat{\delta}_m\) with \(E[\hat{\delta}_m] = 0.2\) and \(V[\hat{\delta}_m] = 1/m\), independent of the \(X_k\). Find \(\textrm{MSE}(\bar{X}_n - \hat{\delta}_m)\) and the minimum \(n = m\) that achieves the same guarantee as (b).

Problem 9

John uses \(\hat{p}_n(1 - \hat{p}_n)\) to estimate the population variance \(p(1-p)\) from \(n\) i.i.d. Bernoulli\((p)\) trials, where \(\hat{p}_n = \bar{X}_n\). He claims the estimator is unbiased because \(E[\hat{p}_n] = p\) and \(E[1 - \hat{p}_n] = 1 - p\). Do you agree? Is the estimator consistent?

Problem 10

Let \(X_1, X_2, \ldots\) be i.i.d. with mean \(\mu\) and variance \(\sigma^2 < \infty\). Show that \((X_1 - X_2)^2/2\) is an unbiased estimator of \(\sigma^2\). Is it consistent?

Problem 11

Let \(X\), \(Z\), and \(U\) be independent random variables with \(X\) and \(Z\) being independent \(\textrm{Exp}(1)\) and \(U \sim U[-1/2, 1/2]\). Compute \(E[e^{(X+Z)U}]\).

Problem 12

Let \(Y \sim U[1,2]\), and given \(Y = y\), suppose that \(X \sim \textrm{Laplace}(y)\). Find \(E[X^2 Y]\).

Problem 13

Let \(Y \sim \textrm{Exp}(\lambda)\), and suppose that given \(Y = y\), \(X \sim \textrm{Gamma}(p, y)\). Assuming \(r > n\), evaluate \(E[X^n Y^r]\).

Problem 14

Let \(V\) and \(U\) be independent random variables with \(V \sim \textrm{Erlang}(2, 1)\) and \(U \sim U[-1/2, 1/2]\). Put \(Y := e^{VU}\).

  1. Find the density \(f_Y(y)\) for all \(y\).

  2. Use your answer to part (a) to compute \(E[Y]\).

  3. Compute \(E[Y]\) directly by using the laws of total probability and substitution.

Problem 15

Use the law of total probability to solve the following problems.

  1. Evaluate \(E[\cos(X + Y)]\) if given \(X = x\), \(Y\) is conditionally uniform on \([x - \pi, x + \pi]\).

  2. Evaluate \(P(Y > y)\) if \(X \sim U[1, 2]\), and given \(X = x\), \(Y\) is exponential with parameter \(x\).

  3. Evaluate \(E[Xe^Y]\) if \(X \sim U[3, 7]\), and given \(X = x\), \(Y \sim N(0, x^2)\).

  4. Let \(X \sim U[1, 2]\), and suppose that given \(X = x\), \(Y \sim N(0, 1/x)\). Evaluate \(E[\cos(XY)]\).