Homework #7
EE 364: Spring 2026
Assigned: 24 February Due: Tuesday, 03 March at 16:00
BrightSpace Assignment: Homework 7
Write your solutions to these homework problems. Submit your work to BrightSpace by the due date. Show all work and box answers where appropriate. Do not guess.
Problem 0
Daily derivation #7
State the formal definition for MOPD convergences. Prove the Weak Law of Large Numbers: \(\overline{X}_n \stackrel{p}{\rightarrow} \mu_X\). Show that that the sample variance is unbiased and consistent: \(E[S_X^2(n)] = \sigma_X^2\) for all \(n\) and \(S_X^2(n) \stackrel{p}{\rightarrow} \sigma_X^2\).
Derive the pdf for \(Y = g(X)\) if \(g(x)\) is 1-to-1.
Problem 1
Random variable \(Y = e^X\). Find the pdf of \(Y\) when \(X\) is a Gaussian random variable – in this case \(Y\) is said to be a lognormal random variable. Plot the pdf and cdf of \(Y\) when \(X\) is zero-mean with variance \(\frac{1}{8}\). Repeat with variance 8.
Problem 2
Let \(Y = \alpha \tan \left( \pi X \right)\), where \(X\) is uniformly distributed in the interval \([-1, 1]\). Determine the pdf \(f_Y(y)\) and identify \(Y\) as one of the BEG-CUP random variables.
Problem 3
Let a radius be given by the random variable X as: \[f_X(x) = \begin{cases} c x \left( 1 - x^2\right) & 0 \le x \le 1 \\ 0 & \textrm{else} \end{cases}\]
Find the pdf of the area covered by a disc with radius \(X\).
Find the pdf of the volume of a sphere with radius \(X\).
Problem 4
A voltage \(X\) is a Gaussian random variable with mean 1 and variance 2. Find the pdf of the power dissipated by an \(R\)-\(\Omega\) resistor \(P = R X^2\).
Problem 5
Suppose that the number of particle emissions by a radioactive mass in \(t\) seconds is a Poisson random variable with mean \(\lambda t\). Use the Chebyshev inequality to obtain a bound for the probability that \(\left|N(t)/t - \lambda\right|\) exceeds \(\epsilon\).
Problem 6
Suppose that \(20\%\) of voters are in favor of certain legislation. A large number \(n\) of voters are polled and a relative frequency estimate \(f_A(n)\) for the above proportion is obtained. How many voters should be polled in order that the probability is at least \(0.95\) that \(f_A(n)\) differs from \(0.20\) by less than \(0.02\).
Problem 7
John works as a for-hire consultant as he searches for a job. A small software firm hires John to write some statistical software. The seller wants an accurate prediction of how many customers to expect on Sunday morning. The seller knows enough about probability that she does not trust John’s “best guess” for a particular probability density function. She requires that John’s analysis must be completely data-driven. So John first examines the customer records for the past several Sunday mornings. He uses this information to estimate that the mean number of customers on a Sunday morning is 19 customers with a standard deviation of 3.5 customers. The seller specifically asks John to tell her how likely it is that on Sunday there will be between 12 and 26 customers. What is John’s best answer?
Problem 8
A fair die is tossed 20 times. Bound the probability that the total number of dots is between 60 and 80.
Problem 9
Let \(\zeta\) be selected at random from the interval \(S = [0,1]\), and let the probability that \(\zeta\) is in a subinterval of \(S\) be given by the length of the subinterval. Define the following sequences of random variables for \(n \ge 1\): \[X_n(\zeta) = \zeta^n \qquad Y_n(\zeta) = \cos^2 2 \pi \zeta \qquad Z_n(\zeta) = \cos^n 2 \pi \zeta\] Do the sequences converge, and if so, in what sense and to what limiting random variable?
Problem 10
Let \(X_n \sim \textrm{Cauchy}(\frac{1}{n})\). Show that \(X_n\) converges in probability to zero.
Problem 11
Let \(U \sim \textrm{Uniform}({0,1})\) and put \[X_n = n I_{[0,\frac{1}{\sqrt{n}}]}(U), \qquad n = 1,2,3,\cdots\] Does \(X_n\) converge in probability to zero?
Problem 12
Let \(X_n\) be a sequence of Laplacian random variables with parameter \(\alpha = n\). Does this sequence converge in distribution?
Problem 13
The random-variable sequence \(X_1, X_2, X_3, \ldots\) consists of similarly distributed binomial random variables \(X_n \sim b(n,p)\) where \(0 < p < 1\). Define the sequence of estimators \(\theta_n\) as \(\theta_n = 1 - \frac{X_n}{n}\). Is \(\theta_n\) a consistent estimator for \(1 - p\)?
Problem 14
You randomly survey \(n\) voters before the state election between candidates John and Mary. Say you use the estimator \(\widehat{\theta}_n = \frac{X- \sqrt{n}/4}{n + \sqrt{n}}\) to estimate the unknown probability \(p\) that a voter will vote for John. Random variable \(X\) is the number of voters that vote for John in \(n\) votes. Is the estimator \(\widehat{\theta_n}\) consistent?
Problem 15
Random variables \(Y_n\) are independent Poisson with mean \(\sum_{i=1}^n \frac{1}{i}\). Does \(X_n = \frac{Y_n}{\ln n}\) converge in probability? [hint: \(\ln n + \frac{1}{n} \le \sum_{i=1}^{n} \frac{1}{i} \le \ln n + 1\)].
Problem 16
Random sequences \(X_n \overset{p}{\longrightarrow} X\) and \(Y_n \overset{p}{\longrightarrow} Y\). Is it always true that \(X_n + a Y_n \overset{p}{\longrightarrow} X + a Y\) for any \(a \in \mathbb{R}\)? Only derivations count as an answer.
Random sequences \(V_n \overset{m}{\longrightarrow} V\) and \(W_n \overset{m}{\longrightarrow} W\). Is it always true that \(V_n + b W_n \overset{p}{\longrightarrow} V + b W\) for any \(b \in \mathbb{R}\)? Only derivations count as an answer.
Problem 17
Let \(X_1, \ldots, X_n\) be an i.i.d. sequence of random variables with unknown mean and variance. Define the sample variance as: \[V_n^2 = \frac{1}{n-1} \sum_{k=1}^{n} (X_k - M_n)^2,\] where \(M_n\) is the sample mean.
Show that \[\sum_{k=1}^{n} (X_k - \mu)^2 = \sum_{k=1}^{n} (X_k - M_n)^2 + n (M_n - \mu)^2.\]
Use the result in (a) to show that \[E\left[c \sum_{k=1}^{n} (X_k - M_n)^2\right] = c (n-1) \sigma^2.\]
Use (b) to show that \(E[V_n^2] = \sigma^2\).
Find the expected value of the sample variance if you replace \(\frac{1}{n-1}\) with \(\frac{1}{n}\).
Problem 18
Computer Problems: Approximate the following integrals using a Monte Carlo simulation. Compare your estimates with the exact values (if known):
\(\displaystyle \int_{-2}^{2} e^{x + x^2} dx\).
\(\displaystyle \int_{0}^{4 \pi} \textrm{sinc}(x) dx\).
\(\displaystyle \int_{0}^{1} \int_{0}^{1} e^{-(x + y)^2} dy dx\).