% 256f19Assignment7.tex Conditional distributions and independence
\documentclass[12pt]{article}
%\usepackage{amsbsy} % for \boldsymbol and \pmb
%\usepackage{graphicx} % To include pdf files!
\usepackage{amsmath}
\usepackage{amsbsy}
\usepackage{amsfonts}
\usepackage[colorlinks=true, pdfstartview=FitV, linkcolor=blue, citecolor=blue, urlcolor=blue]{hyperref} % For links
\usepackage{fullpage}
% \pagestyle{empty} % No page numbers
\begin{document}
%\enlargethispage*{1000 pt}
\begin{center}
{\Large \textbf{\href{http://www.utstat.toronto.edu/~brunner/oldclass/256f19}{STA 256f19}
Assignment Seven}}\footnote{Copyright information is at the end of the last page.}
\vspace{1 mm}
\end{center}
\noindent
Please read Sections 2.8 and 2.9 in the text. Note that in a departure from the text, the criterion for independence is in this class is $F_{_{X,Y}}(x,y) =F_{_X}(x)F_{_Y}(y)$ for all real $x$ and $y$. Also, look over your lecture notes. The following homework problems are not to be handed in. They are preparation for Term Test 3 and the final exam. % Use the formula sheet.
%\vspace{5mm}
\begin{enumerate}
\item Let $X$ and $Y$ be continuous random variables.
\begin{enumerate}
\item Prove that if $f_{_{X,Y}}(x,y) = f_{_X}(x) \, f_{_Y}(y)$ for all real $x$ and $y$, then the random variables $X$ and $Y$ are independent. This result is also true if the condition holds except on a set of probability zero.
\item Prove that if $X$ and $Y$ are independent, then $f_{_{X,Y}}(x,y) = f_{_X}(x) \, f_{_Y}(y)$ at all points where $F_{xy}(x,y)$ is differentiable and $f_{xy}(x,y)$ is continuous.
\end{enumerate}
\item Let $X$ and $Y$ be discrete random variables. Prove that if $p_{_{X,Y}}(x,y) = p_{_X}(x) \, p_{_Y}(y)$, then $X$ and $Y$ are independent.
\item Do exercises
2.8.1, % Discrete marginal and independence
2.8.3, % Continuous marginal and independence
2.8.5 % Discrete conditional
in the text.
\item Do exercise 2.8.8 in the text. The answer is 2/5. % Condition to get probability
To make this problem easier, first prove that $P(Y>5) = \int_{-\infty}^\infty P(Y>5|X=x) f_{_X}(x) \, dx$. Write that conditional probability as an integral with repect to a conditional density, and switch order of integration.
\item Let $p_{_{X,Y}}(x,y) = \frac{xy}{36}$ for $x=1,2,3$ and $y=1,2,3$, and zero otherwise.
\begin{enumerate}
\item What is $p_{_{Y|X}}(1|2)$? Answer is 1/6.
\item What is $p_{_{X|Y}}(1|2)$? Answer is 1/6.
\item Are $X$ and $Y$ independent? Answer Yes or No and prove your answer.
\end{enumerate}
\item \label{continuousXY} The continuous random variables $X$ and $Y$ have joint probability density function
\begin{displaymath}
f_{_{X,Y}}(x,y) = \left\{ \begin{array}{ll} % ll means left left
k \, x^2y & \mbox{for $0 \leq x \leq 1$ and $0 \leq y \leq x^2$} \\
0 & \mbox{otherwise}
\end{array} \right. % Need that crazy invisible right period!
\end{displaymath}
\begin{enumerate}
\item What is the value of $k$? Answer is $k=14$. % Verified.
\item Find the marginal density function $f_{_X}(x)$. Do not forget to indicate where the density is non-zero.
\item Find the marginal density function $f_{_Y}(y)$. Do not forget to indicate where the density is non-zero.
\item Find the conditional density function $f_{_{X|Y}}(x|y)$. Do not forget to indicate where the density is non-zero. For what values of $y$ is this conditional density defined?
\item Find the conditional density function $f_{_{Y|X}}(y|x)$. Do not forget to indicate where the density is non-zero. For what values of $x$ is this conditional density defined?
\item Are $X$ and $Y$ independent? Answer Yes or No and justify your answer.
\end{enumerate}
\pagebreak
\item This question is from the 2018 final exam. There is a continuous version of Bayes' Theorem, which says
\begin{displaymath}
f_{_{Y|X}}(y|x) = \frac{f_{_{X|Y}}(x|y) f_{_Y}(y)}
{\int_{-\infty}^\infty f_{_{X|Y}}(x|t) f_{_Y}(t) \, dt}
\end{displaymath}
Prove it. It's helpful to start with the right-hand side.
\item Do exercise 2.8.9 in the text. Don't waste energy trying to think of a new example. Look at the clever answer in the back of the book and show that
\begin{enumerate}
\item $P(X=1,Y=1) = P(X=1)P(Y=1)$, but
\item $X$ and $Y$ are not independent.
\end{enumerate}
The purpose of this question was to set up the next one.
\item Do exercise 2.8.10 in the text. Hint: Consider all 4 possibilities. Make a $2 \times 2$ table. Fill in the information you know and then solve for the rest.
\item Do exercise 2.8.12 in the text. The answer is 1/3. % This would make a good multiple choice because it is so quick.
\item Do exercise 2.8.15 in the text. % Continuous marginal, conditional, independence.
\item Do exercise 2.8.18 in the text. An additional hint is to define $k_1 = \sum_y h(y)$ and $k_2 = \sum_x g(x)$. Then express the marginal probability mass functions in terms of $k_1$ and $k_2$.
\item Exercise 2.8.19 is just like 2.8.18, except for continuous random variables. You don't have to do it, but you know how; just integrate instead of adding. The question is, does the example of Problem \ref{continuousXY} contradict this theorem? Answer Yes or No and briefly explain. % Support is part of the density.
% These last 3 are straight from the sample problems.
\item Let $X_1, \ldots, X_n$ be independent random variables with probability density function $f_{_X}(x)$ and cumulative distribution function $F_{_X}(x)$. Let $Y = \max(X_1, \ldots, X_n)$. Find the density $f_{_Y}(y)$.
\item Let $X_1, \ldots, X_n$ be independent exponential random variables with parameter $\lambda$. Let $Y = \max(X_1, \ldots, X_n)$. Find the density $f_{_Y}(y)$. Do not forget to indicate where the density is non-zero.
\item Let $X_1, \ldots, X_n$ be independent random variables with probability density function $f_{_X}(x)$ and cumulative distribution function $F_{_X}(x)$. Let $Y = \min(X_1, \ldots, X_n)$. Find the density $f_{_Y}(y)$.
%%%%%%%%%%%%%%%%%%%%% I forgot this whole topic!!
\item Let $X \sim$ Poisson($\lambda_1$) and $Y \sim$ Poisson($\lambda_2$) be independent. Using the convolution formula, find the probability mass function of $Z=X+Y$ and identify it by name.
\item Let $X_1$ and $X_2$ be independent exponential random variables with parameter $\lambda=1$. Find the probability density function of $Y_1 = X_1/X_2$.
\item Let $X_1$ and $X_2$ be independent exponential random variables with parameter $\lambda=1$. Find the probability density function of $Y_1 = \frac{X_1}{X_1+X_2}$. Be sure to specify where the density is non-zero.
\item Let $X_1$ and $X_2$ be independent standard normal random variables; that is, $\mu=0$ and $\sigma^2=1$. Find the probability density function of $Y_1 = X_1/X_2$.
\item Show that the normal probability density function integrates to one. The formula for change to polar co-ordinates is on the formula sheet.
\end{enumerate} % End of all the questions
% \vspace{60mm}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\vspace{3mm}
\hrule
\vspace{3mm}
\noindent
This assignment was prepared by \href{http://www.utstat.toronto.edu/~brunner}{Jerry Brunner},
Department of Mathematical and Computational Sciences, University of Toronto. It is licensed under a
\href{http://creativecommons.org/licenses/by-sa/3.0/deed.en_US}
{Creative Commons Attribution - ShareAlike 3.0 Unported License}. Use any part of it as you like and share the result freely. The \LaTeX~source code is available from the course website:
\begin{center}
\href{http://www.utstat.toronto.edu/~brunner/oldclass/256f19} {\small\texttt{http://www.utstat.toronto.edu/$^\sim$brunner/oldclass/256f19}}
\end{center}
\end{document}
\vspace{20mm} \hrule
\vspace{3mm}
\vspace{3mm} \hrule