× HomeMath BoardArchiveType

Couple exercises from Functions of Random Variables by Xiaojing Ye (Department of Mathematics & Statistics, Georgia State University)

$\textbf{Example}$. If the pdf of $X$ is

$$ f(x)= \begin{cases}6 x(1-x), & \text { if } 0<x<1 \\ 0, & \text { otherwise }\end{cases} $$

Find the probability density function of $Y=X^3$.

Then find the probability density function of $Y=X^3$.

$$\Pr[Y \le k] = \Pr[X^3 \le k] = \Pr[X \le k^{1/3}]$$

We know that $\int_{0}^{k^{1/3}} 6x(1-x) = \int_{0}^{k^{1/3}} 6x-6x^2 = 3(k^{1/3})^2 -2(k^{1/3})^3 = 3 k^{2/3} - 2k $

So the pdf of $Y$ will be $f(k) = 2k^{-1/3}-2$, $k \in(0,1)$

$\textbf{Example.}$ Let $X$ be a random variable with pdf $f(x)$ and $Y=|X|$. Show that the pdf of $Y$ is

$$ g(y)= \begin{cases}f(y)+f(-y), & \text { if } y>0 \\ 0, & \text { otherwise }\end{cases} $$

Let's consider $\Pr[Y \le k] = \Pr[|X| \le k] = \Pr[-k \le X \le k] = \int_{-k}^k f(y)\, dy$

Then if we take derivative with respect to $k$, we will have $$\frac{d}{dk}\Pr[Y \le k] = \frac{d}{dk}\int_{-k}^k f(y)\, dy = f(k) -(-f(-k)) = f(k) +f(-k)$$

$\textbf{Example.}$ Use the previous result to find the pdf of $Y=|X|$ where $X$ is the standard normal RV.

$$X: f(x) = \frac{1}{\sqrt{2 \pi}} \exp(-\frac{x^2}{2})$$

So $$Y: f(y) = \frac{\sqrt{2}}{\sqrt{\pi}} \exp(-\frac{y^2}{2})$$

$\textbf{Example.}$ Suppose the joint pdf of $\left(X_1, X_2\right)$ is

$$f\left(x_1, x_2\right)= \begin{cases}6 e^{-3 x_1-2 x_2}, & \text { if } x_1, x_2>0 \\ 0, & \text { otherwise. }\end{cases}$$

Find the pdf of $Y=X_1+X_2$.

Let's consider $$\Pr[Y \le k] = \Pr[X_1 +X_2 \le k] = \int_0^{k} \int_{0}^{k-y} 6 \exp(-3x -2y)\, dx \,dy$$

$$= 6\int_0^k \exp(-2y) \int_{0}^{k-y}\exp(-3x)\, dx \, dy$$

$$ = -2\int_0^k \exp(-2y) \exp(-3x) |_{0}^{k-y} \, dy$$

$$= -2\int_0^k \exp(-2y) (\exp(-3(k-y)) - 1) \, dy$$

$$=-2 \int_0^k (\exp(-3k+y)) - \exp(-2y)) \, dy$$

$$= -2 (\exp(-3k + y) + \frac{1}{2}\exp(-2y) |_0^k)$$

$$= -2 (\exp(-2k ) + \frac{1}{2}\exp(-2k) - \exp(-3k ) - \frac{1}{2} )$$

$$= -3\exp(-2k) +2\exp(-3k) +1$$

And if we take derivative with respect to $k$, we will have $f(k) = 6 \exp(-2k) -6\exp(-3k)$ for $k > 0$.

Now, let's look into some discrete distribution.

$\textbf{Example.}$ Let $X$ be the number of heads by tossing a fair coin for 4 times. Then the pmf $f$ of $X$ is

x,f(x)

0,0.0625

1,0.25

2,0.375

3,0.25

4,0.0625

Find the pmf $g$ of $Y=\frac{1}{1+X}$.

So, we know that function $\frac{1}{1+X}$ has unique value over $[0, \infty)$,

so we will have

$$y = 1, \frac{1}{2}, \frac{1}{3}, \frac{1}{4}, \frac{1}{5}$$

and

$$f(y) = \frac{1}{16}, \frac{1}{4}, \frac{3}{8}, \frac{1}{4}, \frac{1}{16}$$

$\textbf{Example.}$ Now, let's compute $Z = (X-2)^2.$

$z = 0, 1, 4$

$f_{Z}(z) = \frac{3}{8}, \frac{1}{2}, \frac{1}{8}$

Theorem. Let $X$ be continuous with pdf $f(x)$ and $Y=u(X)$ where $u$ is strictly monotone in $\operatorname{supp}(f)$, then pdf $g(y)$ of $Y$ is

$$g(y)=f(w(y))\left|w^{\prime}(y)\right|$$

where $w$ is the inverse of $u$ (i.e., $y=u(x)$ iff $x=w(y)$ ).

Proof.

Assume that $u$ is monotonically increasing. and we are dealing with positive support.

$$\Pr[Y \le y] = \Pr[u(X) \le y] = \Pr[X \le w(y)]$$

If we take derivative both sides with respect to $y$, we will have

$$g(y) = f'(\omega(y)) |\omega'(y)|$$

In the same way, we can prove that for non-increasing function, we will have the same result.

$\textbf{ Example }$. Let $X$ be the standard exponent RV. Find the pdf of $Y = \sqrt{X}$.

Let's consider

$$Y = u(X), X = \omega(Y)$$

so we have $$g(x) = 2\lambda xe^{-\lambda x^2} $$

$\textbf{Example.}$ If $F(x)$ is the distribution function of the continuous RV $X$. Find the pdf of $Y=F(X)$.

$u(x) = F(x), $ then $\omega(x) = F^{-1}(x)$

so we know that $F(F^{-1}(x)) = x$, so $$F'(F^{-1}(x)) F'^{-1}(x) =1$$

so $$g(x) = \frac{f(F^{-1}(x))}{F'(F^{-1}(x))}=1$$

Example. Let $X$ be the standard normal random variable. Find the pdf of $Z=X^2$.

$$u(x) = x^2$$, then $\omega(x) = x^{1/2}$ so we will have $\omega'(x) = \frac{1}{2}x^{-1/2}$

so $$g(x) = 2 \times \frac{1}{2}x^{-1/2} \times \frac{1}{\sqrt{2\pi}} \exp(- x/2) = \frac{x^{-1/2}}{\sqrt{2\pi}}\exp(- x/2)$$

$\textbf{Example.}$ Let $\left(X_1, X_2\right)$ have joint density

$$f\left(x_1, x_2\right)= \begin{cases}e^{-\left(x_1+x_2\right)}, & \text { if } x_1, x_2>0, \\ 0, & \text { otherwise. }\end{cases}$$

Find the pdf of $Y=\frac{X_1}{X_1+X_2}$.

Let's consider $$U = \frac{X_1}{X_1+X_2}$$ and $$V = X_1 +X_2$$

then $X_1 = UV$ and $X_2 = V - UV$.

Now the Jacobian matrix will be

$$\begin{pmatrix}V & U\\ -V & 1-U\end{pmatrix}$$,

whose determinant will be $V- UV +UV = V$.

Then we will have $g(u,v)= \exp(-uv - v +uv)v = \exp(-v)v$

Now since we are interested in $u$, we can integrate $\exp(-v)v$ with respect to its support $(0, \infty)$

$$\int v \exp(-v) = -v\exp(-v) +\int \exp(-v) = -v \exp(-v) -\exp(-v)$$

so we will have the probability density function for $Y$ will be $f(y) = 1.$

$\textbf{Example.}$ Let $\left(X_1, X_2\right)$ be uniformly distributed in $(0,1)^2$. Find the pdf of $Y=X_1+X_2$

Let's consider $U = X_1+X_2$, $V = X_2$, then we will have $X_1 = U-V$ and $X_2 = V$.

So the Jacobian matrix will be

$$\begin{pmatrix} 1 & -1 \\ 0 & 1\end{pmatrix}$$

and its determinant is $1$.

So our pdf for $U$ and $V$ is $1$ for $ 0< V < 1$ and $0< U-V < 1$

Now we will integrate with respect to $V$ by fixing our $U$, then we will have

If $u \le 1$:

$$\int_0^u 1 \, dv = u$$

If $u \ge 1$:

$\int_{u-1}^1 1\, dv = 2-u$

Example. Let $\left(X_1, X_2, X_3\right)$ be RVs with joint pdf $f$ as follows:

$$f\left(x_1, x_2, x_3\right)= \begin{cases}e^{-\left(x_1+x_2+x_3\right)}, & \text { if } x_1, x_2, x_3>0 \\ 0, & \text { otherwise }\end{cases}$$

Suppose $Y_1=X_1+X_2+X_3, Y_2=X_2, Y_3=X_3$. Find the marginal pdf of $Y_1$.

$$X_1 = Y_1 - Y_2 - Y_3$$

$$X_2 = Y_2$$

$$X_3 = Y_3$$

$$\begin{pmatrix} 1 & -1 & -1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}$$

So the Jacobian will be $1$.

And we have the pdf for $Y_1, Y_2, Y_3$ will be

$\exp(-y_1)$ with constraints: $ 0 < y_2 < y_1, 0< y_3 < y_1 - y_2$

Now if we integrate with respect to $y_3$, we will get the marginal pdf:

$$\exp(-y_1)(y_1 - y_2) = \exp(-y_1)y_1 - \exp(-y_1)y_2$$

Now if we integrate with respect to $y_2$, we will have the marginal pdf to be

$$g(y_1) = \frac{1}{2}\exp(-y_1)y_1^2 $$

$\textbf{Theorem.}$ The mgf of $Y=X_1+\cdots+X_n$, where $X_1, \ldots, X_n$ are independent RVs with MGFs $M_{X_1}(t), \ldots, M_{X_n}(t)$ respectively, is

$$M_Y(t)=\prod_{i=1}^n M_{X_i}(t)$$.

Proof.

$$\mathbb{E}[\exp(Yt)] =\mathbb{E}[\exp((X_1 + ... +X_n)t)] \\= \mathbb{E}[\exp((X_1t + ... +X_nt)] \\= \mathbb{E}[\prod_{i=1}^n\exp(X_it)] $$

Due to independence, we will have

$$\mathbb{E}[\exp(Yt)] = \prod_{i=1}^n \mathbb{E}[\exp(X_it)] =\prod_{i=1}^n M_{X_i}(t) $$

$\textbf{Example.}$ Suppose $X_1, \ldots, X_n$ are independent Poisson RVs with parameters $\lambda_1, \ldots, \lambda_n$ respectively. Find the distribution of $Y=X_1+\cdots+X_n$.

Let's consider the moment generating function $\mathbb{E}[\exp(Yt)] = \prod_{i=1}^n \exp(\lambda_{i}(e^t-1)) = \exp( (e^t-1)\sum_{i=1}^n \lambda_i) $

So it's a possion distribution with parameter $\sum \lambda_i$

$\textbf{Example.}$ Suppose $X_1, \ldots, X_n$ are independent exponential RVs with the same parameter $\theta$. Find the distribution of $Y=X_1+\cdots+X_n$.

$\mathbb{E}[\exp(Xt)] = \int_0^{\infty} \exp(xt) \frac{1}{\theta}\exp(-\frac{1}{\theta}x) \, dx = \frac{1}{\theta}\int_0^{\infty} \exp(x(t-\frac{1}{\theta})) \, dx = \frac{1}{\theta} \frac{-1}{t-\frac{1}{\theta}} = \frac{1}{1- \theta t} $

So $$\mathbb{E}[\exp(Yt)] = \left(\frac{1}{1-\theta t}\right)^n$$,

which is gamma distribution with $n$,$\theta$

In total: it took me about one hour and 30 minutes to finish this exercises. It's quite slow but I'm getting there slightly.

Other References:

https://people.stat.sc.edu/hitchcock/stat512spring2012.html

https://www.math.wustl.edu/~sawyer/math494s10.html