× HomeMath BoardArchiveType

Couple exercises from Functions of Random Variables by Xiaojing Ye (Department of Mathematics & Statistics, Georgia State University)

Example\textbf{Example}. If the pdf of XX is

f(x)={6x(1x), if 0<x<10, otherwise  f(x)= \begin{cases}6 x(1-x), & \text { if } 0<x<1 \\ 0, & \text { otherwise }\end{cases}

Find the probability density function of Y=X3Y=X^3.

Then find the probability density function of Y=X3Y=X^3.

Pr[Yk]=Pr[X3k]=Pr[Xk1/3]\Pr[Y \le k] = \Pr[X^3 \le k] = \Pr[X \le k^{1/3}]

We know that 0k1/36x(1x)=0k1/36x6x2=3(k1/3)22(k1/3)3=3k2/32k\int_{0}^{k^{1/3}} 6x(1-x) = \int_{0}^{k^{1/3}} 6x-6x^2 = 3(k^{1/3})^2 -2(k^{1/3})^3 = 3 k^{2/3} - 2k

So the pdf of YY will be f(k)=2k1/32f(k) = 2k^{-1/3}-2, k(0,1)k \in(0,1)

Example.\textbf{Example.} Let XX be a random variable with pdf f(x)f(x) and Y=XY=|X|. Show that the pdf of YY is

g(y)={f(y)+f(y), if y>00, otherwise  g(y)= \begin{cases}f(y)+f(-y), & \text { if } y>0 \\ 0, & \text { otherwise }\end{cases}

Let's consider Pr[Yk]=Pr[Xk]=Pr[kXk]=kkf(y)dy\Pr[Y \le k] = \Pr[|X| \le k] = \Pr[-k \le X \le k] = \int_{-k}^k f(y)\, dy

Then if we take derivative with respect to kk, we will have ddkPr[Yk]=ddkkkf(y)dy=f(k)(f(k))=f(k)+f(k)\frac{d}{dk}\Pr[Y \le k] = \frac{d}{dk}\int_{-k}^k f(y)\, dy = f(k) -(-f(-k)) = f(k) +f(-k)

Example.\textbf{Example.} Use the previous result to find the pdf of Y=XY=|X| where XX is the standard normal RV.

X:f(x)=12πexp(x22)X: f(x) = \frac{1}{\sqrt{2 \pi}} \exp(-\frac{x^2}{2})

So Y:f(y)=2πexp(y22)Y: f(y) = \frac{\sqrt{2}}{\sqrt{\pi}} \exp(-\frac{y^2}{2})

Example.\textbf{Example.} Suppose the joint pdf of (X1,X2)\left(X_1, X_2\right) is

f(x1,x2)={6e3x12x2, if x1,x2>00, otherwise. f\left(x_1, x_2\right)= \begin{cases}6 e^{-3 x_1-2 x_2}, & \text { if } x_1, x_2>0 \\ 0, & \text { otherwise. }\end{cases}

Find the pdf of Y=X1+X2Y=X_1+X_2.

Let's consider Pr[Yk]=Pr[X1+X2k]=0k0ky6exp(3x2y)dxdy\Pr[Y \le k] = \Pr[X_1 +X_2 \le k] = \int_0^{k} \int_{0}^{k-y} 6 \exp(-3x -2y)\, dx \,dy

=60kexp(2y)0kyexp(3x)dxdy= 6\int_0^k \exp(-2y) \int_{0}^{k-y}\exp(-3x)\, dx \, dy

=20kexp(2y)exp(3x)0kydy = -2\int_0^k \exp(-2y) \exp(-3x) |_{0}^{k-y} \, dy

=20kexp(2y)(exp(3(ky))1)dy= -2\int_0^k \exp(-2y) (\exp(-3(k-y)) - 1) \, dy

=20k(exp(3k+y))exp(2y))dy=-2 \int_0^k (\exp(-3k+y)) - \exp(-2y)) \, dy

=2(exp(3k+y)+12exp(2y)0k)= -2 (\exp(-3k + y) + \frac{1}{2}\exp(-2y) |_0^k)

=2(exp(2k)+12exp(2k)exp(3k)12)= -2 (\exp(-2k ) + \frac{1}{2}\exp(-2k) - \exp(-3k ) - \frac{1}{2} )

=3exp(2k)+2exp(3k)+1= -3\exp(-2k) +2\exp(-3k) +1

And if we take derivative with respect to kk, we will have f(k)=6exp(2k)6exp(3k)f(k) = 6 \exp(-2k) -6\exp(-3k) for k>0k > 0.

Now, let's look into some discrete distribution.

Example.\textbf{Example.} Let XX be the number of heads by tossing a fair coin for 4 times. Then the pmf ff of XX is

x,f(x)

0,0.0625

1,0.25

2,0.375

3,0.25

4,0.0625

Find the pmf gg of Y=11+XY=\frac{1}{1+X}.

So, we know that function 11+X\frac{1}{1+X} has unique value over [0,)[0, \infty),

so we will have

y=1,12,13,14,15y = 1, \frac{1}{2}, \frac{1}{3}, \frac{1}{4}, \frac{1}{5}

and

f(y)=116,14,38,14,116f(y) = \frac{1}{16}, \frac{1}{4}, \frac{3}{8}, \frac{1}{4}, \frac{1}{16}

Example.\textbf{Example.} Now, let's compute Z=(X2)2.Z = (X-2)^2.

z=0,1,4z = 0, 1, 4

fZ(z)=38,12,18f_{Z}(z) = \frac{3}{8}, \frac{1}{2}, \frac{1}{8}

Theorem. Let XX be continuous with pdf f(x)f(x) and Y=u(X)Y=u(X) where uu is strictly monotone in supp(f)\operatorname{supp}(f), then pdf g(y)g(y) of YY is

g(y)=f(w(y))w(y)g(y)=f(w(y))\left|w^{\prime}(y)\right|

where ww is the inverse of uu (i.e., y=u(x)y=u(x) iff x=w(y)x=w(y) ).

Proof.

Assume that uu is monotonically increasing. and we are dealing with positive support.

Pr[Yy]=Pr[u(X)y]=Pr[Xw(y)]\Pr[Y \le y] = \Pr[u(X) \le y] = \Pr[X \le w(y)]

If we take derivative both sides with respect to yy, we will have

g(y)=f(ω(y))ω(y)g(y) = f'(\omega(y)) |\omega'(y)|

In the same way, we can prove that for non-increasing function, we will have the same result.

 Example \textbf{ Example }. Let XX be the standard exponent RV. Find the pdf of Y=XY = \sqrt{X}.

Let's consider

Y=u(X),X=ω(Y)Y = u(X), X = \omega(Y)

so we have g(x)=2λxeλx2g(x) = 2\lambda xe^{-\lambda x^2}

Example.\textbf{Example.} If F(x)F(x) is the distribution function of the continuous RV XX. Find the pdf of Y=F(X)Y=F(X).

u(x)=F(x),u(x) = F(x), then ω(x)=F1(x)\omega(x) = F^{-1}(x)

so we know that F(F1(x))=xF(F^{-1}(x)) = x, so F(F1(x))F1(x)=1F'(F^{-1}(x)) F'^{-1}(x) =1

so g(x)=f(F1(x))F(F1(x))=1g(x) = \frac{f(F^{-1}(x))}{F'(F^{-1}(x))}=1

Example. Let XX be the standard normal random variable. Find the pdf of Z=X2Z=X^2.

u(x)=x2u(x) = x^2, then ω(x)=x1/2\omega(x) = x^{1/2} so we will have ω(x)=12x1/2\omega'(x) = \frac{1}{2}x^{-1/2}

so g(x)=2×12x1/2×12πexp(x/2)=x1/22πexp(x/2)g(x) = 2 \times \frac{1}{2}x^{-1/2} \times \frac{1}{\sqrt{2\pi}} \exp(- x/2) = \frac{x^{-1/2}}{\sqrt{2\pi}}\exp(- x/2)

Example.\textbf{Example.} Let (X1,X2)\left(X_1, X_2\right) have joint density

f(x1,x2)={e(x1+x2), if x1,x2>0,0, otherwise. f\left(x_1, x_2\right)= \begin{cases}e^{-\left(x_1+x_2\right)}, & \text { if } x_1, x_2>0, \\ 0, & \text { otherwise. }\end{cases}

Find the pdf of Y=X1X1+X2Y=\frac{X_1}{X_1+X_2}.

Let's consider U=X1X1+X2U = \frac{X_1}{X_1+X_2} and V=X1+X2V = X_1 +X_2

then X1=UVX_1 = UV and X2=VUVX_2 = V - UV.

Now the Jacobian matrix will be

(VUV1U)\begin{pmatrix}V & U\\ -V & 1-U\end{pmatrix},

whose determinant will be VUV+UV=VV- UV +UV = V.

Then we will have g(u,v)=exp(uvv+uv)v=exp(v)vg(u,v)= \exp(-uv - v +uv)v = \exp(-v)v

Now since we are interested in uu, we can integrate exp(v)v\exp(-v)v with respect to its support (0,)(0, \infty)

vexp(v)=vexp(v)+exp(v)=vexp(v)exp(v)\int v \exp(-v) = -v\exp(-v) +\int \exp(-v) = -v \exp(-v) -\exp(-v)

so we will have the probability density function for YY will be f(y)=1.f(y) = 1.

Example.\textbf{Example.} Let (X1,X2)\left(X_1, X_2\right) be uniformly distributed in (0,1)2(0,1)^2. Find the pdf of Y=X1+X2Y=X_1+X_2

Let's consider U=X1+X2U = X_1+X_2, V=X2V = X_2, then we will have X1=UVX_1 = U-V and X2=VX_2 = V.

So the Jacobian matrix will be

(1101)\begin{pmatrix} 1 & -1 \\ 0 & 1\end{pmatrix}

and its determinant is 11.

So our pdf for UU and VV is 11 for 0<V<1 0< V < 1 and 0<UV<10< U-V < 1

Now we will integrate with respect to VV by fixing our UU, then we will have

If u1u \le 1:

0u1dv=u\int_0^u 1 \, dv = u

If u1u \ge 1:

u111dv=2u\int_{u-1}^1 1\, dv = 2-u

Example. Let (X1,X2,X3)\left(X_1, X_2, X_3\right) be RVs with joint pdf ff as follows:

f(x1,x2,x3)={e(x1+x2+x3), if x1,x2,x3>00, otherwise f\left(x_1, x_2, x_3\right)= \begin{cases}e^{-\left(x_1+x_2+x_3\right)}, & \text { if } x_1, x_2, x_3>0 \\ 0, & \text { otherwise }\end{cases}

Suppose Y1=X1+X2+X3,Y2=X2,Y3=X3Y_1=X_1+X_2+X_3, Y_2=X_2, Y_3=X_3. Find the marginal pdf of Y1Y_1.

X1=Y1Y2Y3X_1 = Y_1 - Y_2 - Y_3

X2=Y2X_2 = Y_2

X3=Y3X_3 = Y_3

(111010001)\begin{pmatrix} 1 & -1 & -1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}

So the Jacobian will be 11.

And we have the pdf for Y1,Y2,Y3Y_1, Y_2, Y_3 will be

exp(y1)\exp(-y_1) with constraints: 0<y2<y1,0<y3<y1y2 0 < y_2 < y_1, 0< y_3 < y_1 - y_2

Now if we integrate with respect to y3y_3, we will get the marginal pdf:

exp(y1)(y1y2)=exp(y1)y1exp(y1)y2\exp(-y_1)(y_1 - y_2) = \exp(-y_1)y_1 - \exp(-y_1)y_2

Now if we integrate with respect to y2y_2, we will have the marginal pdf to be

g(y1)=12exp(y1)y12g(y_1) = \frac{1}{2}\exp(-y_1)y_1^2

Theorem.\textbf{Theorem.} The mgf of Y=X1++XnY=X_1+\cdots+X_n, where X1,,XnX_1, \ldots, X_n are independent RVs with MGFs MX1(t),,MXn(t)M_{X_1}(t), \ldots, M_{X_n}(t) respectively, is

MY(t)=i=1nMXi(t)M_Y(t)=\prod_{i=1}^n M_{X_i}(t).

Proof.

E[exp(Yt)]=E[exp((X1+...+Xn)t)]=E[exp((X1t+...+Xnt)]=E[i=1nexp(Xit)]\mathbb{E}[\exp(Yt)] =\mathbb{E}[\exp((X_1 + ... +X_n)t)] \\= \mathbb{E}[\exp((X_1t + ... +X_nt)] \\= \mathbb{E}[\prod_{i=1}^n\exp(X_it)]

Due to independence, we will have

E[exp(Yt)]=i=1nE[exp(Xit)]=i=1nMXi(t)\mathbb{E}[\exp(Yt)] = \prod_{i=1}^n \mathbb{E}[\exp(X_it)] =\prod_{i=1}^n M_{X_i}(t)

Example.\textbf{Example.} Suppose X1,,XnX_1, \ldots, X_n are independent Poisson RVs with parameters λ1,,λn\lambda_1, \ldots, \lambda_n respectively. Find the distribution of Y=X1++XnY=X_1+\cdots+X_n.

Let's consider the moment generating function E[exp(Yt)]=i=1nexp(λi(et1))=exp((et1)i=1nλi)\mathbb{E}[\exp(Yt)] = \prod_{i=1}^n \exp(\lambda_{i}(e^t-1)) = \exp( (e^t-1)\sum_{i=1}^n \lambda_i)

So it's a possion distribution with parameter λi\sum \lambda_i

Example.\textbf{Example.} Suppose X1,,XnX_1, \ldots, X_n are independent exponential RVs with the same parameter θ\theta. Find the distribution of Y=X1++XnY=X_1+\cdots+X_n.

E[exp(Xt)]=0exp(xt)1θexp(1θx)dx=1θ0exp(x(t1θ))dx=1θ1t1θ=11θt\mathbb{E}[\exp(Xt)] = \int_0^{\infty} \exp(xt) \frac{1}{\theta}\exp(-\frac{1}{\theta}x) \, dx = \frac{1}{\theta}\int_0^{\infty} \exp(x(t-\frac{1}{\theta})) \, dx = \frac{1}{\theta} \frac{-1}{t-\frac{1}{\theta}} = \frac{1}{1- \theta t}

So E[exp(Yt)]=(11θt)n\mathbb{E}[\exp(Yt)] = \left(\frac{1}{1-\theta t}\right)^n,

which is gamma distribution with nn,θ\theta

In total: it took me about one hour and 30 minutes to finish this exercises. It's quite slow but I'm getting there slightly.

Other References:

https://people.stat.sc.edu/hitchcock/stat512spring2012.html

https://www.math.wustl.edu/~sawyer/math494s10.html