× HomeMath BoardArchiveType

A compound distribution $S$ of a primary distribution with pgf $P_N(z)$ and secondary distribution with pgf $P_M(z)$ has the following probability generating function:

$$P_S(z)=P_N(P_M(z))$$

Recall our relation between moment generating function that

$$M_X(t) = P_X(e^t)$$

for any distribution $X$ so we will have

$$M_S(z) = P_S(e^z) = P_N(P_M(e^z)) = P_N(M_M(z)).$$

The motivation of the compound distribution is the following, assuming that we have a series of i.i.d random variables $A_1, A_2, ..., A_n$, where the number $n$ itself is also a random variable independent of $A_1, ... ,A_n$ so we will have

the pgf of $A_1+A_2 +...+A_n$ as

$$\sum_{i=0}^{\infty}\Pr[A_1 + A_2 +... +A_N = i ]z^i$$

by conditioning over $N$, we will have

$$\sum_{i=0}^{\infty}\sum_{n=0}^{\infty}\Pr[N=n]\Pr[A_1 + A_2 +... +A_n = i | N= n]z^i = \sum_{n=0}^\infty \Pr[N=n]\sum_{i=0}^{\infty}A_1 + A_2 +... +A_n = i | N= n]z^i \\= \sum_{n=0}^\infty \Pr[N=n] \left(P_A(z)\right)^n =P_N(P_A(z))$$

$\textbf{Example } $ Demonstrate that any zero-modified distribution is a compound distribution.

Recall that $p_k^M = cp_k^0$, where $p_k$ is the probability for zero modified distribution and $p_k^0$ is the probability of zero-truncated distribution for $k \ge 1$.

Then we will have the following relation for the pgf:

$$p^M(z) = p_0^M + c\sum_{k=1}^{\infty} p^0_kz^k=p_0^M + c(P(z) - p_0^0)$$

If we set $z$ to be $1$, then we will have $$1 = p_0^M + c(1-p_0^0) \implies c =\frac{1-p_0^M}{1-p_0^0}$$

and so $$-\frac{1-p_0^M}{1-p_0^0} \cdot p_0^0 + p_0^M = \frac{p_0^M-p_0^Mp_0^0 -p_0^0+p_0^Mp_0^0}{1-p_0^0}=1-\frac{1-p_0^M}{1-p_0^0}$$ and so we will have Bernoulli distribution with parameter $\frac{1-p_0^M}{1-p_0^0}$ as primary distribution and a zero modified distribution as a secondary distribution.

$\textbf{Example } $

Consider the case where both $M$ and $N$ have a Poisson distribution, $\lambda_1, \lambda_2$. Determine the pgf of this distribution.

We know that Poisson distribution $X$ with parameter $\lambda$ has the following probability generating function:

$$P_{X}(z) = \exp(\lambda(z-1))$$

And so by the definition of compound distribution, we will have

$$P_{S}(z) = P_M(P_N(z)) = P_M(\exp(\lambda_2(z-1))) = \exp(\lambda_1(e^{\lambda_2(z-1)}-1)).$$

Back to our previous discussion, the probability of $A_1 +A_2 +...+A_n =k$ can be written as

$$\Pr[A_1+A_2 +...+A_N = k] = \sum_{n=0}^{\infty}\Pr[A_1+A_2+...+A_n= k | N=n]\Pr[N=n]$$

And so we can express the probability as a series of convolution where we define $\Pr[N=n] = p_n$ and $\Pr[A =k] = q_k$

Then we have

$$\Pr[A_1+A_2 +...+A_N = k] = \sum_{n=0}^{\infty} p_n q_k^{*n}.$$

If the primary distribution $N$ is of $(a,b,0)$ class, then. we will have a recursive formula for the compound distribution. We know $(a,b,0)$ class has the following recursive formula:

$$k \Pr[N=k] = \left(ak+b\right)\Pr[N=k-1].$$

We want to know about $P_S(z) = P_N(P_M(z)) = \sum_{i=0}^{\infty}\Pr[N=k](P_M(z))^k$. And so if we multiply the recursive formula both sides with $(P_M(z))^k$ and do the summation we will have

$$k \sum_{k=1}^{\infty}\Pr[N=k](P_M(z))^k = (ak+b)\sum_{k=1}^{\infty}\Pr[N=k-1](P_M(z))^k\\ \implies k (P_S(z)-p_0) = (ak+b) P_M(z)P_S(z) $$,

which we can't conclude anything.

However, if we differentiate $P_S'(z)$ by taking into account the method of moment, we will have $P'_S(z) = P_N'(P_M(z)) P_M'(z) = \sum_{i=0}^{\infty}k \cdot \Pr[N=k] (P_M(z))^{k-1} P_M'(z)$ and so we will have

$$\sum_{k=1}^{\infty }k \Pr[N=k] (P_M(z))^{k-1} P_M'(z) =\sum_{k=1}^{\infty }(ak+b)\Pr[N=k-1](P_M(z))^{k-1} P_M'(z)\\ \implies P_S'(z) = \sum_{k=1}^{\infty} (ak-a+a+b)\Pr[N=k-1](P_M(z))^{k-1} P_M'(z) \\ \implies P_S'(z) = (P_M(z))\sum_{k=1}^{\infty}a(k-1) \Pr[N=k-1](P_M(z))^{k-2} P_M'(z) + (a+b) \sum_{k=1}^{\infty}\Pr[N=k-1](P_M(z))^{k-1} P_M'(z) \\ \implies P_S'(z) = (P_M(z))\sum_{k=0}^{\infty}ak \Pr[N=k](P_M(z))^{k-1} P_M'(z) + (a+b) P_M'(z) P_N(P_M(z))\\ \implies P'_S(z) = aP_M(z) P_S'(z) + (a+b) P_M'(z) P_S(z)$$

And so we will also have $$P_S(z) = \sum_{i=0}^{\infty} f(k) z^k \\ \implies P_S'(z) = \sum_{i=0}^{\infty} f(k) kz^{k-1}$$

and we also have $$P_M(z) = \sum_{k=0}^{\infty}g(k) z^k \implies P_M'(z)= \sum_{k=0}^{\infty}g(k) kz^{k-1} $$

and so

$$\sum_{i=0}^{\infty} f(k) kz^{k-1}= a\left(\sum_{i=0}^{\infty} f(k) kz^{k-1}\right)\left(\sum_{k=0}^{\infty}g(k) z^k \right) + (a+b) \left(\sum_{k=0}^{\infty}g(k) kz^{k-1}\right)\left(\sum_{i=0}^{\infty} f(k) z^k \right)$$

and so for coefficient $z^{k-1}$, we will have $f(k)k$ on the left hand size and $a\sum_{i=0}^k f(i)ig(k-i) +(a+b)\sum_{i=0}^kg(i) if(k-i)$

and so the right hand side is the following:

$$a\sum_{i=0}^{k} i f(i)g(k-i) + (a+b) \sum_{i=0}^k (k-i)g(k-i)f(i)=f(k)k \\ \implies a \left(\sum_{i=0}^{k-1} i f(i)g(k-i)+kf(k)g(0)\right)+(a+b)\sum_{i=0}^{k-1} (k-i)g(k-i)f(i) = f(k)k \\ \implies (1-ag(0))kf(k) = a \sum_{i=0}^{k-1} i f(i) g(k-i) + (a+b) \sum_{i=0}^{k-1}(k-i)g(k-i)f(i) \\ \implies kf(k) = \frac{1}{1-ag(0)}\sum_{i=0}^{k-1} (ai +(a+b) (k-i)) g(k-i)f(i) \\=\frac{1}{1-ag(0)}\sum_{i=0}^{k-1} (ak +b(k -i)) g(k-i)f(i)$$

and so we will have

$$f(k) = \frac{1}{1-ag(0)}\sum_{i=0}^{k-1} \left(a +\frac{b(k -i)}{k}\right) g(k-i)f(i).$$

For $(a,b,1)$ class, we will have

$$k \Pr[N=k] = \left(ak+b\right)\Pr[N=k-1]$$

for $k \ge 2$ and so we will have the following

$$\sum_{k=2}^{\infty} k \Pr[N=k] P_M(z)^{k-1} P_M'(z) = \sum_{k=2}^{\infty} (ak-a+a+b)(P_M(z))^{k-1} P_M'(z)\Pr[N=k-1] \\ \implies P_S'(z) -\Pr[N=1] P_M'(z) = a P_M(z) P_S'(z) +(a+b) P_M'(z) (P_S(z)-\Pr[N=0])$$

If we let $P_S(z) = \sum_{i=0}^{\infty} f(k) z^k$ and $P_M(z) = P_M(z) = \sum_{k=0}^{\infty}g(k) z^k$, then we will have

$$f(k) k - p_1 kg(k)= a \left(\sum_{i=0}^k i f(i)g(k-i)\right)+(a+b)\sum_{i=0}^{k-1}(k-i)f(i)g(k-i)- (a+b) p_0 kg(k)$$

$$f(k)k = (p_1k -p_0(a+b)k)g(k) + \sum_{i=0}^{k-1} (ai + (a+b)(k-i))f(i)g(k-i)+akf(k)g(0) \\ \implies f(k) = \frac{(p_1 -p_0(a+b))g(k) + \sum_{i=0}^{k-1} ( a+ b\frac{k -i}{k} )f(i)g(k-i)}{1-ag(0)}$$

So

$$ f(k) = \frac{(p_1 -p_0(a+b))g(k) + \sum_{i=1}^{k} ( a+ b\frac{i}{k} )f(k-i)g(i)}{1-ag(0)}$$

Example. Develop the recursive formula for the case where the primary distribution is Poisson.

Recall that $$ \begin{equation} \begin{array}{|c|c|c|c|} \hline \text { Distribution } & \text { A } & \text {B} & p_0 \\ \hline \text { Poisson } & 0 & \lambda & \exp(-\lambda) \\ \hline \text { Negative Binomial } & \frac{\beta}{1+\beta} & \frac{\beta}{1+\beta}(r-1) & (1+\beta)^{-r} \\ \hline \text { Binomial } & -\frac{q}{1-q} & (n+1)\frac{q}{1-q} & (1-q)^n \\ \hline \text { Geometric } & \frac{\beta}{1+\beta} & 0 & (1+\beta)^{-1} \\ \hline \end{array} \end{equation}$$

So we will have

$$ f(k) = \sum_{i=1}^{k} \frac{ \lambda i}{k} f(k-i)g(i)$$

With starting value

$f(0) = P_N(P_M(0)) = \exp(\lambda (P_M(0) -1))$.

Calculate the probabilities for the Poisson-ETNB distribution, where $\lambda=1$ for the Poisson distribution and the ETNB distribution has $r=-0.5$ and $\beta=1$.

Previously, $$ \begin{equation} \begin{array}{|c|c|c|c|} \hline \text { Distribution } & \text { A } & \text {B} & p_0 \\ \hline \text { Poisson } & 0 & \lambda & \exp(-\lambda) \\ \hline \text { Negative Binomial } & \frac{\beta}{1+\beta} & \frac{\beta}{1+\beta}(r-1) & (1+\beta)^{-r} \\ \hline \text { Binomial } & -\frac{q}{1-q} & (n+1)\frac{q}{1-q} & (1-q)^n \\ \hline \text { Geometric } & \frac{\beta}{1+\beta} & 0 & (1+\beta)^{-1} \\ \hline \end{array} \end{equation}$$

$$ f(k) = \sum_{i=1}^{k} \frac{ i}{k} f(k-i)g(i)$$

$$g(k) =\left(\frac{1}{2} - \frac{3}{4k}\right) g(k-1)$$

The original "distribution" has the following:

$$g_0(0) = (1+1)^{0.5} = \sqrt{2}$$

$$g_0(1) = \left(\frac{1}{2} - \frac{1.5}{2}\right) \cdot (1+1)^{0.5} = -\frac{1}{2 \sqrt 2}$$

$$g_0(2) = -(\frac{1}{2} - \frac{3}{8})\frac{1}{2 \sqrt 2}=-\frac{\sqrt 2}{32}$$

So it's more or less

$$g_0(k) = {k -1.5 \choose k} \left(\frac{1}{2}\right)^{k-0.5} = \frac{\Gamma(k-1.5+1)}{\Gamma(k+1) \Gamma(-0.5)}\left(\frac{1}{2}\right)^{k-0.5} $$

Notice that $g_0(2) = \frac{\Gamma(1.5)}{\Gamma(2) \Gamma(-0.5)} \left(\frac{1}{2}\right)^{2-0.5} = -\frac{1}{32 } \left(\frac{1}{2}\right)^{-0.5} = -\frac{\sqrt 2}{32}$, which matches. And so back to the problem, we will have

$$g(k) = \frac{1}{1-\sqrt{2}}\frac{\Gamma(k-0.5)}{\Gamma(k+1) \Gamma(-0.5)} \left(\frac{1}{2}\right)^{k-0.5} $$

So we will have recursive formula as

$$f(k) = \sum_{i=1}^{k} \frac{ i}{k} f(k-i)g(i) \\ = \sum_{i=1}^{k} \frac{ i}{k} \frac{\sqrt{2}}{1-\sqrt{2}}\frac{\Gamma(i-0.5)}{\Gamma(i+1) \Gamma(-0.5)} \left(\frac{1}{2}\right)^{i}f(k-i) $$

which we can't find a closed form for now.

Example. Show that the Poisson-logarithmic distribution is a negative binomial distribution.

The logarithmic distribution has the following pgf:

$$\frac{\ln (1-p z)}{\ln (1-p)}$$

Notice here $p$ is constant.

And Poisson distribution has the following pgf:

$$\exp(\lambda (z-1))$$

and so if we have them composed, we will have the following compound pgf:

$$\exp\left(\lambda\left(\frac{\ln(1-pz)}{\ln(1-p)} - 1\right)\right) = \exp\left(\lambda \frac{\ln((1-pz)/(1-p))}{\ln(1-p)}\right) \\= \left(\frac{1-p+p-pz }{1-p}\right)^{\lambda/ \ln(1-p)} = \left(1-\frac{p}{1-p}(z-1)\right)^{\lambda/ \ln(1-p)}$$

which is a negative binomial distribution with $r=-\lambda/ \ln(1-p)$ and $\beta=\frac{p}{1-p}$.

We can know from the previous discussion:

$$P^M(z)=p_0^{(M)}+\left(1-p_0^{(M)}\right) p^T(z)$$

So we can always write a general pgf as

$$P(z) = f_0 +(1-f_0)P^T(z).$$

Theorem Here's a relationship between probability generating functions (pgfs) in a specific scenario. Suppose we have a pgf denoted by $P_M(z ; \theta)$ that satisfies the equation:

$$P_N(z ; \theta) = B[\theta(z-1)]$$

Here, $\theta$ represents a parameter, and $B(z)$ is a function that doesn't depend on $\theta$. Notably, the pgf contains $\theta$ and $z$ solely in the form of $\theta(z-1)$.

In such a situation, we can rewrite the pgf $P_S(z)$ as a composition of pgfs, which is given by:

$$P_S(z) = P_N[P_M(z) ; \theta] \quad$$

The new pgf $P_S(z)$ can also be expressed differently by using a zero-truncated pgf $P_M^T(z)$ and a modified parameter $\theta\left(1-f_0\right)$, leading to the following alternative expression:

$$P_S(z) = P_N[P_M^T(z) ; \theta\left(1-f_0\right)] \quad $$

Proof.

We know that $P_N(P_M(z) ; \theta) = B[\theta(P_M(z) -1)] = B[\theta (f_0 + (1-f_0)P_M^T(z)-1)] = B[\theta (1-f_0)P_M^T(z) -(1-f_0)\theta] = B[\theta(1-f_0)(P_M^T(z)-1)] = P_N[P_M^T(z);\theta(1-f_0)].$

Determine the probabilities for a Poisson-zero-modified ETNB distribution where the parameters are $\lambda, p_0^M, r, \beta$

$$ f(k) = \frac{\lambda}{k}\frac{ \sum_{i=1}^{k} ( i )f(k-i)g(i)}{1-P_0^M \cdot 0} = \frac{\lambda}{k} \sum_{i=1}^{k} ( i )f(k-i)g(i)$$

and $$g(i) = \frac{1-p_0^M}{1-(1+\beta)^{-r}} \frac{\Gamma(i +r)}{\Gamma(i+1) \Gamma(r)} \left( \frac{1}{1+\beta}\right)^r\left( \frac{\beta}{\beta+1}\right)^i \\=\frac{1-p_0^M}{(1+\beta)^{r}-1} \frac{\Gamma(i +r)}{\Gamma(i+1) \Gamma(r)} \left( \frac{\beta}{\beta+1}\right)^i $$

And so $$ f(k) = \frac{\lambda}{k} \sum_{i=1}^{k} if(k-i)\frac{1-p_0^M}{1-(1+\beta)^{-r}} \frac{\Gamma(i +r)}{\Gamma(i+1) \Gamma(r)} \left( \frac{1}{1+\beta}\right)^r\left( \frac{\beta}{\beta+1}\right)^i \\=\frac{\lambda}{k} \sum_{i=1}^{k} if(k-i)\frac{1-p_0^M}{(1+\beta)^{r}-1} \frac{\Gamma(i +r)}{\Gamma(i+1) \Gamma(r)} \left( \frac{\beta}{\beta+1}\right)^i$$

and $f(0) = \exp(p_0^M-\lambda).$

Compound Poisson Class

Compound Poisson distributions are type of distributions satisfying the following:

$$P_S(z) = \exp(\lambda(P_M(z)-1))$$

where $P_M(z)$ is some other distribution.

Let's consider its moment generating function:

$$P_S(\exp(z)) = \exp(\lambda(P_M(\exp(z))-1))$$

If we differentiate with respect to $z$, we will have

$$P_S'(\exp(z)) \exp(z) = \exp(\lambda P_M(\exp(z))-\lambda) \lambda P_M'(\exp(z)) \exp(z)$$

So the first (central) moment will have

$$\mathbb{E}_S[S] = \lambda\mathbb{E}_M[M]$$

If we differentiate it twice, we will have

$$P_S''(\exp(z)) \exp(z)^2 + P_S'(\exp(z)) \exp(z) = \exp(\lambda P_M(\exp(z))-\lambda) \lambda^2 P_M'(\exp(z))^2 \exp(z)^2 + \exp(\lambda P_M(\exp(z))-\lambda) \lambda P_M''(\exp(z)) \exp(z)^2+ \exp(\lambda P_M(\exp(z))-\lambda) \lambda P_M'(\exp(z)) \exp(z)$$

Then we will have

$$\mathbb{E}_S[S^2] + \mathbb{E}_S[S] = \lambda^2 \mathbb{E}_M[M]^2 +\lambda \mathbb{E}_M[M^2] + \lambda \mathbb{E}_M[M]$$

It means that $$\mathbb{E}_S[S^2] = \lambda^2 \mathbb{E}_M[M]^2 +\lambda \mathbb{E}_M[M^2] $$

So the central moment is $$\mathbb{E}_S[S^2] - (\mathbb{E}_S[S])^2= \lambda \mathbb{E}_M[M^2].$$

The formula are growing exponentially complicated, let's see an alternative approach:

$$P_S'(z) = \exp(\lambda P_M(z)-\lambda) \lambda P_M'(z) $$

$$P_S''(z) = \exp(\lambda P_M(z)-\lambda) \lambda^2 P_M'(z)^2 + \exp(\lambda P_M(z)-\lambda) \lambda P_M''(z)$$

$$P_S'''(z) = \exp(\lambda P_M(z) - \lambda)\lambda^3 P_M'(z)^3 + 2\exp(\lambda P_M(z)-\lambda)\lambda^2 P''_M(z)P_M'(z) + \exp(\lambda_M(z)-\lambda)\lambda^2 P_M''(z)P_M'(z) + \exp(\lambda P_M(z)-\lambda)\lambda P_M'''(z).$$

If we set $z$ to be 1, we will have

$$\mathbb{E}_N[N(N-1)(N-2)] = \lambda^3 \mathbb{E}_M[M]^3 + 2 \lambda^2 \mathbb{E}_M[M(M-1)]+\lambda^2 \mathbb{E}_M[M(M-1)] \mathbb{E}_M[M] + \lambda \mathbb{E}_M[M(M-1)(M-2)].$$

Notice that

$$P_X(z) = \sum_{k=0}^{\infty}p_k z^k $$

and if we differentiate $n$-times, we will have

$$\frac{\partial}{\partial z^n}P_X(z)\bigg |_{z=1} = \sum_{k=n}^{\infty} p_k \frac{k!}{(k-n)!} z^{k-n}|_{z=1} = \mathbb{E}[P(K, n) ] $$

$N(N-1)= N^2 - N$ and $N(N-1)(N-2) = (N^2 -N)(N-2) = N^3 -N^2 - 2N^2 +2N = N^3 - 3N^2 +2N$

So $$ \mathbb{E}_N[N^3] - 3\mathbb{E}_N[N^2] +2\mathbb{E}_N[N] = \lambda^3 \mathbb{E}_M[M]^3 + 2 \lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M])+\lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M]) \mathbb{E}_M[M] + \lambda (\mathbb{E}_M[M^3] - 3\mathbb{E}_M[M^2] +2\mathbb{E}_M[M]).$$

We also know that

$$\mathbb{E}_N[(N-\mathbb{E}[N])^3] = \mathbb{E}_N[N^3 - 3\mathbb{E}[N]N^2+3 \mathbb{E}_N[N]^2 N -\mathbb{E}_N[N]^3]$$

$$ \mathbb{E}_N[N^3] -3 \mathbb{E}_N[N]\mathbb{E}[N^2] + 3 \mathbb{E}_N[N]^2 \mathbb{E}_N[N] + \mathbb{E}_N[N]^3 = \lambda^3 \mathbb{E}_M[M]^3 + 2 \lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M])+\lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M]) \mathbb{E}_M[M] + \lambda (\mathbb{E}_M[M^3] - 3\mathbb{E}_M[M^2] +2\mathbb{E}_M[M]) + 3\mathbb{E}_N[N^2] -2\mathbb{E}_N[N]-3 \mathbb{E}_N[N]\mathbb{E}[N^2] + 3 \mathbb{E}_N[N]^2 \mathbb{E}_N[N] + \mathbb{E}_N[N]^3$$

$$\mathbb{E}_N[N^3] -3 \mathbb{E}_N[N]\mathbb{E}[N^2] + 3 \mathbb{E}_N[N]^2 \mathbb{E}_N[N] - \mathbb{E}_N[N]^3 = 2 \lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M])+\lambda^2 (\mathbb{E}_M[M^2] - \mathbb{E}_M[M]) \mathbb{E}_M[M] + \lambda (\mathbb{E}_M[M^3] - 3\mathbb{E}_M[M^2]) + 3\mathbb{E}_N[N^2] -3 \mathbb{E}_N[N]\mathbb{E}[N^2] + 3 \mathbb{E}_N[N]^2 \mathbb{E}_N[N] $$

$$= 2\lambda^2\mathbb{E}_M[M^2]-2\lambda^2\mathbb{E}_M[M]+\lambda^2\mathbb{E}_M[M^2] \mathbb{E}_M[M]-\lambda^2\mathbb{E}_M[M]^2+\lambda\mathbb{E}_M[M^3]-3\lambda\mathbb{E}_M[M^2]+ 3\mathbb{E}_N[N^2] -3 \mathbb{E}_N[N]\mathbb{E}[N^2] + 3 \lambda^3 \mathbb{E}_M[M]^3$$

Since we already know that $$\mathbb{E}_N[N^2] = \lambda^2 \mathbb{E}_M[M]^2 +\lambda \mathbb{E}_M[M^2] $$

We will have

$$3\mathbb{E}_N[N^2]\mathbb{E}_N[N] = 3 \lambda \mathbb{E}_M[M](\lambda^2 \mathbb{E}_M[M]^2 +\lambda \mathbb{E}_M[M^2])$$

So the old equation equals to

$$ 2\lambda^2\mathbb{E}_M[M^2]-2\lambda^2\mathbb{E}_M[M]-2\lambda^2\mathbb{E}_M[M^2] \mathbb{E}_M[M]+\lambda\mathbb{E}_M[M^3]+ 2\lambda^2 \mathbb{E}_M[M]^2 $$

Ah life is so much difficult and the above to get the third central moment isn't even correct.

Let's do something truly innovative.

Let's define $S_k = \mathbb{E}[S^k]$ for $k \ge 1$ and the same definition for $M$.

Then we will have $S = \lambda M$, here we omit $1$ for the sake of simplicity and we know the numerical interpretation and probability interpretation are different.

$$P_S''(z) = e^{\lambda(P_M(z)-1)} \lambda^2 (P_M'(z))^2 + \lambda e^{\lambda(P_M(z)-1)}P_M''(z)$$

which means $S_2 = \lambda^2 M^2 + \lambda M_2$

And if we differentiate, we will get

$$P_S'''(z) = e^{\lambda(P_M(z)-1)} \lambda^3 (P_M'(z))^3 +2(e^{\lambda(P_M(z)-1)} \lambda^2 (P_M'(z))P_M''(z)) + \lambda^2 e^{\lambda(P_M(z)-1)}P_M'(z)P_M''(z) + \lambda e^{\lambda(P_M(z)-1)}P_M'''(z) $$

And so we will have

$$S_3-3S_2 +2 S = \lambda^3 M^3 + 2 \lambda^2 M(M_2 -M) + \lambda^2 M (M_2 -M)+\lambda (M_3 - 3M_2 + 2M)\\ = \lambda^3 M^3 +2\lambda^2 MM_2 -2\lambda^2 M^2 +\lambda^2 MM_2 -\lambda^2 M^2 +\lambda M_3 -3\lambda M_2 +2\lambda M\\ = \lambda^3 M^3 +3\lambda^2 MM_2 -3\lambda^2 M^2 +\lambda M_3 -3\lambda M_2 +2\lambda M$$

$$S_3 = \lambda^3 M^3 +3\lambda^2 MM_2 +\lambda M_3 $$

$$S_3 -3 S_2 S +2S^3 = \lambda^3 M^3 +3\lambda^2 MM_2 +\lambda M_3 - 3 \lambda M(\lambda^2 M^2 +\lambda M_2) +2\lambda^3 M^3 = \lambda M_3$$

And so the third central moment of the compound distribution is lambda times the third raw moment of the secondary distribution.

The coefficient of skewness is

$$\gamma_1 = \frac{S_3 -3S_3 S + 2 S^3}{(S_2 -S^2)^{3/2}} = \frac{\lambda M_3}{\lambda^{3/2}M_2^{3/2}} = \frac{M_3}{\lambda^{1/2} M_2^{3/2}}.$$

If the underlying secondary distribution is a binomial distribution, specifically, binomial distribution has the following MGF:

$$f(z)=(1+q(\exp(z)−1))^n.$$

If we take derivative, we will get

$$\frac{\partial}{\partial z} f(z) = n (1 + q(\exp(z)-1))^{n-1} q \exp(z)$$

and so $S=\lambda n q$.

If we take derivative again, we will get

$$\frac{\partial}{\partial z^2 } f(z) = n(n-1)(1+q(\exp(z)-1))^{n-2} q^2 \exp(z)^2 + n (1 + q(\exp(z)-1))^{n-1} q \exp(z)$$

and so $S_2 = \lambda (n(n-1)q^2 +nq) = \lambda nq(1+(n-1)q) = S (1+(n-1)q)$

And if we take another derivative, we will have

$$\frac{\partial}{\partial z^3 } f(z) = n(n-1)(n-2)(1+q(\exp(z)-1))^{n-3} q^3 \exp(3z) + 2n(n-1)(1+q(\exp(z)-1))^{n-2} q^2 \exp(2z)+\\ n(n-1)(1+q(\exp(z)-1))^{n-2}q^2\exp(2z) +n (1 + q(\exp(z)-1))^{n-1} q \exp(z)$$

So we will have $$M_3 = n(n-1)(n-2) q^3 + 2n(n-1)q^2 +n(n-1)q^2 +nq \\= n(n-1)(n-2) q^3 + 3n(n-1)q^2 +nq$$

So notice $M_2 -M = M((n-1)q)$ so $(M_2 - M)^2 = M^2 (n-1)^2 q^2$

So $M_3 =M (n-1)(n-2)q^2 + 3n(n-1)q^2 +3nq - 2nq = M \cdot \frac{(M_2-M)^2}{M^2} \cdot \frac{n-2}{n-1} +3M_2-2M = \frac{(M_2-M)^2}{M} \cdot \frac{n-2}{n-1}+3M_2 -2M.$

Now we consider the following example:

Obtain the pgf for the Poisson-ETNB distribution and show that it looks like the pgf of a Poisson-negative binomial distribution.

Recall that ETNB has the following pgf:

$$f(z)=\frac{1}{(1+\beta)^r-1}\left(\left(\frac{1+\beta}{1-\beta(z-1)}\right)^r-1\right) = \frac{1}{1-(1+\beta)^{-r}} \left((1-\beta(z-1))^{-r}-(1+\beta)^{-r}\right).$$

So the Possion-ETNB distribution will have the following pgf:

$$e^{\lambda (f(z)-1)}$$

If we take ln, we will have

$$\lambda (f(z)-1)= \lambda \left(\frac{1}{1-(1+\beta)^{-r}} \left((1-\beta(z-1))^{-r}-(1+\beta)^{-r}-1+(1+\beta)^{-r}\right)\right)= \lambda \frac{(1-\beta(z-1))^{-r}-1}{1-(1+\beta)^{-r}}\\= C ( (1-\beta(z-1))^{-r}-1)$$

So we have proven our example.