By repeating this argument, we get access to all the moments $\mathbf{E}[X^p]$ for every integer p.
Similarly, we can get access to the $\mathbf{P}_X(x)$ from $P(.)$
Important: Because of all this, the PGF actually determines the distribution of $X$. Therefore, as soon as we can compute the PGF of a random variable $Y$, and recognize the formula for the PGF of a Binomial rv, we can conclude that $Y$ is Binomial, and we we can identify its parameters.
Let $X\sim Binom(n,p)$ and given $X$, $Y\sim Binom(X, r)$.
Therefore, let us compute $\mathbf{P}$ for $Y$.
$P(z) = \mathbf{E}[z^Y]$
We know that there exists iid random variables $Y_1,Y_2,...Y_n$ which are Bernoulli with parameter $r$, such that $Y=Y_1+Y_2+...+Y_X$
Therefore, $\displaystyle p(z) = \mathbf{E}\left[z^{Y_1+Y_2+...+Y_X}\right] = \mathbf{E}_X\left[\mathbf{E}_{Y_i's} [z^{Y_1+Y_2+...+Y_X}]\right]$ (in the inner expectation, the $X$ is fixed)
So the inner part is $\sim Binom(X,r)$.
Reminder: if %X\sim Binom(n,p)$, then $\mathbf{E}[z^X] = (1-p+pz)^n$
$$=\mathbf{E}_X\left[(1-r+rz)^X\right] = (1-p+p(1-r+rz))^n$$
$$=(1-rp+rpz)^n$$
therefore, since we recognize that this is the PGF for a binomial with parameters $n$ and $rp$, therefore, we have proved that $Y$ is $Binom(n, rp)$
Use $Y=Y_1 + Y_2 + ... + Y_k$ given $X=k$, and $X=X_1 + X_2 + ... + X_n$ where the $X_i$s are iid $Bern(p)$, and $Y_i$s are iid $Bern(r)$.
Now, consider $\widetilde{Y}=X_1Y_1 + X_2Y_2 + ... + X_nY_n$
Each term $X_iY_i$ is independent of all other, and each $X_iY_i$ is $1$ with probability $\mathbf{P}[X_iY_i=1] = \mathbf{P}[X_i=1]\mathbf{P}[Y_i=1] = rp$
Therefore, $X_iY_i$ are iid $Bern(rp)$. Therefore, $\widetilde{Y}$ is $Binom(n, rp)$.
Finally, we must justify the fact that $\widetilde{Y}$ and $Y$ have the same distribution.
We assume $X=k$, we infer that there are exactly $k$ of $X_i$'s which are $\ne 0$, we conclude that if $X=k$. then $Y$ is a sum of $k$ $Y_i$'s. Therefore, $Y\sim Binom(k,r)$
In [ ]: