Let $F$ be the CDF of $X$, $G(x) = P(X \ge x) = 1 - F(x)$.
With the memoryless property, $G(s + t) = G(s) \, G(t)$.
This time, rather than trying to solve for $s$ or $t$, we are going to solve for the function $G$, in order to show that it is only the exponential function that has the memorylessness property (in the continuous case).
\begin{align} & \text{let } s = t & \quad \\ & \Rightarrow & G(2t) &= G(t + t) = G(t) \, G(t) = G(t)^{2} & \quad \\ & & G(3t) &= G(2t) \, G(t) = G(t)^{2} \, G(t) = G(t)^{3} & \quad \\ & &\dots & \quad \\ & & G(kt) &= G(t)^{k} & \quad \\ \\ & \text{case where } k = \frac{1}{n} & \quad \\ & \Rightarrow & G\left(2 \, \frac{t}{2}\right) &= G\left(\frac{t}{2}\right)^{2} \text{ so } G\left(\frac{t}{2}\right) = \sqrt{G(t)} = G(t)^{1/2} & \quad \\ & & G\left(\frac{t}{3}\right) &= G(t)^{1/3} & \quad \\ & &\dots & \quad \\ & & G\left(\frac{t}{k}\right) &= G(t)^{1/k} & \quad \\ \\ & \text{case where } k = \frac{m}{n} & \quad \\ & \Rightarrow & G\left(\frac{m}{n} \, t \right) &= G(t)^{m/n} & \quad \\ \\ & \text{let } x \in \mathbb{Q} & \quad \\ & \Rightarrow & G(x \, t) &= G(t)^{x} ~~~~ \text{ for all } x \ge 0\\ \\ \\ & \text{now let } t =1 & \quad \\ & \Rightarrow & G(x) &= G(1)^{x} & \quad \\ & & &= e^{x \, ln \, G(1)} ~~~~ \text{ where } ln \, G(1) \text{ is some negative real number } \\ & & &= e^{-\lambda x} & \quad \blacksquare \\ \end{align}And so now we see that in the continuous case, $\operatorname{Expo}(\lambda)$ is the only distribution with the memorylessness property.
Moment generating functions are an alternative way to describe a distribution.
A random variable $X$ has MGF $M(t) = \mathbb{E}(e^{tX})$, as a function of $t$, if this is finite on some $(-a, a)$ where $a > 0$.
Note that any function of a random variable is itself a random variable, so it makes some sense that we can obtain the expected value $\mathbb{E}(e^{tX})$
But why is this called moment-generating?
\begin{align} \mathbb{E}(e^{tX}) &= \mathbb{E}\left(\sum_{n=0}^{\infty} \frac{X^{n} \, t^{n}}{n!} \right) &\quad \text{Taylor expand e} \\ &= \sum_{n=0}^{\infty} \left( \frac{\mathbb{E}(X^{n}) \, t^{n}}{n!}\right) &\quad \text{ where } \mathbb{E}(X^{n}) \text{ is called the } n^{th} \text{ moment} \\ \end{align}Let $X$ have MGF $M(t)$.
If we have independent r.v. $X$ and $Y$, and we know their respective moment generating functions, then we can easily find the moment generating function for $X + Y$
\begin{align} M(X + Y) &= \mathbb{E}(e^{t(X+Y)}) \\ &= \mathbb{E}(e^{tX}) \, \mathbb{E}(e^{tY}) &\quad \text{ by independence} \end{align}Given $X \sim \operatorname{Bern}(p)$, we obtain the MGF with
\begin{align} M(t) &= \mathbb{E}(e^{tX}) \\ &= p \, e^t * q &\quad \text{ where } q = 1-p \end{align}Given $X \sim Bin(n,p)$, we obtain the MGF with
\begin{align} M(t) &= \mathbb{E}(e^{tX}) \\ &= \left( p \, e^t + q \right)^n &\quad \text{ by applying } G(kt) = G(t)^{k} \end{align}Given standard normal $Z \sim \mathcal{N}(0,1)$, we obtain the MGF with
\begin{align} M(t) &= \frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\infty} e^{tZ - Z^2/2} \, dZ \\ &= \frac{1}{\sqrt{2\pi}} ~~ e^{t^2/2} \int_{-\infty}^{\infty} e^{-\frac{1}{2}\,(Z-t)^2} \, dZ &\quad \text{ completing the square} \\ &= \frac{1}{\sqrt{2\pi}} ~~ e^{t^2/2} ~~ \sqrt{2\pi} &\quad \text{ recall the PDF of standard normal (Lec. 13)} \\ &= e^{t^2/2} \end{align}*And just in case you've forgotten how to complete the square...
If we have observed the sun rising for the past $n$ days in succession, then what is the probability that the sun will rise tomorrow?
Given $p$ is the probability that the sun will rise on any given day $X_k$, we can consider a consecutive string of days $X_1, X_2, \dots \text{ i.i.d. } \operatorname{Bern}(p)$ which is conditional on $p$. But for the question above, we do not know what $p$ is. Bayesians treat $p$ as an r.v.
We use $f$ as a simple stand-in for the PDF $p$. We start with the general case:
\begin{align} f( p | S_n=k) &= \frac{P(S_n=k | p) f(p)}{P(S_n=k)} &\quad \text{ from Bayes' Rule} \\ &\propto p^k \, (1-p)^{n-k} \end{align}But since
we can consider $f(p | S_n=k) $ with proportionality.
Now let's consider the case of our question, where the sun has risen for $n$ days straight:
\begin{align} \text{since } f(p) &= \int_{0}^{1} p^n \, dp \\ &= \frac{1}{n+1} \\ \\ \text{so } (p | S_n=n) &= \boxed{(n+1) \, p^n} &\quad \text{ normalizing for a valid PDF}\\ \\ \text{and } P(X_{n+1}=1 | S_n=n) &= \int_{0}^{1} (n+1) \, p \, p^n \, dp &\quad \text{ Fundamental Bridge, } \mathbb{E}(p | S_n=n) \\ &= \int_{0}^{1} (n+1) \, p^{n+1} \, dp \\ &= \boxed{\frac{n+1}{n+2}} \end{align}View Lecture 17: Moment Generating Functions | Statistics 110 on YouTube.