类似的证法见于这份讲义的71页:
Theorem 2.4.7 (The Binomial Expansion) Let $p$ be a real number, and let $P(x)$ be the power series
$$
P(x)=1+p x+\frac{p(p-1)}{2 !} x^{2}+\cdots+\frac{p(p-1) \cdots(p-n+1)}{n !} x^{n}+\cdots
$$
whose convergence radius $R=1$ unless $p=0$ or $p \in \mathbb{N}$. If $p \in \mathbb{N}, P(x)$ is a polynomial of degree $p$.
1) For any real number $p$ we have
$$
(1+x)^{p}=P(x) \quad \text { for } x \in(-1,1)
$$
2) If $p>0$ then
$$
(1+x)^{p}=P(x) \quad \text { for } x \in(-1,1]
$$
If $p=0$ or $p \in \mathbb{N}, P(x)$ is reduced to a polynomial, 1) and 2) follow immediately from the ordinary binomial formula.
Let us first show that $P(x)$ is the Taylor expansion for the function $f(x)=(1+x)^{p}$ for $x>-1$ at $a=0$. In fact
$$
\begin{aligned}
f^{\prime}(x)=& p(1+x)^{p-1} \\
f^{\prime \prime}(x)=& p(p-1)(1+x)^{p-2} ; \\
& \cdots ; \\
f^{(k)}(x)=& p(p-1) \cdots(p-(k-1))(1+x)^{p-k}
\end{aligned}
$$so $f^{(k)}(0)=p(p-1) \cdots(p-(k-1)) .$ Hence the Taylor expansion of $f(x)$ at $a=0$ is by definition given by
$$
P(x)=\sum_{k=0}^{\infty} \frac{p(p-1) \cdots(p-(k-1))}{k !} x^{k}
$$
If $p \neq 0,1,2, \cdots$, then, by ratio test, the convergence radius $R=1 .$ For convenience, one may introduce notation
$$
\left(\begin{array}{c}
p \\
k
\end{array}\right)=\frac{p(p-1) \cdots(p-(k-1))}{k !}
$$
so that the Taylor's expansion of $(1+x)^{p}$ may be written as
$$
P(x)=\sum_{k=0}^{\infty}\left(\begin{array}{c}
p \\
k
\end{array}\right) x^{k}
$$
which is a polynomial of order $p$ in the case that $p$ is zero or a positive integer, as if $p \in \mathbb{N}$, then $\left(\begin{array}{c}p \\ k\end{array}\right)=0$ for $k>p$. Hence the case that $p \in \mathbb{N}$ is trivial, and reduces to the elementary Binomial expansion. In what follows, we may assume that $p \neq 0,1,2, \cdots$.
To prove 1), Taylor's Theorem is not needed in fact, and the Identity Theorem does the job.
Proof of part 1). Let us apply the Identity Theorem to $f(x)=(1+x)^{p}$ and its Taylor expansion $P(x)$ on the interval $(-1,1)$. Both are differentiable on $(-1,1)$, and, by chain rule,
$$
f^{\prime}(x)=\frac{d}{d x} \exp (p \ln (1+x))=p(1+x)^{p} \frac{1}{1+x}=\frac{p}{1+x} f(x)
$$
for $x>-1$, so that $f$ satisfies the differential equation:
$$
(1+x) f^{\prime}(x)=p f(x)
$$
where $-1\lt x<1$. One may expect that its Taylor expansion $P(x)$ should satisfies the same differential equation. In fact, we may write
$$
P(x)=1+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n}
$$
which is a power series with convergence radius $R=1$, so that $P(x)$ is differentiable on $(-1,1)$ and its derivative can be evaluated by differentiating it term by term:
$$
P^{\prime}(x)=\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{(n-1) !} x^{n-1}
$$
Hence
$$
\begin{aligned}
(1+x) P^{\prime}(x) &=\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{(n-1) !}(1+x) x^{n-1} \\
&=\sum_{n=0}^{\infty} \frac{p(p-1) \cdots(p-n)}{n !} x^{n}+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} n x^{n} \\
&=p+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !}((p-n)+n) x^{n} \\
&=p+p \sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n} \\
&=p P(x) .
\end{aligned}
$$
We apply the Identity Theorem to $h(x)=P(x) / f(x)$ on $(-1,1)$, which is differentiable as well as $f(x) \neq 0$ for $x \in(-1,1)$. Now
$$
\begin{aligned}
h^{\prime} &=\frac{P^{\prime} f-P f^{\prime}}{f^{2}} \\
&=\frac{(1+x) P^{\prime} f-(1+x) f^{\prime} P}{(1+x) f^{2}} \\
&=\frac{p P f-p f P}{(1+x) f^{2}}=0
\end{aligned}
$$
so that, according to Identity Theorem, $P(x) / f(x)$ is constant in $(-1,1)$, and therefore $\frac{P(x)}{f(x)}=\frac{P(0)}{f(0)}=1 \quad$ for all $x \in(-1,1)$
Hence
$$
(1+x)^{p}=1+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n} \quad \text { for } x \in(-1,1)
$$
Proof of 2). By 1) we only need to show that $f(1)=P(1)$ if $p>0$. In fact, if $p>0$, we prove that $f(x)=P(x)$ for $x \in[0,1]$ via Taylor's Theorem.
We may assume that $p \in(0,1)$. Let us apply Taylor's Theorem to $f(x)=(1+x)^{p}$ which has derivatives of any order on $(-1, \infty)$. Hence, for any $x>-1$, there is a number $\xi_{n}$ between 0 and $x$ such that
$$
(1+x)^{p}=1+\sum_{k=1}^{n-1} \frac{p(p-1) \cdots(p-(k-1))}{k !} x^{k}+E_{n}(x)
$$
where
$$
E_{n}(x)=\frac{f^{(n)}\left(\xi_{n}\right)}{n !} x^{n}
$$
for some $\xi_{n} \in(-1,1)$, where
$$
\frac{f^{(n)}(x)}{n !}=\frac{p(p-1) \cdots(p-(n-1))}{n !}(1+x)^{p-n}
$$
Hence
$$
E_{n}(x)=\frac{p(p-1) \cdots(p-(n-1))}{n !}\left(1+\xi_{n}\right)^{p}\left(\frac{x}{1+\xi_{n}}\right)^{n}
$$
If $x \in[0,1]$, then $\xi_{n} \in(0,1)$ so that
$$
\left|\left(1+\xi_{n}\right)^{p}\left(\frac{x}{1+\xi_{n}}\right)^{n}\right| \leq 2^{p}
$$
and therefore
$$
\begin{aligned}
\left|E_{n}(x)\right| & \leq 2^{p}\left|\frac{p(p-1) \cdots(p-(n-1))}{n !}\right| \\
&=2^{p} p \frac{(1-p)(2-p) \cdots(n-1-p)}{n !} \\
&=2^{p} p \frac{1-p}{1} \frac{2-p}{2} \cdots \frac{n-1-p}{n-1} \frac{1}{n} \\
& \leq \frac{2^{p} p}{n} \rightarrow 0
\end{aligned}
$$
so that, by the Sandwich lemma, $E_{n}$ converges to zero uniformly on $[0,1] .$ It follows that $(1+x)^{p}=$ $P(x)$ for $x \in[0,1]$. Together with the first part 1), 2) now follows.
For $p>0$, we can show that $(1+x)^{p}=P(x)$ for every $x \in[-1,1]$, which will be the context of the following theorem. Before doing this, we observe that, for $\alpha>0$
$$
\lim _{x>0, x \rightarrow 0} x^{\alpha}=\lim _{x \downarrow 0} \exp (\alpha \ln x)=0
$$
so we naturally define $0^{\alpha}=0$ for $\alpha>0$. Hence the power function $x^{\alpha}$ is continuous on $[0, \infty)$ if the power $\alpha>0$.
Theorem 2.4.8 Let $p$ be a real number, and $P(x)$ denote the Taylor expansion of $(1+x)^{p}$ at 0, that is
$$
P(x)=1+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n}
$$
1) If $p>-1$ then $(1+x)^{p}=P(x)$ all $x \in(-1,1]$.
2) If $p>0$, then $(1+x)^{p}=P(x)$ for all $x \in[-1,1]$, and the convergence of the power series $P(x)$ is uniform on $[-1,1]$.
Assume that $p \neq 0,1,2, \cdots$. According to the Taylor Theorem, for every $x>-1$ and $n \in \mathbb{N}$, there is $\xi_{n}$ between 0 and $x$ such that
$$
(1+x)^{p}=1+\sum_{m=1}^{n-1} \frac{p(p-1) \cdots(p-(n-1))}{m !} x^m+E_{n}(x)
$$
where the error term is given by, as we have seen in the theorem,
$$
\begin{aligned}
E_{n}(x) &=\frac{p(p-1) \cdots(p-(n-1))}{n !}\left(1+\xi_{n}\right)^{p-n} x^{n} \\
&=\frac{p(p-1) \cdots(p-(n-1))}{n !}\left(1+\xi_{n}\right)^{p}\left(\frac{x}{1+\xi_{n}}\right)^{n}
\end{aligned}
$$
Step 1. If $x \in[0,1]$, then $\left|\frac{x}{1+\xi_{n}}\right|<1$ so that
$$
\left|E_{n}(x)\right| \leq 2^{p} \frac{|p(p-1) \cdots(p-(n-1))|}{n !}=2^{p}\left|a(p)_{n}\right|
$$
where
$$
\begin{aligned}
a(p)_{n} &=\frac{p(p-1) \cdots(p-(n-1))}{n !} \\
&=(-1)^{n} \frac{(-p)(1-p) \cdots((n-1)-p)}{n !}
\end{aligned}
$$
If $p \in(0,1)$ then
$$
a(p)_{n}=(-1)^{n-1} \frac{p}{n}\left(1-\frac{p}{1}\right)\left(1-\frac{p}{2}\right) \cdots\left(1-\frac{p}{n-1}\right)
$$
so that
$$
\left|a(p)_{n}\right| \leq \frac{p}{n} \rightarrow 0
$$
which implies that $E_{n} \rightarrow 0$ uniformly on $[0,1]$ for this case that $p>0$.
If $p \in(-1,0)$ then $1+p \in(0,1)$ and we may rewrite
$$
\begin{aligned}
a(p)_{n} &=(-1)^{n} \frac{(1-(p+1))(2-(p+1)) \cdots(n-(1+p))}{n !} \\
&=(-1)^{n}\left(1-\frac{p+1}{1}\right)\left(1-\frac{p+1}{2}\right) \cdots\left(1-\frac{p+1}{n}\right)
\end{aligned}
$$
Let us prove the elementary inequality
$$
1-t \leq e^{-t} \quad \text { for } t \geq 0
$$
Let $g(t)=1-t-e^{-t}$. Then $g(0)=0$ and $g^{\prime}(t)=-1+e^{-t} \leq 0$ for $t \geq 0$. Hence $g$ is decreasing on $[0, \infty)$ and therefore $g(t) \leq 0$ for all $t \geq 0$.
By using this inequality we obtain, as $0<1+p<1$,$$\left|a(p)_{n}\right| \leq \exp \left(-(1+p) \sum_{k=1}^{n} \frac{1}{k}\right) \rightarrow 0$$as $1+p>0$ and $\sum_{k=1}^{n} \frac{1}{k} \rightarrow \infty$. Therefore $E_{n} \rightarrow 0$ as $n \rightarrow \infty$ uniformly on $[0,1]$ and $p>-1$, so that, together with Theorem 2.4.7, we thus have
$$
(1+x)^{p}=1+\sum_{n=1}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n} \quad \text { for } x \in(-1,1]
$$
and the convergence is uniform on $[-1+\delta, 1]$ for any $0<\delta<1$. This proves 1 ) and part of 2 ).
Step 2. Now we prove 2), so that we assume that $p>0$. Without losing generality, let us assume that $p \in(0,1)$. We want to show that $(1+x)^{p}=P(x)$ for all $x \in[-1,1]$ and the convergence is uniform on $[-1,1]$. Note that
$$
P(x)=1+p x+\sum_{n=2}^{\infty} a(p)_{n} x^{n} \quad \forall x \in[-1,1]
$$
where
$$
a(p)_{n}=\frac{p(p-1) \cdots(p-(n-1))}{n !}
$$
Of course we only need to show that $P(x)$ is convergent at $-1$. According to Abel's theorem, we only need to prove that the power series is convergent at $x=-1$, that is,
$$
1-p+\sum_{n=2}^{\infty}(-1)^{n} a(p)_{n}
$$
is convergent. As we have mentioned, we may rewrite
$$
a(p)_{n}=(-1)^{n-1} \frac{p}{n}\left(1-\frac{p}{1}\right)\left(1-\frac{p}{2}\right) \cdots\left(1-\frac{p}{n-1}\right)
$$
so that
$$
(-1)^{n} a(p)_{n}=-\frac{p}{n}\left(1-\frac{p}{1}\right)\left(1-\frac{p}{2}\right) \cdots\left(1-\frac{p}{n-1}\right)
$$
for $n \geq 2$, which has a definite sign (always negative) for $p \in(0,1) .$ Using the elementary inequality (2.4.8) one obtains that
$$
\begin{aligned}
0 & \leq-(-1)^{n} a(p)_{n} \\
& \leq \frac{p}{n} \exp \left\{-p \sum_{k=1}^{n-1} \frac{1}{k}\right\} \\
&=\frac{p}{n} \exp \left\{-p \gamma_{n-1}-p \ln (n-1)\right\} \\
&=\frac{p}{n} \frac{1}{(n-1)^{p}} e^{-p \gamma_{n-1}}
\end{aligned}
$$
where
$$
\gamma_{n-1}=\sum_{k=1}^{n-1} \frac{1}{k}-\ln (n-1) \rightarrow \gamma
$$
the Euler constant. Hence $e^{-p \gamma_{n-1}} \rightarrow e^{-p \gamma}$ as $n \rightarrow \infty$, and therefore sequence $e^{-p \gamma_{n-1}}$ is bounded by some constant $C$. Therefore
$$
0 \leq-(-1)^{n} a(p)_{n}\lt p C \frac{1}{n(n-1)^{p}}
$$
for any $n \geq 2 .$ Since $p>0, \sum \frac{1}{n(n-1)^{p}}$ is convergent, so that, by the comparison test for series,
$$
\sum_{n=2}^{\infty}(-1)^{n-1} a(p)_{n}
$$
is convergent. Since
$$
\left|\frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n}\right| \leq(-1)^{n-1} a(p)_{n}\lt p C \frac{1}{n(n-1)^{p}}
$$
for every $x \in[-1,1]$ and for every $n \geq 1$, by M-test for uniform convergence, together with Abel's theorem, for $p>0$, the power series
$$
\sum_{n=2}^{\infty} \frac{p(p-1) \cdots(p-(n-1))}{n !} x^{n}
$$
converges uniformly to $(1+x)^{p}-1-p x$ on $[-1,1]$, which proves 2$)$.
For example
$$
\sqrt{1+x}=1+\sum_{n=1}^{\infty} \frac{\frac{1}{2}\left(\frac{1}{2}-1\right) \cdots\left(\frac{1}{2}-(n-1)\right)}{n !} x^{n} \quad \forall x \in[-1,1]
$$and the convergence of the Taylor expansion on $[-1,1]$ is uniform, and
$$
\frac{1}{\sqrt{1+x}}=1+\sum_{n=1}^{\infty} \frac{\left(-\frac{1}{2}\right)\left(-\frac{1}{2}-1\right) \cdots\left(-\frac{1}{2}-(n-1)\right)}{n !} x^{n} \quad \forall x \in(-1,1] .
$$
类似的证法见于complex.pdf的24页:
Let $F(z)$ be the multi-function \[
\left[(1+z)^\alpha\right]=\{\exp (\alpha \cdot w): w \in \mathbb{C}, \exp (w)=1+z\}
\]
Using $L(z)$ the principal branch of $[\log (z)]$ we obtain a branch $f(z)$ of $[(1+z)^\alpha]$ given by $f(z)=\exp (\alpha \cdot L(1+z))$. Let $\left(\begin{array}{l}\alpha \\ k\end{array}\right)=\frac{1}{k !} \alpha \cdot(\alpha-1) \ldots(\alpha-k+1)$. We want to show that a version of the binomial theorem holds for this branch of the multifunction $\left[(1+z)^\alpha\right]$.
Let $s(z)=\sum_{k=0}^{\infty}\left(\begin{array}{l}\alpha \\ k\end{array}\right) z^{k}$.
By the ratio test, $s(z)$ has radius of convergence equal to 1, so that $s(z)$ defines a holomorphic function in $B(0,1)$. Moreover, you can check using the properties of power series established in a previous section, that within $B(0,1), s(z)$ satisfies $(1+z) s^{\prime}(z)=\alpha \cdot s(z)$.
Now $f(z)$ is defined on $\mathbb{C} \backslash(-\infty,-1)$, and hence on all of $B(0,1)$. Moreover $f^{\prime}(z)=(\alpha / 1+z) f(z)$. We claim that within the open ball $B(0,1)$ the power series $s(z)=\sum_{n=0}^{\infty}\left(\begin{array}{c}\alpha \\ k\end{array}\right) z^k$ coincides with $f(z)$. Indeed if we set
\[
g(z)=\frac{s(z)}{f(z)}
\]
then $g(z)$ is holomorphic for every $z \in B(0,1)$ and by the chain rule
\[
g^{\prime}(z)=\frac{s^{\prime}(z) f(z)-s(z) f^{\prime}(z)}{f^2(z)}=0
\]
since $s^{\prime}(z)=\frac{\alpha \cdot s(z)}{1+z}$. Also $g(0)=1$ so $g$ is constant and $s(z)=f(z)$.
Here we use the fact that if a holomorphic function $g$ has $g^{\prime}(z)=0$ on $B(0,1)$ then it is constant. We have already proven this for $\mathbb{C}$ and in fact the same proof applies to $B(0,1)$. Indeed, as we saw in the case of $\mathbb{C}$, if $g^{\prime}(z)=0$ for all $z$ then $g$ is constant on any vertical and horizontal segment, which clearly implies that $g$ is constant on $B(0,1)$. We note that this follows also from the following general result that we will prove soon: if a holomorphic function $g$ has $g^{\prime}(z)=0$ for all $z$ in a domain $U$, then $g$ is constant on $U$.
奥林德·罗德里格斯
纯数学与应用数学杂志第一系列,第三卷(1838),第550-551页。 numdam.org/item?id=JMPA_1838_1_3__550_0
Démonstration élémentaire et purement algébrique du développement d'un binôme élevé à une puissance négative ou fractionnaire
纯代数和初等的证明负或分数幂二项式展开