Skip to main content

Section 2.4 Interchange of Differentiation and Limit of a Sequence of Functions

Proof.

Let’s first give a proof under a stronger assumption: each \(f_n'(x)\) is continuous on \([a, b]\text{,}\) and let’s denote \(\lim_{n\to\infty} f_n'(x)\) by \(g(x)\text{.}\) Then \(g(x)\) is continuous on \([a, b]\text{,}\) \(f_n(x)=f_n(x_0)+\int_{x_0}^x f_n'(t)\, dt\text{,}\) and Theorem 2.2.7 implies that \(\int_{x_0}^x f_n'(t)\, dt\to \int_{x_0}^x g(t)\, dt\text{,}\) so \(\lim_{n\to\infty} f_n(x)\) exists---denote it as \(f(x)\) and
\begin{equation*} f(x)=\lim_{n\to\infty} f_n(x_0)+ \int_{x_0}^x g(t)\, dt\text{.} \end{equation*}
It follows that
\begin{equation*} f'(x)= g(x)=\lim_{n\to \infty} f_n'(x). \end{equation*}
Furthermore, the convergence above is uniform over \(x\in [a, b]\text{,}\) as
\begin{equation*} \sup_{x\in [a, b]} \left| \int_{x_0}^x f_n'(t)\, dt- \int_{x_0}^x g(t)\, dt \right| \le \int_a^b \left| f_n'(t)-g(t)\right|\, dt \to 0 \end{equation*}
as \(n\to \infty\text{.}\)
We now do a proof in the general case. First, we prove that \(\{f_n(x)\}\) converges uniformly over \([a, b]\text{.}\) Under the assumption of uniform convergence of \(\{f_n'(x)\}\) over \([a, b]\text{,}\) for any \(\epsilon >0\text{,}\) there exists \(N\) such that for \(n, m\ge N, x\in [a, b]\text{,}\) we have
\begin{equation} |f_n'(x)-f_m'(x)| \lt \epsilon.\tag{2.4.1} \end{equation}
Applying the theorem of the mean to \(f_n(x)-f_m(x)-\left[ f_n(x_0)-f_m(x_0)\right]\text{,}\) we get
\begin{equation*} f_n(x)-f_m(x)-\left[ f_n(x_0)-f_m(x_0)\right]=\left(f_n'(x^*)-f_m'(x^*)\right)(x-x_0) \end{equation*}
for some \(x^*\) depending on \(n, m, x\text{.}\) This leads to
\begin{equation*} \left| f_n(x)-f_m(x)\right| \le \left| f_n(x_0)-f_m(x_0)\right| + \epsilon (b-a). \end{equation*}
Since \(f_n(x_0)-f_m(x_0) \to 0\) as \(n, m \to \infty\text{,}\) this shows that \(\{f_n(x)\}\) satisfies the Cauchy Criterion for Uniform Convergence on \([a, b]\text{,}\) therefore it converges uniformly to some \(f(x)\) on \([a, b]\text{.}\)
Next for any \(x\in (a, b)\) we define \(g_n(h)=\left[f_n(x+h)-f_n(x)\right]/h\) for \(0 \lt |h| \lt \delta\) for some \(\delta >0\) (when \(x=a\) or \(b\text{,}\) we restrict \(h\) to have appropriate sign). Note that \(g_n(h)\to f_n'(x)\) as \(h\to 0\) and that \(g_n(h)\to \left[f(x+h)-f(x)\right]/h\) as \(n\to \infty\text{.}\) We next show that this convergence is uniform for \(0 \lt |h| \lt \delta\text{.}\)
We apply the theorem of the mean to \(g_n(h)-g_m(h)\) to get
\begin{equation*} g_n(h)-g_m(h)=f_n'(x^*)-f_m'(x^*) \end{equation*}
for some \(x^*\) between \(x\) and \(x+h\) depending on \(n, m, x, h\text{.}\) But (2.4.1) holds for \(n, m\ge N, x\in [a, b]\text{,}\) so we get \(| g_n(h)-g_m(h) | \lt \epsilon\) for \(n, m\ge N\) uniformly in \(0 \lt |h| \lt \delta\text{.}\) This shows that \(g_n(h)\) converges to \(\left[f(x+h)-f(x)\right]/h\) uniformly in \(0 \lt |h| \lt \delta\) as \(n\to \infty\text{.}\) Now Theorem 2.2.1 applies to \(g_n(h)\) to conclude that
\begin{equation*} \lim_{h\to 0} \left[f(x+h)-f(x)\right]/h = \lim_{n\to \infty} f_n'(x). \end{equation*}
Namely, \(f'(x)\) exists and equals \(\lim_{n\to \infty} f_n'(x)\text{.}\)

Remark 2.4.2.

In Theorem 2.4.1, the assumptions are only sufficient, but not necessary conditions. E.g. \(f_n(x)= x^n/n\) converges to \(f(x)=0\) uniformly on \(I=(-1,1)\text{,}\) and \(f(x)=0\) is differentiable on \(I\text{,}\) yet the convergence of \(f'_n(x)=x^{n-1}\) (to \(f'(x)=0\)) is not uniform on \(I\) (although when restricted to \((-\delta, \delta)\) for any fixed \(0 < \delta < 1\text{,}\) the convergence is uniform.). Another example is \(f_n(x)= \sin (nx)/n\text{,}\) which converges to \(f(x)=0\) uniformly on \(\mathbb R\text{,}\) yet \(f'_n(x)= \cos (nx)\) would not converge for many values of \(x\) (e.g. all rational multiples of \(\pi\)).

Example 2.4.3.

Define
\begin{equation*} f(x)= \sum_{k=1}^{\infty} \frac{ \cos (kx)}{k^2}\text{.} \end{equation*}
The series converges uniformly for \(x\in\bbR\) so it defines a continuous function on \(\bbR\text{.}\) To check its differentiability at any \(x\text{,}\) we need to check whether the differentiated series
\begin{equation*} - \sum_{k=1}^{\infty} \frac{ \sin (kx)}{k} \end{equation*}
converges uniformly in a neighborhood of \(x\text{,}\) according to Theorem 2.4.1.
We studied this series earlier and showed that for any \(\delta, 0 \lt \delta \lt \pi\text{,}\) it converges uniformly when restricted to \(\left\{x\in [-\pi, \pi]: 0 < \delta < |x|\right\}\text{.}\) Thus in that region, we do have
\begin{equation*} f'(x)= -\sum_{k=1}^{\infty} \frac{ \sin (kx)}{k}\text{.} \end{equation*}
Does \(f'(0)\) exist?

Remark 2.4.4.

Weierstrass’ nowhere differentiable function, which is defined as
\begin{equation*} \sum ^{\infty \:}_{n=0}a^n\cos \left(b^n\pi x\right),\:b\:\text{odd},\:0 < a < 1,\:ab>1+\frac{3\pi }{2}, \end{equation*}
is the uniform limit on \(\bbR\) of the infinite series above whose terms are infinitely times differentiable. One can also construct a nowhere differentiable function from the uniform limit of but the building blocks in this construction are not differentiable.
Ideas similar to the Weierstrass’s construction show up in later work of J. Nash in constructing \(C^{1}\) isometric imbedding of a given Riemannian metric, and in more recent work in constructing very rough solutions of Navier-Stokes equations. Roughly speaking, one constructs sufficiently differentiable functions which approximately satisfy the specified equations, but in the limit only a low order regularity is preserved, and the differentiability is lost.