周次: 16 日期: June 8, 2022 节次: 1
In this lecture, we show that the convergent power series represents a function $f(x)$. The coefficient of the power series can be presented in turn by the function. This opens a new page for the application of power series, for we can associate power series to any infinitely differentiable functions. To see the power series derived from a function really converges to the corresponding function, we have to use the Taylor’s formula and estimate the remainder.
Multiplication of Power Series
Theorem 21. The Series Multiplication Theorem for Power Series
If $A(x)=\sum_{n=0}^\infty a_n x^n$ and $B(x)=\sum_{n=0}^\infty b_n x^n$ converges absolutely for $|x|<R$ and
$$ c_n = \text{sum of all the coefficients of $n$-th order terms in the multiplication}\ =a_0 b_n + a_1 b_{n-1} + a_2 b_{n-2} + \cdots + a_n b_0 = \sum_{k=0}^n a_k b_{n-k} $$
then $\sum_{n=0}^\infty c_n x^n$ converges absolutely to $A(x) B(x)$ for $|x|<R$
$$ A(x)B(x) = \left(\sum_{n=0}^\infty a_n x^n\right)\cdot \left(\sum_{n=0}^\infty b_n x^n\right) \underbrace{=}{\text{important}} \sum{n=0}^\infty c_n x^n $$
Example 7. Multiply the geometric series
Given the power series $\frac1{1-x} = 1 + x + x^2 + \cdots$, multiplying with itself, one finds the power series form for $\frac1{(1-x)^2}$, for $|x|<1$.
§11.8 Taylor and Maclaurin Series
The previous section has shown a sum of power series, when treated as a function $f(x)$, will be infinitely differentiable inside the interval of convergence. In this section, we ask the converse question: If a function $f(x)$ is infinitely differentiable, can we generate a power series from it? And if we can, will the generated power series converges to the function $f(x)$? The answer for the first question can be answered in this section.
Suppose $f(x)$ can be written as the sum of a power series on an interval $(a-R,a+R)$
$$ f(x) = \sum_{n=0}^\infty a_n (x-a)^n = a_0 + a_1(x-a) + a_2 (x-a)^2 + \cdots $$
with a positive radius of convergence $R$. By repeated term-by-term differentiation within the interval of convergence $I$ we obtain
$$ f'(x) = a_1 + 2 a_2 (x-a) + 3a_3(x-a)^2 + \cdots \ f''(x) = 2a_2 + 3\cdot 2 a_3 (x-a) + 4\cdots 3 a_4 (x-a)^2 + \cdots \ \vdots \ f^{(n)}(x) = n! a_n + \text{a sum of terms with $(x-a)$ as a factor} $$
Since these equations all hold at $x=a$, we have
$$ f'(a) = a_1,\ f''(a) = 2! a_2,\ \vdots \ f^{(n)}(a) = n! a_n $$
So if there’s a series converging to $f(x)$, then the coefficients must be
$$ a_n = \frac{f^{(n)}(a)}{n!}. $$
The series is completely described by $f(x)$
$$ f(x) = \sum_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x-a)^n. $$
But what will happen if we only know that $f(x)$ is an infinitely differentiable function on an interval $I$?
For an $x=a\in I$, we can still generate the series $\sum_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x-a)^n$, but we have to study whether this series can converge to $f(x)$ or not. It will be discussed in the next section.
Taylor and Maclaurin Series
Definitions. Taylor Series, Maclaurin Series
Let $f$ be a function with derivatives of all orders throughout some interval containing $a$ as an interior point. The Taylor series generated by $f$ at $x=a$ is
$$ \sum_{k=0}^\infty \frac{f^{(n)}(a)}{n!}(x-a)^n. $$
If $x=0$ is an interior point of the interval, the Taylor series generated by $f$ at $x=0$ is also called a Maclaurin series (generated by $f$) , which is simply
$$ \sum_{n=0}^\infty \frac{f^{(n)}(0)}{n!}x^n. $$
Example 1. Finding a Taylor Series
Find the Taylor series generated by $f(x) = 1/x$ at $a=2$. Where does the series converge to $1/x$ ?
Solution. The derivatives $f^{(n)}(2) = \frac{(-1)^n n!}{ 2^{n+1}}$. Therefore the Taylor series is
$$ \sum_{n=0}^\infty \frac{(-1)^n}{2^{n+1}} (x-2)^n. $$
It is a geometric series converges on $|x-2|<2$, with the sum $\frac{1/2}{1+\frac{(x-2)}{2}} = \frac1{x}$. Therefore the Taylor series converges to $1/x$ on $(0,4)$.
Example 2. Finding a Maclaurin Series
Find the Maclaurin series generated by $f(x) = e^x$.
Solution. The Maclaurin series generated is
$$ \sum_{n=0}^\infty \frac{x^n}{n!}. $$
Taylor Polynomials
If the function $f(x)$ has higher order derivatives at $a$, then we can truncate the Taylor series to finite order, resulting in a polynomial function. For example, if $f$ has derivative at $a$, we can truncate the Taylor series to order $1$, ended up with an order $1$ polynomial
$$ f(a) + f'(a)(x-a) $$
which is just the linear approximation of $f(x)$ at $x=a$, given in the section 3.8. From this point of view, a higher order Taylor polynomial might be a better approximation of $f(x)$.
Definition. Taylor Polynomial of Order $n$ (Partial sum of the Taylor Series)
Let $f$ be a function with derivatives of order $k=1,2,\cdots, N$ in some interval containing $a$ as an interior point. Then for any integer $n\le N$, the Taylor polynomial of order $n$ generated by $f$ at $x=a$ is the polynomial
$$ P_n(x) = f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^n + \cdots + \frac{f^{(n)}(a)}{n!}(x-a)^n. $$
Notice.
Since the $n$th derivative of $f$ at $a$ maybe $0$, the actual degree for the polynomial might be smaller than $n$. That is why we don’t call $P_n(x)$ a degree $n$ polynomial, but use order instead.
Example 2. Finding Taylor Polynomials for $e^x$
Find the Taylor polynomials generated by $e^x$ at $x=0.$
Solution.
The order $n$ polynomial is given by $P_n(x) = \sum_{k=0}^n \frac{x^k}{k!}$ .
Example 3. Finding Taylor Polynomial for $\cos(x)$
Find the Taylor series and Taylor polynomials generated by $\cos(x)$ at $x=0$.
Solution.
The $n$th derivative for $\cos(x)$ can be expressed by $\cos^{(n)}(x) = \cos(x + \frac{n\pi}{2})$. The $n$th derivative at $x=0$ is either $0$ or $(\pm1)$ depending on the order. The Taylor series is given by
$$ \sum_{k=0}^\infty \frac{(-1)^k x^{2k}}{(2k)!}=1-\frac{x^2}{2!} + \frac{x^4}{4!} - \frac{x^6}{6!} +\cdots $$
And the order $2n$ and $2n+1$ Taylor polynomials (they are identical) are given by
$$ P_{2n}(x) = 1- \frac{x^2}{2!} + \frac{x^4}{4!} - \cdots + \frac{(-1)^n x^{2n}}{(2n)!} $$
Example 4. A Function $f$ Whose Taylor Series Converges at Every $x$ but Converges to $f(x)$ Only at $x=0$
It can be shown that
$$ f(x) = \begin{cases}0, & x=0\ e^{-1/x^2}, & x\neq 0 \end{cases} $$
has derivative of all orders at $x=0$ and that $f^{(n)}(0) = 0$ for all $n$. This means the Taylor series generated by $f$ at $x=0$ is $0$. But the function is only equal to $0$ at $x=0$. Which means the Taylor series converges everywhere but only be equal to $f$ at $x=0$.
Two Questions
- For what values of $x$ can we normally expect a Taylor series converges to its generating functions?
- How accurately do a function’s Taylor polynomial approximate the function on a given interval?
§ 11.9 Convergence of Taylor Series; Error Estimates
Two questions from the previous section:
- When does a Taylor series converge to its generating function?
- How accurately do a function’s Taylor polynomial approximate the function on a given interval?
Taylor’s Theorem
Theorem 22. Taylor’s Theorem and Taylor’s Formula
If $f$ and its first $n$ derivatives $f'$, $f''$, $\cdots$, $f^{(n)}$ are continuous on the closed interval between $a$ and $b$, and $f^{(n)}$ is differentiable on the open interval between $a$ and $b$, then for any $x\in[a,b]$, there exists a number $c$ between $a$ and $b$ such that
$$ \begin{equation}f(x)= f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \cdots + \frac{f^{(n)}(a)}{n!}(x-a)^n + R_n(x)\end{equation} $$
where the remainder $R_n(x)$ is
$$ R_n(x) = \frac{f^{(n)(x)}}{(n+1)!}(x-a)^{n+1},\quad \text{for some $c$ between $a$ and $x$}. $$
For any value of $n$ we want, the equation gives both a polynomial approximation of $f$ of that order and a formula for the error involved in using that approximation over the interval $I$.
The equation (1) is called Taylor’s formula. The function $R_n(x)$ is called the remainder of order $n$ or the error term for the approximation of $f$ by $P_n(x)$ over $I.$
If the remainder $R_n(x)\to 0$ as $n\to \infty$ for all $x\in I$, we say that the Taylor series generated by $f$ at $x=a$ converges to $f$ on $I$, and we write
$$ f(x) = \sum_{k=0}^\infty \frac{f^{(k)}(a)}{k!}(x-a)^k. $$
Example 1. The Taylor Series for $e^x$ Revisited
The Taylor series generated by $f(x) = e^x$ at $x=0$ converges to $f(x)$ for every real value of $x$.
Solution. Writing down the Taylor’s formula for $e^x$ at $x=0$ we find
$$ e^x = 1 + x + \frac{x^2}{2!} + \cdots +\frac{x^n}{n!} + R_n(x) $$
where
$$ R_n(x) = \frac{e^c}{(n+1)!} x^{n+1}, \quad \text{for some $c$ between $0$ and $x$}. $$
An estimation shows
$$ |R_n(x)|\le \frac{e^{|x|} |x|^{n+1}}{(n+1)!} $$
Therefore, for any $x$, $\lim_{n\to\infty} R_n(x) = 0.$ Thus the series converges to $e^x$ for every $x$.
$$ e^x = \sum_{k=0}^\infty \frac{x^k}{k!}. $$
Example 2. The Taylor Series for $\sin(x)$ at $x=0$
The Taylor series for $\sin(x)$ at $x=0$ converges for all $x$.
Solution.
The $n$th order derivative for $\sin(x)$ can be written as
$$ \sin^{(n)}(x) = \sin(x+\frac{n\pi}{2}). $$
So
$$ \sin^{(n)}(0) = \begin{cases} 0, & \text{$n=2k$} \ (-1)^k, & \text{$n=2k+1$}\end{cases}. $$
The series has only odd-powered terms and, Taylor’s Theorem gives
$$ \sin(x) = x-\frac{x^3}{3!} + \frac{x^5}{5!} - \cdots + \frac{(-1)^k x^{2k+1}}{(2k+1)!} + R_{2k+1}(x), $$
where
$$ |R_{2k+1}(x)| = \left|\frac{\sin^{(2k+1)}(c)}{(2k+1)!}x^{2k+1}\right|\le \frac{x^{2k+1}}{(2k+1)!}\to 0 $$
Estimating the Remainder
Theorem 23. The Remainder Estimation Theorem
If there is a positive constant $M$ such that $|f^{(n+1)}(t)|\le M$ for all $t\in(x,a)$, then the remainder term $R_n(x)$ in Taylor’s Theorem satisfies the inequality
$$ |R_n(x)|\le M \frac{|x-a|^{n+1}}{(n+1)!}. $$
If this condition holds for every $n$ and the other conditions of Taylor’s Theorem are satisfied by $f$, then the series converges to $f(x)$.
Example 3. The Taylor Series for $\cos(x)$ at $x=0$ Revisited.
Show the Taylor Series for $\cos(x)$ at $x=0$ converges to $\cos(x)$ for every value of $x$.
Solution.
The $n$th derivative for $\cos(x)$ at $x=0$ is
$$ \cos^{(n)}(0) =\cos(0+\frac{n\pi}{2})= \begin{cases}(-1)^{k}& n=2k \ 0 & n=2k+1\end{cases}. $$
The Taylor’s formula shows that
$$ \cos(x) = 1 - \frac{x^2}{2!} + \frac{x^4}{4!} - \cdots + \frac{(-1)^{k} x^{2k}}{(2k)!} + R_{2k}(x) $$
where $R_{2k}(x)= \frac{\cos^{(2k+1)}(c)}{(2k+1)!}x^{2k+1}$. Since $\cos(x)$ is bounded, the Remainder Esitmation Theorem with $M=1$ gives
$$ |R_{2k}(x)|\le \frac{|x|^{2k+1}}{(2k+1)!}. $$
For every value of $x$, $R_{2k}\to 0$. Therefore the series converges to $\cos(x)$ for every value of $x$.
$$ \cos(x) = \sum_{k=0}^\infty \frac{(-1)^k x^{2k}}{(2k)!} = 1-\frac{x^2}{2!} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots $$
Various Ways to Find the Taylor Series
Example 4. Finding a Taylor Series by Substitution
Find the Taylor series for $\cos(2x)$ at $x=0.$
Solution. By substituting $2x$ for $x$ in the Taylor series for $\cos(x)$, we can find the Taylor series for $\cos(2x).$
$$ \cos(2x) = 1-\frac{(2x)^2}{2!} + \frac{(2x)^4}{4!} -\frac{(2x)^6}{6!} + \cdots= \sum_{k=0}^\infty (-1)^k \frac{2^{2k}x^{2k}}{(2k)!}. $$
This equality holds for $-\infty<2x<+\infty$, so the newly created series converges for all $x$.
Question. Why the series is in fact the Taylor series for $\cos(2x)$.
Example 5. Finding a Taylor Series by Multiplication
Find the Taylor series for $x\sin(x)$ at $x=0$.
Solution. By multiplying the Taylor series for $\sin(x)$ with $x$,
$$ x\sin(x) = x^2 - \frac{x^4}{3!} + \frac{x^6}{5!} - \frac{x^7}{6!} + \cdots. $$
Truncation Error
Example 6. Calculate $e$ with an error of less than $10^{-6}$
Solution.
According to the Taylor series,
$$ e = 1 + \frac1{1!} + \frac{1^2}{2!} + \frac{1^3}{3!} + \cdots $$
The first several terms on the right hand side will give a good approximation for $e$. So to estimate the approximation error, we use the Taylor’s formula
$$ e = 1 + 1 + \frac1{2!} + \frac1{3!} + \cdots + \frac1{n!} + R_n(1). $$
With $R_n(1) = e^c \frac{1}{(n+1)!}$, for some $c\in(0,1).$ Assuming we know that $e<3$, then the remainder is less than $3/(n+1)!$.
By experiment we find that $3/10! < 10^{-6}.$ Thus we should take $n$ to be at least $9$. With an error of less than $10^{-6}$.
$$ e= 1 + 1 + \frac12 + \frac1{3!} + \cdots + \frac1{9!} \approx 2.718282. $$
Example. How Accurate It Is to Take $\cos(10^\circ) \approx 1-\frac{(\pi/18)^2}{2!}=0.984769129$
Solution.
According to the Taylor’s formula
$$ \cos(x) = 1- \frac{x^2}{2!} + \frac{\cos^{(3)}(c) x^3}{3!} $$
The remainder has an absolute value less than $x^3/3!$. In the case of $x=\pi/18< 4/18 <1/4$. The error will be less than $\frac{1}{4^3 3!}\approx 0.0026.$
Example 7. For what values of $x$ can we replace $\sin(x)$ by $x-\frac{x^3}{3!}$ with an error of magnitude no greater than $3\times 10^{-4}$?
Solution.
Since the Taylor series for $\sin(x)$ is an alternating series, we can estimate the error in truncation by the Alternating Series Estimation Theorem
$$ \sin(x) = x - \frac{x^3}{3!} {\color{red}\bigg| } + \frac{x^5}{5!} - \cdots $$
After $x^3/3!$ the error is no greater than $|x|^5/120$. Therefore the error will be less than $3\times 10^{-4}$ if
$$ \frac{|x|^5}{120}< 3\times 10^{-4} \quad \text{or} \quad |x|< \sqrt[5]{360\times 10^{-4}} \approx 0.514. $$
Notice that the alternating property shows that the truncation $x-\frac{x^3}{3!}$ is an underestimation for $\sin(x)$ when $x>0$.
How the estimate given by the Remainder Estimation Theorem compares with the one just obtained from the Alternating Series Estimation Theorem?
$$ \sin(x) = x -\frac{x^3}{3!} + R_3(x) $$
The remainder $| R_3(x) |\le \frac{|x|^4}{4!} = \frac{|x|^4}{24}.$ It is not as good as obtained from the Alternating Series Estimation. But, if considered the $x^4$ term has the coefficient $0$, the remainder is actually of an order $4$, which is $|R_4(x)| = \frac{\cos^{(5)}(c)}{5!}|x^5|$, and it gives the same estimation as the Alternating Estimation Theorem.
Combining Taylor Series
If $f(x)$ and $g(x)$ both have convergent Taylor series on an interval $I$, then the sum $f(x)+g(x)$ has a Taylor series obtained by combining the Taylor series for $f(x)$ and $g(x)$. The Taylor series for $\sin(x) + \cos(x)$ is the term-by-term sum of the Taylor series for $\sin(x)$ and $\cos(x)$.
A glimpse to the Euler’s Identity
Definition.
For any real number $x$,
$$ e^{ix} = \cos x + i\sin x, \quad i=\sqrt{-1}. $$
Since
$$ e^{ix} = 1 + \frac{ix}{1!} + \frac{(ix)^2}{2!} + \frac{(ix)^3}{3!} + \frac{(ix)^4}{4!} + \frac{(ix)^5}{5!} + \frac{(ix)^6}{6!}\cdots \= \left(1-\frac{x^2}{2!} + \frac{x^4}{4!} -\cdots\right) + i\left(x - \frac{x^3}{3!} + \frac{x^5}{5!} - \cdots\right)\ = \cos (x) + i\sin(x) $$