## You are here

HomeTaylor series

## Primary tabs

# Taylor series

###### Contents:

- 1 Real Taylor series
- 2 Taylor polynomials
- 3 Examples
- 4 Complex Taylor series
- 5 Taylor series and polynomials in Banach spaces
- 6 Taylor series and polynomials for functions of several variables
- 7 Taylor expansion of formal polynomials
- References

# 1 Real Taylor series

Let $f\colon I\to\mathbb{R}$ be a function defined on an open interval $I$, possessing derivatives of all orders at $a\in I$. Then the power series

$T(x)=\sum_{{k=0}}^{{\infty}}\frac{f^{{(k)}}(a)}{k!}(x-a)^{k}$ |

is called the *Taylor series* for $f$ centered at $a$.

Often the case $a=0$ is considered, and we have the simpler

$T(x)=\sum_{{k=0}}^{{\infty}}\frac{f^{{(k)}}(0)}{k!}x^{k}\,,$ |

called the *Maclaurin series* for $f$ by some authors.

If we perform formal term-by-term differentiation of $T(x)$, we find that $T^{{(k)}}(a)=f^{{(k)}}(a)$, so it is plausible that $T$ is an extrapolation or an approximation to $f$ based on the derivatives of $f$ at a single point $a$.

In general, $T$ may not extrapolate or approximate $f$ in the strictest sense: a Taylor series does not necessarily converge, and even if it does, it may not necessarily converge to the original function $f$. It is also not necessarily true that a Taylor series about $a$ equals the Taylor series of $f$ about some other point $b$, when considered as functions.

Those functions whose Taylor series do converge
to the function are termed
*analytic functions*.

If we start with a *convergent* power series
$f(x)=c_{0}+c_{1}(x-a)+c_{2}(x-a)^{2}+\cdots$ to
define a function $f$, then the Taylor series of $f$
about $a$ will turn out to be the same as our original power series.

# 2 Taylor polynomials

The $n$th degree *Taylor polynomial* for $f$ centered at $a$
is the polynomial

$P_{{n,a}}(x)=\sum_{{k=0}}^{{n}}\frac{f^{{(k)}}(a)}{k!}(x-a)^{k}\,.$ |

In general, $P_{{n,a}}$ has degree $\leq n$; it may be $<n$ if some of the terms $f^{{(k)}}(a)$ vanish. Nevertheless, $P_{{n,a}}$ is characterized by the following properties: it is the unique polynomial of degree $\leq n$ whose derivatives up to the $n$th order at $a$ agree with those of $f$; it is also the unique polynomial $p$ of degree $\leq n$ such that

$f(x)-p(x)=o(\lvert x-a\rvert^{n})\,,\quad\textrm{as $x\to a$}.$ |

(Landau notation is being used here.) These characterizations are sometimes helpful in actually computing Taylor polynomials.

The Taylor polynomial $P_{{n,a}}$ is applicable even if $f$ is only differentiable $n$ times at $a$, or when its Taylor series does not converge to $f$.

The error from the approximation of $f$ by $P_{{n,a}}$,
or *remainder term*, $R_{{n,a}}(x)=f(x)-P_{{n,a}}(x)$,
can be quantified precisely
using Taylor’s theorem.
In particular, Taylor’s Theorem is often used to show

$\lim_{{n\rightarrow\infty}}R_{{n,a}}(x)=0\,,$ |

which is equivalent to $T(x)=f(x)$, i.e. the Taylor series converges to the original function.

A term that is often heard is that of a “Taylor expansion”; depending on the circumstance, this may mean either the Taylor series or the $n$th degree Taylor polynomial. Both are useful to linearize or otherwise reduce the analytical complexity of a function. They are also useful for numerical approximation of functions, when the magnitude of the later terms fall off rapidly.

# 3 Examples

Using the above definition of a Taylor series about $0$, we have the following important series representations:

$\displaystyle e^{x}$ | $\displaystyle=1+\frac{x}{1!}+\frac{x^{2}}{2!}+\frac{x^{3}}{3!}+\cdots$ | ||

$\displaystyle\sin x$ | $\displaystyle=\frac{x}{1!}-\frac{x^{3}}{3!}+\frac{x^{5}}{5!}-\frac{x^{7}}{7!}+\cdots$ | ||

$\displaystyle\cos x$ | $\displaystyle=1-\frac{x^{2}}{2!}+\frac{x^{4}}{4!}-\frac{x^{6}}{6!}+\cdots$ |

That the series on the right converge to the functions on the left can be proven by Taylor’s Theorem.

# 4 Complex Taylor series

If $f\colon U\to\mathbb{C}$ is a holomorphic function from an open subset $U$ of the complex plane, and $a\in U$, we may also consider its Taylor series about $a$ (defined with the same formulae as before, but with complex numbers).

In contrast with the complex case, it turns out that all holomorphic functions are infinitely differentiable and have Taylor series that converge to them. (The radius of convergence of the Taylor series at $a$ being the radius of the largest open disk about $a$ on which the domain of $f$ can be extended.)

This of course makes the theory of analytic functions very nice, and many questions about real power series and real analytic functions are more easily answered by looking at the complex case. For example, we can immediately tell that the Taylor series about the origin for the tangent function

$\frac{\sin z}{\cos z}=\tan z=z+\frac{1}{3}z^{3}+\frac{2}{15}z^{5}+\frac{17}{31% 5}z^{7}+\frac{62}{2835}z^{9}+\cdots$ |

has a radius of convergence of $\pi/2$, because $\tan$ is holomorphic everywhere except at its poles at $z=\pi/2+k\pi,k\in\mathbb{Z}$, and $\pi/2$ is the distance that the closest of these poles get to the origin.

# 5 Taylor series and polynomials in Banach spaces

Taylor series and polynomials can be generalized to Banach spaces: for details, see Taylor’s formula in Banach spaces.

# 6 Taylor series and polynomials for functions of several variables

The simplest Banach spaces are the spaces $\mathbb{R}^{n}$, and in this case Taylor series and Taylor polynomials for functions $f\colon\mathbb{R}^{n}\to\mathbb{R}$ (“functions of $n$ variables”) look like this:

$\displaystyle T(x)$ | $\displaystyle=\sum_{{i_{1}=0}}^{\infty}\cdots\sum_{{i_{n}=0}}^{\infty}\frac{f^% {{(i_{1},i_{2},\ldots,i_{n})}}(0)}{i_{1}!i_{2}!\cdots i_{n}!}x_{1}^{{i_{1}}}x_% {2}^{{i_{2}}}\cdots x_{n}^{{i_{n}}}\,,\quad f^{{(i_{1},i_{2},\ldots,i_{n})}}(0% )=\left.\frac{\partial^{{i_{1}+\cdots+i_{n}}}f}{\partial x_{1}^{{i_{1}}}\cdots% \partial x_{n}^{{i_{n}}}}\right|_{{x=0}}\,,$ | ||

$\displaystyle P_{{N,0}}(x)$ | $\displaystyle=\sum_{{i_{1}+\cdots+i_{n}\leq N}}\frac{f^{{(i_{1},i_{2},\ldots,i% _{n})}}(0)}{i_{1}!i_{2}!\cdots i_{n}!}x_{1}^{{i_{1}}}x_{2}^{{i_{2}}}\cdots x_{% n}^{{i_{n}}}=\sum_{{\lvert I\rvert\leq N}}\frac{1}{I!}\left.\frac{\partial^{{% \lvert I\rvert}}f}{\partial x^{I}}\right|_{{x=0}}\,x^{I}\,.$ |

(For simplicity, we have put the centre at $a=0$. The last expression employs a commonly-used multi-index notation.)

For example, the second-degree Taylor polynomial for $f(x,y)=\cos(x+y)$ centered about $(0,0)$ is

$P_{{2,0}}(x,y)=1-\frac{1}{2}x^{2}-xy-\frac{1}{2}y^{2}\,.$ |

Note that $P_{{2,0}}(x,y)$ can also be obtained by taking the one-variable Taylor series $\cos t=1-t^{2}/2+\cdots$ and substituting $t=x+y$, and keeping only the terms of degree $\leq 2$. This procedure works because of the uniqueness characterization of Taylor polynomials.

# 7 Taylor expansion of formal polynomials

If $f$ is a polynomial function, of degree $n$, then its Taylor series and its Taylor polynomial of degree $\geq n$ actually equal $f$. For this reason, we can consider Taylor series and polynomials applied to formal polynomials, without any notion of convergence. (The usual derivative is replaced by formal differentiation.) In this setting, a “Taylor expansion” of a formal polynomial $p(x)$ about $a$ amounts to nothing more than rewriting $p(x)$ in the form $c_{0}+c_{1}(x-a)+\cdots+c_{n}(x-a)^{n}$.

Similar considerations apply to formal power series, or to formal polynomials of several variables.

# References

- 1 Lars V. Ahlfors. Complex Analysis, third edition. McGraw-Hill, 1979.
- 2 Michael Spivak. Calculus, third edition. Publish or Perish, 1994.

## Mathematics Subject Classification

41A58*no label found*26A24

*no label found*30B10

*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections

## Info

## Attached Articles

getting Taylor series from differential equation by Wkbj79

Taylor series, derivation of by apmc

Taylor's formula in Banach spaces by stevecheng

example of computing limits using Taylor expansion by stevecheng

Taylor expansion of $\sqrt{1+x}$ by stevecheng

Taylor formula remainder: various expressions by gufotta

Taylor series of arcus tangent by pahio

approximate non-linear transformation of affine combination by stevecheng

Taylor series via division by pahio

## Comments

## Taylor series

Some authors(see, by example, Courant & John, Intoduction to Calculus and Analysis, Vol. I, Chap.5, Sec.5.4, footnote^1, Wiley,1965) do not grant recognition to Mc Laurin's paper because it is a particular case (a=0) of Taylor's paper. The former was published in 1742, the latter in 1715. So, I think that is preferable take Taylor's series as

f(x)=\sum_{k=0}^{\infty}f^{(k)}(a)/k!(x-a)^k, f^{(0)}(a)=f(a).

## Re: Taylor series

Or is the taylor series just a McLaurin series where you have translated it to x=a?

Anyway, in all books I've seen, MacLaurin series are a special case of Taylor series. So I think the entry should be changed, make a copy and call it maclaurin series, and then change x=0 to x=a in the taylor entry.

## Re: Taylor series

There's too much loose talk in the Encyclopedias'a Taylor series entry such as

'a function with infinitely differentiable point [sic]',

'for most functions one encounters in college calculus, f(x)=T(x)'.

A Taylor series centered at the point a (or a Maclaurin [the proper spelling of poor Collin's Scotch surname] series centered at a=0), may be associated to any function f infinitely differentiable at a, but the requirement is that the series has a strictly positive radius of convergence R; in such case the series is absolutely convergent on open disk (interval) |x-a|<R, uniformely convergent on any closed disk |x-a|<= r with 0<r<R, and f(x)=T(x) for x in |x-a|<R.

Note that in general f(x)=T(x) holds on the convergence disk only, not on f's domain; e.g., x->arctg x is defined and infinitely differentiable on the real line, but it's Maclaurin series converges only on the interval (-1,1).

For a real function of one real variable, a sufficient condition for existence of Taylor series development centered at a is that all derivatives of f are uniformely bounded in a neighborhood of a;

for a complex function of one complex variable the necessary and sufficient condition is that the function is differentiable with respect to the field of complex numbers (or equivalently, it's real and imaginary parts are differentiable and satisfy the Cauchy-Riemann conditions).

## Potential confusion when talking about bounds on $R_n$

For the bounds on $R_n$, the author refers to http://planetmath.org/encyclopedia/TaylorsTheorem.html where we have some informations on $R_n$.

But on the two pages, the same symbol ($R_n$) describe something different:

- for the "Taylor Series", $R_n$ refers to the rest of the approximation by a polynom of order n (n+1 terms in the approximation) ;

- for the "Taylor's Theorem", $R_n$ refers to the rest of the approximation by a serie of n terms (thus a polynom of order n-1).

This is quite confusing when only reading thoses two pages and not the definition http://planetmath.org/encyclopedia/RemainderTerm.html

## converge to what?

it says: "even if it converges, it may not necessarily converge to the original function"

I think there should be an example for this remarkable claim.

## Re: converge to what?

> it says: "even if it converges, it may not necessarily

> converge to the original function"

>

> I think there should be an example for this remarkable

> claim.

The usual example from calculus is

\[

f(x)=\begin{cases}

e^{-1/x^2} & x \ne 0 \\

0 & x = 0.

\end{cases}

\]

This function is nonzero and has a convergent Taylor series,

but its Taylor series $0$. A proof is given in the entry

http://planetmath.org/InfinitelyDifferentiableFunctionThatIsNotAnalytic.... .