Fork me on GitHub
Math for the people, by the people.

User login

Taylor series

Defines: 
Taylor polynomial, Taylor expansion, Maclaurin series
Type of Math Object: 
Definition
Major Section: 
Reference

Mathematics Subject Classification

41A58 no label found26A24 no label found30B10 no label found

Comments

Some authors(see, by example, Courant & John, Intoduction to Calculus and Analysis, Vol. I, Chap.5, Sec.5.4, footnote^1, Wiley,1965) do not grant recognition to Mc Laurin's paper because it is a particular case (a=0) of Taylor's paper. The former was published in 1742, the latter in 1715. So, I think that is preferable take Taylor's series as
f(x)=\sum_{k=0}^{\infty}f^{(k)}(a)/k!(x-a)^k, f^{(0)}(a)=f(a).

Or is the taylor series just a McLaurin series where you have translated it to x=a?

Anyway, in all books I've seen, MacLaurin series are a special case of Taylor series. So I think the entry should be changed, make a copy and call it maclaurin series, and then change x=0 to x=a in the taylor entry.

There's too much loose talk in the Encyclopedias'a Taylor series entry such as

'a function with infinitely differentiable point [sic]',

'for most functions one encounters in college calculus, f(x)=T(x)'.

A Taylor series centered at the point a (or a Maclaurin [the proper spelling of poor Collin's Scotch surname] series centered at a=0), may be associated to any function f infinitely differentiable at a, but the requirement is that the series has a strictly positive radius of convergence R; in such case the series is absolutely convergent on open disk (interval) |x-a|<R, uniformely convergent on any closed disk |x-a|<= r with 0<r<R, and f(x)=T(x) for x in |x-a|<R.

Note that in general f(x)=T(x) holds on the convergence disk only, not on f's domain; e.g., x->arctg x is defined and infinitely differentiable on the real line, but it's Maclaurin series converges only on the interval (-1,1).

For a real function of one real variable, a sufficient condition for existence of Taylor series development centered at a is that all derivatives of f are uniformely bounded in a neighborhood of a;
for a complex function of one complex variable the necessary and sufficient condition is that the function is differentiable with respect to the field of complex numbers (or equivalently, it's real and imaginary parts are differentiable and satisfy the Cauchy-Riemann conditions).

For the bounds on $R_n$, the author refers to http://planetmath.org/encyclopedia/TaylorsTheorem.html where we have some informations on $R_n$.

But on the two pages, the same symbol ($R_n$) describe something different:
- for the "Taylor Series", $R_n$ refers to the rest of the approximation by a polynom of order n (n+1 terms in the approximation) ;
- for the "Taylor's Theorem", $R_n$ refers to the rest of the approximation by a serie of n terms (thus a polynom of order n-1).

This is quite confusing when only reading thoses two pages and not the definition http://planetmath.org/encyclopedia/RemainderTerm.html

it says: "even if it converges, it may not necessarily converge to the original function"

I think there should be an example for this remarkable claim.

> it says: "even if it converges, it may not necessarily
> converge to the original function"
>
> I think there should be an example for this remarkable
> claim.

The usual example from calculus is
\[
f(x)=\begin{cases}
e^{-1/x^2} & x \ne 0 \\
0 & x = 0.
\end{cases}
\]
This function is nonzero and has a convergent Taylor series,
but its Taylor series $0$. A proof is given in the entry

http://planetmath.org/InfinitelyDifferentiableFunctionThatIsNotAnalytic.... .

Subscribe to Comments for "Taylor series"