A Guide to Taylor and Maclaurin Series

by: Kathryn Marsh, proud Member of the Math Squad.

 keyword: taylor series, maclaurin series 

INTRODUCTION The purpose of this tutorial is to give an overview of Taylor and Maclaurin Series; what they are, how to derive them, and a few applications. This is meant to be a guide to UNDERSTANDING them and finding Taylor Series expansions of functions, not just being able to solve problems on your homework because math is a lot more fun that way :).

Contents

  • The Almighty Power Series
  • What's a Taylor Series?
  • Why is this useful again?

The Almighty Power Series

Before we get too deep into the magic of Taylor Series, we need to start with a firm understanding of the power series. So let's take a look at what a power series is.

A power series is just a polynomial that may or may not have a finite degree. Formally, this is anything of the form

$ \sum_{n=0}^{\infty} c_n x^n = c_0 + c_1 x + c_2 x^2 + c_3 x^3 + ... $

where the $ \ c_n $'s can be any constant and x is a variable.

Sometimes it is more useful to write a power series in another form, called a power series centered at a or a power series about a which we write $ \sum_{n=0}^{\infty} c_n (x-a)^n = c_0 + c_1 (x-a) + c_2 (x-a)^2 + c_3 (x-a)^3 + ... $

Let's say we have the following function.

$ \ f(x)=c_0 + c_1 (x-a) + c_2 (x-a)^2 + c_3 (x-a)^3 + ... $

Now as soon as we choose a specific x to plug into our function, we have an infinite series which may or may not converge. The domain for this function is the set of all x's for which the series converges.

How would we go about actually finding which x values are in the domain? To do this, we treat x as a number and proceed the same as if we had any other infinite series. Recall that we have several tests at our disposal for finding whether a series converges or not, namely, the integral test, the comparison and limit comparison tests, alternating series test, the p-series test, the ratio test, and the root test. In general, the ratio test is the most useful for these kinds of series but it may be that another test would also work.

Now, if we apply the ratio test to our function we can get three possible outcomes:

(i) The series only converges when x=a.

(ii) The series converges for all x.

(iii) There is some positive number R such that the series converges if $ \ |x-a|<R $ and diverges if $ \ |x-a|>R $.

We call this R the radius of convergence and in the case of (i) we say R=0, and in the case of (ii) we say R=$ \infty $.



What's a Taylor Series?

Let's say we have some function that isn't a particularly nice function. And by not "nice", I mean something like $ f(x)=e^x \text{or} \ln(x) $ because if I asked you what $ e^{3.81} $ was you would have to use a calculator to find an approximate answer. But what if there was a way to rewrite a "nasty" function as a polynomial? Polynomials are generally easier to compute so it would be great if we could represent difficult functions as polynomials or power series (just infinite polynomials, remember). As it turns out, we can represent some functions as power series and if a function can be represented as a power series, we can find it using a Taylor Series.

1. We start with the assumption that we have a function $ f(x) $ which can be represented as a power series. Therefore,

$ \ f(x)=c_0+c_1(x-a)+ c_2(x-a)^2+c_3(x-a)^3+ ... $

but we need to find out what the values of all the $ c_n $'s are. We notice that if we plug in $ a $ for $ x $ we get

$ \ f(a)=c_0 $

because all the other terms become 0.

Ok, cool, so now we know that $ \ c_0=f(a) $.

2. Just for fun, lets take the derivative of this function. When we do, we get

$ \ f '(x)=c_1 +2c_2(x-a)+3c_3(x-a)^2+4c_4(x-3)^3+... $.

Again, we plug in a for x and all the terms except $ c_1 $ become zero so we are left with

$ \ f '(a)=c_1 $.

So now we know the first two terms of our power series.

$ \ f(x)=f(a)+f '(a)(x-a)+c_2(x-a)^2+c_3(x-a)^3 $

3. Taking a derivative helped us last time so let's try taking a second derivative.

$ \ f ''(x)=2c_2 +2*3c_3(x-a)+3*4c_4(x-a)^2+... $.

Again, we plug in a for x and we get that

$ \ f ''(a)=2c_2 $.

Solving for $ c_2 $ we get that

$ \ c_2=\frac{f ''(a)}{2} $

So now we know the first three terms of our power series.

$ \ f(x)=f(a)+f '(a)(x-a)+\frac{f ''(a)}{2}(x-a)^2+c_3(x-a)^3 $

4. If we continue doing this process (taking another derivative, plugging in a for x and solving for $ c_n $ we realize that every $ c_n $ has the following form.

$ \ c_n=\frac{f^{(n)}(a)}{n!} $

This is easy to verify and if you don't believe me, please try it on your own! To continue just find the 3rd derivative, plug in a for x and solve for $ c_3 $ and again with the 4th derivative until you see the pattern.

So what did we just learn? If we have some function that can be represented as a power series, then its power series representation is of the form,

$ \ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x-a)^n $


Why is this useful again?

So lets say we have some function $ f(x) $ which is not a nice function to work with. Maybe you would like to know the area under the curve of this function but you can't integrate it. If, however, your function could be represented by a power series, you could easily integrate the power series because remember power series are just infinite polynomials! Polynomials can be broken up into terms when integrating or differentiating so we have now made an impossible integration problem fairly straightforward. Voila!


Lets find the Taylor series centered at 0 (Maclaurin series) of the following function.


$ f(x)=e^{-x} $


Now we assume that this function has a Taylor series expansion and as we showed above if it does then we know it will have the following form:

$ \ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x-a)^n $

But we are centering this at 0 so $ a=0 $ and we get

$ \ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(0)}{n!}(x)^n $

Now all we have to do is find out what $ f^{(n)}(0) $ is for our function. Remember this is the nth derivative of the function evaluated at 0. The 0th derivative would just be the function itself evaluated at 0. Lets make a table to organize this information.


n $ f^{(n)}(x) $ $ f^{(n)}(0) $
0 $ e^-x $ $ e^{-0}=1 $
1 $ -e^{-x} $ $ -e^{-0}=-1 $
2 $ e^{-x} $ $ e^{-0}=1 $

Note that because of the chain rule, the derivative of the function alternates between positive and negative every time we take another derivative. Now we want to be able to write this behavior in a concise way and the standard way to denote this simple change in sign is with a $ (-1)^n $. So now we have all the pieces of our formula and we can say that

$ e^{-x}=\sum_{n=0}^{\infty} \frac{(-1)^n}{n!}(x)^n $


Hints and Other Helpful Bits of Advice:

1. Tables are the way to go for these problems. It helps stay organized and helps keep track of which n you are on.
2. Be careful when you are taking derivatives! Remember the chain rule, product rule, quotient rule, etc. Often people forget these rules of differentiation when they are in the middle of this type of problem because they are not used to having so many steps so they just think they can skip a few. This is not the case! Be careful with your differentiation and treat each as its own problem. 
3. Don't simplify to quickly. The purpose of writing out the table is to find a pattern that you can write down. So show all of your steps so later you can look and see where that pattern came from. 
4. Go slowly. These are long problems and its very easy to make a small mistake somewhere. Take a deep breath and don't rush.
5. Have fun :) 

Questions and comments

If you have any questions, comments, etc. please, please please post them below:

  • One can find a long list of Taylor series in this table.
  • Comment / question 2
  • Comment / question 3

Back to Math Squad page

The Spring 2013 Math Squad 2013 was supported by an anonymous gift to Project Rhea. If you enjoyed reading these tutorials, please help Rhea "help students learn" with a donation to this project. Your contribution is greatly appreciated.

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach