1.1 Introduction to Convergence
We have introduced the concept of infinite series’ in our previous modules. Today in this module, we shallt alk about convergence of an infinite series. Convergence, in elementary terms define the fact that the sum of an infinite series is bounded. This would mean \(\sum\limits_{n=0}^{\infty} a_nx_n < S\).
In other words let’s define \(S_n=\sum\limits_{i=1}^na_n\). If \(S_n\) tends to a certain fixed real constant when \(n\rightarrow \infty\) we can say that \(\sum\limits_{n=1}^\infty a_n\) converges.
There are a few tests for convergence, and we can determine which series’ are convergent and which are not from these tests.
One important result we may need is that if \(S_n\) converges, we must have \(a_n \rightarrow 0\) as \(n \rightarrow \infty\). While this is a necessary condition for convergence, it is not sufficient. The proof follows from the fact that for a convergent series \(S_n\), we know that \(S_n\rightarrow S\) as \(n \rightarrow \infty\), thus \(S_{n+1}-S_n\rightarrow S-S=0\thinspace\Rightarrow a_{n+1}\rightarrow 0\) as \(n\rightarrow \infty\).
1.2 Tests for Convergence
We’ll, for simplicity use the notation \(\sum a_n\) to denote \(\sum \limits{n=1}^{n=\infty}a_n\) in the remaining module unless otherwise stated.
Let \(a_n\le b_n\) for all positive integers \(n>k\) for some non-negative integer \(k\). Then, we can say that if \(\sum b_n\) converges, then \(\sum a_n\) also converges. Conversely if \(a_n \ge b_n\) for all \(n \ge k\) for some k and \(sum b_n\) diverges, then \(\sum a_n\) also diverges.
Ex-1
Show that the series’ \(\sum a_n\) converges iff \(\sum \left( \frac{a_n}{1+a_n}\right)\) converges. Assume that \(a_i>0\) for all \(i\).
Solution:
Assume that \(\sum a_n\) converges. We have \(\frac{a_k}{1+a_k}<a_k\) for all positive integers \(k\). Thus \(\sum \left(\frac{a_k}{k+1} \right) <\sum a_k\). Since RHS converges, we can say that LHS is convergent.
Next we assume that \(\sum \left(\frac{a_k}{k+1} \right)\) converges. This implies that \( \left(\frac{a_k}{a_k+1} \right)\rightarrow 0\) as \(k \rightarrow \infty\). This implies \(a_k\rightarrow 0\) as \(k\rightarrow \infty\). Thus we can say that \(\frac{a_k}{2} <\frac{a_k}{a_k+1}\). Given that summation of RHS is convergent which implies summation of LHS also converges.
Ratio test
Consider the series \(\sum a_k\) with \(a_k \ne 0\) \(\forall k\).
- if \(\left | \frac{a_{n+1}}{a_n} \right | \le q\) for some \(0<q<1\) eventually then we can say \(\sum |a_k|\) converges.
- if \(\left | \frac{a_{n+1}}{a_n} \right | \ge 1\) eventually then \(\sum a_k\) diverges
Test Your Concepts:
Show that \(0 \le a_n \le x^n\) eventually for some \(0<x<1\), then \(\sum a_n\) is convergent. Can we say the same for \(x=1\)?
2.1 Cauchy Sequences
In our earlier discussion we used the term ‘eventually’ to describe the behaviour of sequences. When we mean that \(a_n \approx b_n\) eventually – what we mean by that is the following:
Given any \(\epsilon >0\), we can find an integer N such that for all \(m>n\), \(|a_m-b_m|<\epsilon\). We basically mean that, no matter how small a number \(\epsilon\) is, we can find a sufficiently large integer \(N\), such that the difference between the 2 sequences after the \((N+1)\)th terms is less than \(\epsilon\). We shall cover this concept in more detail when we cover limits.
A sequence \(\left \{ a_n \right \}\)is said to be Cauchy if for every \(\epsilon>0\), there exists an integer \(N\) such that, for all \(m,n\ge N\), \(|a_n-a_m|<\epsilon\). Every convergent sequence is a Cauchy Sequence.
2.3 Power Series and Radius of Convergence
Let \(a_n\) be a sequence of real numbers and \(c\in\mathbb{R}\). The series \(\sum\limits_{n=0}^\infty a_n(x-c)^n\) is defined to be a power series centered at \(c\). Thus \(\sum\limits_{n=0}^\infty \frac{x^n}{n!}\), \(\sum\limits_{n=0}^\infty x^n\) are all power series’ centered at \(0\). Note that for any general power series centered at \(c\), a transformation of the form \(x\rightarrow (x-c)\) centers the power series around \(0\).
If the power series \(\sum\limits_{n=0}^\infty a_n(x-c)^n\) converges for \(|x-c|<R\) and diverges for \(|x-c|>R\) for \(0 \le R < \infty\), then \(R\) is called the Radius of Convergence of this power series. The series may or may not converge at \(|x-c|=R\). All Power Series’ have a radius of convergence. The proof of these theorems will be covered in more advanced College Math Modules.
Test Your Concepts:
What is the radius of convergence for \(\sum\limits_{n=0}^\infty x^n\)?