In mathematics, a series is the sum of the terms of an infinite sequence of numbers. More precisely, an infinite sequence defines a series S that is denoted
The nth partial sum Sn is the sum of the first n terms of the sequence; that is,
A series is convergent (or converges) if the sequence of its partial sums tends to a limit; that means that, when adding one after the other in the order given by the indices, one gets partial sums that become closer and closer to a given number. More precisely, a series converges, if there exists a number such that for every arbitrarily small positive number , there is a (sufficiently large) integer such that for all ,
If the series is convergent, the (necessarily unique) number is called the sum of the series.
The same notation
is used for the series, and, if it is convergent, to its sum. This convention is similar to that which is used for addition: a + b denotes the operation of adding a and b as well as the result of this addition, which is called the sum of a and b.
Any series that is not convergent is said to be divergent or to diverge.
Examples of convergent and divergent series
- The reciprocals of the positive integers produce a divergent series (harmonic series):
- Alternating the signs of the reciprocals of positive integers produces a convergent series (alternating harmonic series):
- The reciprocals of prime numbers produce a divergent series (so the set of primes is "large"; see divergence of the sum of the reciprocals of the primes):
- The reciprocals of triangular numbers produce a convergent series:
- The reciprocals of factorials produce a convergent series (see e):
- The reciprocals of square numbers produce a convergent series (the Basel problem):
- The reciprocals of powers of 2 produce a convergent series (so the set of powers of 2 is "small"):
- The reciprocals of powers of any n>1 produce a convergent series:
- Alternating the signs of reciprocals of powers of 2 also produces a convergent series:
- Alternating the signs of reciprocals of powers of any n>1 produces a convergent series:
- The reciprocals of Fibonacci numbers produce a convergent series (see ψ):
Convergence tests
There are a number of methods of determining whether a series converges or diverges.
Comparison test. The terms of the sequence are compared to those of another sequence . If, for all n, , and converges, then so does
However, if, for all n, , and diverges, then so does
Ratio test. Assume that for all n, is not zero. Suppose that there exists such that
If r < 1, then the series is absolutely convergent. If r > 1, then the series diverges. If r = 1, the ratio test is inconclusive, and the series may converge or diverge.
Root test or nth root test. Suppose that the terms of the sequence in question are non-negative. Define r as follows:
- where "lim sup" denotes the limit superior (possibly ∞; if the limit exists it is the same value).
If r < 1, then the series converges. If r > 1, then the series diverges. If r = 1, the root test is inconclusive, and the series may converge or diverge.
The ratio test and the root test are both based on comparison with a geometric series, and as such they work in similar situations. In fact, if the ratio test works (meaning that the limit exists and is not equal to 1) then so does the root test; the converse, however, is not true. The root test is therefore more generally applicable, but as a practical matter the limit is often difficult to compute for commonly seen types of series.
Integral test. The series can be compared to an integral to establish convergence or divergence. Let be a positive and monotonically decreasing function. If
then the series converges. But if the integral diverges, then the series does so as well.
Limit comparison test. If , and the limit exists and is not zero, then converges if and only if converges.
Alternating series test. Also known as the Leibniz criterion, the alternating series test states that for an alternating series of the form , if is monotonically decreasing, and has a limit of 0 at infinity, then the series converges.
Cauchy condensation test. If is a positive monotone decreasing sequence, then converges if and only if converges.
Conditional and absolute convergence
For any sequence , for all n. Therefore,
This means that if converges, then also converges (but not vice versa).
If the series converges, then the series is absolutely convergent. The Maclaurin series of the exponential function is absolutely convergent for every complex value of the variable.
If the series converges but the series diverges, then the series is conditionally convergent. The Maclaurin series of the logarithm function is conditionally convergent for x = 1.
The Riemann series theorem states that if a series converges conditionally, it is possible to rearrange the terms of the series in such a way that the series converges to any value, or even diverges.
Uniform convergence
Let be a sequence of functions. The series is said to converge uniformly to f if the sequence of partial sums defined by
converges uniformly to f.
There is an analogue of the comparison test for infinite series of functions called the Weierstrass M-test.
Cauchy convergence criterion
The Cauchy convergence criterion states that a series
converges if and only if the sequence of partial sums is a Cauchy sequence. This means that for every there is a positive integer such that for all we have
This is equivalent to
See also
External links
- "Series", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- Weisstein, Eric (2005). Riemann Series Theorem. Retrieved May 16, 2005.