Real Analysis (2)

Real Analysis (2)

Sequence

The Limit of Sequence

Definition 2.2.1: Definition of Sequence

A sequence is a function whose domain is \(\mathbb{N}\). Given a function \(f: \mathbb{N} \rightarrow \mathbb{R}\), \(f(n)\) is just the \(n\)th term on the list.

\((1, \frac{1}{2}, \frac{1}{3}...)\) is a sequence: \[f(n) = \{n \in \mathbb{N}: \frac{1}{n}\}\]

Definition 2.2.3: Convergence of a Sequence

A sequence \((a_n)\) converges to a real number \(a\) if, for every positive number \(\epsilon\), there exists an \(N \in \mathbb{N}\) such that whenever \(n \geq \mathbb{N}\), it follows that \(|a_n - a| < \epsilon\).

\[\lim_{n \rightarrow \infty} a_n = a\]

Definition 2.2.4

Given a real number \(a \in \mathbb{R}\) and a positive number \(\epsilon > 0\), the set:

\[V_{\epsilon} (a) = \{x \in \mathbb{R}: |x - a| < \epsilon\}\]

is called the \(\boldsymbol{\epsilon-neighborhood}\) of \(a\). In other words, \(V_{\epsilon} (a)\) is an interval, centered at \(a\), with radius \(\epsilon\).

Definition 2.2.3B

A sequence \((a_n)\) converges to \(a\) if, given any \(\epsilon-neighborhood V_{\epsilon}(a)\) of \(a\), there exists a point in the sequence after which all of the terms are in \(V_{\epsilon} (a)\).

Theorem 2.2.7: Uniqueness of Limits

The limit of a sequence, when it exists, must be unique.

The Algebraic and Order Limit Theorems

Definition 2.3.1: Bounded Sequence

A sequence \((x_n)\) is Bounded if there exists a number \(M > 0\) such that \(|x_n| \leq M, \;\;\forall n \in \mathbb{N}\). Geometrically, this means that we can find an interval \([-M, M]\) that contains every term in the sequence \((x_n)\)

Theorem 2.3.2

Every convergent sequence is bounded.

Theorem 2.3.3: Algebraic Limit Theorem

Let \(\lim a_n = a\) and \(\lim b_n = b\). Then:

  1. \(\lim (c a_n) = ca, \forall c \in \mathbb{R}\)
  2. \(\lim(a_n + b_n) = a + b\)
  3. \(\lim(a_n b_n) = ab\)
  4. \(\lim(a_n / b_n) = a / b\) only if \(b \neq 0\)

Theorem 2.3.4: Order Limit Theorem

Assume \(\lim a_n = a\) and \(\lim b_n = b\):

  1. If \(a_n > 0\) for all \(n \in \mathbb{N}\), then \(a \geq 0\).
  2. If \(a_n \leq b_n\) for all \(n \in \mathbb{N}\), then \(a \geq b\)
  3. If there exists \(c \in \mathbb{R}\) for which \(c \leq b_n\) for all \(n \in \mathbb{N}\), then \(c \leq b\). Similarly for \(a_n\).

The Monotone Convergence Theorem and a First Look at Infinite Series

Definition 2.4.1: Monotone

A sequence \((a_n)\) is increasing if \(a_n \leq a_{n+1}\) for all \(n \in \mathbb{N}\) and decreasing if \(a_n \geq a_{n+1}\) for all \(n \in \mathbb{N}\). A sequence is monotone if it is either increasing or decreasing.

Theorem 2.4.2: Monotone Convergence Theorem

If a sequence is monotone and bounded, then it converges.

Definition 2.4.3: Convergence of a Series

Let \((b_n)\) be a sequence. An infinite series is a formal expression of the form:

\[\sum^{\infty}_{n=1} b_n = b_1 + b_2 + b_3 +....\]

We define the corresponding sequence of partial sums \((s_m)\) by

\[s_m = b_1 + b_2 + .... + b_m\]

and say that \(\sum^{\infty}_{n=1} b_n\) converges to \(B\) if the sequence \((s_m)\) converges to \(B\). In this case, we write:

\[\sum^{\infty}_{n=1} b_n = B\]

consider the series \(\sum^{\infty}_{n=1} \frac{1}{n^2}\) \(s_m = 1 + 1 - \frac{1}{m} < 2\) by Monotonic Convergence Theorem, we know that the sequence converges to some value \(B\) that we do not know.

Harmonic Series

\[\sum^{\infty}_{n=1} \frac{1}{n}\]

The harmonic series is unbounded, thus it diverges.

Theorem 2.4.6: Cauchy Condensation Test

Suppose \((b_n)\) is decreasing and satisfies \(b_n \geq 0\) for all \(n \in \mathbb{N}\). Then, the series \(\sum^{\infty}_{n=1} b_n\) converges if and only if the series:

\[\sum^{\infty}_{n=0} 2^nb_{2^n} = b_1 + 2b_2 + 4b_4 + ...\]

Corollary 2.4.7

The series \(\sum^{\infty}_{n=1} \frac{1}{n^p}\) converges if and only if \(p > 1\)

Subsequences and the Bolzano-Weierstrass Theorem

Definition 2.5.1: Subsequence

Let \((a_n)\) be a sequence of real numbers, and let $n_1 < n_2 < n_3 < ... $ be an increasing sequence of natural numbers. Then the sequence:

\[(a_{n1}, a_{n2} ,....)\]

is called subsequence of \((a_n)\) and is denoted by \((a_{nk})\), where \(k \in \mathbb{N}\) indexes the subsequence. Notice that the order of the terms in a subsequence is the same as in the original sequence and repetitions are not allowed.

Theorem 2.5.2

Subsequences of a convergent sequence converge to the same limit as the original sequence.

Theorem 2.5.5: Bolzano-Weierstrass Theorem

Every bounded sequence contains a convergent subsequence.

The Cauchy Criterion

Definition 2.6.1: Cauchy Sequence

A sequence \((a_n)\) is called a Cauchy sequence if, for every \(\epsilon > 0\), there exists an \(N \in \mathbb{N}\) such that whenever \(m, n \geq N\) it follows that \(|a_n - a_m | < \epsilon\).

This definition resembles to the definition of convergence. A sequence if a Cauchy sequence if, for every \(\epsilon\), there is a point in the sequence after whcih the terms are all close to each other than the given \(\epsilon\).

Theorem 2.6.2

Every convergent sequence is a Cauchy sequence.

Lemma 2.6.3

Cauchy sequences are bounded.

Cauchy Criterion

A sequence is convergent if and only if it is a Cauchy sequence.

Properties of Infinite Series

Given an infinite series \(\sum^{\infty}_{k=1} a_k\), it is important to keep a clear distinction between:

  1. the sequence of terms: \((a_1, a_2, a_3, ...)\)
  2. the sequence of partial sums: \((s_1, s_2, s_3, ...)\), where \(s_n = a_1 + a_2 + ... + a_n\)

The convergence of the series is defined in terms of the sequence \(s_n\). Specifically, the statement:

\[\sum^{\infty}_{k=1} a_k = A\]

Means that

\[\lim_{n \rightarrow \infty} s_n = A\]

Theorem 2.7.1: Algebraic Limit Theorem for Series

If \(\sum^{\infty}_{k=1} a_k = A\) and \(\sum^{\infty}_{k=1} b_k = B\):

  1. \(\sum^{\infty}_{k=1} ca_k = cA, \;\forall c \in \mathbb{R}\)
  2. \(\sum^{\infty}_{k=1} a_k + b_k = A + B\)

Theorem 2.7.2: Cauchy Criterion for Series

The series \(\sum^{\infty}_{k=1} a_k\) converges if and only if, given \(\epsilon > 0\), there exists an \(N \in \mathbb{N}\) such that whenever, \(n > m \geq N\) it follows that:

\[|a_{m+1} + a_{m+2} + .... + a_n| < \epsilon\]

Theorem 2.7.3

If the series \(\sum^{\infty}_{k=1} a_k\) converges, then \((a_k) \rightarrow 0\) (the converse is false)

Theorem 2.7.4: Comparison Test

Assume \((a_k)\) and \((b_k)\) are sequences satisfying \(0 \leq a_k \leq b_k, \; \forall k \in \mathbb{N}\) (does not need to hold for all \(k\), only need eventually true):

  1. If \(\sum^{\infty}_{k=1} b_k\) converges, then \(\sum^{\infty}_{k=1} a_k\) converges.
  2. If \(\sum^{\infty}_{k=1} a_k\) diverges, then \(\sum^{\infty}_{k=1} b_k\) diverges.

Definition 2.7.5: Geometric Series

A series is called geometric if it is of the form:

\[\sum^{\infty}_{k=0} a r^k = a + ar + ar^2 + ar^3 + ...\]

If \(r=1\) and \(a \neq 0\), the series evidently diverges. For \(r \neq 1\), the algebraic identity

\[(1 - r) (1 + r + r^2 + r^3 + ... + r^{m-1}) = 1 - r^m\]

We have:

\[s_m = a + ar + ar^2 + ar^3 + .... + ar^m = a(1 + r + r^2 + r^3 + ... + r^{m-1}) = \frac{a(1 - r^m)}{1 - r}\]

Then we have:

\[\lim_{m \rightarrow \infty} (1 - r^m) = 1\] \[\lim_{m \rightarrow \infty} (1 - r) = 1 - r\]

By Theorem 2.3.3(3) we have:

\[\sum^{\infty}_{k=0} a r^k = \frac{a}{1 - r}\]

only for \(|r| < 1\)

Theorem 2.7.6: Absolute Convergence Test

If the series \(\sum^{\infty}_{n=1} |a_n|\) converges, then \(\sum^{\infty}_{n=1} a_n\) converges as well.

Theorem 2.7.7: Alternating Series Test

Let \((a_n)\) be a sequence satisfying:

  1. \(a_1 \geq a_2 \geq a_3 \geq ... \geq a_n \geq a_{n+1} \geq ...\)
  2. \((a_n) \rightarrow 0\)

Then the alternating series \(\sum^{\infty}_{n=1} (-1)^{n+1} a_n\) converges.

Definition 2.7.8: Conditional and Absolute Convergence

If \(\sum^{\infty}_{n=1} |a_n|\) converges, then we say that the original series \(\sum^{\infty}_{n=1} a_n\) converges absolutely. If, on the other hand, the \(\sum^{\infty}_{n=1} a_n\) converges but the series of absolute values \(\sum^{\infty}_{n=1} |a_n|\) does not converge, we say that the original series \(\sum^{\infty}_{n=1} a_n\) converges conditionally.

Definition 2.7.9 Rearrangement

Let \(\sum^{\infty}_{k=1} a_k\) be a series. A series \(\sum^{\infty}_{k=1} b_k\) is called a rearrangement of \(\sum^{\infty}_{k=1} a_k\) if there exists a one-to-one, onto function \(f: \mathbb{N} \rightarrow \mathbb{N}\) such taht \(b_{f(k)} = a_k, \; \forall k \in \mathbb{N}\).

Theorem 2.7.10

If a series converges absolutely, then any rearrangement of this series converges to the same limit.

Double Summations and Products of Infinite Series

\[\sum^{\infty}_{i=1}\sum^{\infty}_{j=1} a_{ij} \neq \sum^{\infty}_{j=1}\sum^{\infty}_{i=1} a_{ij}\]

Theorem 2.8.1

Let \(\{a_{ij}: i, j \in \mathbb{N}\}\) be a doubly indexed array of real numbers. If

\[\sum^{\infty}_{i=1}\sum^{\infty}_{j=1} |a_{ij}|\]

converges, then both \(\sum^{\infty}_{i=1}\sum^{\infty}_{j=1} a_{ij}\) and \(\sum^{\infty}_{j=1}\sum^{\infty}_{i=1} a_{ij}\) converge to the same value. Moreover:

\[\lim_{n \rightarrow \infty} s_{nn} = \sum^{\infty}_{i=1}\sum^{\infty}_{j=1} a_{ij} = \sum^{\infty}_{j=1}\sum^{\infty}_{i=1} a_{ij}\]

Where \(s_nn = \sum^{n}_{j=1}\sum^{n}_{i=1} a_{ij} = \sum^{n}_{i=1}\sum^{n}_{j=1} a_{ij}\)