Geometric Series - Converge or Diverge?

Jason76

Senior Member
Joined
Oct 19, 2012
Messages
1,180
\(\displaystyle \sum_{n = 0}^{\infty}\dfrac{3}{10^{n + 1}}\)

\(\displaystyle r = \dfrac{1}{10}\) but how can we find r? :confused: It seems like somebody said to divide one term in the series by the one before. But doing so doesn't come out to \(\displaystyle \dfrac{1}{10}\)

Here is the formula for \(\displaystyle r\):

\(\displaystyle r = \dfrac{a_{n+1}}{a_{n}}\) What would go in the numerator and denominator, in this case? :confused:

Anyhow since \(\displaystyle \dfrac{1}{10} < 1\) then the geometric series does NOT diverge

But here is another question. Now, we have to find the sum. Is it possible that, after this point, the series could diverge? :confused:

\(\displaystyle S = \dfrac{a}{1 - r}\)

\(\displaystyle S = \dfrac{\dfrac{3}{10}}{1 - \dfrac{1}{10}} = \dfrac{1}{3}\) - The infinite series converges. But what does this have to do with the sum? :confused:
 
Last edited:
\(\displaystyle \displaystyle\sum_{n = 0}^{\infty}\dfrac{3}{10^{n + 1}}\)

\(\displaystyle r = \dfrac{1}{10}\) but how can we find r? It seems like somebody said to divide one term in the series by the one before. But doing so doesn't come out to \(\displaystyle \dfrac{1}{10}\)

The first term of the series is \(\displaystyle a_0 = \dfrac{3}{10}\), when \(\displaystyle n=0\).

The second term of the series is \(\displaystyle a_1 = \dfrac{3}{100}\), when \(\displaystyle n=1\).

Now the common ratio is \(\displaystyle r =\dfrac{a_1}{a_0}= \dfrac{1}{10}\).

Use the sum formula \(\displaystyle S = \dfrac{a_0}{1-r}\)
 
Given \(\displaystyle n = 0\) That is, 0 is the first term in the series.

\(\displaystyle a_{n} = a_{0} = \dfrac{3}{10^{(0) + 1}} = \dfrac{3}{10^{1}} = \dfrac{3}{10}\)

\(\displaystyle a_{n + 1} = a_{1} = \dfrac{3}{10^{(1) + 1}} = \dfrac{3}{10^{2}} = \dfrac{3}{100}\)

\(\displaystyle \dfrac{a_{n + 1}}{a_{n}} = \dfrac{a_{1}}{a_{0}} = \dfrac{\dfrac{3}{100}}{ \dfrac{3}{10}} = \dfrac{1}{10}\)


OK, makes sense. But we can tell right away from r that it converges, because \(\displaystyle r < 1\). After that we take the sum. But if \(\displaystyle r \ge 1\), then it would have diverged, and we would not have taken the sum.
 
Last edited:
Top