Limit of x^2(e^(1/x) - 1 - 1/x)

bluemath

New member
Joined
Sep 20, 2015
Messages
43
Hello,

I'm trying to find the limit (+inf) of x2 (e1/x - 1 - 1/x) (The solution is 0.5)

Well (x2 e1/x) -x2 -x give nothing and all situations tested are similar, it's impossible to have 0.5

(I tried too to put u = 1/x and we have (eu - 1 - u)/u2 but I see nothing after)

Is it possible to get a lead please ?

Thanks
 
Last edited:
Hello,

I'm trying to find the limit (+inf) of x2 (e1/x - 1 - 1/x) (The solution is 0.5)

Well (x2 e1/x) -x2 -x give nothing and all situations tested are similar, it's impossible to have 0.5

Is it possible to get a lead please ?

Thanks

Impossible!!! - Delete that word from your vocabulary!!

Expand x^2 * e(1/x) → x2 * [1 + 1/x + 1/(2x^2) + ....]


limit (+inf) of x2 (e1/x - 1 - 1/x) = [x2 + x + 1/2 + ...] - x2 - x = ?
 
Last edited by a moderator:
Really magic, thanks.

But I didn't know this development of e(1/x)

Ca you say me more about it please ?
 
mmmh the definition of exp(x) actually

Thanks a lot for your help.
 
Last edited:
Hello,

I'm trying to find the limit (+inf) of x2 (e1/x - 1 - 1/x) (The solution is 0.5)

Well (x2 e1/x) -x2 -x give nothing and all situations tested are similar, it's impossible to have 0.5

(I tried too to put u = 1/x and we have (eu - 1 - u)/u2 but I see nothing after)

Is it possible to get a lead please ?

Thanks
Have you run into l'Hospital's Rule yet? If so, after making the u substitution, you have a zero over zero situation. If so, after the first time around you will still have a 0/0 so apply l'Hospital's rule again.
 
Hello,

I'm trying to find the limit (+inf) of x2 (e1/x - 1 - 1/x) (The solution is 0.5)

Well (x2 e1/x) -x2 -x give nothing and all situations tested are similar, it's impossible to have 0.5

(I tried too to put u = 1/x and we have (eu - 1 - u)/u2 but I see nothing after)

Is it possible to get a lead please ?

Thanks
Hi, next time please state which calculus class you are in (I, II or III). Your substitution will help as Ishuda pointed out. I just want to add something. Using the substitution u=1/x then when x goes to infinity what does u tend to? Then try computing that limit.

Let us know how you make out.
 
Sorry, I forgot to say that Hospital's rule was forbidden here.

Hi, next time please state which calculus class you are in (I, II or III)

L1 in France.(if I understood well your question)
 
Hi, next time please state which calculus class you are in (I, II or III).
L1 in France.(if I understood well your question)
It's kind of a loaded question, because there is much variation in what is meant by Calc I, II, and III (and sometimes IV).

I'm going to guess that you're in differential calculus (so Calc I), where you're studying limits and derivatives; in particular, you have not studied integration yet.

(Integrals are covered, usually, in Calc II; advanced topics, such as vectors and multi-variable contexts, are often covered in Calc III.)

I'm trying to find:

. . . . .\(\displaystyle \displaystyle \lim_{x\, \rightarrow\, +\infty}\, \left[\, x^2\, \left(\, e^{\frac{1}{x}}\, -\, 1\, -\, \dfrac{1}{x}\, \right)\, \right]\)

(The solution is 0.5)

...Hospital's rule [is] forbidden here.
Expand x^2 * e(1/x) → x2 * [1 + 1/x + 1/(2x^2) + ....]
I didn't know this development of e(1/x)
Have you worked with expansions of the sort provided above by Subhotosh Khan? I'm guessing that you probably have not, both because of your follow-up question and also because power series, etc (here), are not usually introduced until after integrals have been covered. Sooo....

Has your book (or your instructor) covered any similar types of limits (so maybe, by working backwards, we can figure out what's being expected of you)? Are there maybe some tricks, formulas, algorithms, or methods that you're expected to apply? Because I'm kinda drawing a blank here. About the only thing I can come up with is this:

Let f(x) = e^x. Let g(x) = 1 + x. Let h(x) = f(x) - g(x). You can confirm that h(x) has its minimum at x = 0, is increasing after x = 0, and is concave up. Since h(x) > 0 for all x, then f(x) > g(x) for all x. In particular, e^x > 1 + x for all x. Using e^(1/x) and 1 + (1/x), you can show that the limit is certainly greater than zero.

But this is barely any closer than we'd already been. To get what they're wanting, there must be some other method, some better technique, some superior bounding functions, which would allow us to, say, apply the Squeeze Theorem to this limit. I'm afraid that, at the moment, I'm not seeing it. But I'll keep kicking this around for a while.... :oops:
 
Sorry, I forgot to say that Hospital's rule was forbidden here.



L1 in France.(if I understood well your question)

O.K.-let's go another way. Hold on to the u substitution for a moment and let's also do another one but first let's define a function
f(u) =eu-u
so that our limit expression cane be expressed as
L0(u) = \(\displaystyle \frac{f(u)\, -\, 1}{u^2}\, =\, \frac{f(u)\, -\, f(0)}{u^2}\)
If we make a substitution x=-1/u, then our origional expression can be expressed as
L1(u) = \(\displaystyle \frac{f(-u)\, -\, 1}{u^2}\, =\, \frac{f(-u)\, -\, f(0)}{u^2}\)
Our original expression we wanted the limit of now becomes
L(u) = \(\displaystyle \frac{L_0(u)\, +\, L_0(-u)}{2}\)

So consider
\(\displaystyle \underset{u\, \to\, 0}{lim}\, L(u)\) = \(\displaystyle \underset{u\, \to\, 0}{lim}\, \frac{L_0(u)\, + \, L_0(-u)}{2}\)
...
=\(\displaystyle \underset{u\, \to\, 0}{lim}\, \frac{\frac{f(u)\, -\, f(0)}{u\, -\,0}\, -\, \frac{f(-u)\, -\, f(0)}{-u\, -\,0}}{u\, -\,(-u)}\)
 
Last edited:
Thanks a lot Ishuda ! Really nice.

A question about the first answer when we took the definition of the exponential :

How to be sure that an infinite sum of terms converge towards zero in + inf ?

Suppose for example the limit in + inf of 1/n + 1/n2 + 1/n3 + ....+ 1/nk
All the terms converge but how to know if the sum converge towards the same limit ? (zero)
 
Well, an infinite series (that is, an infinite number of terms added together) can only converge (sum) to zero if every term is zero. An infinite series can converge to some real number L, and many do just that. But other series diverge (sum to infinity or negative infinity). There's a theorem from my Calculus textbook called the "Divergence Test." Your book probably calls it something different. But it basically says that if we have an infinite series, and the terms don't converge to zero, then we can say the series diverges. Conversely, however, if the terms go to 0, the series may or may not converge to a real number. We don't know in that case, and must use a different test. Some examples where the individual terms converge to zero, but each of the series behave differently:

\(\displaystyle \displaystyle \sum _{k=1}^{\infty }\:\frac{1}{2^k}=\frac{1}{2^1}+\frac{1}{2^2}+\frac{1}{2^3}+...=1\)

\(\displaystyle \displaystyle \sum _{k=1}^{\infty }\:\frac{1}{n^k}=\frac{1}{n^1}+\frac{1}{n^2}+\frac{1}{n^3}+...=\frac{1}{n-1}\)

\(\displaystyle \displaystyle \sum _{k=1}^{\infty }\:\frac{1}{k}=\frac{1}{1}+\frac{1}{2}+\frac{1}{3}+...=\infty\) (This series is sometimes called the "harmonic series" and it diverges)
 
Thanks a lot Ishuda ! Really nice.

A question about the first answer when we took the definition of the exponential :

How to be sure that an infinite sum of terms converge towards zero in + inf ?

Suppose for example the limit in + inf of 1/n + 1/n2 + 1/n3 + ....+ 1/nk
All the terms converge but how to know if the sum converge towards the same limit ? (zero)
In general you don't have that an infinite sum of zeros converges and, in fact. that sum can be anything you like - think of an integral as \(\displaystyle \Sigma\, f(x_n)\, \Delta x\) and letting \(\displaystyle \Delta x\) go to zero so each term goes to zero.

However, for certain cases you do have the sum of an infinite number of zeros is a particular number. Certainly that works for certain power series. Looking at the problem here (and making the u substitution) we have
f(u) = \(\displaystyle \frac{e^u\, -\, 1\, -\, u}{u^2}\, =\, u^2\, [\frac{1}{2}\, +\, \underset{n=3}{\Sigma}\, \frac{u^{n-2}}{n!}]\, =\, u^2\, [\frac{1}{2}\, +\, s(u)]\)
Now, if we could show that if s(u) went to zero as u went to zero we would have
\(\displaystyle \underset{u\, \to\, 0}{lim}\, \frac{f(u)}{u^2}\, =\, \frac{1}{2}\)

Now,
s(u) = \(\displaystyle \underset{n=3}{\Sigma}\, \frac{u^{n-2}}{n!}]\, =\, u\, \underset{n=3}{\Sigma}\, \frac{u^{n-3}}{n!}]\, =\, u\, t(u)\)
Now, t(u) converges for all finite u (think ratio test for example) and thus for u in some neighborhood of zero, say |u|<a, |t(u)| < A for some positive A (why?) and
0 \(\displaystyle \le\) |s(u)| \(\displaystyle \le\) A u; |u| < a.
So, by the 'squeeze theorem' that means that
\(\displaystyle \underset{u\, \to\, 0}{lim}\, s(u)\, =\, 0\)

You can do this sort of thing for every power series representing an infinitely differentiable function although I'm not sure all derivatives of just any power series must converge to prove the sum of its infinite zeros is any particular value.
 
Top