Oblique Asymptote of f(x) = x / (1 + e^(-x))

?math?

New member
Joined
Dec 13, 2006
Messages
4
So to find a function's oblique asymptote, you do the long division and then take the limit to infinity of the function minus what you think is the asymptote, right? The limit should equal 0. Is this correct?

I have f(x) = x / (1 + e^(-x))

1) I don't see how this can be divided any further.
2) I'm supposed to prove that the asymptote is y=x, but when I took the limit of the function - x as it went to infinity, I did not get 0.

Could you tell me where I'm going wrong and how I can solve this? Thanks!
 
It appears that you have memorized a rechnique and failed to understand the idea. The long division is only a good trick for polynomials. It may be beneficial for other things.

Generally, you should think about the limit and what it means.

\(\displaystyle e^{-x}\) tends to zero as x increases without bound in the positive direction.

You tell me what it does in the negative direction.
 
e^{-x} tends to zero as x increases without bound in the positive direction.

You tell me what it does in the negative direction.

I know the limit of the function is 0, but I need the limit to equal zero when I find the limit of the function subtracted by x.

Or am I misunderstanding what you are saying?
 
We're working with \(\displaystyle \frac{x}{1+e^{-x}}\)

Let x go unbounded in the positive direction.

\(\displaystyle e^{-x}\) tends to zero

\(\displaystyle 1+e^{-x}\) tends to one

\(\displaystyle \frac{x}{1+e^{-x}}\) tends to x/1 = x

You do the negative direction.
 
Top