Trouble solving a question: lim[x->-infty]f(x) * lim[x->+infty] < 0; find max of g(x)

Eohyn

New member
Joined
Jan 6, 2018
Messages
12
Trouble solving a question: lim[x->-infty]f(x) * lim[x->+infty] < 0; find max of g(x)

Hello!
I'm new to the forum, so please excuse any mistakes I might be making.

I'm having trouble solving an exercise, I feel like there's something obvious I'm missing but can't see where...

There's a function which is continuous everywhere and \[\lim_{x\to-\infty}f(x)\times\lim_{x\to+\infty}f(x)<0\] (both limits exist and are finite. I'm supposed to find the maximum of \[g(x)=\dfrac{1}{1+\left[f(x)\right]^2}\]

Since those limits have different signs, exist and are finite, I believe f has maximum and minimum (given by those two asymptotes)...

I got to the derivative of g easily, but I'm not sure what it gets me. \[g'(x)=\dfrac{-2f(x)f'(x)}{\left(1+\left(f(x)\right)^2\right)}\] but since I don't know much about f, I can't say anything about the sign of g'... I've tried to find the second derivative, but in one attempt it was always positive, so it couldn't have a maximum...

Can anyone help me see what I'm missing? Thanks in advance!
 
Hello!
I'm new to the forum, so please excuse any mistakes I might be making.

I'm having trouble solving an exercise, I feel like there's something obvious I'm missing but can't see where...

There's a function which is continuous everywhere and \[\lim_{x\to-\infty}f(x)\times\lim_{x\to+\infty}f(x)<0\] (both limits exist and are finite. I'm supposed to find the maximum of \[g(x)=\dfrac{1}{1+\left[f(x)\right]^2}\]

Since those limits have different signs, exist and are finite, I believe f has maximum and minimum (given by those two asymptotes)...

I got to the derivative of g easily, but I'm not sure what it gets me. \[g'(x)=\dfrac{-2f(x)f'(x)}{\left(1+\left(f(x)\right)^2\right)}\] but since I don't know much about f, I can't say anything about the sign of g'... I've tried to find the second derivative, but in one attempt it was always positive, so it couldn't have a maximum...

Can anyone help me see what I'm missing? Thanks in advance!
You are on the wrong track altogether. You cannot use derivatives because you do not know that f(x) is everywhere differentiable. (Assuming it is differentiable, you would still need to ensure that g(x) is everywhere differentiable and calculate the derivative correctly. Using the power rule might suggest squaring the denominator of g(x) when taking the first derivative.) Moreover, you cannot assume that the asymptotes are reached as x increases or decreases without bound. Nor can you assume that f(x) does not have absolute values greater than the absolute value of the asymptotes at some finite values of x.

What you do know is that the asymptotes have opposite sign so f(x) has at least one zero. (Intermediate Value Theorem)

\(\displaystyle f(a) = 0 \implies \{f(a)\}^2 = 0 \implies g(a) = 1.\)

\(\displaystyle f(b) \ne 0 \implies \{f(b)\}^2 > 0 \implies g(b) < 1.\)

Now what do you think the maximum of g(x) is?

Differential calculus is a very powerful tool, but not all problems can take advantage of it.
 
You are on the wrong track altogether. You cannot use derivatives because you do not know that f(x) is everywhere differentiable. (Assuming it is differentiable, you would still need to ensure that g(x) is everywhere differentiable and calculate the derivative correctly. Using the power rule might suggest squaring the denominator of g(x) when taking the first derivative.) Moreover, you cannot assume that the asymptotes are reached as x increases or decreases without bound. Nor can you assume that f(x) does not have absolute values greater than the absolute value of the asymptotes at some finite values of x.

What you do know is that the asymptotes have opposite sign so f(x) has at least one zero. (Intermediate Value Theorem)

\(\displaystyle f(a) = 0 \implies \{f(a)\}^2 = 0 \implies g(a) = 1.\)

\(\displaystyle f(b) \ne 0 \implies \{f(b)\}^2 > 0 \implies g(b) < 1.\)

Now what do you think the maximum of g(x) is?

Differential calculus is a very powerful tool, but not all problems can take advantage of it.

First of all, thanks for the help!

You're right, I can't assume the function is differentiable or that the asymptotes are the maximum and minimum... Silly me!

If I'm understanding correctly, then g has a maximum when \(\displaystyle x:f(x)=0 \). I know such x exists because of the Intermediate Value Theorem. Another question: can I guarantee there's only one such x?
 
First of all, thanks for the help!

You're right, I can't assume the function is differentiable or that the asymptotes are the maximum and minimum... Silly me!

If I'm understanding correctly, then g has a maximum when \(\displaystyle x:f(x)=0 \). I know such x exists because of the Intermediate Value Theorem. Another question: can I guarantee there's only one such x?
You cannot guarantee that f(x) has a unique zero. But you do not need to do so. You are asked to find the maximum of g(x). Its value will be 1 at EVERY x such that f(x) = 0. The value of g(x) will be less than 1 but more than zero at every x such that f(x) \(\displaystyle \ne\) 0. Consequently the maximum value of g(x) is 1 no matter how many zeroes f(x) may have.
 
You cannot guarantee that f(x) has a unique zero. But you do not need to do so. You are asked to find the maximum of g(x). Its value will be 1 at EVERY x such that f(x) = 0. The value of g(x) will be less than 1 but more than zero at every x such that f(x) \(\displaystyle \ne\) 0. Consequently the maximum value of g(x) is 1 no matter how many zeroes f(x) may have.

Thank you, you've been a great help!
 
Hello,

Just one final question. Do you think it's necessary to prove that I can find a closed and limited interval from a to b and formally prove such an interval is bounded? Personally, I don't think it's needed since the function is continuous everywhere...

Thanks in advance!
 
Last edited:
Hello,

Just one final question. Do you think it's necessary to prove that I can find a closed and limited interval from a to b and formally prove such an interval is bounded? Personally, I don't think it's needed since the function is continuous everywhere...

Thanks in advance!

Any help, please?
 
I don't understand the question. The interval [a, b] is bounded; that is what the notation means.

That's what I thought, but the teacher is saying unless you prove the interval is closed and bounded, the answer is very incomplete...
 
That's what I thought, but the teacher is saying unless you prove the interval is closed and bounded, the answer is very incomplete...

It will help if you show us the exact answer you submitted, so we can see what might have been omitted. As it is, it is not clear what a and b you are referring to.
 
It will help if you show us the exact answer you submitted, so we can see what might have been omitted. As it is, it is not clear what a and b you are referring to.

In my answer I said that since the function is continuous everywhere, then I can pick a closed interval from a to b, both reals with a<b, where the function is continuous and, as such, it has at least a zero, since the limits to infinity have opposing signs.

My exact answer is in another language, so I'm not sure it's helpful to post it here...

Thanks for the help :)
 
That's what I thought, but the teacher is saying unless you prove the interval is closed and bounded, the answer is very incomplete...

In my answer I said that since the function is continuous everywhere, then I can pick a closed interval from a to b, both reals with a<b, where the function is continuous and, as such, it has at least a zero, since the limits to infinity have opposing signs.

My exact answer is in another language, so I'm not sure it's helpful to post it here...

Actually, you never know; at least we might be able to get a sense of the flow of your argument after Google translates it for us (or someone might know the language).

Okay, the interval is by definition closed and bounded; but how do you know it exists? Perhaps the teacher is saying that: you must prove that such a closed and bounded interval exists.

Did you? How did you justify being able to "pick" it?
 
Actually, you never know; at least we might be able to get a sense of the flow of your argument after Google translates it for us (or someone might know the language).

Okay, the interval is by definition closed and bounded; but how do you know it exists? Perhaps the teacher is saying that: you must prove that such a closed and bounded interval exists.

Did you? How did you justify being able to "pick" it?

Thank you for your quick answer!

I didn't justify that the interval is bounded, only closed (our wording of the theorem only states it must be closed, not closed and limited). The teacher agrees that if it is continuous everywhere, than there exists a closed interval from a to b which will be limited, but I had to prove it. It's the only question where I didn't get full marks and the teacher isn't really explaining why it was needed...

Is my line of thought wrong? If so, can you help clarify?
 
Thank you for your quick answer!

I didn't justify that the interval is bounded, only closed (our wording of the theorem only states it must be closed, not closed and limited). The teacher agrees that if it is continuous everywhere, than there exists a closed interval from a to b which will be limited, but I had to prove it. It's the only question where I didn't get full marks and the teacher isn't really explaining why it was needed...

Is my line of thought wrong? If so, can you help clarify?

I'm finding it very hard to be sure what you are referring to. I can't tell what the teacher is complaining about without knowing all (more or less) of what you said.

I presume "the theorem" refers to the intermediate value theorem; but we need to see the details of how you applied it. If you can give us a reasonably complete translation of what you said, it will help a lot. In particular, I think what you are saying is that you claim that there is an interval [a,b] such that f(a) and f(b) have opposite signs; saying that really just means that there are points a and b, with a < b, such that, etc. Is that what you said?

I assume you are using "limited" to mean the same thing as "bounded".
 
I'm finding it very hard to be sure what you are referring to. I can't tell what the teacher is complaining about without knowing all (more or less) of what you said.

I presume "the theorem" refers to the intermediate value theorem; but we need to see the details of how you applied it. If you can give us a reasonably complete translation of what you said, it will help a lot. In particular, I think what you are saying is that you claim that there is an interval [a,b] such that f(a) and f(b) have opposite signs; saying that really just means that there are points a and b, with a < b, such that, etc. Is that what you said?

I assume you are using "limited" to mean the same thing as "bounded".

My reasoning was as follows:

Since f is continuous everywhere, the fact that the limits to infinity have opposings signs indicates f has at least one zero in its domain (similar to the corollary of Bolzano theorem). As such, choosing \[x=a\in D_f\] so that \[f(a)=0\] it follows that \[g(a)=\dfrac{1}{a+(f(a))^2}=\dfrac{1}{1+0}=1\]

If \[f(a)\neq 0\], \[1+(f(a))^2>1\] since the square of a real number is always non negative and, if \[1+(f(a))^2>1\Rightarrow g(a)<1\]

It follows that the maximum of g is 1. Also note that g is continuous everywhere because f is and the denominator is never 0.

The teacher comment was that I didn't prove f was continuous in a bounded set, only closed set, and that was the intended part... I thought that any closed set [a,b] is also bounded with a<b and both real numbers.

This is pretty much translating the whole thing. I hope it's clear now!
 
My reasoning was as follows:

Since f is continuous everywhere, the fact that the limits to infinity have opposing signs indicates f has at least one zero in its domain (similar to the corollary of Bolzano theorem). As such, choosing \[x=a\in D_f\] so that \[f(a)=0\] it follows that \[g(a)=\dfrac{1}{1+(f(a))^2}=\dfrac{1}{1+0}=1\]

If \[f(a)\neq 0\], \[1+(f(a))^2>1\] since the square of a real number is always non negative and, if \[1+(f(a))^2>1\Rightarrow g(a)<1\]

It follows that the maximum of g is 1. Also note that g is continuous everywhere because f is and the denominator is never 0.

The teacher comment was that I didn't prove f was continuous in a bounded set, only closed set, and that was the intended part... I thought that any closed set [a,b] is also bounded with a<b and both real numbers.

This is pretty much translating the whole thing. I hope it's clear now!

One thing is unclear: I see no mention of [a,b] in your work! So you haven't even mentioned that the function is continuous on some [a,b], which is necessary in order to apply Bolzano's theorem. (Maybe you need to tell us the statement of the corollary you are referring to, in case it's not what I think -- does it mention a closed and bounded set, or an interval?) When you apply a theorem, you have to show that it applies.

But then, I don't see that you've even mentioned a closed set, so I don't know what the teacher is referring to!
 
I am guessing. Perhaps what is meant is that you must show that there exists real a such that f(a) is negative and real b such that f(b) is positive and that therefore there exists f(x) = 0 in (a, b).
 
One thing is unclear: I see no mention of [a,b] in your work! So you haven't even mentioned that the function is continuous on some [a,b], which is necessary in order to apply Bolzano's theorem. (Maybe you need to tell us the statement of the corollary you are referring to, in case it's not what I think -- does it mention a closed and bounded set, or an interval?) When you apply a theorem, you have to show that it applies.

But then, I don't see that you've even mentioned a closed set, so I don't know what the teacher is referring to!

That's true, I didn't state in my answer that it was continuous in a closed [a,b]. I can understand that being a mistake, but since I didn't use the theorem explicitly, only its idea, it seemed ok at the time... Still, the teachers' reference is only that I didn't prove the interval is bounded.

My version of the theorem only mentions the function is continuous in a closed [a,b] (doesn't mention bounded).

When trying to understand why I couldn't think that since the function is continuous everywhere, I can choose a closed interval [a,b] where the function is continuous, the teachers' reply was that obviously I could and that it'd be bounded, but I still didn't prove the interval was bounded, so the answer was pretty much wrong... Which is why this one is confusing me a lot!

Thanks for all the help!
 
I am guessing. Perhaps what is meant is that you must show that there exists real a such that f(a) is negative and real b such that f(b) is positive and that therefore there exists f(x) = 0 in (a, b).

Even so, the teacher barely gave any points if you didn't prove such an interval was bounded...
 
That's true, I didn't state in my answer that it was continuous in a closed [a,b]. I can understand that being a mistake, but since I didn't use the theorem explicitly, only its idea, it seemed ok at the time... Still, the teachers' reference is only that I didn't prove the interval is bounded.

My version of the theorem only mentions the function is continuous in a closed [a,b] (doesn't mention bounded).

When trying to understand why I couldn't think that since the function is continuous everywhere, I can choose a closed interval [a,b] where the function is continuous, the teachers' reply was that obviously I could and that it'd be bounded, but I still didn't prove the interval was bounded, so the answer was pretty much wrong... Which is why this one is confusing me a lot!

Thanks for all the help!

"since I didn't use the theorem explicitly, only its idea, it seemed ok at the time": If you were told to prove your result, then you have to give clear reasons (though the level of detail depends on the course). I imagine the real issue may be that you didn't use the theorem explicitly. If you were just told to solve the problem, on the other hand, then it wouldn't really be necessary.

"the teachers' reference is only that I didn't prove the interval is bounded": that still seems odd if you didn't even mention an interval, as you indicated. But there may be a language issue here.

"continuous in a closed [a,b] (doesn't mention bounded)": An interval [a,b] is by definition bounded; they don't have to say that separately. I'm almost sure what your teacher said was that you didn't prove that it is continuous on a closed, bounded interval, which amounts to saying "on some [a,b]".

But you'll have to ask your teacher for a fuller explanation, since we can't know just what was said, much less what was meant. Teachers want to be understood, so asking is always a good thing.

If you do find out what is going on, some of us would probably like to hear about it ...
 
Top