Not sure I understand all the closed vs. bounded business, but if you need to prove that [a,b] exists such that f(a) > 0 and f(b) < 0 I would try something along these lines:

Assume if x-> +inf f(x) -> A, A > 0. Let's assume there is NO such a, that f(a) > 0. By definition of lim f(x) for any epsilon we should be able to get f(x) closer to A than epsilon: |f(x) - A| < eps.

Let's make A the epsilon:

|f(x) - A| < A

f(x) - A < A if f(x) > A - contradicts our assumption that at no a f(a) > 0.

A - f(x) < A if f(x) < A =>

=> -f(x) < 0 => f(x) > 0 - again contradicts our assumption that at no a f(a) > 0.

Therefore, there should be an a where f(a) > 0.

Same for b and the case when limits are reversed.

Sorry if this is sloppy, took calculus 20+ years ago

## Bookmarks