@Dr.Peterson and JeffM- I apologize but I still don't understand. N^{2} + N^{3 }for any input over 1 will always be higher than N^{3}. How does N^{2} + N^{3 }have the same growth rate as N^{3}? I apologize if I am not understanding what you are saying. I have tried to read over both of the links but they are honestly over my head.

In the web.mit.edu link the first page says this:

"f(x) <= C*g(x) for all x>N. Intuitively, this means that f does not grow faster than g."

Maybe I am misunderstanding this, but to me that would indicate that the value on the left side does not grow faster than the value on the right. But wouldn't N^{2} + N^{3} grow faster than N^{3} ?

Yes, it does say that. I strongly reiterate my suggestion that you dismiss as confusing attempts to explain big O notation in "intuitive" or "simple" terms. Look at what you just quoted. It quite explicitly says formally

\(\displaystyle f(x) \le C * |g(x)|\)

and then IMMEDIATELY says that intuitively means

\(\displaystyle f'(x) \not > g'(x)\),

which is clearly wrong, as I shall now prove using your example.

\(\displaystyle f(x) = x^2 + x^3 \implies f'(x) = 2x + 3x^2.\)

\(\displaystyle g(x) = x^3 \implies g'(x) 3x^2. \)

\(\displaystyle \therefore f(x) - g(x) = x^2 \text { and } f'(x) - g(x) = 2x.\)

\(\displaystyle \therefore f(x) - g(x) > 0 < f'(x) - g'(x) \text { if } x > 0.\)

\(\displaystyle \therefore f(x) > g(x) \text { and } f'(x) > g'(x) \text { if } x > 0.\)

YOU ARE RIGHT.

Now let's talk seriously about Big O notation.

\(\displaystyle 0 < \epsilon < 1.\)

We are thinking about epsilon as a very tiny positive number.

\(\displaystyle \text {Let } n = \dfrac{1}{\epsilon},\ c = (1 + \epsilon), \text { and } h(x) = c * g(x).\)

\(\displaystyle x = p > n \implies f(p) = p^2 + p^3, h(p) = cp^3 = p^3 + p^3 \epsilon,\)

\(\displaystyle f'(p) = 3p^2 + 2p, \text { and } h'(p) = 3cp^2 = 3p^2 + 3p^2 \epsilon. \)

\(\displaystyle f(p) - h(p) = p^2 - p^3 \epsilon = p^2(1 - p \epsilon) = p^2 * \left ( \dfrac{n}{n} - p * \dfrac{1}{n} \right ) = \dfrac{p^2}{n} * (n - p).\)

\(\displaystyle \therefore p > n \implies f(p) - h(p) < 0 \implies f(p) < h(p) \implies f(p) \le h(p).\)

For large enough x, f(x) never exceeds a

**multiple** of g(x).

\(\displaystyle f'(p)= 3p^2 + 2p \text { and } h'(p) = 3p^2(1 + \epsilon ) \implies\)

\(\displaystyle \dfrac{f'(p)}{h'(p)} = \dfrac{3p^2}{3p^2(1 + \epsilon} = \dfrac{1}{1 + \epsilon} < 1.\)

For large enough x, f'(x) never exceeds a

**multiple** of g'(x).

What is even more important, however, is that with small enough epsilon

\(\displaystyle \displaystyle \lim_{x \rightarrow \infty} \dfrac{f'(x)}{g'(x)} = 1.\)

For large enough x, the ratio of f'(x) and g'(x) approaches or equals a

**positive** **constant**.

Now in your example the multiple and constant happen to be 1. This will not normally be the case.

I am sure there is a good reason (but I do not know it) why the multiple is dropped in informal discussion. But the multiple is key as reference to the formal definition makes obvious.

REVISED EDIT: Dr. Peterson and I have talked. He has suggested that my use of derivatives and \(\displaystyle \delta\)-\(\displaystyle \epsilon\) logic may be introducing concepts that the OP is not familiar with. There is no disagreement between our posts, but they are giving very different perspectives.