The closest the book comes to defining growth rates is the following on page 70:
"The idea of these definitions is to establish a relative order among functions. Given two functions, there are usually points where on function is smaller than the other. So it does not make sense to claim, for instance, f(N) < g(N). Thus we compare their relative rates of growth. When we apply this to the analysis of algorithms, we shall see why this is the important measure."
a) How does N2 + N3 not have a higher rate of growth than N3? For every value N, N2 + N3 will result in a higher value than N3. Is it because N2 would become functionally irrelevant for really high values of N?
b) I apologize. I kept forgetting it was little o. But wouldn't my example still be correct? Wouldn't N3 - N2 have a slower growth rate than whatever f(N) is? Based on the original premises, we know that f(N) has to be at least equal to the larger of the two T(N) values. Due to that, T1(N) - T2(N) must either be a positive or negative value that is less than f(N).
"The idea of these definitions is to establish a relative order among functions. Given two functions, there are usually points where on function is smaller than the other. So it does not make sense to claim, for instance, f(N) < g(N). Thus we compare their relative rates of growth. When we apply this to the analysis of algorithms, we shall see why this is the important measure."
a) How does N2 + N3 not have a higher rate of growth than N3? For every value N, N2 + N3 will result in a higher value than N3. Is it because N2 would become functionally irrelevant for really high values of N?
b) I apologize. I kept forgetting it was little o. But wouldn't my example still be correct? Wouldn't N3 - N2 have a slower growth rate than whatever f(N) is? Based on the original premises, we know that f(N) has to be at least equal to the larger of the two T(N) values. Due to that, T1(N) - T2(N) must either be a positive or negative value that is less than f(N).
Last edited: