Another algorithm, is this answer correct?

mathguy78

New member
Joined
Nov 28, 2006
Messages
9
Hello all,

I was working on this problem and wanted to make sure it was correct:

Find functions f and g satisfying

f(n) =/ 0(g(n))
g(n) =/ 0(f(n))

=/ is no equal picture the / going through the equals sign





f(n) = {1 if x is even, 0 if x is odd}
g(n) = {1 if x is even, 0 is x is odd}


If x is odd it will move toward 0, if x is even it will move toward infinity.


If that is right, could there be any other way to do that problem?


thanks for the help.
 
[I assume you didn't actually mean that f(n)=g(n) and that you mean to switch the odd/even for g(n)). Obviously, two equal functions are both O() of eachother.]

I believe both of those functions are O(1) (more precisely O(c)). They always can complete in a constant number of steps. In other words, the function's "run-time" (or number of 'steps') is not dependent on the input.

f(n) is equivilant to:
f(int n){ int x = n/2; if((n-x*2)==0) return 1; else return 0;}

No matter what inputn, f(n) will always terminate in a constant number of steps. Similarly for g(n).
 
Here's what I hope to be a push in the right direction:

(assuming f,g are always positive functions)
\(\displaystyle f(n) \in O(g(n)) \,\, \Rightarrow \,\, \exists ^{k \in \mathbb{N}} \,\, s.t. \,\, \forall^{ n>k}, \,\, \exists ^{c > 0} \,\, s.t. \,\, f(n) \le cg(n)\)

Simply put, for sufficiently large inputs n, there is a constant multiple of g(n) which is always bigger than f(n).

\(\displaystyle f(n) \notin O(g(n))\) means that there is no constant multiple of g(n) for which f(n) is always less than it for any input. For example, n<sup>3</sup> is not O(4n<sup>2</sup>), because for n>=4, n<sup>3</sup> >= 4n<sup>2</sup>.

It is very easy to prove that: \(\displaystyle If \,\, f(n) \,\, \notin \,\, O(g(n)), \,\, then \,\, f(n) \,\, \in \,\, \omega(g(n))\)

So, you're question is equivilant to finding functions f,g such that:
\(\displaystyle f(n) \,\, \in \,\, \omega(g(n)) \,\, and \\ g(n) \,\, \in \,\, \omega(f(n))\).

Hope that helps,
-daon
 
Thanks for the help on this,

I have a question though,

in the final answer you have, what does the "w" stand for?

I'm new to algorithms and it's all just so confusing.

Thanks
 
Okay, my apologies.

\(\displaystyle \omega\) or "little-omega" works like this:

\(\displaystyle f(n) = \omega(g(n)) \,\, \Leftrightarrow \,\, lim \frac{f(n)}{g(n)} = \infty\) Basically, f(n) is always a "dominating function". For example, x<sup>15</sup> = \(\displaystyle \omega\)(x<sup>12</sup>).

I am *pretty sure* the answer to your question should be "no such functions exist." If they are little-omega of one another then that would imply that f(n)/g(n) approaches infinity as well as g(n)/f(n) appraoches infinity. To my knowledge, that is not possible.

------
If you've been introduced to only "Big-O" then heres another way of looking at it (again, assuming f,g always positive, otherise use absolute value on limits):

Well,

\(\displaystyle \L f \notin O(g) \,\, \Leftrightarrow \,\, lim \frac{f}{g} \not< \infty \,\, \Leftrightarrow \,\, lim \frac{f}{g} = \infty\)

So, we need functions f and g such that Lim f/g = infinity and Lim g/f = infinity.

Hope that makes it clearer.
-Daon
 
Yeah,

So far we have only covered Big O essentially, I guess that is why I was getting a little confused on that.



It's still a little confusing but I'll try to work with what you have posted.
 
Top