Proving a equation

miller72

New member
Joined
Feb 3, 2019
Messages
1
Hi everyone!

one of my friends asked me to prove this equation but sadly as it seems I'm not able to do it.
Please Please help me with this problem.I need an explanation on how to solve this.


here is what needs to be proven :log(n!)= θ(nlog(n))


I'll be waiting for your replies ...

Thank you all in advance.
 
one of my friends asked me to prove this equation but sadly as it seems I'm not able to do it.
Please Please help me with this problem.I need an explanation on how to solve this.
here is what needs to be proven :log(n!)= θ(nlog(n))
How does your friend define \(\displaystyle \Theta(n\cdot\log(n))~?\)
 
First I'm assuming that \(\displaystyle \theta(n \cdot \log(n)\) is meant to refer to Big O notation, such that the task at hand is really to prove that \(\displaystyle \log(n!)\) "grows like" \(\displaystyle n \cdot \log(n)\) as n grows without bounds. If that's not correct, you'll need to reply and tell us what the notation you've used does mean.

Begin by noting the definition of \(\displaystyle n! = n \times (n-1) \times (n-2) \times \dots \times 1\). Thus, we have: \(\displaystyle \log(n!) = \log(n \times (n-1) \times (n-2) \times \dots \times 1)\). Do you know any logarithm rules that would help you proceed from here? More specifically, do you know any logarithm rules that apply when you're taking the log of a product of two (or more) terms? You can find a refresher of some common logarithm rules here if you need it.

Edit: Oh boy, did I make a doozy of a mistake! Sorry about that. Hopefully everyone else, including OP, caught my gaff like JeffM did. I've now corrected the plus signs to be minus signs.
 
Last edited:
First I'm assuming that \(\displaystyle \theta(n \cdot \log(n)\) is meant to refer to Big O notation, such that the task at hand is really to prove that \(\displaystyle \log(n!)\) "grows like" \(\displaystyle n \cdot \log(n)\) as n grows without bounds. If that's not correct, you'll need to reply and tell us what the notation you've used does mean.
Begin by noting the definition of \(\displaystyle n! = n \times (n+1) \times (n+2) \times \dots \times 1\). Thus, we have: \(\displaystyle \log(n!) = \log(n \times (n+1) \times (n+2) \times \dots \times 1)\). Do you know any logarithm rules that would help you proceed from here? More specifically, do you know any logarithm rules that apply when you're taking the log of a product of two (or more) terms? You can find a refresher of some common logarithm rules here if you need it.
I hope that guess is correct. I hate to guess and be wrong, it is a waste of time. I have not seen \(\displaystyle \Theta \) for the big-O.
That said because \(\displaystyle \log(AB)=\log(A)+\log(B)\) where both \(\displaystyle A>0~\&~B>0\) we know that
\(\displaystyle \log (n!) = \sum\limits_{k = 1}^n {\log (k)} \) also the function \(\displaystyle \log(x)\) is increasing so \(\displaystyle \log (n!) = \sum\limits_{k = 1}^n {\log (k)} \le n\log (n)\)
Can you finish?
 
Last edited:
one of my friends asked me to prove this equation but sadly as it seems I'm not able to do it.
Please Please help me with this problem.I need an explanation on how to solve this.

here is what needs to be proven :log(n!)= θ(nlog(n))

You are presumably referring to Big-Theta notation as shown here: https://en.wikipedia.org/wiki/Big_O_notation#Family_of_Bachmann–Landau_notations . I am not aware of a "little-theta", which is what you wrote.

What have you learned about proving it? Have you tried using the definition? Or do you know any properties you could use?
 
First I'm assuming that \(\displaystyle \theta(n \cdot \log(n)\) is meant to refer to Big O notation, such that the task at hand is really to prove that \(\displaystyle \log(n!)\) "grows like" \(\displaystyle n \cdot \log(n)\) as n grows without bounds. If that's not correct, you'll need to reply and tell us what the notation you've used does mean.

Begin by noting the definition of \(\displaystyle n! = n \times (n+1) \times (n+2) \times \dots \times 1\). Thus, we have: \(\displaystyle \log(n!) = \log(n \times (n+1) \times (n+2) \times \dots \times 1)\). Do you know any logarithm rules that would help you proceed from here? More specifically, do you know any logarithm rules that apply when you're taking the log of a product of two (or more) terms? You can find a refresher of some common logarithm rules here if you need it.
n<(n+1)<(n+2)<.... which is NOT approaching 1.
 
If we assume that you mean big Theta, you need to prove for big O and big Omega.

Assuming log is to a base greater than 1 and n is a positive integer

\(\displaystyle 0 \le x = log(n!) = \displaystyle log \left ( \prod_{j=1}^n j \right) = \sum_{j=2}^n log(j) \implies\)

\(\displaystyle 0 \le x < (n - 1) * log(n) < nlog(n) \text { if } n > 1 \implies\)

\(\displaystyle n > 1 \text { and } c \ge 1 \implies |log(n!)| \le c * nlog(n) \implies\)

\(\displaystyle log(n!) = O\{nlog(n)\}.\)

Now only if your equation is also true for big Omega will it be true for big Theta. Big Omega looks like a steeper hill to climb. In fact, I myself do not see how to climb it right now.

For another site for information on Big O, Big Theta, and Big Omega, see

https://en.wikipedia.org/wiki/Big_O_notation
 
Top