Deriving log-likelihood functions

Vernon

New member
Joined
Apr 14, 2008
Messages
8
Not sure what I'm doing is correct but my question goes as follows:

Suppose that data \(\displaystyle (y_{1}, . . . , y_{n})\) are assumed to be a random sample of observations from a data model \(\displaystyle y_{i}\) ~ p(y; \(\displaystyle \theta\)) = \(\displaystyle \theta(1 - y)^{\theta - 1}\) for\(\displaystyle 0 \le y \le 1\) and\(\displaystyle \theta > 0\)

Derive the log-likelihood function for \(\displaystyle \theta\). Obtain the maximum likelihood estimate \(\displaystyle \hat{\theta}\),

Well my worki9ng so far is (L=likelihood , l = log-likelihood)

L(\(\displaystyle \theta\)) = \(\displaystyle \prod_{i=1}^n\) \(\displaystyle \theta (1 - y_{i})^{\theta -1}\)

= \(\displaystyle \theta^{n}\) \(\displaystyle \prod_{i=1}^n (1 - y_{i})^{\theta -1}\)

so log-likelihood, l(\(\displaystyle \theta\)) = log(L(\(\displaystyle \theta\)))


= n log \(\displaystyle \theta\) + \(\displaystyle (\theta - 1) log \prod_{i=1}^n (1 - y_{i})\)

The maximum likelihood est will be at \(\displaystyle \frac{dl(\theta)}{d\theta} = 0\)

= \(\displaystyle \frac{n}{\theta}\) + \(\displaystyle \frac{1-\theta}{\prod_{i=1}^n (1 - y_{i})}\) = 0


Is it correct up to this point ? Is there a way of simplifying the product term ? and how do I rearrange that equation so that \(\displaystyle \theta\) equals an equation?
 
Use the fact that the log of a product is the sum of the logs of the individual factors.
 
Top