Thank you for the reply.
I don't fully understand the whole concept of maximum likelihood estimation, but I do know that the probability function is needed to achieve it. What I don't understand is how to get the probability function for this data set...
I found a lot of work online about deriving the maximum likelihood estimator of a Bernoulli's distribution, and I thought I could use that by changing the dice roll into a dummy [MATH]Y[/MATH] in which = 1 when [MATH]X[/MATH] is an odd number and = 0 when [MATH]X[/MATH] is an even number.
So I thought that the probability function would be
[MATH] Pr(Y_i) = 3{p_o}^{Y_i}(1-3p_o)^{1-Y_i} [/MATH]
and the likelihood function is
[MATH] L_n(p_o) = \prod_{i=1}^n3{p_o}^{Y_i}(1-3p_o)^{1-Y_i}[/MATH]
and the log-likelihood function is
[MATH] \log{L_n(p_o) }=\log{ \prod_{i=1}^n3{p_o}^{Y_i}(1-3p_o)^{1-Y_i}}[/MATH][MATH] = \sum_{i=1}^n\log{[3{p_o}^{Y_i}(1-3p_o)^{1-Y_i}]}[/MATH][MATH]= \log{3p_o}\cdot\sum_{i=1}^nY_i + \log{(1-3p_o)}\cdot\sum_{i=1}^n(1-Y_i)[/MATH]
and the maximum likelihood estimator is
[MATH] \frac{\partial \log{L_n(p_o) }}{\partial p_o}= \frac{n\mu_y}{3p_o} - \frac{n(1-\mu_y)}{1-3p_o}=0[/MATH][MATH]p_o=\frac{1}{3}\mu_y[/MATH][MATH]= \frac{1}{3n}\sum_{i=1}^nY_i[/MATH]
But I was wondering if there was a different way (possibly without changing X into Y). If I can show that the biased dice roll follows a normal distribution I could use the normal distribution function as the probability function, but I am not sure whether or not the biased dice roll does follow a normal distribution...
Anyways, here is my work. I am not sure if it makes sense or not so please give me your feedback. Thank you.