Multivariate normal distribution and marginal distribution

nakys

New member
Joined
Oct 3, 2013
Messages
1
Hi everyone,
I have the following exercise:
Given \(\displaystyle Y \sim \mathcal{N}_p(\mu,\Omega ) \),


a) Consider the following decomposition \(\displaystyle Y=(Y_1,Y_2)^T, \mu=(\mu_1, \mu_2)^T, \Omega=( \Omega_{11}, \Omega_{12};\Omega_{21},\Omega_{22} )\) ( omega is supposed to be a matrix).
Show that conditional \(\displaystyle Y_1 |(Y_2=y_2) \) is \(\displaystyle \mathcal{N}_p ( \mu_1+\Omega_{12}\Omega_{22}^{-1}(y_2-\mu_2),\Omega_{11}-\Omega_{12}\Omega_{22}^{-1}\Omega_{21})\), where p is the dimension of \(\displaystyle Y_1\).


This one, I have shown.


b) Let \(\displaystyle a,b \in \mathbb{R}^n\). Find the conditional \(\displaystyle X_1|X_2=x_2\) where \(\displaystyle X_1=a^TY,X_2=b^TY\). In which case this distribution doesn't depend on \(\displaystyle x_2\)?

This one is causing me trouble. Well, with some linear transformation ( \(\displaystyle (a^T, b^T)^T*Y=(X_1, X_2)\)) and question a), I found the conditional distribution for b) but I have some atrocious matrix multiplication to do to find the exact form of my new matrix Omega in terms of a and b and the old Omega. I'm really wondering if there isn't another way. Plus my answer for last part is when sigma_12 * sigma_22_inverse = 0. But this implies a lot of ugly sub cases... what am I missing, I don't think it should be as messy as what I've found.


Thank you in advance for taking time to answer my question.
 
Last edited:
Hi everyone,
I have the following exercise:
Given \(\displaystyle Y \sim \mathcal{N}_p(\mu,\Omega ) \),


a) Consider the following decomposition \(\displaystyle Y=(Y_1,Y_2)^T, \mu=(\mu_1, \mu_2)^T, \Omega=( \Omega_{11}, \Omega_{12};\Omega_{21},\Omega_{22} )\) ( omega is supposed to be a matrix).
Show that conditional \(\displaystyle Y_1 |(Y_2=y_2) \) is \(\displaystyle \mathcal{N}_p ( \mu_1+\Omega_{12}\Omega_{22}^{-1}(y_2-\mu_2),\Omega_{11}-\Omega_{12}\Omega_{22}^{-1}\Omega_{21})\), where p is the dimension of \(\displaystyle Y_1\).


This one, I have shown.


b) Let \(\displaystyle a,b \in \mathbb{R}^n\). Find the conditional \(\displaystyle X_1|X_2=x_2\) where \(\displaystyle X_1=a^TY,X_2=b^TY\). In which case this distribution doesn't depend on \(\displaystyle x_2\)?

This one is causing me trouble. Well, with some linear transformation ( \(\displaystyle (a^T, b^T)^T*Y=(X_1, X_2)\)) and question a), I found the conditional distribution for b) but I have some atrocious matrix multiplication to do to find the exact form of my new matrix Omega in terms of a and b and the old Omega. I'm really wondering if there isn't another way. Plus my answer for last part is when sigma_12 * sigma_22_inverse = 0. But this implies a lot of ugly sub cases... what am I missing, I don't think it should be as messy as what I've found.


Thank you in advance for taking time to answer my question.
I believe the Covariance matrix \(\displaystyle \Omega\) has to be symmetric.
\(\displaystyle \Omega_{11} = V_1 = \sigma_1^2\)
\(\displaystyle \Omega_{22} = V_2 = \sigma_2^2\)
\(\displaystyle \Omega_{12} = \Omega_{21} = \rho\ \sigma_1\ \sigma_2\)
where \(\displaystyle \rho\) is the correlation coefficient.

Would that help? Would the new \(\displaystyle \Omega\) be a symmetry transform of the old?
 
Top