I cant understand an equation to LDA. The context is:
Maybe the lecturer is trying to create a proof of the equation given below.
I know the above that LDA projects the points along an axis so that we can have maximum separation between two classes.in addition to reducing dimensionality
Assume we have a set of D-dimensional samples $${x_1, x_2,...x_N},$$ ##N_1## of which belong to class
##\Omega_1## and ##N_2## to class ##\Omega_2##. We seek to obtain a scalar ##y## bt projecting the
samples ##x## onto a line:
##y=w^T X##
In the above equation, what is W?
In my opinion W is used to find the projection axis? So please guide me what is W?
Kindly tell me how to activate Latex on this forum also, I have tried both the $$ & ## options.
Zulfi.
Zulfi.
The objective of LDA is to perform dimensionality reduction while
preserving as much of the class discriminatory information as
possible
Maybe the lecturer is trying to create a proof of the equation given below.
I know the above that LDA projects the points along an axis so that we can have maximum separation between two classes.in addition to reducing dimensionality
Assume we have a set of D-dimensional samples $${x_1, x_2,...x_N},$$ ##N_1## of which belong to class
##\Omega_1## and ##N_2## to class ##\Omega_2##. We seek to obtain a scalar ##y## bt projecting the
samples ##x## onto a line:
##y=w^T X##
In the above equation, what is W?
In my opinion W is used to find the projection axis? So please guide me what is W?
Kindly tell me how to activate Latex on this forum also, I have tried both the $$ & ## options.
Zulfi.
Zulfi.