Understanding what is W in Linear Discriminant Analysis

zak100

New member
Joined
Dec 10, 2013
Messages
14
I cant understand an equation to LDA. The context is:
The objective of LDA is to perform dimensionality reduction while
preserving as much of the class discriminatory information as
possible

Maybe the lecturer is trying to create a proof of the equation given below.

I know the above that LDA projects the points along an axis so that we can have maximum separation between two classes.in addition to reducing dimensionality

Assume we have a set of D-dimensional samples $${x_1, x_2,...x_N},$$ ##N_1## of which belong to class
##\Omega_1## and ##N_2## to class ##\Omega_2##. We seek to obtain a scalar ##y## bt projecting the
samples ##x## onto a line:
##y=w^T X##

In the above equation, what is W?
In my opinion W is used to find the projection axis? So please guide me what is W?

Kindly tell me how to activate Latex on this forum also, I have tried both the $$ & ## options.

Zulfi.

Zulfi.
 

Attachments

  • LDA 1st Slide of 16_pdf.jpg
    LDA 1st Slide of 16_pdf.jpg
    194.9 KB · Views: 2
Top