Proof of bias term in nearest centroid classifier

Sauraj

New member
Joined
Jul 6, 2019
Messages
39
Hello, I dont even know how to start the proof

New Text Document.jpg


This is true but only if I know that w=µ2µ1µ2µ1\displaystyle w = \frac{ µ_2 - µ_1 }{ ||µ_2-µ_1|| } if β=wTµ1+µ22\displaystyle \beta = w^T \frac {µ_1 + µ_2} {2} from here (picture above in link) so that in my case w should be:

w=wwowwo=wwo(wwo)(wwo)T\displaystyle w = \frac {w_∆ - w_o} {||w_∆ - w_o||} = \frac {w_∆ - w_o} {(w_∆ - w_o)(w_∆ - w_o)^T}

if I insert everything in wTxβ\displaystyle w^Tx-\beta:

(wwo(wwo)(wwo)T)Tx(wwo(wwo)(wwo)T)Two+w2=\displaystyle (\frac {w_∆ - w_o} {(w_∆ - w_o)(w_∆ - w_o)^T})^Tx-(\frac {w_∆ - w_o} {(w_∆ - w_o)(w_∆ - w_o)^T})^T \frac {w_o + w_∆} {2} =

((wwo)T(wwo)T(wwo))x((wwo)T(wwo)T(wwo))wo+w2\displaystyle (\frac {(w_∆ - w_o)^T} {(w_∆ - w_o)^T(w_∆ - w_o)})x-(\frac {(w_∆ - w_o)^T} {(w_∆ - w_o)^T(w_∆ - w_o)}) \frac {w_o + w_∆} {2} and now I have the expected beta

But how to calculate this if I dont know what w is, I only have β=wTwo+w2\displaystyle \beta = w^T \frac {w_o + w_∆} {2} and w=wow\displaystyle w = w_o - w_∆ from the formula above
 
Top