Linear dependency of vectors

akim_bandana

New member
Joined
Oct 19, 2020
Messages
2
Let v1, v2 be two vectors in R2. Show that if v1 and v2 form a basis of R2, then w1=v1+v2, w2=v1 form a basis of R2.

I kind of know the theory for this but I'm struggling where to begin. Help much appreciated!
 
Last edited:
Let v1, v2 be two vectors in R2. Show that if v1 and v2 form a basis of R2, then w1=v1+v2, w2=v1 form a basis of R2.

I kind of know the theory for this but I'm struggling where to begin. Help much appreciated!
Start with definition and properties of basis vectors.

Please show us what you have tried and exactly where you are stuck.

Please follow the rules of posting in this forum, as enunciated at:


Please share your work/thoughts about this problem.
 
Last edited by a moderator:
Okay so I'm trying to show that w1 and w2 are linearly independent. Is that correct? I also calculated a matrix that gets me from (v1, v2) to (w1, w2) but I'm not sure what to do with it.



Start with definition and properties of basis vectors.

Please show us what you have tried and exactly where you are stuck.

Please follow the rules of posting in this forum, as enunciated at:


Please share your work/thoughts about this problem.
 
Okay so I'm trying to show that w1 and w2 are linearly independent. Is that correct? I also calculated a matrix that gets me from (v1, v2) to (w1, w2) but I'm not sure what to do with it.
We are told that \(v_1~\&~v_2\) form a basis for \(\Re^2\). That means that \(v_1~\&~v_2\) are linearly independent, RIGHT??
Now consider \(\alpha w_1+\beta w_2=0 \text{ or } \alpha(v_1+v_2)+\beta v_1=0\) Can you show that the scalars must be zero?
 
Top