Basis: Let {u,v,w} belong to R^n and suppose that....

Imum Coeli

Junior Member
Joined
Dec 3, 2012
Messages
86
Question:
Let {u,v,w} belong to R^n and suppose that {2u+w, u-v, w} is a basis for a subspace W. Show that {u,v,w} is a basis for W by showing that it satisfies the two conditions in the definition of a basis.

Notes:
First start with the definition of a basis.
1) The vectors must be linearly independent
2) The vectors must span W

To show 1)
Want to show that u,v,w are linearly independent.
For some a,b,c that belong to R we can write
0 = a(2u+w)+(a+b)(u-v)+(a+b+c)(w)
Which can be written as
0 = (3a+b)u - (a+b)v + (2a+b+c)w
Putting the coefficients in a matrix and row reducing (my first attempt at latex. I hope it works...)

\(\displaystyle \left(\begin{array}{ccc} 3\, a & b & 0\\ - a & - b & 0\\ 2\, a & b & c \end{array}\right)\)

becomes

\(\displaystyle \left(\begin{array}{ccc} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1 \end{array}\right)\)

As there is a unique solution a=b=c=0, u,v,w are linearly independent. This meets the requirements of 1)


To show 2)
Want to show that u,v,w span W.
Let x belong to span{2u+w, u-v, w}. Now I want to show that x is a linear combination of u,v,w.
Write
x
= a(2u + w) + b(u-v) + cw
Which can be written
x = (2a+b)u - bv + (a +c)w
Which implies that x belongs to W
This satisfies condition 2)

Thus we can conclude that {u,v,w} is a basis for W.

I'm not sure if I have done any of this right... Any advice would be very helpful.
Thanks
 
Question:
Let {u,v,w} belong to R^n and suppose that {2u+w, u-v, w} is a basis for a subspace W. Show that {u,v,w} is a basis for W by showing that it satisfies the two conditions in the definition of a basis.

Notes:
First start with the definition of a basis.
1) The vectors must be linearly independent
2) The vectors must span W

To show 1)
Want to show that u,v,w are linearly independent.
For some a,b,c that belong to R we can write
0 = a(2u+w)+(a+b)(u-v)+(a+b+c)(w)
Which can be written as
0 = (3a+b)u - (a+b)v + (2a+b+c)w
Putting the coefficients in a matrix and row reducing (my first attempt at latex. I hope it works...)
When you say "putting coefficients in a matrix ..." you are saying that each of 3a+ b, a+ b, and 2a+ b+ c must be equal to 0. You recognize that this is true because u, v, and w are independent, right?

\(\displaystyle \left(\begin{array}{ccc} 3\, a & b & 0\\ - a & - b & 0\\ 2\, a & b & c \end{array}\right)\)

becomes

\(\displaystyle \left(\begin{array}{ccc} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1 \end{array}\right)\)

As there is a unique solution a=b=c=0, u,v,w are linearly independent. This meets the requirements of 1)


To show 2)
Want to show that u,v,w span W.
Let x belong to span{2u+w, u-v, w}. Now I want to show that x is a linear combination of u,v,w.
No! If x were to lie in any subspace of W, it would be a linear combination of u, v, w. What you want to do is the other way: Let x be in W (the span of u, v, w and show that x can be written as a linear combination of 2u+ w, u- v, and w.

Write
x
= a(2u + w) + b(u-v) + cw
Which can be written
x = (2a+b)u - bv + (a +c)w
Which implies that x belongs to W
This satisfies condition 2)

Thus we can conclude that {u,v,w} is a basis for W.

I'm not sure if I have done any of this right... Any advice would be very helpful.
Thanks[/QUOTE]
 
When I said "putting coefficients in a matrix ..." I did realise that is true because u, v, and w are independent. I was just trying to be explicit. I guess it was unnecessary.

I may be totally misunderstanding this but to...

"Let x be in W (the span of u, v, w and show that x can be written as a linear combination of 2u+ w, u- v, and w."

...I need to start with x being a linear combination of u, v, w and show that it is also a linear combination of 2u+ w, u- v, and w? I will end up with more terms than I begin with. How do I go about this?

Thank you so much for your time.
 
When I said "putting coefficients in a matrix ..." I did realise that is true because u, v, and w are independent. I was just trying to be explicit. I guess it was unnecessary.

I may be totally misunderstanding this but to...

"Let x be in W (the span of u, v, w and show that x can be written as a linear combination of 2u+ w, u- v, and w."

...I need to start with x being a linear combination of u, v, w and show that it is also a linear combination of 2u+ w, u- v, and w? I will end up with more terms than I begin with. How do I go about this?

Thank you so much for your time.

You're thinking too hard. 2u+w, u-v, w are all in span{u,v,w}. Then 3= dim(span{2u+w,u-v,w}) <= dim(span{u,v,w}) <= 3. Therefore they span the same space, and u,v,w are linearly independent.
 
Wow. Thanks. I had to look up dimension though.

For some reason this problem is still not clicking for me.

I was looking at a similar, simpler question trying to solve this one.

Question:
Show that if {u,v} is a basis for a subspace W, then so is {u+v, v}.
Solution (ignoring linear independence):
Let x belong to W then x = au + bv (since {u,v} span W)
Then x = au + av - av + bv = a(u+v) + (a-b)v
So x belongs to span {u+v, v}.

I was wondering if a similar approach is possible with this problem, but then I can't see the trick of adding zero...

Thanks again.
 
Wow. Thanks. I had to look up dimension though.

For some reason this problem is still not clicking for me.

I was looking at a similar, simpler question trying to solve this one.

Question:
Show that if {u,v} is a basis for a subspace W, then so is {u+v, v}.
Solution (ignoring linear independence):
Let x belong to W then x = au + bv (since {u,v} span W)
Then x = au + av - av + bv = a(u+v) + (a-b)v
So x belongs to span {u+v, v}.

I was wondering if a similar approach is possible with this problem, but then I can't see the trick of adding zero...

Thanks again.

If you must do it that way, then (and I did this by inspection):

\(\displaystyle \dfrac{1}{2}\left([2u+w] + (-1)[w]\right)=u\)

\(\displaystyle \dfrac{1}{2}\left([2u+w] + (-1)[w] +(-2)[u-v]\right)=v\)

So \(\displaystyle u,v,w\in \text{span}\{2u+w,u-v,w\}\)

Therefore \(\displaystyle \text{span}\{u,v,w\} = \text{span}\{2u+v,u-v,w\}\)



To do this in general, given a basis B for W and another set B' of W you wish to show is a basis for W, you create a map \(\displaystyle T: W \to W\) by mapping B' onto B. If T is invertible, then B' is a basis.

For this example (and I choose this): \(\displaystyle T(2u+w)=u, T(u-v)=v, T(w)=w\). By using linearity of \(\displaystyle T\),

\(\displaystyle T(2u+w)=2T(u)+T(w)=u\)
\(\displaystyle T(u-v)=T(u)-T(v)=v\)
\(\displaystyle T(w)=w\)

By subbing (3) into (1) we get: \(\displaystyle T(u) = \dfrac{1}{2}(u-w)\)
By subbing this into (2) we get: \(\displaystyle T(v) = \dfrac{1}{2}(u-w-2v)\)

So \(\displaystyle T\) is invertible, and is in fact called a change of basis transformation (and has an associated matrix, the change of basis matrix). To get your linear combination, apply \(\displaystyle T^{-1}\)

\(\displaystyle u = T^{-1}\left(\dfrac{1}{2}(u-w)\right) = \dfrac{1}{2}\left(T^{-1}(u)-T^{-1}(w)\right) = \dfrac{1}{2}\left([2u+w] + (-1)[w]\right)\)

\(\displaystyle v = T^{-1}\left(\dfrac{1}{2}(u-w-2v)\right) = \dfrac{1}{2}\left(T^{-1}(u)-T^{-1}(w)-2T^{-1}(v)\right) = \dfrac{1}{2}\left([2u+w] + (-1)[w]+(-2)[u-v]\right)\)
 
At the risk of looking like an idiot...

If I use HallsofIvy's condition that x is in W then does my first proof hold?

x = a(2u + w) + b(u-v) + cw
Which can be written
x = (2a+b)u - bv + (a +c)w

because
x = a(2u + w) + b(u-v) + cw = x1u + x2v + x3w (as a, b and c are arbitrary constants can't I write 2a+b=x1, -b=x2 and a+c=x3?)
So isn't this just a linear combination of u,v and w? And as x is just any vector in W, then doesn't {u, v, w} span W?

Sorry if I'm being dense and annoying but I feel like I'm missing something important if I can't understand this.
 
At the risk of looking like an idiot...

If I use HallsofIvy's condition that x is in W then does my first proof hold?

x = a(2u + w) + b(u-v) + cw
Which can be written
x = (2a+b)u - bv + (a +c)w

because
x = a(2u + w) + b(u-v) + cw = x1u + x2v + x3w (as a, b and c are arbitrary constants can't I write 2a+b=x1, -b=x2 and a+c=x3?)
So isn't this just a linear combination of u,v and w? And as x is just any vector in W, then doesn't {u, v, w} span W?

Sorry if I'm being dense and annoying but I feel like I'm missing something important if I can't understand this.

What that shows is that \(\displaystyle W\subset \text{span}\{u,v,w\}\), not equality
 
Top