Prove the determinant is non-zero (linear independence w/o row-echelon form)

Mampac

New member
Joined
Nov 20, 2019
Messages
48
Hi there,

In R100 we are given three vectors:
v1 = (1, 2, 3, 4, . . . , 100),
v2 = (2, 3, 4, 5, . . . , 100, 0),
v3 = (3, 4, 5, 6, . . . , 100, 0, 0).
Have to show these are linearly independent, without any row-echelon matrix.

I know that the rows are dependent if [MATH]detA = 0[/MATH], so I've gotta prove the opposite.

I see only 2 ways of finding the determinant -- out of the ones we've covered so far:
1) I'm hesitating to use the Triangle method, since we have unknowns. Can I form a matrix, by rows, then get a21, a31, a32 nullified and say that the product of the diagonal is nonzero? I'm hesitating since it's not a square matrix in this case. Even here I get 1 * -1 * 0 = 0.
2) As about Laplace expansion... I just can't find a valid entry to use this theorem on, even when I perform row-operations. Everything fails. If I sequester 3 columns from the ones known and create a matrix I always get a zero determinant.

By the way, when creating a matrix, should I drop the last zeros in v2 and v3, or should I append more to v1 and v2?

It seems to me there's a mistake in the subject and it should've been "show whether independent", because to me these rows are dependent as ****.
 
You need to start with a definition of linear independence.
1618774315137.png
Write out the three equations for the last three entries.
Solve for x, y, z (starting with the final equation, then the second-last...)
Then look at the definition of linear independence and decide.

(There is no square matrix here, so determinants are not of use).
 
(There is no square matrix here, so determinants are not of use).
Oops!
I was going to say look at aV1+bV2+V3 =0, but determinant of a upper/lower triangular square matrix would have done the trick nicely--if in fact we had a square matrix. I'll be in the corner for while thinking about how I made such a simple error.

BTW, lex, welcome to the forum!
 
You need to start with a definition of linear independence.
View attachment 26600
Write out the three equations for the last three entries.
Solve for x, y, z (starting with the final equation, then the second-last...)
Then look at the definition of linear independence and decide.

(There is no square matrix here, so determinants are not of use).
only a couple of hours later i realized (pointed out by a classmate) that the first vector expands to ...98, 99, 100) in the end. I thought that the dimensions of the vectors are different (such that v1 has 98 vector components, v2 has 99 vector components, and v3 has all the 100 components). that's why i was so confused and thought about appending 0s.

it was required from me to not use any row-echelon thingies (well, as a matter of fact, the subject states "Show that these vectors are linearly independent. This can be done without any row-echelon matrix", but it seems to me i'll lose some points this way).
after the aforementioned discovery i did understand that if a bring this matrix formed by vectors to row-echelon form i don't get a zero-row at the bottom thus it's linearly independent. You said it's no square matrix and I agree, but can I now separate the last three entries of each vector to form one? then it's already in a mirrored triangle form, thus the determinant is -(100 × 100 × 100) 0 thus the whole set is independent?
 
  • Like
Reactions: lex
it's no square matrix and I agree, but can I now separate the last three entries of each vector to form one? then it's already in a mirrored triangle form, thus the determinant is -(100 × 100 × 100) 0 thus the whole set is independent?
Certainly so. If the only solution of the last three equations is x=0, y=0, z=0 then the only solution of the entire system is x=0, y=0, z=0.
Well done. Everyone is happy!
 
Top