Linear combination and linear dependence

plepamo

New member
Joined
Mar 11, 2014
Messages
2
Hi,

My questions are based on a set of vectors which are homogenous.

Am I correct by saying the determinant not equal to zero implies a trivial solution which also implies linear independence?

If the determinant is equal zero, does it mean there exists a linear combination? If not, please explain.
 
Hi,

My questions are based on a set of vectors which are homogenous.

Be precise. Vectors are not homogeneous, the linear system is homogeneous.

Am I correct by saying the determinant not equal to zero implies a trivial solution which also implies linear independence?

It implies the only solution to the corresponding linear system, or corresponding vector equation is the trivial one.
If the determinant is equal zero, does it mean there exists a linear combination? If not, please explain.

Exists a linear combination such that what? Again, be precise and I'll help further.
 
Be precise. Vectors are not homogeneous, the linear system is homogeneous.

What I meant was, a matrix Ax=0, where the matrix is the set of vectors.

It implies the only solution to the corresponding linear system, or corresponding vector equation is the trivial one.

Does a trivial solution imply linear independence? In other words, provided A is a square matrix, can I determine if the set of vectors are linearly independent by checking if the matrix has a trivial solution?


Exists a linear combination such that what? Again, be precise and I'll help further.
If I'm given a set of vectors S, and asked to determine if another vector w is a linear combination of S, is it sufficient for me to conclude that w is a linear combination of S if the determinant of the matrix of the set of vectors is zero?

Hopefully that makes more sense.
 
What I meant was, a matrix Ax=0, where the matrix is the set of vectors.



Does a trivial solution imply linear independence? In other words, provided A is a square matrix, can I determine if the set of vectors are linearly independent by checking if the matrix has a trivial solution?

If A is square, then Ax=0 has only the trivial solution if and only if A is invertible, if and only if the columns are linearly independent.

If I'm given a set of vectors S, and asked to determine if another vector w is a linear combination of S, is it sufficient for me to conclude that w is a linear combination of S if the determinant of the matrix of the set of vectors is zero?

Hopefully that makes more sense.

What matrix? How are you making the matrix? If you mean the members of S together with w then no. For example

\(\displaystyle \left[\begin{matrix} 1 & 2 & 1\\1 & 2 & 0\\1&2&0\end{matrix}\right]\)

has a determinant of zero. But the third vector (w) is not a linear combination of the first two. SImilarly,


\(\displaystyle \left[\begin{matrix} 1 & 3 & 1\\1 & 2 & 0\\1&2&0\end{matrix}\right]\)

also has a determinant of zero, but w is a linear combination of the other two.

To figure out if w is a linear combination, you create a matrix whose columns are formed by members of S, augment it with w and row-reduce. Then attempt to see if the corresponding system is consistent.
 
Last edited:
Top