Revision as of 16:58, 16 December 2011 by Gallegoj (Talk | contribs)

Contents

Linear Dependence



Definition




The vectors v1,v2,...,vk in a vector space V are said to be linearly dependent if there exist constans a1,a2,...,ak not all zero, such that

                                                                           $ \sum_{j=1}^k a_1v_1+a_2v_2+...+a_kv_k=0 $                        (1)                                 

Otherwise, v1,v2,...,vk are called linearly independent. That is v1,v2,...,vk are linearly independent if, whenever a1v1+ a2v2+...+akvk=0


                                                                       a1= a2=...=ak=0



If S = {v1,v2,...,vk} ,then we also say that the set S is linearly dependent or linearly independent if the vectors have the corresponding property.


         It should be emphasized that for any vectors v1,v2,...,vk, Equation (1) always holds if we choose all the scalars a1,a2,...,ak, equal to zero. The important point in this definition is whether it is possible to satisfy (1) with at least one of the scalars different from zero.


Remark: Definition is started for a finite set of vectors, but it also applies to an infinite set S of a vector space, using corresponding notation for infinite sums.


        To determine whether a set of vectors is linearly independent or linearly dependent, we use Equation (1). Regardless of the form of the vectors, Equation (1) yields a homogeneous linear system of equations. it is always consistent, since a1=a2=...=ak=0 is a solution. However, the main idea from the Definition is whether there is a nontrivial solution.





Example 1



Determine whether the vectors

$ v1=\left[\begin{array}{cccc}3\\2\\1\end{array}\right] $

$ v2=\left[\begin{array}{cccc}1\\2\\0\end{array}\right] $

$ v3=\left[\begin{array}{cccc}-1\\-2\\-1\end{array}\right] $

are linearly independent.


Solution

Forming Equation (1)

$ a1\left[\begin{array}{cccc}3\\2\\1\end{array}\right] $$ +a2\left[\begin{array}{cccc}1\\2\\0\end{array}\right] $$ +a3\left[\begin{array}{cccc}-1\\2\\-1\end{array}\right] $$ =\left[\begin{array}{cccc}0\\0\\0\end{array}\right] $

we obtain the homogeneous system

3a1 + a2a3 = 0

2a1 + 2a2 + 2a3 = 0

a1               − a3 = 0


The corresponding augmented matrix is

$ \left[\begin{array}{cccc}3&1&-1&|0\\2&2&2&|0\\1&0&-1&|0\end{array}\right] $


Whose reduced row echelon form is

$ \left[\begin{array}{cccc}1&0&-1&|0\\0&1&2&|0\\0&0&0&|0\end{array}\right] $


This there is a nontrivial solution

$ \left[\begin{array}{cccc}k\\-2k\\k\end{array}\right] $       k is not equal to 0

so the vectors are linearly dependent.






Example 2



Are the vectors

$ v1=\left[\begin{array}{cccc}1&0&1&2\end{array}\right] $

$ v2=\left[\begin{array}{cccc}0&1&1&2\end{array}\right] $

$ v3=\left[\begin{array}{cccc}1&1&1&3\end{array}\right] $     in R4

linearly dependent or linearly independent?


Solution

We form Equation (1),

a1v1+a2v2+a3v3=0

and solve for a1,a2,a3.

The resulting homogeneous system is

a1 + a3 = 0

a2 + a3 = 0

a1 + a2 + a3 = 0

2a1 + 2a2 + 3a3 = 0


The corresponding augmented matrix is

$ \left[\begin{array}{cccc}1&0&1&|0\\0&1&1&|0\\1&1&1&|0\\2&2&3&|0\end{array}\right] $


and its reduced row echelon form is

$ \left[\begin{array}{cccc}1&0&0&|0\\0&1&0&|0\\0&0&1&|0\\0&0&0&|0\end{array}\right] $


Thus the only solution is the trivial solution

a1 = a2 = a3 = 0

so the vectors are linearly independent.



Theorem

Let {S = v1,v2,...,vn}

be a set of n vectors in Rn (<span class="texhtml" />R. Let A be the matrix whose columns (rows) are the elements of S. Then S is linearly independent if and only if det(A) is not equal to 0.



Example 3


Is     $ S=\left[\begin{array}{cccc}1&2&3\end{array}\right] $ $ ,\left[\begin{array}{cccc}0&1&2\end{array}\right] $ $ ,\left[\begin{array}{cccc}3&0&-1\end{array}\right] $

a linearly independent set of vectors in R3?


Solution

We form the matrix A whose rows are the vectors in S:


$ A=\left[\begin{array}{cccc}1&2&3\\0&1&2\\3&0&-1\end{array}\right] $


Since det(A) = 2, we conclude that S is linearly independent.



Theorem

Let S1<span style="font-size: 17px;" /><span style="font-size: 17px;" /> and S2 be finite susbsets of a vector space and let S1 be a subset of S2. Then the following statements are true:

(a) If S1 is linearly dependent, so is S2.

(b) If S2 is linearly independent, so is S1.



Theorem

The nonzero vectors v1,v2,...,vn in a vector space V are linearly dependent if and only if one of the vectors vj $ (2 \le j) $ is a linear combination of the preceding vectors v1,v2,...,vj − 1.

<span class="texhtml" />

Example 4

Let $ V $=$ R_3 $ and also 

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett