Bonus point project for MA265.

# Elementary Linear Algebra Chapter 4: Real Vector Spaces

Welcome!

Note: This page is based on the fourth chapter in Elementary Linear Algebra with Applications (Ninth Edition) by Bernard Kolman and David R Hill.

**4.1 Vectors in the Plane and in 3-Space**

Basic definitions of what a vector and a coordinate system is (see book). I am under the impression that you have had enough math to know what these are.

**4.2 Vector Spaces**

A **real vector space** is a set V of elements on which we have two operations + and ∙ defined with the following properties:

(a) If u and v are any elements in V, then u + v is in V. We say that V isclosedunder the operation +

1. u + v = v + u for all u, v in V

2. u + (v + w) = (u + v) + w for all u, v, w in V

3. There exists an element 0 in V such that u + 0 = 0 + u = u for any u in V

4. For each u in V there exists an element –u in V such that u + -u = -u + u = 0

(b) If u is any element in V and c is any real number, then c ∙ u is in V

1. c ∙ (u + v) = c ∙ u + c ∙ v for any u, v in V and any real number c

2. (c + d) ∙ u = c ∙ u + d ∙ u for any u in V and any real numbers c and d

3. c ∙ (d ∙ u) = (cd) ∙ u for any u in V and any real numbers c and d

4. 1 ∙ u = u for any u in V

The operation "+" is called **vector addition **and the operation "∙" is **scalar multiplication**.

* Examples*:

[a

- Let R
^{n}be the set of all 1 x n matrices [a_{1 }a_{2}··· a_{n}], where we define "+" by_{1}a_{2}··· a_{n}] + [b_{1}b_{2 }··· b_{n}] = [a_{1}+ b_{1 }a_{2}+ b_{2}··· a_{n}+ b_{n}] and we define "·" by c · [a_{1}a_{2}··· a_{n}] = [ca_{1}ca_{2}··· ca_{n}]

2. A polynomial (in t) is a function that is expressible as:

p(t) = a_{n}t^{n} + a_{n-1}t^{n-1} + ··· + a_{1}t + a_{0}

_{} q(t) = b_{n}t^{n} + b_{n-1}t^{n-1} + ··· + b_{1}t + b_{0}

where a and b are real numbers and n is a nonnegative integer. If a_{n} ≠ 0, then the function is of degree n.

We define p(t) + q(t) as:

p(t) + q(t) = (a_{n}+b_{n})t^{n }+ (a_{n-1} + b_{n-1})t^{n-1} + ··· + (a_{1} + b_{1})t + (a_{0} + b_{0})

If c is a scalar, we define c · p(t) as:

c · p(t) = (ca_{n})t^{n} + (ca_{n-1})t^{n-1} + ··· + (ca_{1})t + (ca_{0})

**4.3 Subspaces**

Let V be a vector space and W a nonempty subset of V. If W is a vector space with respect to the operations in V, then W is called a **subspace** of V. A subspace is like a “mini” vector space that satisfies all of the properties mentioned in section 4.2. An easy way to test if something is a subspace is to see if it satisfies the addition and scalar multiplication properties.

*Example:*

- If R
^{2}is a vector space then what is a possible subspace of R^{2}. If we think of of this problem graphically then R^{2}is the same as the xy-plane. One possible solution to this example is the y-axis. How do we know if this is actually a subspace? Check your answer by seeing if the properties of addition and scalar multiplication hold true. If you add any number (positive or negative) its possible to get another value on the y-axis. The same goes for scalar multiplication.

**4.4 Span**

Let S be a set of vectors in a vector space V. If every vector in V is a linear combination of the vectors in S, then the set S is said to **span** V, or V is spanned by the set S; that is, span S = V.

* Example:*

[a b c d] = a[1 0 0 0] + b[0 1 0 0] + c[0 0 1 0] + d[0 0 0 1]

- If V is a vector space of all vectors of the form [a b c d], what subspace S is a linear combination of all vectors in V?

**4.5 Linear Independence**

**Linear Independence **is when all vectors in a set of vectors are unique. So if there are two vectors in a set that are a combination of other vectors in the the set, then the set is not linear independent.

*Examples:*

Determine if the following combination of vectors are linear independent or linear dependent

A = $ \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} $ B = $ \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} $ C = $ \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} $ D = $ \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} $ E = $ \begin{bmatrix} 1 \\ 2 \\ 6 \end{bmatrix} $ F = $ \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} $ G = $ \begin{bmatrix} 5 \\ 5 \\ 5 \end{bmatrix} $

- ABF - Linear Independent (Note that the vectors are all unique)
- ACD - Linear Independent (Note that the vectors are all unique)
- BCFG - Linear Dependent (Note vector C = B + F and vector G = 5 * C)

**4.6 Basis and Dimension**

The vectors in a vector space V are said to form a **basis** for V if they:

(1) span V

(2) linear independent

The **dimension **of a nonzero vector space V is the number of vectors in a basis for V. We often write **dim V** for the dimension of V. We also define the dimension of the trivial vector space {0} to be zero.

* Example:*

- The vectors [1 0] and [0 1] are a basis for R
^{2 }(linear independent and span)- The vectors [1 0 0], [0 1 0], and [0 0 1] are a basis for R
^{3 }(linear independent and span)- The vectors [1 2] , [2 3], and [3 4] are not a basis for R
^{2}(although they span R^{2}they are not linear independent)- The vector [1 1] is not a basis for R
^{2}(although it is linear independent it does not span R^{2})

**4.7 Homogeneous Systems**

If A is an m × n matrix, we refer to the dimension of the null space of A as the **nullity** of A, denoted by nullity A.

**4.9 Rank of a Matrix**

• If A is an m × n matrix, then rank A + nullity A = n

• A is nonsingular if and only if rank A = n

• If A is an n × n matrix, then rank A = n if and only if det(A) ≠ 0

See page 272 for Example 4 to show how to calculate the above statements.

By Michael Munizzi