Line 8: Line 8:
  
 
<math>f(a)=\frac{1}{2\pi i}\int_\gamma \frac{f(z)}{z-a}\ dz.</math>
 
<math>f(a)=\frac{1}{2\pi i}\int_\gamma \frac{f(z)}{z-a}\ dz.</math>
 +
 +
Question from a student:
 +
 +
I have a question about P.301 #33.  I see in the back of the book that it is
 +
not a vector space.  I don't understand why though.  In the simplest form I
 +
would think an identity matrix would satisfy the requirements mentioned in
 +
the book for #33.  Isn't an identity matrix a vector space?
 +
 +
Answer from Bell:
 +
 +
One of the key elements of being a vector space is
 +
that the thing must be closed under addition.  The
 +
set consisting of just the identity matrix is not
 +
a vector space because if I add the identity matrix
 +
to itself, I get a matrix with twos down the diagonal,
 +
and that isn't in the set.  So it isn't closed under
 +
addition.  (The set consisting of the ZERO matrix
 +
is a vector space, though.)
 +
 +
Another question from the same student:
 +
 +
Does a set of vectors have to contain the 0 vector to be a vector space?  If that is true, then any set of vectors that are linearly independent would not be a vector space?
 +
 +
Answer from Bell:
 +
 +
Yes, a vector space must contain the zero vector
 +
(because a constant times any vector has to be in
 +
the space if the vector is, even if the constant
 +
is zero).
 +
 +
The set of all linear combinations of a bunch of
 +
vectors is a vector space, and the zero vector is
 +
in there because one of the combinations involves
 +
taking all the constants to be zero.
 +
 +
Question from a student:
 +
 +
How do I start problem 6 (reducing the matrix) on page 301? I understand that the row and column space will be the same, but I'm not sure how to get it to reduce to row-echelon form.
 +
 +
Answer from Bell:
 +
 +
You'll need to do row reduction.
 +
At some point you'll get
 +
 +
<PRE>
 +
1  1    a
 +
0  a-1  1-a
 +
0  1-a  1-a^2
 +
</PRE>
 +
 +
At this point, you'll need to consider the case  a=1.
 +
If  a=1, the matrix is all 1's and the row space is
 +
just the linear span of
 +
 +
[ 1  1  1 ].
 +
 +
If  a  is
 +
not 1, then you can divide row two by  a-1 and continue
 +
doing row reduction.  I think you'll have one more case
 +
to consider when you try to deal with the third row
 +
after that.
 +
 +
Question from student:
 +
 +
Also, what are you wanting for an answer to questions 22-25 (full proof, explanation, example, etc.)?
 +
 +
Anwer from Bell:
 +
 +
For problems 22-25, I'd want you to explain the answer
 +
so that someone who has just read 7.4 would understand
 +
you and believe what you say.  For example, the answer
 +
to #25 might start:
 +
 +
If the row vectors of a square matrix are linearly
 +
independent, then Rank(A) is equal to the number
 +
of rows.  But Rank(A)=Rank(A^T).  Etc.

Revision as of 10:48, 30 August 2010


Homework 1 collaboration area

Feel free to toss around ideas here. Feel free to form teams to toss around ideas. Feel free to create your own workspace for your own team. --Steve Bell 12:11, 20 August 2010 (UTC)

Here is my favorite formula:

$ f(a)=\frac{1}{2\pi i}\int_\gamma \frac{f(z)}{z-a}\ dz. $

Question from a student:

I have a question about P.301 #33. I see in the back of the book that it is not a vector space. I don't understand why though. In the simplest form I would think an identity matrix would satisfy the requirements mentioned in the book for #33. Isn't an identity matrix a vector space?

Answer from Bell:

One of the key elements of being a vector space is that the thing must be closed under addition. The set consisting of just the identity matrix is not a vector space because if I add the identity matrix to itself, I get a matrix with twos down the diagonal, and that isn't in the set. So it isn't closed under addition. (The set consisting of the ZERO matrix is a vector space, though.)

Another question from the same student:

Does a set of vectors have to contain the 0 vector to be a vector space? If that is true, then any set of vectors that are linearly independent would not be a vector space?

Answer from Bell:

Yes, a vector space must contain the zero vector (because a constant times any vector has to be in the space if the vector is, even if the constant is zero).

The set of all linear combinations of a bunch of vectors is a vector space, and the zero vector is in there because one of the combinations involves taking all the constants to be zero.

Question from a student:

How do I start problem 6 (reducing the matrix) on page 301? I understand that the row and column space will be the same, but I'm not sure how to get it to reduce to row-echelon form.

Answer from Bell:

You'll need to do row reduction. At some point you'll get

 1  1    a
 0  a-1  1-a
 0  1-a  1-a^2

At this point, you'll need to consider the case a=1. If a=1, the matrix is all 1's and the row space is just the linear span of

[ 1 1 1 ].

If a is not 1, then you can divide row two by a-1 and continue doing row reduction. I think you'll have one more case to consider when you try to deal with the third row after that.

Question from student:

Also, what are you wanting for an answer to questions 22-25 (full proof, explanation, example, etc.)?

Anwer from Bell:

For problems 22-25, I'd want you to explain the answer so that someone who has just read 7.4 would understand you and believe what you say. For example, the answer to #25 might start:

If the row vectors of a square matrix are linearly independent, then Rank(A) is equal to the number of rows. But Rank(A)=Rank(A^T). Etc.

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn