Revision as of 11:30, 8 September 2013 by Jayling (Talk | contribs)


Homework 3 collaboration area

MA527 Fall 2013


Question from James Down Under (Jayling):

For Page 329 Question 11. Am I meant to calculate all eigenvalues and eigenvectors or just calculate the eigenvector corresponding to the given eigenvalue of 3?

Answer from Steve Bell :

Yes, you are only supposed to find the eigenvector for lambda=3. (The idea here is to spare you from finding the roots of a rather nasty 3rd degree polynomial.)

Jayling: thanks Steve, I did try the hard way first but then started to drown in the algebra.


Question from a student:

Let 3x+4y+2z = 0; 2x+5z= 0 be the system for which I have to find the basis.

When Row Reduced the above system gives [ 1 0 2.5 0 ; 0 1 -1.375 0].

Rank = no of non zero rows = 2 => Dim(rowspace) = 2 ; Nullity = # free variables = 1

Q1: Aren't [ 1 0 2.5] and [0 1 -1.375] called the basis of the system?

A1 from Steve Bell:

Those two vectors form a basis for the ROW SPACE.

The solution space is only 1 dimensional (since the number of free variables is only 1).

Q2: Why is that we get a basis by considering the free variable as some "parameter" and reducing further(and get 1 vector in this case). Isn't that the solution of the system?

A2 from Steve Bell :

If the system row reduces to

[ 1 0  2.5   0 ]
[ 0 1 -1.375 0 ]

then z is the free variable. Let it be t. The top equation gives

x = -2.5 t

and the second equation gives

y = 1.375 t

and of course,

z = t.

So the general solution is

[ x ]   [ -2.5   ]
[ y ] = [  1.375 ] t
[ z ]   [  1     ]

Thus, you can find the solution from the row echelon matrix, but I wouldn't say that you can read it off from there -- not without practice, at least.


Question from a student :

On problem 11, I swapped rows 1 and 2 during row reduction and my final solution has x1 and x2 swapped. Do I need to swap back any row swaps or did I make a mistake along the way? Tlouvar

Eun Young discussed this issue here in a way that is slightly beyond the scope of our course, so I've moved it to here:

Remark from Eun Young

Remark from Steve Bell :

Step 1: Find the eigenvalues from det(A - lambda I)=0.

Step 2: Choose an eigenvalue lambda and plug it into the system

(A - lambda I) a = 0

and solve the system for the eigenvector a. Swapping rows does not change the answer, so you are safe here.

Sometimes you might think you are swapping entries of a vector when you are really multiplying by -1. For example , if [1, -1] is an eigenvector, so is [-1, 1].



Question from Dalec

For #2 on page 351, I found my spectrum to be lambda = 2i , and -i. For the case where lambda = 2i , I am trying to find the eigenvectors, and I get a matrix

[ -i 1+i  |   0]
[ -1+i  -2i  |   0]

Is there a way to get a 0 in the bottom left, or is this simply overcontrained?

- Chris

Suggestions from Shawn Whitman

In one step: multiply row 1 by (1+i) and add to row 2.

In two easier steps: Multiply row 1 by i,

[1, (-1+i)]

[(-1+i), -2i]

then multiply row 1 by (1-i) and add to row 2.

[1, (-1+i)]

[0, 0]


I have questions about determinants. For a homogeneous systems, for non-zero determinants we have only the trivial solution while for zero determinant we have infinitely many solutions. For non-homogeneous system, when the determinant is non-zero we have exactly one solution. 1. What will happen if a non-homogeneous system has zero determinant? 2. From the determinant of a non-homogeneous system can we know when the system doesn't have any solution?

- Farhan

Suggestion from Ryan Russon

Here is what I understand:

For question 1) If we are thinking of a system of equations, then by looking at the determinant, we are only looking at the left-hand side (LHS) of the system. If the determinant of that system is zero, it means that one or more of those equations are dependent on the others. Said differently, one or more of those expressions can be put together by combining the other expressions from the LHS of the system. This also means that any non-homogenous system formed from the components of the LHS expressions may have more than one way to be combined to get the desired solution (i.e. $ \bar{x} $ is not unique). Now if the expanded system looks like:

[1  4  1  | 4] 
[0  2  0  | 1] 
[0  0  0  | 3] 

where you have a statement that "0=3" this is obviously a bad system.

For 2) From the determinant alone, it would not be possible to determine if the system has no solutions. If it is zero, it may have infinitely many or it may be an undetermined system.

Please others chime in and correct me if I am flawed in my thinking.




Question from Ryan Russon:

About p. 338, #3,6, and 8, are we supposed to be finding eigenvectors here? I noticed that they put them in the back of the book, although it only asks to find the spectrum of each, which was defined as the set of eigenvalues in 8.1? I understand that we are using Thms 1-5 to prove our results and it seems like #3 doesn't require finding eigenvectors to prove that it isn't any of the listed matrices. I hope I am not way off-base here. Thanks!

Follow-up question: On p. 338, #6 Are we only to consider $ A \in \mathbb{R}^{n \times n} $ or are we to consider complex matrices as well? Thanks again!


Response from Jake Eppehimer:

I found that #8 is orthogonal, according to theorem 5. It took quite a bit of manipulation with trig identities, but I believe my answer is reasonable. For number 6, I am not exactly sure how to find the eigenvalues. I am considering substituting a couple prime numbers for k and a, but I am unsure if that is the correct way to do it. It doesn't say anything about eigenvectors, and you don't need them to determine what kind of matrix it is.

Response from Ryan Russon

Thanks Jake. I found the eigenvectors for #6 to be: $ \lambda_{1} = a , \lambda_{2} = a+k , \lambda_{3} = a-k $ by using a cofactor expansion which wasn't too bad. And I think I am a little brain dead today as I can answer my own follow-up: We are obviously not considering $ A \in \mathbb{C}^{n \times n} $ because we are talking about 'symmetric, skew-symmetric, and orthogonal' matrices which are only classes of real-valued matrices.

Response from Jayling: Ryan I was confused with the definition of Specturm also. But Steve did state in the last lecture that it was the set of all eigenvalues of A. Also I found via the index in the text that Spectrum is indeed this. In summary no need to calculate eigenvectors if the question is asking for the spectrum.






Back to MA527, Fall 2013

Alumni Liaison

has a message for current ECE438 students.

Sean Hu, ECE PhD 2009