Prev: THE PROBLEM WITH THE NATURAL PHILOSOPHY ALLIANCE
Next: Maximum of a function of one variable over a region
From: Kaba on 4 Jun 2010 16:10 achille wrote: > On Jun 4, 9:48 pm, Kaba <n...(a)here.com> wrote: > > Hi, > > > > This is part of the first question for chapter 1 of "Applied numerical > > linear algebra" book, but I just can't come up with a solution: > > > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > > A + B is singular. > > > > Any hints? > > > > --http://kaba.hilvi.org > > Hint: A^t (A+B) B^t = B^t + A^t and take det(.) on both sides. Well, that's straightforward, thanks:) I am not sure if this is a best kind of exercise, at least when separated from any context. It seems this is just a trick, with no deeper lesson to learn. But maybe it is used somewhere in the next pages. -- http://kaba.hilvi.org
From: Stephen Montgomery-Smith on 5 Jun 2010 01:10 Kaba wrote: > Hi, > > This is part of the first question for chapter 1 of "Applied numerical > linear algebra" book, but I just can't come up with a solution: > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > A + B is singular. > > Any hints? > Here is my attempt. If A is orthogonal, then the determinant is either 1 or minus 1. WLOG det(A) = 1. Now A+B = A(I + A^{-1} B) so WLOG A=I, and B is an orthogonal matrix whose determinant is -1. The eigenvalues of B are complex numbers with absolute value 1. The complex roots come on conjugate pairs, so the product of those is 1. Therefore B must have at least one eigenvalue equal to -1. Hence I+B is singular.
From: Ostap Bender on 5 Jun 2010 01:22 On Jun 4, 6:48 am, Kaba <n...(a)here.com> wrote: > Hi, > > This is part of the first question for chapter 1 of "Applied numerical > linear algebra" book, but I just can't come up with a solution: > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > A + B is singular. Look up the theorem that says that a square matrix X is singular iff det(X) = 0. But det (A + B) = det A + det B = - det(B) + det(B) = 0
From: Ostap Bender on 5 Jun 2010 01:26 On Jun 4, 10:22 pm, Ostap Bender <ostap_bender_1...(a)hotmail.com> wrote: > On Jun 4, 6:48 am, Kaba <n...(a)here.com> wrote: > > > Hi, > > > This is part of the first question for chapter 1 of "Applied numerical > > linear algebra" book, but I just can't come up with a solution: > > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > > A + B is singular. > > Look up the theorem that says that a square matrix X is singular iff > det(X) = 0. > > But det (A + B) = det A + det B = - det(B) + det(B) = 0 Never mind: determinant is not a linear function. Sorry.
From: Kaba on 5 Jun 2010 07:23 Stephen Montgomery-Smith wrote: > Kaba wrote: > > Hi, > > > > This is part of the first question for chapter 1 of "Applied numerical > > linear algebra" book, but I just can't come up with a solution: > > > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > > A + B is singular. > > > > Any hints? > > > > Here is my attempt. If A is orthogonal, then the determinant is either > 1 or minus 1. WLOG det(A) = 1. Now > > A+B = A(I + A^{-1} B) > > so WLOG A=I, and B is an orthogonal matrix whose determinant is -1. Could you be more specific why WLOG here? -- http://kaba.hilvi.org
First
|
Prev
|
Next
|
Last
Pages: 1 2 3 4 5 Prev: THE PROBLEM WITH THE NATURAL PHILOSOPHY ALLIANCE Next: Maximum of a function of one variable over a region |