Prev: THE HONOR OF YOUR GREAT MAHEMATICIAN PROF ESCULTURA / MATHEMATICS MONUMENT
Next: Density of Q in R
From: JEMebius on 4 Jun 2010 17:18 Kaba wrote: > Hi, > > This is part of the first question for chapter 1 of "Applied numerical > linear algebra" book, but I just can't come up with a solution: > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > A + B is singular. > > Any hints? > Beautiful theorem - I guess there exists a great purely geometrical proof. I think that the book mentioned may be great and interesting too. Please could you tell who are the author and the publisher? Thanks in advance - Johan E. Mebius
From: Kaba on 4 Jun 2010 17:04 JEMebius wrote: > Kaba wrote: > > Hi, > > > > This is part of the first question for chapter 1 of "Applied numerical > > linear algebra" book, but I just can't come up with a solution: > > > > If A and B are orthogonal matrices, and det(A) = -det(B), show that > > A + B is singular. > > > > Any hints? > > > > > Beautiful theorem - I guess there exists a great purely geometrical proof. > I think that the book mentioned may be great and interesting too. > Please could you tell who are the author and the publisher? Hi, it is: "Applied Numerical Linear Algebra", James W. Demmel: http://www.amazon.com/Applied-Numerical-Linear-Algebra- Demmel/dp/0898713897 Our university uses it as a course book for a course in numerical linear algebra. -- http://kaba.hilvi.org
|
Pages: 1 Prev: THE HONOR OF YOUR GREAT MAHEMATICIAN PROF ESCULTURA / MATHEMATICS MONUMENT Next: Density of Q in R |