Prev: On e^(pi*Sqrt[58])
Next: Retiring from the fray
From: Rob Johnson on 17 Jun 2010 01:36 In article <MPG.26839c71d4b6b6ae9898e2(a)news.cc.tut.fi>, Kaba <none(a)here.com> wrote: >Rob Johnson wrote: >> In article <MPG.26838c5c17078b909898e1(a)news.cc.tut.fi>, >> Kaba <none(a)here.com> wrote: >> >Given a skew-symmetric real (n x n)-matrix A, and a real orthogonal >> >(n x n)-matrix Q, is it possible to find a real (n x n)-matrix B such >> >that: >> > >> >A = Q^T B - B^T Q >> >> Yes, there are several. For example, let U be the upper triangular >> part of A; then, B = (Q^T)^{-1}U. > >Right. Meanwhile I found: > >Q^T B can be decomposed into a sum of a skew-symmetric matrix X and >symmetric matrix S: > >Q^T B = X + S > >where >X = 0.5 [QB - (QB)^T] >S = 0.5 [QB + (QB)^T] > >Therefore >B = Q(X + S) > >Q^T B - B^T Q >= Q^T Q (X + S) - (X^T + S^T) Q^T Q >= (X + S) - (X^T + S^T) >= 2X > >For equality to hold: >A = 2X >=> >X = A / 2 > >Thus: > >A = Q^T B - B^T Q ><=> >B = Q(0.5 A + S) > >[] For any A, if A = V - V^T, then we know V up to a symmetric matrix. This is because 0 = V - V^T says precisely that V is a symmetric matrix. Thus, we know B modulo (Q^T)^{-1}S, where S is a symmetric matrix. Therefore, the general solution is B = (Q^T)^{-1}(U + S) where U is the upper-triangular part of A, and S is any symmetric matrix (S = 0 in my previous example). >This is actually related to reproving the conformal affine regression, >the one you wrote about... Please send me a link to, or copy of, your proof. Thanks. Rob Johnson <rob(a)trash.whim.org> take out the trash before replying to view any ASCII art, display article in a monospaced font
From: Kaba on 17 Jun 2010 06:52 Rob Johnson wrote: > >Thus: > > > >A = Q^T B - B^T Q > ><=> > >B = Q(0.5 A + S) > > > >[] > > For any A, if A = V - V^T, then we know V up to a symmetric matrix. > This is because 0 = V - V^T says precisely that V is a symmetric > matrix. Thus, we know B modulo (Q^T)^{-1}S, where S is a symmetric > matrix. Therefore, the general solution is > > B = (Q^T)^{-1}(U + S) > > where U is the upper-triangular part of A, and S is any symmetric > matrix (S = 0 in my previous example). Sure, the solutions are actually the same. For let 1) B = (Q^T)^{-1}(U + S) = Q(U + S) 2) B' = Q(0.5 A + S') We know: A = U - U^T Thus B' = Q(0.5 (U - U^T) + S') = Q(0.5 (U - U^T) + 0.5(U + U^T) - 0.5(U + U^T) + S') = Q(U - 0.5(U + U^T) + S') We get B' = B, when S' = S + 0.5(U + U^T). > >This is actually related to reproving the conformal affine regression, > >the one you wrote about... > > Please send me a link to, or copy of, your proof. Thanks. I will when I am ready. -- http://kaba.hilvi.org
From: Rob Johnson on 17 Jun 2010 08:29 In article <MPG.26842acf670e8f229898e5(a)news.cc.tut.fi>, Kaba <none(a)here.com> wrote: >Rob Johnson wrote: >> >Thus: >> > >> >A = Q^T B - B^T Q >> ><=> >> >B = Q(0.5 A + S) >> > >> >[] >> >> For any A, if A = V - V^T, then we know V up to a symmetric matrix. >> This is because 0 = V - V^T says precisely that V is a symmetric >> matrix. Thus, we know B modulo (Q^T)^{-1}S, where S is a symmetric >> matrix. Therefore, the general solution is >> >> B = (Q^T)^{-1}(U + S) >> >> where U is the upper-triangular part of A, and S is any symmetric >> matrix (S = 0 in my previous example). > >Sure, the solutions are actually the same. For let > >1) B = (Q^T)^{-1}(U + S) = Q(U + S) >2) B' = Q(0.5 A + S') > >We know: > >A = U - U^T > >Thus > >B' = Q(0.5 (U - U^T) + S') >= Q(0.5 (U - U^T) + 0.5(U + U^T) - 0.5(U + U^T) + S') >= Q(U - 0.5(U + U^T) + S') > >We get B' = B, when S' = S + 0.5(U + U^T). Duh! I didn't carry through a couple of things. First, when I saw that Q was orthogonal, I knew it was invertible, but I didn't use the fact that (Q^T)^{-1} = Q. Second, Since U - U^T = A, I didn't make the connection that U - A/2 = (U + U^T)/2 which is symmetric. I assumed that the two solutions were the same, but didn't carry things through. Thanks. >> >This is actually related to reproving the conformal affine regression, >> >the one you wrote about... >> >> Please send me a link to, or copy of, your proof. Thanks. > >I will when I am ready. I guess that's better than getting an incomplete proof ;-) Rob Johnson <rob(a)trash.whim.org> take out the trash before replying to view any ASCII art, display article in a monospaced font
From: Kaba on 17 Jun 2010 10:54 Rob Johnson wrote: > >> >This is actually related to reproving the conformal affine regression, > >> >the one you wrote about... > >> > >> Please send me a link to, or copy of, your proof. Thanks. > > > >I will when I am ready. > > I guess that's better than getting an incomplete proof ;-) It is:) -- http://kaba.hilvi.org
From: Kaba on 17 Jun 2010 22:38
Rob Johnson wrote: > >> >This is actually related to reproving the conformal affine regression, > >> >the one you wrote about... > >> > >> Please send me a link to, or copy of, your proof. Thanks. > > > >I will when I am ready. > > I guess that's better than getting an incomplete proof ;-) Ok, I now have a complete proof/derivation, check the recently started threads! -- http://kaba.hilvi.org |