From: Schoenfeld on 26 Feb 2010 19:12 On Feb 26, 4:55 pm, mpalenik <markpale...(a)gmail.com> wrote: > On Feb 25, 8:26 pm, Schoenfeld <schoenfeld.fore...(a)gmail.com> wrote: > > > > > On Feb 25, 11:39 pm, mpalenik <markpale...(a)gmail.com> wrote: > > > > On Feb 25, 2:57 am, Schoenfeld <schoenfeld.fore...(a)gmail.com> wrote: > > > > > The statement A.B = INVARIANT means A.B = A'.B' which necessarily > > > > means B is a vector as it can be transformed to B'. > > > > I just want to emphasize this one more time, the way the problem in > > > Jackson is phrased, the existence of B' does not on its own mean that > > > B is a 4-vector. B' is constructed by frame dependent measurements, > > > in this case (w, kx, ky, kz). In this case B *is* a 4-vector but it > > > could easily have been something like (E, x, y, m) which does not > > > transform as a 4-vector (i.e. (E', x', y', m) is not (E, x, y, m) > > > under a coordinate transformation). We are using the invariance of > > > A*B to PROVE that B transforms as a 4-vector. > > > > I agree that this is completely valid but the OP's question was "is > > > the only possible way that A*B can be invariant if B is a 4-vector." > > > The answer is yes--but although A*B DOES give an invariant when B is a > > > 4-vector, it is not clear by this definition that the ONLY way for A*B > > > to be invariant is for B to be a 4-vector. > > > Okay. So reading all that, I look at question as: > > > QUESTION > > "If A.B = INVARIANT SCALAR and A is an invariant vector, prove B MUST > > be an invariant vector". By invariant vector, we mean covariant or > > contravariant which is acceptable usage of word invariant. > > > PROOF > > > We know that A.B = A'.B', so in tensor form > > > [1] A'^r B'_r = A^r B_r > > > We chose A as contravariant, so now we must show that given [1], B is > > covariant for all B. > > > So we know that A' is contravariant a priori > > > [2] A'^r = A^j (@x'^r/@x^j) > > > Substitute [2] into LHS of [1] > > > [3] A'^r B'_r = A^j (@x'^r/@x^j) B'_r > > > Substituting covariant B'_r = B_j (@x^j/@x'r) in RHS of [3] > > > [4] A'^r B'_r = A^j (@x'^r/@x^j) B_j (@x^j/@x'r) > > > Simplifying > > > [5] A'^r B'_r = A^j B_j KroneckaDelta^r_r > > > Thus recover equation [1] > > > [6] A'^r B'_r = A^j B_j > > > [7] So in step 4 we assumed B was covariant we recovered equation [1] > > as the jacobian elements cancelled out to a contraction of the > > KroneckaDelta tensor. > > > The question is now was the ONLY POSSIBLE way to recover [1] the > > subsitution made in [4]? And yes, it is based on the uniqueness of the > > inverse. > > > For example > > > A*B = A*B*10*C implies that C = 1/10 and only 1/10 > > > So observe that > > > A^j B_j = A^j B_j (@x'^r/@x^j) C^j_r > > > Uniquely implies 'C' cancel out the previous term such that > > > A'^r B'_r = A^j B_j (@x'^r/@x^j) (@x^j/@x'r) = A^j B_j > > > But rather than cancelling it, just rewrite it as > > > A'^r B'_r = A^j (@x'^r/@x^j) B_j (@x^j/@x'r) > > > and you can clearly see that given [1], B' is UNIQUELY > > > B'_r = B_j (@x^j/@x'r) > > > which is covariant. > > > QED > > > If you chose A_r as covariant then the above would show B is uniquely > > contravariant.- Hide quoted text - > > > - Show quoted text - > > And hence we have a more general version of the exact same thing I > wrote several posts ago, where I used the Lorentz matrices (written as > L) instead of general Jacobian matrices, since the problem dealt > specifically with flat, Minkowski spacetime. Actually you wrote it more concisely, whereas I kind of fumbled around until I got it. Still though, Tensors are preferred for me as they are easier to work with in the end (takes longer but fewer operations) and the results apply generally.
From: mpalenik on 26 Feb 2010 19:51 On Feb 26, 7:12 pm, Schoenfeld <schoenfeld.fore...(a)gmail.com> wrote: > On Feb 26, 4:55 pm, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > On Feb 25, 8:26 pm, Schoenfeld <schoenfeld.fore...(a)gmail.com> wrote: > > > > On Feb 25, 11:39 pm, mpalenik <markpale...(a)gmail.com> wrote: > > > > > On Feb 25, 2:57 am, Schoenfeld <schoenfeld.fore...(a)gmail.com> wrote: > > > > > > The statement A.B = INVARIANT means A.B = A'.B' which necessarily > > > > > means B is a vector as it can be transformed to B'. > > > > > I just want to emphasize this one more time, the way the problem in > > > > Jackson is phrased, the existence of B' does not on its own mean that > > > > B is a 4-vector. B' is constructed by frame dependent measurements, > > > > in this case (w, kx, ky, kz). In this case B *is* a 4-vector but it > > > > could easily have been something like (E, x, y, m) which does not > > > > transform as a 4-vector (i.e. (E', x', y', m) is not (E, x, y, m) > > > > under a coordinate transformation). We are using the invariance of > > > > A*B to PROVE that B transforms as a 4-vector. > > > > > I agree that this is completely valid but the OP's question was "is > > > > the only possible way that A*B can be invariant if B is a 4-vector." > > > > The answer is yes--but although A*B DOES give an invariant when B is a > > > > 4-vector, it is not clear by this definition that the ONLY way for A*B > > > > to be invariant is for B to be a 4-vector. > > > > Okay. So reading all that, I look at question as: > > > > QUESTION > > > "If A.B = INVARIANT SCALAR and A is an invariant vector, prove B MUST > > > be an invariant vector". By invariant vector, we mean covariant or > > > contravariant which is acceptable usage of word invariant. > > > > PROOF > > > > We know that A.B = A'.B', so in tensor form > > > > [1] A'^r B'_r = A^r B_r > > > > We chose A as contravariant, so now we must show that given [1], B is > > > covariant for all B. > > > > So we know that A' is contravariant a priori > > > > [2] A'^r = A^j (@x'^r/@x^j) > > > > Substitute [2] into LHS of [1] > > > > [3] A'^r B'_r = A^j (@x'^r/@x^j) B'_r > > > > Substituting covariant B'_r = B_j (@x^j/@x'r) in RHS of [3] > > > > [4] A'^r B'_r = A^j (@x'^r/@x^j) B_j (@x^j/@x'r) > > > > Simplifying > > > > [5] A'^r B'_r = A^j B_j KroneckaDelta^r_r > > > > Thus recover equation [1] > > > > [6] A'^r B'_r = A^j B_j > > > > [7] So in step 4 we assumed B was covariant we recovered equation [1] > > > as the jacobian elements cancelled out to a contraction of the > > > KroneckaDelta tensor. > > > > The question is now was the ONLY POSSIBLE way to recover [1] the > > > subsitution made in [4]? And yes, it is based on the uniqueness of the > > > inverse. > > > > For example > > > > A*B = A*B*10*C implies that C = 1/10 and only 1/10 > > > > So observe that > > > > A^j B_j = A^j B_j (@x'^r/@x^j) C^j_r > > > > Uniquely implies 'C' cancel out the previous term such that > > > > A'^r B'_r = A^j B_j (@x'^r/@x^j) (@x^j/@x'r) = A^j B_j > > > > But rather than cancelling it, just rewrite it as > > > > A'^r B'_r = A^j (@x'^r/@x^j) B_j (@x^j/@x'r) > > > > and you can clearly see that given [1], B' is UNIQUELY > > > > B'_r = B_j (@x^j/@x'r) > > > > which is covariant. > > > > QED > > > > If you chose A_r as covariant then the above would show B is uniquely > > > contravariant.- Hide quoted text - > > > > - Show quoted text - > > > And hence we have a more general version of the exact same thing I > > wrote several posts ago, where I used the Lorentz matrices (written as > > L) instead of general Jacobian matrices, since the problem dealt > > specifically with flat, Minkowski spacetime. > > Actually you wrote it more concisely, whereas I kind of fumbled around > until I got it. Still though, Tensors are preferred for me as they are > easier to work with in the end (takes longer but fewer operations) and > the results apply generally.- Hide quoted text - > > - Show quoted text - Fair enough. I like using tensors too, but I don't like writing out the indices on an ASCII newsgroup. Also, I think it actually does make things more clear when you can see which indices match up with each other.
From: blackhead on 1 Mar 2010 20:22 On 18 Feb, 21:08, mpalenik <markpale...(a)gmail.com> wrote: > On Feb 18, 6:46 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > On 17 Feb, 15:39, mpalenik <markpale...(a)gmail.com> wrote: > > > > On Feb 17, 10:02 am, mpalenik <markpale...(a)gmail.com> wrote: > > > > > On Feb 17, 8:52 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > On 17 Feb, 12:47, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > On Feb 17, 6:10 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > On 16 Feb, 22:57, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > > > On Feb 16, 5:32 pm, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > > > The scalar product of 2 4-vectors is an invariant. However, Page 530 > > > > > > > > > of Jackson's Electrodynamics makes the claim that because the phase of > > > > > > > > > a wave is an invariant and given by the scalar product of a 4 vector > > > > > > > > > with (w/c, K), then the latter is a 4 vector. > > > > > > > > > > Is this generally true? > > > > > > > > > > . > > > > > > > > > Yes, it is. The inner product of two 4 vectors is a scalar, which > > > > > > > > should be invariant in any frame. Typically, you would show that A*A > > > > > > > > is invariant in any frame but it suffices to show that it's invariant > > > > > > > > when you take the product with another 4 vector. > > > > > > > > But if the scalar product of a 4 vector with 4 numbers is a scalar, > > > > > > > does that imply those 4 numbers are the components of a 4 vector? > > > > > > > You can arrange any 4 numbers into a vector and get a scalar when you > > > > > > take the scalar product with a 4 vector. That's why it's called a > > > > > > scalar product. > > > > > > Suppose these 4 numbers transform in a certain way under a coordinate > > > > > transformation, so that their scalar product with a 4-vector is a > > > > > scalar invariant. Must the 4 numbers transform as a 4-vector? > > > > > Yes, that's what I was trying to say in the first response. > > > > If you want a more complete answer, let's say we know A is a 4 vector > > > but we're unsure about B. > > > > in the first frame, we have A*B > > > > Now, let's transform into another frame, where we have A' and B' > > > > A transforms like a 4 vector, so we know that A' = LA > > > > So LA*B' = A*B = (L*L^-1)A*B > > > Seems clear to me. > > > > from this, we can see B' = L^-1B > > > You've lost me. > > > > In order to take a an inner product, if A is a vector, B must be a one- > > > form (or vector and co-vector, or covariant and contravariant vectors, > > > whatever terminology you use). The transformation rule for one-forms > > > is B' = (L^-1)B > > > > Therefore, B transforms like a one-form. So it is a 4-vector as well.- Hide quoted text - > > > > - Show quoted text -- Hide quoted text - > > > - Show quoted text -- Hide quoted text - > > > - Show quoted text - > > Well, since A' = LA and A*B = LA*B', then LA*B' = (L*L^-1)*A*B = > (L^-1)A'*B = A'*B' OK >so B' = (L^-1)B- Hide quoted text - And this is where I feel stupid, because I don't see the logical step. Are you using some property of the LT other than L^-1 L(A) = A? If not, then it seems your proof applies to any transformation that has an inverse. > - Show quoted text -
From: mpalenik on 1 Mar 2010 20:48 On Mar 1, 8:22 pm, blackhead <larryhar...(a)softhome.net> wrote: > On 18 Feb, 21:08, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > On Feb 18, 6:46 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > On 17 Feb, 15:39, mpalenik <markpale...(a)gmail.com> wrote: > > > > > On Feb 17, 10:02 am, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > On Feb 17, 8:52 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > On 17 Feb, 12:47, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > > On Feb 17, 6:10 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > > On 16 Feb, 22:57, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > > > > On Feb 16, 5:32 pm, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > > > > The scalar product of 2 4-vectors is an invariant. However, Page 530 > > > > > > > > > > of Jackson's Electrodynamics makes the claim that because the phase of > > > > > > > > > > a wave is an invariant and given by the scalar product of a 4 vector > > > > > > > > > > with (w/c, K), then the latter is a 4 vector. > > > > > > > > > > > Is this generally true? > > > > > > > > > > > . > > > > > > > > > > Yes, it is. The inner product of two 4 vectors is a scalar, which > > > > > > > > > should be invariant in any frame. Typically, you would show that A*A > > > > > > > > > is invariant in any frame but it suffices to show that it's invariant > > > > > > > > > when you take the product with another 4 vector. > > > > > > > > > But if the scalar product of a 4 vector with 4 numbers is a scalar, > > > > > > > > does that imply those 4 numbers are the components of a 4 vector? > > > > > > > > You can arrange any 4 numbers into a vector and get a scalar when you > > > > > > > take the scalar product with a 4 vector. That's why it's called a > > > > > > > scalar product. > > > > > > > Suppose these 4 numbers transform in a certain way under a coordinate > > > > > > transformation, so that their scalar product with a 4-vector is a > > > > > > scalar invariant. Must the 4 numbers transform as a 4-vector? > > > > > > Yes, that's what I was trying to say in the first response. > > > > > If you want a more complete answer, let's say we know A is a 4 vector > > > > but we're unsure about B. > > > > > in the first frame, we have A*B > > > > > Now, let's transform into another frame, where we have A' and B' > > > > > A transforms like a 4 vector, so we know that A' = LA > > > > > So LA*B' = A*B = (L*L^-1)A*B > > > > Seems clear to me. > > > > > from this, we can see B' = L^-1B > > > > You've lost me. > > > > > In order to take a an inner product, if A is a vector, B must be a one- > > > > form (or vector and co-vector, or covariant and contravariant vectors, > > > > whatever terminology you use). The transformation rule for one-forms > > > > is B' = (L^-1)B > > > > > Therefore, B transforms like a one-form. So it is a 4-vector as well.- Hide quoted text - > > > > > - Show quoted text -- Hide quoted text - > > > > - Show quoted text -- Hide quoted text - > > > > - Show quoted text - > > > Well, since A' = LA and A*B = LA*B', then LA*B' = (L*L^-1)*A*B = > > (L^-1)A'*B = A'*B' > > OK > > >so B' = (L^-1)B- Hide quoted text - > > And this is where I feel stupid, because I don't see the logical step. > Are you using some property of the LT other than L^-1 L(A) = A? > > If not, then it seems your proof applies to any transformation that > has an inverse. > > It does, although I wasn't thinking of that at the time. The transformation matrices for one forms are the inverse of those for vectors, so this isn't unique to the Lorentz matrices. If L is the transformation for a vector, L^-1 will always be the transformation matrix for one-forms. I was using the fact that we know this is true for Lorentz matrices, but it is not specific to them.
From: Schoenfeld on 2 Mar 2010 10:11
On Mar 2, 11:22 am, blackhead <larryhar...(a)softhome.net> wrote: > On 18 Feb, 21:08, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > On Feb 18, 6:46 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > On 17 Feb, 15:39, mpalenik <markpale...(a)gmail.com> wrote: > > > > > On Feb 17, 10:02 am, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > On Feb 17, 8:52 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > On 17 Feb, 12:47, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > > On Feb 17, 6:10 am, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > > On 16 Feb, 22:57, mpalenik <markpale...(a)gmail.com> wrote: > > > > > > > > > > On Feb 16, 5:32 pm, blackhead <larryhar...(a)softhome.net> wrote: > > > > > > > > > > > The scalar product of 2 4-vectors is an invariant. However, Page 530 > > > > > > > > > > of Jackson's Electrodynamics makes the claim that because the phase of > > > > > > > > > > a wave is an invariant and given by the scalar product of a 4 vector > > > > > > > > > > with (w/c, K), then the latter is a 4 vector. > > > > > > > > > > > Is this generally true? > > > > > > > > > > > . > > > > > > > > > > Yes, it is. The inner product of two 4 vectors is a scalar, which > > > > > > > > > should be invariant in any frame. Typically, you would show that A*A > > > > > > > > > is invariant in any frame but it suffices to show that it's invariant > > > > > > > > > when you take the product with another 4 vector. > > > > > > > > > But if the scalar product of a 4 vector with 4 numbers is a scalar, > > > > > > > > does that imply those 4 numbers are the components of a 4 vector? > > > > > > > > You can arrange any 4 numbers into a vector and get a scalar when you > > > > > > > take the scalar product with a 4 vector. That's why it's called a > > > > > > > scalar product. > > > > > > > Suppose these 4 numbers transform in a certain way under a coordinate > > > > > > transformation, so that their scalar product with a 4-vector is a > > > > > > scalar invariant. Must the 4 numbers transform as a 4-vector? > > > > > > Yes, that's what I was trying to say in the first response. > > > > > If you want a more complete answer, let's say we know A is a 4 vector > > > > but we're unsure about B. > > > > > in the first frame, we have A*B > > > > > Now, let's transform into another frame, where we have A' and B' > > > > > A transforms like a 4 vector, so we know that A' = LA > > > > > So LA*B' = A*B = (L*L^-1)A*B > > > > Seems clear to me. > > > > > from this, we can see B' = L^-1B > > > > You've lost me. > > > > > In order to take a an inner product, if A is a vector, B must be a one- > > > > form (or vector and co-vector, or covariant and contravariant vectors, > > > > whatever terminology you use). The transformation rule for one-forms > > > > is B' = (L^-1)B > > > > > Therefore, B transforms like a one-form. So it is a 4-vector as well.- Hide quoted text - > > > > > - Show quoted text -- Hide quoted text - > > > > - Show quoted text -- Hide quoted text - > > > > - Show quoted text - > > > Well, since A' = LA and A*B = LA*B', then LA*B' = (L*L^-1)*A*B = > > (L^-1)A'*B = A'*B' > > OK > > >so B' = (L^-1)B- Hide quoted text - > > And this is where I feel stupid, because I don't see the logical step. > Are you using some property of the LT other than L^-1 L(A) = A? You only need to know that [1] A'.B' = A.B and [2] A' = TA Start with [3] A'.B' = A'.B' Substitute [2] into the RHS of [3] to get [4] A'.B' = (TA) . B' Which implies that [5] B' = T^-1 B since subsitution of [5] into the RHS of [4] gives [1] [6] A'.B' = (TA).(T^-1 B) = (T T^-1) A.B = A.B Also, since a coordinate transformation is defined as [7] T is a differentiable bijective function it means [8] T^-1 exists and is unique thus making [5] the unique solution. > If not, then it seems your proof applies to any transformation that > has an inverse. A coordinate transformation is simply a differentiable bijective function. By definition, every coordinate transformation T has a unique inverse transformation T^-1. |