thomas5267
  • thomas5267
Can someone explain this proof? http://math.stackexchange.com/questions/554957/center-of-the-orthogonal-group-and-special-orthogonal-group
Mathematics
  • Stacey Warren - Expert brainly.com
Hey! We 've verified this expert answer for you, click below to unlock the details :)
SOLVED
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.
katieb
  • katieb
I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!
Kainui
  • Kainui
Well I guess you answered your own question, haha
thomas5267
  • thomas5267
Why does \(A=EAE^{-1}\) imply all the diagonal entries are the same?
thomas5267
  • thomas5267
In fact, what does it mean to right multiply a elementary matrix with two rows switched?

Looking for something else?

Not the answer you are looking for? Search for more explanations.

More answers

thomas5267
  • thomas5267
\(E^T=E^{-1}\) for all row switching elementary matrix if that helps.
Kainui
  • Kainui
Well it looks like you're really you're just taking the identity matrix and flipping two rows or columns. So when you transpose a matrix where you've flipped two columns it's the same as if you flipped two rows now. The difference between left or right multiplying is I believe right multiplying will flip the columns and left multiplying will flip rows. I haven't really played with this sort of stuff in a while but intuitively I'm familiar with the concept of what you're trying to prove since you can geometrically just think of these as being rotation matrices. But that's not quite a proof I understand.
thomas5267
  • thomas5267
\[ E=\begin{bmatrix} 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix}\\ A=\begin{bmatrix} a_{1,1} & a_{1,2} & a_{1,3} & a_{1,4} \\ a_{2,1} & a_{2,2} & a_{2,3} & a_{2,4} \\ a_{3,1} & a_{3,2} & a_{3,3} & a_{3,4} \\ a_{4,1} & a_{4,2} & a_{4,3} & a_{4,4} \end{bmatrix}\\ A=EAE^{-1}=EAE=\begin{bmatrix} a_{3,3} & a_{3,2} & a_{3,1} & a_{3,4} \\ a_{2,3} & a_{2,2} & a_{2,1} & a_{2,4} \\ a_{1,3} & a_{1,2} & a_{1,1} & a_{1,4} \\ a_{4,3} & a_{4,2} & a_{4,1} & a_{4,4} \\ \end{bmatrix} \] The diagonals are permuted.
thomas5267
  • thomas5267
EAE is equivalent to exchanging row 1 and row 3 then exchange column 1 and column 3 in this case.
Kainui
  • Kainui
Right or you could say that the other way around, that it's exchanging column 1 and 3 and then exchanging row 1 and 3 since matrix multiplication is associative.
ganeshie8
  • ganeshie8
By considering \(AD=DA\), it is established first that \(A\) is a diagonal matrix : \[A=\begin{bmatrix} a&0\\0&d\end{bmatrix}\] Next conjugate this with any one row switching matrix to further establish that the diagonal entries must be same
thomas5267
  • thomas5267
Okay. Let \(D_k\) be the matrix with -1 on \(\left[D\right]_{kk}\) and otherwise same as the identity matrix. Note that \(D_k=D_k^{-1}=D_k^T\). Conjugation of \(A\) by \(D_k\) multiplies the column k of \(A\) by -1 and row k by -1. It leaves the diagonal unchanged. For example. \[ D_2=\begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix}\\ A=D_2AD_2^{-1}=D_2AD_2=\begin{bmatrix}a_{1,1} & -a_{1,2} & a_{1,3} & a_{1,4} \\ -a_{2,1} & a_{2,2} & -a_{2,3} & -a_{2,4} \\ a_{3,1} & -a_{3,2} & a_{3,3} & a_{3,4} \\ a_{4,1} & -a_{4,2} & a_{4,3} & a_{4,4} \end{bmatrix} \] By comparing the entries, we can see that \(a_{2,1}=-a_{2,1}=0\) and so on. By considering all the possible \(D_k\), we can see that all entries outside the diagonal are zero.
thomas5267
  • thomas5267
I think I understand why he only considers the row switching elementary matrices and matrices with -1 on diagonal now. Any invertible matrix are generated by a chain of elementary matrices and these two kind of elementary matrices are the only two satisfying det(E)=1 or -1 and \(E^T=E^{-1}\).
thomas5267
  • thomas5267
Those are the requirements for matrices in the orthogonal group.
ganeshie8
  • ganeshie8
Yes, since the inverse of a row switching matrix is itself, we may try this If \(EA\) switches the rows \(i,j\), then it changes four elements in \(A\) : \(A_{ij} = A_{jj}\tag{1}\) \(A_{ji} = A_{ii}\tag{2}\) \(A_{ii} = A_{jj}=0\) Also,\(AE\) switches the columns \(i,j\), so it changes four elements in \(A\) : \(A_{ij} = A_{ii}\tag{1'}\) \(A_{ji} = A_{jj}\tag{2'}\) \(A_{ii} = A_{jj}=0\) from \((1)\) and \((1')\) it follows that \(A_{jj} = A_{ii}\) because we want \(AE=EA\) for the center
thomas5267
  • thomas5267
If such is the case, wouldn't we end up with a zero matrix? That seems wrong as the center of O(n) certainly contains the identity matrix.
ganeshie8
  • ganeshie8
How would you end up with 0 ?
ganeshie8
  • ganeshie8
Notice that \(i\) and \(j\) are specific rows/columns that are affected by \(E\)
thomas5267
  • thomas5267
But if \(A_{ii}=A_{jj}=0\) we will certainly have a matrix with 0 on the diagonals right?
ganeshie8
  • ganeshie8
i, j are specific two rows that are affected by doing EA.
ganeshie8
  • ganeshie8
consider a 2x2 example, we must have \(AE-EA = 0\) for \(A\) to be an element in center : http://www.wolframalpha.com/input/?i=%7B%7Ba%2C0%7D%2C%7B0%2Cd%7D%7D*%7B%7B0%2C1%7D%2C%7B1%2C0%7D%7D-%7B%7B0%2C1%7D%2C%7B1%2C0%7D%7D*%7B%7Ba%2C0%7D%2C%7B0%2Cd%7D%7D as expected, we must have \(a=d\)
thomas5267
  • thomas5267
I should interpret \(A_{ii}=A_{jj}=0\) as \((EA)_{ii}=(EA)_{jj}=0\) right?
ganeshie8
  • ganeshie8
Nope, as I said, \(i,j\) are the specific two rows in \(A\) that are affected by doing \(EA\) \(EA\) swaps two particular rows in \(A\), for example, \(i=2, j=3\)
thomas5267
  • thomas5267
What exactly do you mean by \(A_{ii}=A_{jj}=0\)? Clearly you don't mean by \(A_{22}=A_{33}=0\) as that would mean the diagonals are zero?
ganeshie8
  • ganeshie8
I mean exactly that, by doing \(EA\), the diagonal elements in rows \(2,3\) become \(0\)
ganeshie8
  • ganeshie8
ofcourse \(i\ne j\)
ganeshie8
  • ganeshie8
http://www.wolframalpha.com/input/?i=%7B%7B1%2C0%2C0%7D%2C%7B0%2C0%2C1%7D%2C%7B0%2C1%2C0%7D%7D*%7B%7Ba%2C0%2C0%7D%2C%7B0%2Cb%2C0%7D%2C%7B0%2C0%2Cc%7D%7D
thomas5267
  • thomas5267
So by referring to \(A_{ii}=A_{jj}=0\) you are referring to the elements of EA instead of A itself? That is absolutely confusing. So the equality are more like assignment operators in computer programming?
ganeshie8
  • ganeshie8
Yes sorry about that, i can't think of a better notation at the moment because I also don't know much about groups
thomas5267
  • thomas5267
Okay I get it now. As \(A_{ii}=A_{jj}\) for all i and j and 0 otherwise, it must be a multiple of identity matrix. Moreover, as A is required to be in the orthogonal group, for which all element have determinant 1 or -1, so \(A_{ii}=A_{jj}=\pm 1\).
thomas5267
  • thomas5267
So the only elements in center of O(n) is \(\pm I\).
ganeshie8
  • ganeshie8
looks good to me!
thomas5267
  • thomas5267
I was just trying to find the biggest group in a subset of orthogonal matrices. The subset consists of orthogonal matrices that are symmetric. As the product of two symmetric matrices A and B is symmetric if they commute, I have to find the center of the orthogonal group. Turns out that the group is almost trivial. \[ AB=(AB)^T=B^TA^T=BA \quad \text{as A and B are both symmetric.} \]

Looking for something else?

Not the answer you are looking for? Search for more explanations.