A community for students.
Here's the question you clicked on:
 0 viewing
Empty
 one year ago
does \(A^3+B^3=C^3\) have any solutions as nxn invertible matrices with integer entries?
Empty
 one year ago
does \(A^3+B^3=C^3\) have any solutions as nxn invertible matrices with integer entries?

This Question is Closed

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.01x1 matrices are out due to fermat's last thm

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0oh wait, trivial solutions with 0,1,1 etc are allowed right ?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1I don't know, does FLT allow negative integers? I would prefer only positive integers but I don't care just really thought this would be a fun extension. :P

Empty
 one year ago
Best ResponseYou've already chosen the best response.1My first thoughts are to diagonalize these, which might end up essentially turning the problem into FLT in each of the entries, but I think this proof only shows that similar matrices can't obey FLT for matrices. So what about nonsimilar matrices?

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0bruteforce spitted out this http://www.wolframalpha.com/input/?i=%7B%7B1%2C3%7D%2C%7B0%2C1%7D%7D%5E3%2B%7B%7B1%2C0%7D%2C%7B1%2C1%7D%7D%5E3%3D%7B%7B0%2C3%7D%2C%7B1%2C0%7D%7D%5E3

Empty
 one year ago
Best ResponseYou've already chosen the best response.1More formally: X, Y, Z are diagonal matrices similar to A, B, and C related by: \[A=PXP^{1}\]\[B=PYP^{1}\]\[C=PZP^{1}\] then \[A^n+B^n=C^n\]\[PX^n P^{1}+PY^nP^{1}=PZ^nP^{1}\]\[P(X^n +Y^n)P^{1}=PZ^nP^{1}\]\[X^n +Y^n=Z^n\] which is Fermat's last theorem in each of the entries, which has been proven to be false, so there is no such solution of this form in matrices. There may still exist a form in nonsimilar matrices. Hmm.

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0does that mean eigen vectors need to be different ?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Can you be certain that when you diagonalize them, their surrounding matrices will be the same?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1@ganeshie8 Yeah the eigenvalues must be different. @wio No, you can't which is why this question isn't over yet haha.

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0looks there are many solutions in 2x2

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0infinitely many it seems..

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Wait, if two matrices have the same eigen values, does that mean they'll have the same surrounding matrices?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I guess I can't completely remember all the theorems about eigen vectors

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Is there something else motivating this question?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Hmmm wait yeah I guess I am no quite right. I am thinking of the matrices that have the same eigenvalues are similar, not eigenvectors like @ganeshie8 is saying, since the eigenvectors make up P and P inverse.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Oh right, so \(P\) is a matrix of eigen vectors and the diagonal vector is just the eigen values.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1No, nothing is really motivating this other than I wanted to know if FLT also applied to matrices as well or not. I think it came up the other day when talking about the dual numbers but I forget why I thought that, I'll probably go back and visit it after this though.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0The invertible restriction does rule out a lot of matrices though

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0We found few solutions, so clearly FLT is not applicable to matrices.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1What solutions did we find?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Ahhh ok I missed it, I guess do you mind if I restrict ourselves to only positive integer entries?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0The following matrix: \[ A = [a_{i,j}] = \begin{cases} 1 & i+1\equiv j\pmod n\\ 0 \end{cases}\]Which looks like this in the \(3\times3\) case: \[ A = \begin{bmatrix} 0&1&0 \\ 0&0&1 \\ 1&0&0 \end{bmatrix} \]Has the property \(A^n = I\). If you make the bottom left entry be \(s\), then you have \(A^n = sI\).

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0so 0 is also not allowed ?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Hmm. Good question, I don't know. I think having zero as an entry should be allowed as long as it's an invertible, but the zero matrix itself shouldn't, for obvious trivial reasons. 0^3+A^3=A^3 for instance.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0\[ A^3 + A^3 = 2I \]

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Hmmm, maybe that doesn't help

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0\[ (A_7)^3 + (A_1)^3 = 8I = (2I)^3 \]

Empty
 one year ago
Best ResponseYou've already chosen the best response.1No I think you're right, this solves it, FLT doesn't apply awesome!

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0Nice! that shows FLT doesn't apply for power 3

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0or power of form \(3k\)

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Oh true, but I think we can extend this to arbitrary powers maybe by just adding an extra column and row with all 0s and a 1 on the new diagonal entry.

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0you're talking about matrix dimensions ?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0You can define \(A_{n,k}\) as follows: \[ \begin{bmatrix} 0&I_{n1}\\ k&0 \end{bmatrix} \]And \((A_{n,k})^n = kI_n\).

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Wait, yeah sorry I was off. I guess what we really showed was that this is true for 3x3 matrices of power 3. But really we need arbitrary dimension matrices for arbitrary powers and I accidentally went to thinking of the other way.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Interesting, how did you find this solution @wio ?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0When \(k=1\), it's just a permutation which shifts everything to the right. You need \(n\) shifts to get to the identity matrix. Fortunately the \(k\) multiplication only gets applies to each row once.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Make that \(n1\) shifts

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Likewise, if you have \((A_{n,l})^2\), it's a permutation matrix that shifts twice for each multiplication. So it will take \(n/21\) shifts to get the the identity, I believe. Obviously you need even dimensions for this to get you back to the identity.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Nice, I really like this. The only problem I have with it is that it tells us there's always a specific matrix size for every power, but maybe there are exceptions where powers don't match the dimension of the matrix. The only reason this seems important to me is because we can consider the 1x1 matrix dimension case as being an exception where it fails for power 3 and higher, so there could be others out there with other special powers and dimensions that follow some pattern.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Whoops, I keep making little mistakes. Anyway, \[ \large [(A_{n,k})^p]^{n/p} = (A_{n,k})^n = kI \]If we let \(n/p=3\) then \(n=3p\), so we can do this method for \(3m\times 3m\) matrices. We would still need a method for other cases.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Ok here's the table I'd like to sorta fill out for me to feel satisfied the problem is solved haha. $$A_{n \times n}^p+B_{n \times n}^p=C_{n \times n}^p$$ \begin{array}{ccc} \hline \text{ } \text{n } & \text{p} \\ \hline \text{ } 1 & p \ge 3 \\ \hline \text{ } 2 & \\ \hline \text{ } 3 & \\ \hline \text{ } 4 & \\ \hline \end{array}

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Hold on, we know it is false when \(n=1\).

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I've only shown it for cases \(n=3p\).

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Only for p>2, since we have pythagorean triples.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I found another interesting matrix. let \(L_{n,k}\) be an \(n\times n\) matrix of all zeros except the bottom left entry is \(k\). \[ (L_{n,k}+I_n)^n = nL_{n,k}+I_n \]I'm wondering if this could have good uses.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Interesting I think this is good, that can definitely help us out I think.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0You could also write that as: \[ (L_{n,k}+I_n)^n = L_{n,nk}+I_n \]

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Do we have t place the entry at the bottom left or can it be any arbitrary location as long as it's not on the main diagonal?

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0It doesn't have to be \(n\) either. \[ (L_{n,k}+I_n)^p = L_{n,pk}+I_n \]

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Can be anywhere, but corners are the only places we can expect to exist.

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0\[\begin{bmatrix}1&0\\k&1\end{bmatrix}^n=\begin{bmatrix}1&0\\nk&1\end{bmatrix}\]

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Yeah, basically that's the 2D case

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0The only issue is that \(L^{2} = 0\), and it isn't inevitable so we can't use it as a candidate.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1No, but we can make invertible matrices out of invertible ones by adding them, so they're not useless.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1In fact any nxn matrix you make with all zeroes and one single entry on an off diagonal squares to 0. The proof would be to look at the square in summation notation: $$ A^2 = \sum_j a_{ij}a_{jk} $$ All the entries will be multiplying by 0 except for if you have a single term on a diagonal which would correspond to \(a_{ij} = a_{ji}\) which we have restricted.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Found another interesting matrix. Suppose you reflect the rows of the identity matrix, so something like \(T = \delta_{i,nj+1}\). For the \(3\times 3\) case we have: \[ T = \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix} \]It is always the case that \(T^2=I\).

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0This matrix flips the elements

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Ahhh yeah this is called the exchange matrix.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0For cases where the dimensions are even, we can have the bottom half entries be \(k\) and it will result in \(kI\) when squared.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Otherwise, having all entries as k gives \(k^2I\), obviously

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Okay, I found a really interesting matrix, but I've not completely formalized how to construct it. I think what you do is you start with the \(A_{n,1}\) matrix, and then you shift the \(n\)th row with the \(m\)th row. I think with this matrix, well call it \(B\), you have: \[ B^m = I \]

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Not sure when it is certain to work.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1The whole reasoning behind proving that all nxn zero matrices with a single off diagonal element squaring to 0 allows us to separate a single matrix into a diagonal matrix in a linear combination with a lot of other matrices that square to 0, which makes it easier to simplify powers I think. Hmmmm I think I like your reasoning better though it looks like it's going to take us somewhere. This is a tricky thing to try to prove.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I am thinking you should always be able to find a permutation matrix such that \(P^m = I\). The shift case is just where \(m=n1\), when \(n\) is the dimensions

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I just am not sure how to construct it

Empty
 one year ago
Best ResponseYou've already chosen the best response.1I think you're right because the permutation matrices essentially correspond to modular arithmetic so I think we can actually use Fermat's little theorem to show this. I'm not entirely sure how to construct them but I am certain we can which might be all we need.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Matrix multiplication is associative, right?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Yeah you better believe it. It's just not commutative in general.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Something I'm reading right now: http://math.stackexchange.com/questions/625140/afermatsfltlooklikeformatrices

UsukiDoll
 one year ago
Best ResponseYou've already chosen the best response.0Matrix multiplication isn't commutative due to the fact that \[AB\neq BA\] holds the result from AB doesn't equal the result from BA

Empty
 one year ago
Best ResponseYou've already chosen the best response.1If A and B are 2D rotation matrices, they'll commute for instance, or if the transformations are 'orthogonal' to each other. For instance stretching and rotation are orthogonal. These are really all more easily captured by the complex number form of these matrices: \[re^{i \theta}\]

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Are you allowing non integer matrices?

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0below seem to work for all powers http://www.wolframalpha.com/input/?i=Table%5B%7B%7B%7B1%2C0%7D%2C%7B0%2C1%7D%7D%7B%7B0%2C1%7D%2C%7B1%2C0%7D%7D%7D%5En%2B%7B%7B%7B2%2C0%7D%2C%7B0%2C1%7D%7D%7B%7B0%2C1%7D%2C%7B1%2C0%7D%7D%7D%5En%7B%7B%7B3%2C0%7D%2C%7B0%2C1%7D%7D%7B%7B0%2C1%7D%2C%7B1%2C0%7D%7D%7D%5En%2C%7Bn%2C1%2C10%7D%5D

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0wolfram must be lying, that doesn't look correct

Empty
 one year ago
Best ResponseYou've already chosen the best response.1\(\color{blue}{\text{Originally Posted by}}\) @wio Are you allowing non integer matrices? \(\color{blue}{\text{End of Quote}}\) No, I was just sorta adding a note for @UsukiDoll to add to her comment about 2x2 matrices and commutativity.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1@ganeshie8 Hmmmm.... I don't know, I guess I'll just have to check them on matlab or something. All I have is the online compiler for it though. Weird.

ganeshie8
 one year ago
Best ResponseYou've already chosen the best response.0I'm trying to construct the solutions using the method in that mse post http://gyazo.com/31615b52acb858db5875d66eaec0d715

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I've been thinking of something... Note matrices like: \[ I = \begin{bmatrix} 1 \\ 2 \\ \vdots \\ n \end{bmatrix} \]Where the number represents the column which has a \(1\), and all else is \(0\). The shift matrix mentioned earlier is expressed as: \[ R = \begin{bmatrix} 2 \\ 3 \\ \vdots \\ n \\ 1 \end{bmatrix} = \begin{bmatrix} 1+1 \\ 2+1 \\ \vdots \\ n+1 \end{bmatrix} \]Obviously, when number is \(>n\), we just subtract \(n\) to get its actual value. This allows us to denote permutation matrices.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0I put \(R\) because it shifts right. We have shift left by doing \(R^{n1} = L\). Notice that\[ R^k = \begin{bmatrix} 1+k \\ 2+k \\ \vdots \\ n+ k \end{bmatrix} \]

Empty
 one year ago
Best ResponseYou've already chosen the best response.1I see, essentially we're cycling through with these matrices.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0We can have a swapping matrix, which always swaps the \(k\)th row with the \(1\)st row: \[ S_{k} = \begin{bmatrix} k \\ 2 \\ \vdots \\k1 \\ 1 \\ \vdots \\ n \end{bmatrix} \]We can swap the \(k\)th row with the \(j\)th row by using \(S_kS_jS_k\).

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0It turns our that \(R = S_2S_3\dots S_n\).

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0So we need to find a permutation that takes 3 cycles to return the the identity

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Well we can represent every permutation matrix by permuting those columns of the identity matrix, so as long as we have an nxn matrix we can shift the elements to make the permutation matrix that cycles by 1. Since it's the permutation matrix on n elements, we know its n power will be the identity. It's weird and fascinating because permuting the permutation matrix is the permutation we are doing simultaneously. I find it to be amazing when I learned this.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0The nice thing about these swap matrices is that it doesn't matter how large the matrix is. We can just ignore the larger dimensions and focus on the top row.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0So, for example \((S_2)^2 = I\), where as \((S_2S_3)^3 = (S_3S_2)(S_2S_3)=I \)

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0And I believe that \((S_2S_3\ldots S_m)^m = I\).

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0You are basically making a shift matrix out of the first m rows, remaining the rest as is.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1I see so m<n and all of these are really nxn matrices?

Empty
 one year ago
Best ResponseYou've already chosen the best response.1Ahhh ok, right of course

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0However, permuting this around, I'm not sure what happens with nonone values.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Suppose we have \(k^m = p\). Then we can use a matrix of this form: \[ \begin{bmatrix} A_{m,p} & 0 \\ 0 & kI_{nm} \end{bmatrix}^m =\begin{bmatrix} pI_{m} & 0 \\ 0 & pI_{nm} \end{bmatrix}= pI \]

Empty
 one year ago
Best ResponseYou've already chosen the best response.1I'm pretty confident that a permutation matrix that has within it an mcycle will equal the identity matrix when raised to the mth power.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0Now, for a \(2\times 2\) matrix, we have a problem. It's only going to have have 2cycle permutations. So we won't be able to find one for odd cases.

anonymous
 one year ago
Best ResponseYou've already chosen the best response.0So far we have \(m\leq n\) and \(2  m\) as conditions.

Empty
 one year ago
Best ResponseYou've already chosen the best response.1A small example of the intuition I have when thinking of raising a 6x6 matrix with a 4cycle to the 4th power: dw:1436091055012:dw
Ask your own question
Sign UpFind more explanations on OpenStudy
Your question is ready. Sign up for free to start getting answers.
spraguer
(Moderator)
5
→ View Detailed Profile
is replying to Can someone tell me what button the professor is hitting...
23
 Teamwork 19 Teammate
 Problem Solving 19 Hero
 Engagement 19 Mad Hatter
 You have blocked this person.
 ✔ You're a fan Checking fan status...
Thanks for being so helpful in mathematics. If you are getting quality help, make sure you spread the word about OpenStudy.