## Empty one year ago does $$A^3+B^3=C^3$$ have any solutions as nxn invertible matrices with integer entries?

1. ganeshie8

1x1 matrices are out due to fermat's last thm

2. ganeshie8

oh wait, trivial solutions with 0,1,-1 etc are allowed right ?

3. Empty

I don't know, does FLT allow negative integers? I would prefer only positive integers but I don't care just really thought this would be a fun extension. :P

4. Empty

My first thoughts are to diagonalize these, which might end up essentially turning the problem into FLT in each of the entries, but I think this proof only shows that similar matrices can't obey FLT for matrices. So what about nonsimilar matrices?

5. ganeshie8
6. Empty

More formally: X, Y, Z are diagonal matrices similar to A, B, and C related by: $A=PXP^{-1}$$B=PYP^{-1}$$C=PZP^{-1}$ then $A^n+B^n=C^n$$PX^n P^{-1}+PY^nP^{-1}=PZ^nP^{-1}$$P(X^n +Y^n)P^{-1}=PZ^nP^{-1}$$X^n +Y^n=Z^n$ which is Fermat's last theorem in each of the entries, which has been proven to be false, so there is no such solution of this form in matrices. There may still exist a form in nonsimilar matrices. Hmm.

7. ganeshie8

does that mean eigen vectors need to be different ?

8. anonymous

Can you be certain that when you diagonalize them, their surrounding matrices will be the same?

9. Empty

@ganeshie8 Yeah the eigenvalues must be different. @wio No, you can't which is why this question isn't over yet haha.

10. ganeshie8

looks there are many solutions in 2x2

11. ganeshie8

infinitely many it seems..

12. anonymous

Wait, if two matrices have the same eigen values, does that mean they'll have the same surrounding matrices?

13. anonymous

I guess I can't completely remember all the theorems about eigen vectors

14. anonymous

Is there something else motivating this question?

15. Empty

Hmmm wait yeah I guess I am no quite right. I am thinking of the matrices that have the same eigenvalues are similar, not eigenvectors like @ganeshie8 is saying, since the eigenvectors make up P and P inverse.

16. anonymous

Oh right, so $$P$$ is a matrix of eigen vectors and the diagonal vector is just the eigen values.

17. Empty

No, nothing is really motivating this other than I wanted to know if FLT also applied to matrices as well or not. I think it came up the other day when talking about the dual numbers but I forget why I thought that, I'll probably go back and visit it after this though.

18. anonymous

The invertible restriction does rule out a lot of matrices though

19. ganeshie8

We found few solutions, so clearly FLT is not applicable to matrices.

20. Empty

What solutions did we find?

21. ganeshie8

scroll up

22. Empty

Ahhh ok I missed it, I guess do you mind if I restrict ourselves to only positive integer entries?

23. anonymous

The following matrix: $A = [a_{i,j}] = \begin{cases} 1 & i+1\equiv j\pmod n\\ 0 \end{cases}$Which looks like this in the $$3\times3$$ case: $A = \begin{bmatrix} 0&1&0 \\ 0&0&1 \\ 1&0&0 \end{bmatrix}$Has the property $$A^n = I$$. If you make the bottom left entry be $$s$$, then you have $$A^n = sI$$.

24. ganeshie8

so 0 is also not allowed ?

25. Empty

Hmm. Good question, I don't know. I think having zero as an entry should be allowed as long as it's an invertible, but the zero matrix itself shouldn't, for obvious trivial reasons. 0^3+A^3=A^3 for instance.

26. anonymous

$A^3 + A^3 = 2I$

27. anonymous

Hmmm, maybe that doesn't help

28. anonymous

$(A_7)^3 + (A_1)^3 = 8I = (2I)^3$

29. Empty

No I think you're right, this solves it, FLT doesn't apply awesome!

30. ganeshie8

Nice! that shows FLT doesn't apply for power 3

31. ganeshie8

or power of form $$3k$$

32. Empty

Oh true, but I think we can extend this to arbitrary powers maybe by just adding an extra column and row with all 0s and a 1 on the new diagonal entry.

33. ganeshie8

you're talking about matrix dimensions ?

34. anonymous

You can define $$A_{n,k}$$ as follows: $\begin{bmatrix} 0&I_{n-1}\\ k&0 \end{bmatrix}$And $$(A_{n,k})^n = kI_n$$.

35. Empty

Wait, yeah sorry I was off. I guess what we really showed was that this is true for 3x3 matrices of power 3. But really we need arbitrary dimension matrices for arbitrary powers and I accidentally went to thinking of the other way.

36. Empty

Interesting, how did you find this solution @wio ?

37. anonymous

When $$k=1$$, it's just a permutation which shifts everything to the right. You need $$n$$ shifts to get to the identity matrix. Fortunately the $$k$$ multiplication only gets applies to each row once.

38. anonymous

Make that $$n-1$$ shifts

39. ganeshie8

beautiful!

40. anonymous

Likewise, if you have $$(A_{n,l})^2$$, it's a permutation matrix that shifts twice for each multiplication. So it will take $$n/2-1$$ shifts to get the the identity, I believe. Obviously you need even dimensions for this to get you back to the identity.

41. Empty

Nice, I really like this. The only problem I have with it is that it tells us there's always a specific matrix size for every power, but maybe there are exceptions where powers don't match the dimension of the matrix. The only reason this seems important to me is because we can consider the 1x1 matrix dimension case as being an exception where it fails for power 3 and higher, so there could be others out there with other special powers and dimensions that follow some pattern.

42. anonymous

Whoops, I keep making little mistakes. Anyway, $\large [(A_{n,k})^p]^{n/p} = (A_{n,k})^n = kI$If we let $$n/p=3$$ then $$n=3p$$, so we can do this method for $$3m\times 3m$$ matrices. We would still need a method for other cases.

43. Empty

Ok here's the table I'd like to sorta fill out for me to feel satisfied the problem is solved haha. $$A_{n \times n}^p+B_{n \times n}^p=C_{n \times n}^p$$ \begin{array}{|c|c|c|} \hline \text{ } \text{n } & \text{p} \\ \hline \text{ } 1 & p \ge 3 \\ \hline \text{ } 2 & \\ \hline \text{ } 3 & \\ \hline \text{ } 4 & \\ \hline \end{array}

44. anonymous

Hold on, we know it is false when $$n=1$$.

45. anonymous

I've only shown it for cases $$n=3p$$.

46. Empty

Only for p>2, since we have pythagorean triples.

47. anonymous

I found another interesting matrix. let $$L_{n,k}$$ be an $$n\times n$$ matrix of all zeros except the bottom left entry is $$k$$. $(L_{n,k}+I_n)^n = nL_{n,k}+I_n$I'm wondering if this could have good uses.

48. Empty

Interesting I think this is good, that can definitely help us out I think.

49. anonymous

You could also write that as: $(L_{n,k}+I_n)^n = L_{n,nk}+I_n$

50. Empty

Do we have t place the entry at the bottom left or can it be any arbitrary location as long as it's not on the main diagonal?

51. anonymous

It doesn't have to be $$n$$ either. $(L_{n,k}+I_n)^p = L_{n,pk}+I_n$

52. anonymous

Can be anywhere, but corners are the only places we can expect to exist.

53. ganeshie8

$\begin{bmatrix}1&0\\k&1\end{bmatrix}^n=\begin{bmatrix}1&0\\nk&1\end{bmatrix}$

54. anonymous

Yeah, basically that's the 2D case

55. anonymous

The only issue is that $$L^{2} = 0$$, and it isn't inevitable so we can't use it as a candidate.

56. Empty

No, but we can make invertible matrices out of invertible ones by adding them, so they're not useless.

57. Empty

In fact any nxn matrix you make with all zeroes and one single entry on an off diagonal squares to 0. The proof would be to look at the square in summation notation: $$A^2 = \sum_j a_{ij}a_{jk}$$ All the entries will be multiplying by 0 except for if you have a single term on a diagonal which would correspond to $$a_{ij} = a_{ji}$$ which we have restricted.

58. anonymous

Found another interesting matrix. Suppose you reflect the rows of the identity matrix, so something like $$T = \delta_{i,n-j+1}$$. For the $$3\times 3$$ case we have: $T = \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix}$It is always the case that $$T^2=I$$.

59. anonymous

This matrix flips the elements

60. Empty

Ahhh yeah this is called the exchange matrix.

61. anonymous

For cases where the dimensions are even, we can have the bottom half entries be $$k$$ and it will result in $$kI$$ when squared.

62. anonymous

Otherwise, having all entries as k gives $$k^2I$$, obviously

63. anonymous

Okay, I found a really interesting matrix, but I've not completely formalized how to construct it. I think what you do is you start with the $$A_{n,1}$$ matrix, and then you shift the $$n$$th row with the $$m$$th row. I think with this matrix, well call it $$B$$, you have: $B^m = I$

64. anonymous

Not sure when it is certain to work.

65. Empty

The whole reasoning behind proving that all nxn zero matrices with a single off diagonal element squaring to 0 allows us to separate a single matrix into a diagonal matrix in a linear combination with a lot of other matrices that square to 0, which makes it easier to simplify powers I think. Hmmmm I think I like your reasoning better though it looks like it's going to take us somewhere. This is a tricky thing to try to prove.

66. anonymous

I am thinking you should always be able to find a permutation matrix such that $$P^m = I$$. The shift case is just where $$m=n-1$$, when $$n$$ is the dimensions

67. anonymous

I just am not sure how to construct it

68. Empty

I think you're right because the permutation matrices essentially correspond to modular arithmetic so I think we can actually use Fermat's little theorem to show this. I'm not entirely sure how to construct them but I am certain we can which might be all we need.

69. anonymous

Matrix multiplication is associative, right?

70. Empty

Yeah you better believe it. It's just not commutative in general.

71. Empty

Something I'm reading right now: http://math.stackexchange.com/questions/625140/a-fermats-flt-look-like-for-matrices

72. UsukiDoll

Matrix multiplication isn't commutative due to the fact that $AB\neq BA$ holds the result from AB doesn't equal the result from BA

73. Empty

If A and B are 2D rotation matrices, they'll commute for instance, or if the transformations are 'orthogonal' to each other. For instance stretching and rotation are orthogonal. These are really all more easily captured by the complex number form of these matrices: $re^{i \theta}$

74. anonymous

Are you allowing non integer matrices?

75. ganeshie8
76. ganeshie8

wolfram must be lying, that doesn't look correct

77. Empty

$$\color{blue}{\text{Originally Posted by}}$$ @wio Are you allowing non integer matrices? $$\color{blue}{\text{End of Quote}}$$ No, I was just sorta adding a note for @UsukiDoll to add to her comment about 2x2 matrices and commutativity.

78. Empty

@ganeshie8 Hmmmm.... I don't know, I guess I'll just have to check them on matlab or something. All I have is the online compiler for it though. Weird.

79. ganeshie8

I'm trying to construct the solutions using the method in that mse post http://gyazo.com/31615b52acb858db5875d66eaec0d715

80. anonymous

I've been thinking of something... Note matrices like: $I = \begin{bmatrix} 1 \\ 2 \\ \vdots \\ n \end{bmatrix}$Where the number represents the column which has a $$1$$, and all else is $$0$$. The shift matrix mentioned earlier is expressed as: $R = \begin{bmatrix} 2 \\ 3 \\ \vdots \\ n \\ 1 \end{bmatrix} = \begin{bmatrix} 1+1 \\ 2+1 \\ \vdots \\ n+1 \end{bmatrix}$Obviously, when number is $$>n$$, we just subtract $$n$$ to get its actual value. This allows us to denote permutation matrices.

81. anonymous

I put $$R$$ because it shifts right. We have shift left by doing $$R^{n-1} = L$$. Notice that$R^k = \begin{bmatrix} 1+k \\ 2+k \\ \vdots \\ n+ k \end{bmatrix}$

82. Empty

I see, essentially we're cycling through with these matrices.

83. anonymous

We can have a swapping matrix, which always swaps the $$k$$th row with the $$1$$st row: $S_{k} = \begin{bmatrix} k \\ 2 \\ \vdots \\k-1 \\ 1 \\ \vdots \\ n \end{bmatrix}$We can swap the $$k$$th row with the $$j$$th row by using $$S_kS_jS_k$$.

84. anonymous

It turns our that $$R = S_2S_3\dots S_n$$.

85. anonymous

So we need to find a permutation that takes 3 cycles to return the the identity

86. Empty

Well we can represent every permutation matrix by permuting those columns of the identity matrix, so as long as we have an nxn matrix we can shift the elements to make the permutation matrix that cycles by 1. Since it's the permutation matrix on n elements, we know its n power will be the identity. It's weird and fascinating because permuting the permutation matrix is the permutation we are doing simultaneously. I find it to be amazing when I learned this.

87. anonymous

The nice thing about these swap matrices is that it doesn't matter how large the matrix is. We can just ignore the larger dimensions and focus on the top row.

88. anonymous

So, for example $$(S_2)^2 = I$$, where as $$(S_2S_3)^3 = (S_3S_2)(S_2S_3)=I$$

89. anonymous

And I believe that $$(S_2S_3\ldots S_m)^m = I$$.

90. anonymous

You are basically making a shift matrix out of the first m rows, remaining the rest as is.

91. Empty

I see so m<n and all of these are really nxn matrices?

92. anonymous

$m\leq n$

93. Empty

Ahhh ok, right of course

94. anonymous

However, permuting this around, I'm not sure what happens with non-one values.

95. anonymous

Suppose we have $$k^m = p$$. Then we can use a matrix of this form: $\begin{bmatrix} A_{m,p} & 0 \\ 0 & kI_{n-m} \end{bmatrix}^m =\begin{bmatrix} pI_{m} & 0 \\ 0 & pI_{n-m} \end{bmatrix}= pI$

96. Empty

I'm pretty confident that a permutation matrix that has within it an m-cycle will equal the identity matrix when raised to the mth power.

97. anonymous

Now, for a $$2\times 2$$ matrix, we have a problem. It's only going to have have 2-cycle permutations. So we won't be able to find one for odd cases.

98. anonymous

So far we have $$m\leq n$$ and $$2 | m$$ as conditions.

99. Empty

A small example of the intuition I have when thinking of raising a 6x6 matrix with a 4-cycle to the 4th power: |dw:1436091055012:dw|