Get our expert's

answer on brainly

SEE EXPERT ANSWER

Get your **free** account and access **expert** answers to this and **thousands** of other questions.

I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!

Get this expert

answer on brainly

SEE EXPERT ANSWER

Get your **free** account and access **expert** answers to this and **thousands** of other questions

1x1 matrices are out due to fermat's last thm

oh wait, trivial solutions with 0,1,-1 etc are allowed right ?

does that mean eigen vectors need to be different ?

Can you be certain that when you diagonalize them, their surrounding matrices will be the same?

looks there are many solutions in 2x2

infinitely many it seems..

I guess I can't completely remember all the theorems about eigen vectors

Is there something else motivating this question?

Oh right, so \(P\) is a matrix of eigen vectors and the diagonal vector is just the eigen values.

The invertible restriction does rule out a lot of matrices though

We found few solutions, so clearly FLT is not applicable to matrices.

What solutions did we find?

scroll up

Ahhh ok I missed it, I guess do you mind if I restrict ourselves to only positive integer entries?

so 0 is also not allowed ?

\[
A^3 + A^3 = 2I
\]

Hmmm, maybe that doesn't help

\[
(A_7)^3 + (A_1)^3 = 8I = (2I)^3
\]

No I think you're right, this solves it, FLT doesn't apply awesome!

Nice! that shows FLT doesn't apply for power 3

or power of form \(3k\)

you're talking about matrix dimensions ?

Make that \(n-1\) shifts

beautiful!

Hold on, we know it is false when \(n=1\).

I've only shown it for cases \(n=3p\).

Only for p>2, since we have pythagorean triples.

Interesting I think this is good, that can definitely help us out I think.

You could also write that as: \[
(L_{n,k}+I_n)^n = L_{n,nk}+I_n
\]

It doesn't have to be \(n\) either. \[
(L_{n,k}+I_n)^p = L_{n,pk}+I_n
\]

Can be anywhere, but corners are the only places we can expect to exist.

\[\begin{bmatrix}1&0\\k&1\end{bmatrix}^n=\begin{bmatrix}1&0\\nk&1\end{bmatrix}\]

Yeah, basically that's the 2D case

The only issue is that \(L^{2} = 0\), and it isn't inevitable so we can't use it as a candidate.

This matrix flips the elements

Ahhh yeah this is called the exchange matrix.

Otherwise, having all entries as k gives \(k^2I\), obviously

Not sure when it is certain to work.

I just am not sure how to construct it

Matrix multiplication is associative, right?

Yeah you better believe it. It's just not commutative in general.

Are you allowing non integer matrices?

wolfram must be lying, that doesn't look correct

I see, essentially we're cycling through with these matrices.

It turns our that \(R = S_2S_3\dots S_n\).

So we need to find a permutation that takes 3 cycles to return the the identity

So, for example \((S_2)^2 = I\), where as \((S_2S_3)^3 = (S_3S_2)(S_2S_3)=I \)

And I believe that \((S_2S_3\ldots S_m)^m = I\).

You are basically making a shift matrix out of the first m rows, remaining the rest as is.

I see so m

\[
m\leq n
\]

Ahhh ok, right of course

However, permuting this around, I'm not sure what happens with non-one values.

So far we have \(m\leq n\) and \(2 | m\) as conditions.