If \(A= \left[\begin{matrix}a&0\\b&c\end{matrix}\right]\) 1)compute \(e^{At}\) 2) Find the eigenvalues and eigenvectors of \(e^{-A}\) Please, help

At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.

Get our expert's

answer on brainly

SEE EXPERT ANSWER

Get your free account and access expert answers to this and thousands of other questions.

A community for students.

If \(A= \left[\begin{matrix}a&0\\b&c\end{matrix}\right]\) 1)compute \(e^{At}\) 2) Find the eigenvalues and eigenvectors of \(e^{-A}\) Please, help

Mathematics
See more answers at brainly.com
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident, similique sunt in culpa qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat.

Get this expert

answer on brainly

SEE EXPERT ANSWER

Get your free account and access expert answers to this and thousands of other questions

@oldrin.bataku I got \[e^{At}= \left[\begin{matrix}e^{at}(a-c)&0\\b(e^{at}-e^{ct})&e^{ct}(a-c)\end{matrix}\right]\]
hence if t = 1, we have \(e^{A}= \left[\begin{matrix}e^{a}(a-c)&0\\b(e^{a}-e^{c})&e^{c}(a-c)\end{matrix}\right]\)
for part 2), we know that \(e^{-A}=(e^{A})^{-1}\), so that we can calculate its eigenvalues and eigenvectors. However, it takes a long time to work on it. I would like to know there is any link between the two's eig.values and eig.vectors? Since we both work on the same A. Moreover, we have theorem for eigenvalue, it says if \(\lambda\) is one of eigenvalue of A, then \(\lambda^n\) is one of eigenvalue of \(A^n\) But I don't know whether we can apply for \(e^{A}\) and \(e^{-A}\) or not, please, explain me

Not the answer you are looking for?

Search for more explanations.

Ask your own question

Other answers:

One more thing I got confuse: Surely \(-A \neq A^{-1}\), If I calculate \(- A\) by putting - sign in the front of A, then, \(-A = \left[\begin{matrix}-a&0\\-b&-c\end{matrix}\right]\), while \[A^{-1}= \left[\begin{matrix}1/c &0\\b/ac & 1/a\end{matrix}\right]\] How can \(e^{-A}=(e^A)^{-1}\) ???
by definition \(-A,A\) commute so \(e^Ae^{-A}=e^{A+-A}=e^0=I\)
since \(e^{-A}=(e^A)^{-1}\) it follows the eigenvectors are identical but the eigenvalues are related by \(\lambda_{e^{-A},i}=1/\lambda_{e^A,i}\)
Hmmm is there a reason why this method I'm using here doesn't work? First I separate A into a sum of the diagonal matrix D and the corner matrix C. \[A=D+C\] \[e^{At}=e^{Dt+Ct}=e^{Dt}e^{Ct}\] Then I compute them individually: \[e^{Dt} = \left[\begin{matrix}e^{at} & 0 \\ 0 & e^{ct} \end{matrix}\right]\] \[e^{Ct} = I+tC=\left[\begin{matrix}1 & 0 \\ bt & 1 \end{matrix}\right]\] Of course multiplying these together doesn't give the matrix I wanted since addition is commutative in the exponents but multiplying these two matrices together is easy to check that it's not commutative. I guess my most important question is, if \(C\) had been a nonsingular matrix, would this be valid, or is there more to it than that?
well, you need \(C,D\) to commute to compute it in that way, and both being invertible is definitely insufficient -- consider a change of basis followed by a scaling along the coordinate axes. these very clearly do not commute as linear transformations and yet both are invertible
the standard way to compute \(\exp(A)\) for diagonalizable \(A\) is to diagonalize \(A=P^{-1}DP\) where \(P\) rotates into an eigenbasis and \(D\) describes the scaling, since \(A^n=(P^{-1}DP)^n=P^{-1}D^nP\) so: $$\exp(A)=P^{-1}\left(\sum_{n=0}^\infty\frac1{n!}D^n\right)P$$ and \(D^n\) is trivial for diagonal matrices
@empty if you're curious as to when two matrices A,B commute: https://en.wikipedia.org/wiki/Commuting_matrices#Characterization_in_terms_of_eigenvectors
Yeah I'm only able to know when matrices commute when I understand their geometric interpretation such as rotation matrices will commute with each other and with scalar matrices, things like that. Thanks @oldrin.bataku

Not the answer you are looking for?

Search for more explanations.

Ask your own question