what is multiplication

- ParthKohli

what is multiplication

- Stacey Warren - Expert brainly.com

Hey! We 've verified this expert answer for you, click below to unlock the details :)

- jamiebookeater

I got my questions answered at brainly.com in under 10 minutes. Go to brainly.com now for free help!

- ParthKohli

@Kainui please teach

- Kainui

It can be hard to understand lol

- Kainui

repeated addition is complicate

Looking for something else?

Not the answer you are looking for? Search for more explanations.

## More answers

- ParthKohli

Thanks. Now that we've got that out of the way, let's begin.

- ParthKohli

How do I indicate the k-th row/column?
Better yet, how do I indicate a row/column vector but with the same dimension as the parent set?

- Kainui

Alright so let's just rehash the "Einstein summation notation" if you have a pair of the same index, that implies you sum over them. The indices refer to dimensions, so here we go, our "dot product" although this isn't complete yet but will be fixed after we understand a couple things about matrices. I assume 2 or 3 dimensions whenever I feel like it but usually 3D for the sake of simplification.
\[\bar a \cdot \bar b = a^i b_i = \sum_{n=1}^3 a^ib_i=a^1b_1+a^2b_2+a^3b_3\]
Similarly matrix multiplication follows as well from the same rule,
\[AB = A^i_jB^j_k = \sum_{n=1}^3 A^i_jB^j_k=A^i_1B^1_k+A^i_2B^2_k+A^i_3B^3_k\]
Ok now that that's out of the way on to whatever needs to be clarified or the next thing.

- ParthKohli

Like if I have\[A=\left[\begin{matrix}a & b \\ c&d\end{matrix}\right]\]how do I indicate\[A_0 = \left[\begin{matrix}a & b \\ 0&0\end{matrix}\right]\]

- Kainui

So to indicate the an individual row or column of your matrix here: \[A=A^i_j=\left[\begin{matrix}a & b \\ c&d\end{matrix}\right] = \left[\begin{matrix}A^1_1 & A^1_2 \\ A^2_1&A^2_2\end{matrix}\right]\]

- Kainui

So you say "the matrix \(A^i_j\) and the entry \(A^2_1\)" whereas before you would have said "the matrix\(A\) and the entry \(c\)" these are completely identical if that makes sense.

- ParthKohli

So is there no preexisting notation for the matrix\[\left[\begin{matrix}a & b \\ 0&0\end{matrix}\right]\]or for the matrix\[\left[\begin{matrix}a & 0 \\ c&0\end{matrix}\right]\]

- Kainui

No not like that, although you can specifically single out a row or column of the matrix, for example:
So to indicate the an individual row or column of your matrix here: \[A^2_j=\left[\begin{matrix}c&d\end{matrix}\right] = \left[\begin{matrix} A^2_1&A^2_2\end{matrix}\right]\] \[A^i_1=\left[\begin{matrix}a \\ c\end{matrix}\right] = \left[\begin{matrix}A^1_1 \\ A^2_1\end{matrix}\right]\]

- ParthKohli

OK, that's still better.

- Kainui

So there are two matrices that I'll show you that are pretty common, the first is the Kronecker delta which is just the identity matrix.
\[I = \delta^i_j\]
Also I don't think I answered your question about dimensionality earlier, so I'll say it now while here. Usually they'll use Latin indices to represent 3D and use Greek indices to represent 2D so if you're projecting a vector onto a surface you would have a 2x3 matrix, so you would have some object like this, specifically I'm thinking of the "shift tensor" but there are others as well, just one example.
\[Z^\alpha_i=\left[ \begin{array}c Z^1_1 & Z^2_1 & Z^3_1\\Z^1_2 & Z^2_2 & Z^3_2\\\end{array} \right]\]

- ParthKohli

What do you really mean when you talk about projection?

- Kainui

Ok I said two matrices that are common, the shift tensor is common, but it wasn't what I intended on introducing since that's complicated and far off.

- Kainui

Forget the projection stuff, we can't meaningfully talk about that yet.
The matrix I really wanted to introduce is the metric tensor. That's a very important matrix and it's called "metric" because it is what tells you how to measure distances.
Have you heard of a Gram Matrix before?

- ParthKohli

lol no, I don't know any of this.

- Kainui

I hadn't heard of them before either, so not really a big deal. The metric tensor is defined this way:
\[ \large Z_{ij} = \bar Z_i \cdot \bar Z_j =\left[ \begin{array}c \bar Z_1 \cdot \bar Z_1 & \bar Z_1 \cdot \bar Z_2 & \bar Z_1 \cdot \bar Z_3\\\bar Z_2 \cdot \bar Z_1 & \bar Z_2 \cdot \bar Z_2 & \bar Z_2 \cdot \bar Z_3 \\\bar Z_3 \cdot \bar Z_1 & \bar Z_ 3\cdot \bar Z_2 & \bar Z_3 \cdot \bar Z_3\\\end{array} \right]\]

- Kainui

Gasp what the hell is this crap?

- ParthKohli

What the...

- Kainui

I'll explain it, \(\bar Z_i\) is just the general form of the basis vectors. One set of basis vectors you may be comfortable with is:
\(\bar Z_1=\hat i\), \(\bar Z_2=\hat j\), and \(\bar Z_3=\hat k\).
This is nothing mystical here, these are just the regular orthonormal unit basis vectors we know and love I hope.
So what's the metric tensor? well since we know \(\hat i \cdot \hat i = 1\) and \(\hat i \cdot \hat j =0\) we fill in the matrix there and we get the identity matrix. Try it out abit and make sure that makes sense, ther's some gaps to fill in here.

- ParthKohli

Oh, I see.

- Kainui

So now comes in the true definition of a vector and dot product.

- Kainui

\[\bar V = V^i \bar Z_i = V^1 \bar Z_1+V^2 \bar Z_2+V^3 \bar Z_3\] A vector really has NO indices on it. What we were really looking at were the components of a vector, \(V^i\) and we must contract with the basis elements \(\bar Z_i\) in order to get the real vector.

- ParthKohli

Again, that conforms to my knowledge of vectors. Haha.

- Kainui

In tensor calculus a vector is a geometric object in space, an invariant. We must contract this specific set of elements with its basis, so now this is where tensors will begin to play a role past what you know. First I will explain the dot product, then we can talk about the true nature of a tensor.

- Kainui

The dot product now between two vectors is truly:
\[\bar A \cdot \bar B = A^i \bar Z_i \cdot B^j \bar Z_j \]
We can pull the \(A^i\) and \(B^j\) terms away from the vectors since they're just scalars.
\[ A^i B^j\bar Z_i \cdot \bar Z_j \]
And we see that we have the definition of the metric tensor! \(\bar Z_i \cdot \bar Z_j = Z_{ij}\)
\[ A^i B^j Z_{ij}\]
Now remember that in our Euclidean basis \(\hat i, \hat j, \text{ and } \hat k\) the metric tensor was the identity matrix, so we can write:
\[ A^i B^j \delta_{ij}\] And just like we would assume the identity matrix to do, it will rename and in this case, lower the index, that's all. See if you can write this out in terms of old linear algebra with a matrix and two vectors, I'll help you in a minute if you have troubles.
So continuing on we can write either:
\[ A^i B^j \delta_{ij} = A^iB_i\]
or
\[ A^i B^j \delta_{ij} = A^jB_j\]
since it doesn't matter which we contract the idenity matrix with, we will still get the same dot product we had earlier.

- Kainui

Try to do that exercise I described, write out \[ A^i B^j \delta_{ij} \] in terms of linear algebra. Just make a quick drawing of the matrix and two "vectors" (even though they are not truly vectors, just the components of the vectors, it's common for authors to just call this the vectors, even though it's understood these are the components).

- Kainui

Hopefully my presentation of the metric tensor makes it clear why we want to define one at all. If not, I'll say it explicitly. It allows us to define the dot product to be whatever we want it to be. Spaces where you define the metric tensor like this are called Riemannian spaces and let us do non Euclidean geometry. =)

- Kainui

Another fact to prove to yourself is that the metric tensor is symmetric.
\(Z_{ij} = Z_{ji}\)
Start at one end and follow the definition. At this point I've introduced enough things to stir up a bunch of mud and make you question some basic things maybe so if you have to ask, ask. There are a lot of things that I left unsaid in order to make the main point, so if you have questions about details I'll help fill them in for you.

- ParthKohli

Hey, hold on - I was out for breakfast. Gimme a second.

- Kainui

Alright I'm about to go to bed in about 10 minutes so I was just trying to throw a bunch of stuff up here before I left I didn't mean to overload you hahaha, but I'll be back tomorrow of course.

- ParthKohli

Yeah, thanks for this. Good night.

- Kainui

There are a couple more hurdles to overcome but you're not too far off from deriving some very fascinating and powerful identities.

- ParthKohli

Yeah, the symmetry of the metric tensor follows from the commutativity of dot product.

- ParthKohli

@Kainui

- dan815

lolool

Looking for something else?

Not the answer you are looking for? Search for more explanations.