Comments on What does upper indices represent?
Parent
What does upper indices represent?
I saw that people were representing matrices in two ways.
- $$\sum_{j=1}^n a_{ij}$$
It is representing a column matrix (vector actually) if we assume $i=1$.
$$\begin{bmatrix}a_{11} & a_{12} & a_{13} & ......\end{bmatrix}$$
- $$\sum_{j=1}^n a^{ij}$$
What is it representing? At first, I thought it was a row matrix (vector) since it is the opposite of a column matrix (vector). But when I was writing the question I couldn't generate a row matrix using the 2nd equation. I became more confused when I saw $a_j^{i}$, and sometimes there are two variables in sub and sup: $a_{ji}^{kl}$. I don't remember if either of them matches (I am not sure if I wrote it the wrong way).
After searching a little bit I found that when we move components our vectors don't change. But, I can't get deeper into covariant and contravariant. I even saw some people use an equation like this: $^ia_j$.
I was reading https://physics.stackexchange.com/q/541822/, and those answers don't explain the covariant and contravariant for a beginner (those explanations are for those who have some knowledge of the covariant and contravariant).
I was watching the video, what he said that is, if we take some basis vectors and then find a vector using those basis vectors than if we decrease length of those vectors than that's contravariant vectors (I think he meant to say changing those components). But the explanation is not much more good to me. He might be correct also but I don't have any idea. If he is assuming that changes of basis vectors is contravariant then is "the original" basis vectors covariant? So how do we deal with covariant and contravariant altogether $g^i_j$ sometimes $g_j^i$
Post
In general sense, convariant and contravariant isn't a interesting thing. But what confuses here that is the meaning of those words.
Actually, if we think of two different basic vectors than a basis vector will be covariant another will be contravariant. But a basis vector must be smaller than another basis vector. Let we have a vector which looks like this : $\vec{A}=2\vec{e_1}+2\vec{e_2}$. Here $\vec{e_1}$ and $\vec{e_2}$ is basis vectors. I am taking two e_1 and e_2 to make a vector line. Now I am going to decrease length of those basis vectors let $\tilde{\vec{e_1}}=\frac{1}{2}\vec{e_1}$ and $\tilde{\vec{e_2}}=\frac{1}{2}\vec{e_2}$. To make that $\vec{A}$ using these new basis vectors I must increase coefficient of $\vec{e_1}$ and $\vec{e_2}$. Systematically, that vector should be $\vec{A}=4\tilde{\vec{e_1}}+4\tilde{\vec{e_2}}$. So that I can find the same vector this way. But here we can write $\tilde{\vec{e_1}}=e^1$ (It's superscript not exponent). And $e_1$ is called covariant and $e^1$ is called contravariant.
We usually use the method to define spacetime from different parts. There's possible way to transform (It's not the one which I had seen but it describes little bit) them also.
When using contravariant and covariant in a single term that represent that, we are trying to represent a vector using different basis. Like as, we are going to define plain line respectively through x,y and z axis $e_1$, $e_2$ and $e_3$. Now I am going to transform them rather than decreasing their length and I am call those transformed basis vector as $e^1$, $e^2$ and $e^3$. To find a curvy vector we must use those transformed and normal basis vectors together (we can call that vector by normal basis vectors but it will be easier to find that vector by mixing all basis vectors) $\vec{A}=g^1_2+g^3_1$. For finding any kind of vector we use i,j,k respectively for 3 dimensional coordinate. We can take another vector $\vec{B}=\sum_{ijk} g_i^{jk}+g_k^{ij}$. But Einstein said that we are summing them every time so we can remove the summation expression from that equation but that doesn't mean we aren't summing we are summing by default. So the same vector will look like, $\vec{B}=g_i^{jk}+g_k^{ij}$.
2 comment threads