The vectors are called basis vectors of .
In this representation, sometimes Einstein’s summation convention is used: We write , omitting the sum symbol in order to simplify the notation. The sum is automatically carried out over repeated indices. Here, the index is .
The element of a matrix is the entry in its -th row and its -th column. For two-by-two matrices, this reads
The result of a linear mapping can be written in index form, too:
This means that the first and second components, and , of are given by
Note that the index runs over the columns of the matrix .
This is simple,
A matrix moves a vector into a new vector . This new vector can again be transformed into another vector by acting with another matrix B on it: . The combined operation transforms the original vector into in one single step. This matrix product is calculated according to
In general, the matrix product does not commmute, i.e.,
This means that in contrast to real or complex numbers, the result of a multiplication of two matrices and depends on the order of and .
The commutator of two matrices and is defined as
The commutator plays a central role in quantum mechanics, where classical variables like position and momentum are replaced by operators(matrices) which in general do not commute, i.e., their commutator is non–zero.
Example:
The abstract way to write a matrix multiplication with indices:
To get the element in the th row and th column of the product , take the scalar product of the th row-vector of with the -th column vector of . This looks complicated but it is not, it is just another formulation of our definition Eq.( 5.28 ).