In this representation, sometimes Einstein’s summation convention is used: We write
a ={\mathop{ \mathop{∑
}}\nolimits }_{i=1}^{2}{a}_{i}{e}_{i} = {a}_{i}{e}_{i}, omitting
the sum symbol in order to simplify the notation. The sum is automatically carried out over repeated indices. Here, the
index is i.
Matrices
The element {A}_{ij} of a
matrix A is the entry
in its i-th row and its
j-th column. For two-by-two
matrices, this reads \left (\array{
{A}_{11}&{A}_{12}
\cr
{A}_{21}&{A}_{22} } \right ).
Note: be very careful not to mix up the row and the column index!
Matrix operating on vector
The result of a linear mapping x → y = Ax
can be written in index form, too:
A matrix A moves
a vector x into a
new vector y = Ax.
This new vector can again be transformed into another vector
y' by acting with another
matrix B on it: y' = By = BAx. The
combined operation C = BA
transforms the original vector x
into y' in
one single step. This matrix product is calculated according to
This means that in contrast to real or complex numbers, the result of a multiplication of two matrices
A and
B depends on
the order of A
and B.
The commutator [A,B]
of two matrices A
and B is
defined as [A,B] = AB − BA.
The commutator plays a central role in quantum mechanics, where classical variables like position
x and
momentum p
are replaced by operators(matrices) which in general do not commute, i.e., their commutator is
non–zero.
The abstract way to write a matrix multiplication with indices:
\begin{eqnarray}
C = BA ⇝ {C}_{ij} ={ \mathop{∑
}}_{k=1}^{2}{B}_{
ik}{A}_{kj}.\quad \text{($ = {B}_{ik}{A}_{kj}$ in the summation convention).}& & %&(5.31) \\
\end{eqnarray}
To get the element in the ith
row and jth column of
the product BA, take the
scalar product of the ith
row-vector of B with
the j-th column
vector of A.
This looks complicated but it is not, it is just another formulation of our definition Eq.(5.28).