matrix multiplication is defined such that the expression \(\mathcal{M}(ST) = \mathcal{M}(S)\mathcal{M}(T)\) holds:
\begin{equation} (AC)_{j,k} = \sum_{r=1}^{n}A_{j,r}C_{r,k} \end{equation}
While matrix multiplication is distributive and associative, it is NOT!!!!!!!!!!! commutative. I hope you can see that \(ST\neq TS\).
memorization
- its always row-by-column, move down rows first then columns
- multiply element-wise and add (row times column and add)
other ways of thinking about matrix multiplication
- it is “row times column”: \((AC)_{j,k} = A_{j, .} \cdot C_{., k}\)
- it is “matrix times columns”: \((AC)_{. , k} = A C_{., k}\)
matrix as a linear combinator
Suppose \(A\) is an \(m\) by \(n\) matrix; and \(c = \mqty(c_1\\ \vdots\\ c_{0})\) is an \(n\) by \(1\) matrix; then:
\begin{equation} Ac = c_1 A_{., 1} + \dots + c_{n} A_{., n} \end{equation}
(i.e. you can use a vector to linearly combinate the column vectors.)
linear maps are like matrix multiplication
\begin{equation} \mathcal{M}(Tv) = \mathcal{M}(T)M(v) \end{equation}
“the matrix of a vector formed by applying some Linear Map \(T\) onto \(v\) is the same as the product of the matrix of \(T\) and the matrix of a vector of \(v\)”
Proof:
Let \(v_1 \dots v_{n}\) be a basis of \(v\).
So, we have that \(Tv = c_1Tv_{1} + \dots + c_{n}T v_{n}\) by the additivity and homogeneity of \(T\).
Then, converting it all to matricies:
\begin{align} \mathcal{M}(Tv) &= c_1 \mathcal{M}(Tv_{1}) + \dots + c_{n} \mathcal{M}(Tv_{n}) \\ &= c_1 \mathcal{M}(T)_{.,1} + \dots + c_{n}\mathcal{M}(T)_{.,n} \end{align}
because the columns of a matrix represent where each basis vector gets taken in the new space.
You will notice now that \(c_1 \dots c_{n}\) are the scalars needed to construct \(v\), and that \(\mathcal{M}(T)_{.,1} \dots\) are the vectors needed to construct \(\mathcal{M}(T)\).
So:
\begin{equation} c_1 \mathcal{M}(T)_{.,1} + \dots + c_{n}\mathcal{M}(T)_{.,n} = \mathcal{M}(T) \mathcal{M}(v) = \mathcal{M}(Tv) \end{equation}
as desired. \(\blacksquare\)