What is an intuitive way to understand the dot product in the context of matrix multiplication?

by Pinocchio   Last Updated November 14, 2017 18:20 PM

I was trying to understand where it came from that each row in a matrix multiplication is a dot product, as in:

$$ Ax = \left( \begin{array}{ccc} a_{1}^T \\ \vdots \\ a_m^T \end{array} \right)x = \left( \begin{array}{ccc} a_{1}^Tx \\ \vdots \\ a_m^T x \end{array} \right) $$

what is an intuitive explanation or interpretation that each row is a dot product of the vector x?

What I do understand is that $Ax$ encodes a linear transformation $T$. Consider a super simple example in 2 dimensions to explain what I do understand. I understand that $Ax = A [x_1 x_2] = T(v) = T(x_1 \hat i + x_2 \hat j) = x_1 T(\hat i) + x_2 T( \hat j)$. This makes me interpret intuitively that a multiplication by a matrix gives me a new vector that is composed of the same linear combination of the transformed basis vectors (or whatever vectors v is composed of)[source]. Furthermore one can easily see from this view where the multiplication of a matrix comes from:

$$Ax = \left[ \begin{array}{ccc} a_{11} & a_{12} \\ a_{21} & a_{22} \\ \end{array} \right] x = \left[ \begin{array}{ccc} T(\hat i)_1 & T(\hat j)_1\\ T(\hat i)_2 & T(\hat j)_2\\ \end{array} \right] \left[ \begin{array}{ccc} x_{1} \\ x_{2} \\ \end{array} \right] = x_1\left[ \begin{array}{ccc} T(\hat i)_1 \\ T(\hat i)_2 \\ \end{array} \right] + x_2 \left[ \begin{array}{ccc} T(\hat j)_1\\ T(\hat j)_2\\ \end{array} \right] = \left[ \begin{array}{ccc} T(\hat i)_1 x_1 + T(\hat j)_1 x_2\\ T(\hat i)_2x_2 + T(\hat j)_2 x_2\\ \end{array} \right] $$

where now its obvious why matrix multiplication is defined the way it is (because of linear transformations). Notice that the nice thing about this view is that one can interpret that each column of the matrix tells us how each basis vector changes. i.e. each column specifies how $\hat i$, $\hat j$ are transformed. Furthermore, the amount it used to be in the old vector is retained but now its in the new direction $T(\hat i)$ for the first coordinate. This for me is really intuitive and explains a lot of where matrix multiplication comes from.

However, if you notice this view reveals that each row $(Ax)_i = a_1^T x$ is a dot product of the initial array representation of the vector. This seems to me to not be a coincidence and that something deeper has to be going on. Usually dot products are related with projections so I was trying to understand if each coordinate of $(Ax)_i$ might actually be encoding how much the original $x$ is being projected into each row vector of $A$ (or possible something to do with the row space of $A$ i.e. $C(A^T)$ ). In an attempt to understand this I considered what each row means:

$$ \left[ a_{i,1} \dots a_{i,m} \right] \left[ \begin{array}{ccc} x_1\\ \vdots\\ x_n \\ \end{array} \right] = \sum^n_{j=1} a_{ij} x_j$$

in the old interpretation I had of what a column of a matrix is (this time the matrix is 1 by m), it seems that the columns $a_{i,j}$ specifies how much some basis vector $e_i$ is transformed. However, I've had difficulties understanding beyond that what the significance of the dot product of $x$ with the rows of $A$ means. Does someone know how to interpret this or how to understand it at a conceptual level, similar to the way the interpretation I gave of what the columns of a matrix mean? Are we doing some transformation to the row space of $A$ or something like that?



Answers 1


Using the property of the transpose $\langle A^Tw,v\rangle = \langle w, Av\rangle$, I get:

$$\pmatrix{a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23}}\pmatrix{x \\ y \\ z} = \pmatrix{\langle A^T\pmatrix{1 \\ 0}, \pmatrix{x \\ y \\ z}\rangle \\ \langle A^T\pmatrix{0 \\ 1}, \pmatrix{x \\ y \\ z}\rangle} = \pmatrix{\langle \pmatrix{1 \\ 0}, A\pmatrix{x \\ y \\ z}\rangle \\ \langle \pmatrix{0 \\ 1}, A\pmatrix{x \\ y \\ z}\rangle} = \pmatrix{\operatorname{proj}_{e_1}(Ax) \\ \operatorname{proj}_{e_2}(Ax)}$$

That last bit should be absolutely clear -- The first component of $Ax$ is the projection of $Ax$ onto $e_1 = \pmatrix{1 \\ 0}$ and likewise for the second component.

Bobbie D
Bobbie D
November 12, 2016 03:55 AM

Related Questions





linear inequalities

Updated March 21, 2016 08:08 AM