- matrix = linear map
- tensor = multilinear map
The Long Short Of It
This is a brief “explanation” and contains almost no detail. To be fair, it’s really just meant to give some intuition to someone who understands linear algebra semi-well.
First, think of a matrix, which we’ll call .
(Digression: A matrix is nothing but an array of stuff, , which are usually numbers, but don’t have to be. We’ll say they’re real numbers since in machine learning, that’s pretty much always the case. If you want to learn more, read about fields.)
These matrices can be used for more than linear algebra, but that’s where they really shine, because of the next piece of information.
Linear function: A linear map.
Linear functions and matrices are in 1-1 correspondence. Every linear function has a unique representation as a matrix (in terms of some basis), and every matrix represents some linear function (in that basis).
That means represents some linear function , and is represented by the matrix .
Incidentally, matrix multiplication is defined to be equivalent to function composition. So if the matrices correspond to linear functions , and is a vector,
Additionally, an matrix ( for row, for column) is a linear function from to , (or any field, not just ).
This is why an matrix, , can only be (left) multiplied with an vector, . Since , it can only be applied to vectors in , which are vectors.
Notice that linear functions take in only one argument. It’s , not .
We’ll fix that now.
There’s an extension of linear functions to multiple arguments that preserves linearity in a simple way. Consider the multilinear function .
Multilinear (with respect to a single argument ): Linear if every other argument is held fixed, and is allowed to vary. Think of a partial derivative for the basic idea.
A tensor is a multilinear function. Tensors and multilinear functions are in the same kind of 1-1 correspondence that linear functions and matrices are. This is why matrices are a (simple) type of tensor. After all, the 1 argument that linear functions take is a subset of the arguments that tensors take.
There is a 1-1 correspondence between linear functions in one space and multilinear functions in another, related space, so tensors can be written as multi-dimensional arrays, or as nested arrays in libraries such as NumPy.
Tensors are multilinear functions. Matrices are linear functions.
If you find any errors, or if I’m just plain wrong, feel free to tell me.