2 Definitions: Computational and Abstract


In math, you notice a lot of concepts have multiple definitions, which are formally equivalent but give insight into different aspects of that concept.

Some concepts have 2 especially notable definitions: one that lets you compute things and a more abstract one that lets you know what’s actually going on.

This is not a perfect division, but a useful one. I’ll use linear algebra as the main source of examples, since so much of computation relies on linear algebra.

If you want to escape the prison of convention, you need to learn both definitions. Otherwise, you’re forced to think in only one way, which imparts a taste into you. Not a good form of taste either, as why the lucky stiff pointed out.

When you don’t create things, you become defined by your tastes rather than ability. Your tastes only narrow and exclude people. So create.

TODO

Let’s start with a simple example: matrix multiplication.

Matrix Multiplication

Computational Def

\[AB_{ij} := \sum_k A_{ik}B_{kj}\]

That’s straightforward to compute, but ugly. You can check that it satisfies properties like distributivity and associativity, but it’s a pain with all the indices.

On the other hand, it’s a breeze to compute.

Abstract Definition

Matrices are linear functions. Matrix multiplication is defined the way it is to correspond to function composition. Check that it \(AB\) corresponds to the composition of the linear functions that \(A\) and \(B\) represent, and you get all those properties for free.

But that elegance has some downsides. It’s not really clear what the composite function looks like. You need the computational definition to find its matrix.

So both definitions have upsides and downsides. Luckily, we can use both of them at will.

Eigenvalues and Eigenvectors

Computational Definition

\(Ax = \lambda x\), and to find the eigenvectors, solve \((A - \lambda I) x = 0\)

Abstract Definition

The abstract definition has a nice geometric interpretation that makes most of the properties of eigenstuff obvious. Basically, they preserve one-dimensional subspaces. I’ll refer you to my blog post linked above for more. It, and 2 others I wrote, show how eigenvectors let you view linear operators as the sum of simple linear operators.

But to actually find the eigenvectors, you need the computational definition.

Tensor Products

Computational Definition: The Kronecker Product

In this case, the computational definition may be clearer, since the abstract definition is really abstract.

\[\mathbf{A}\otimes\mathbf{B} = \begin{bmatrix} a_{11} \mathbf{B} & \cdots & a_{1n}\mathbf{B} \\ \vdots & \ddots & \vdots \\ a_{m1} \mathbf{B} & \cdots & a_{mn} \mathbf{B} \end{bmatrix},\]

See Wikipedia for a clearer expansion.

Abstract Definition(s)

The quotient of a free vector space to enforce bilinearity or a basis for the space of bilinear maps.

The first definition gives you all the abstract properties pretty much by definition but will leave you with no idea of what a tensor product looks like. The 2nd one is a little clearer, and gives a hint towards the universal property of the tensor product.

If you squint at the computational definition, you can morally believe the universal property, and it certainly tells you what tensors they look like as matrices.

Determinants

Going in the other direction, here’s a concept where the computational definition has never managed to convince anyone that the concept was important.

Computational Definition

A weird formula that’s hard to remember and harder to typeset. The standard cofactor expansion is so inefficient it’s not worth writing down.

Abstract Definition

The product of all the eigenvalues. If you know the abstract definition of eigenvalues, all the properties of the determinant are blindingly obvious.

Related Posts

Compactness of the Classical Groups

Derivative AT a Discontinuity

Just because 2 things are dual, doesn't mean they're just opposites

Boolean Algebra, Arithmetic POV

discontinuous linear functions

Continuous vs Bounded

Minimal Surfaces

November 2, 2023

NTK reparametrization

Kate from Vancouver, please email me