Accuracy vs Precision


I had a hard time remembering this1, almost as bad as Type 1 vs Type 2 (which are still the worst terminology I’ve ever seen).

Thinking of these by reframing them with actual statistical terms made them easy to remember.

Accuracy

A random variable (the estimator) is accurate if it has low bias.

Precision

A random variable is precise if it has low variance.

Examples

From these definitions, it’s finally clear to me why neither implies each other. Say your true distribution is a standard (\(\mu = 0, \sigma = 1\)) Gaussian. Consider these cases for your estimator, also a Gaussian with the following parameters:

(\(\mu = 0, \sigma = 1,000\))

Perfectly accurate because it’s unbiased. Imprecise as hell.

(\(\mu = 1,000, \sigma = 0.01\))

Inaccurate. Very precise.

(\(\mu = 1,000, \sigma = 1,000\))

Inaccurate and imprecise.

(\(\mu = 0, \sigma = 0.01\))

Perfectly accurate, more precise than the true distribution because its spread is lower.

  1. 4 years until 10 minutes ago, when I finally thought about it properly. 

Related Posts

Compactness of the Classical Groups

Derivative AT a Discontinuity

Just because 2 things are dual, doesn't mean they're just opposites

Boolean Algebra, Arithmetic POV

discontinuous linear functions

Continuous vs Bounded

Minimal Surfaces

November 2, 2023

NTK reparametrization

Kate from Vancouver, please email me