Accuracy vs Precision


I had a hard time remembering this1, almost as bad as Type 1 vs Type 2 (which are still the worst terminology I’ve ever seen).

Thinking of these by reframing them with actual statistical terms made them easy to remember.

Accuracy

A random variable (the estimator) is accurate if it has low bias.

Precision

A random variable is precise if it has low variance.

Examples

From these definitions, it’s finally clear to me why neither implies each other. Say your true distribution is a standard (\(\mu = 0, \sigma = 1\)) Gaussian. Consider these cases for your estimator, also a Gaussian with the following parameters:

(\(\mu = 0, \sigma = 1,000\))

Perfectly accurate because it’s unbiased. Imprecise as hell.

(\(\mu = 1,000, \sigma = 0.01\))

Inaccurate. Very precise.

(\(\mu = 1,000, \sigma = 1,000\))

Inaccurate and imprecise.

(\(\mu = 0, \sigma = 0.01\))

Perfectly accurate, more precise than the true distribution because its spread is lower.

  1. 4 years until 10 minutes ago, when I finally thought about it properly. 

Related Posts

hyperbolica

Random Thought: LC Theorem

I finally have an answer to "who's your favorite singer?"

My Top Tip for Helping People Get Started Programming

GPT-f

Random paper on angles

An Image is Worth 16x16 Words

Random stuff

Lossless Data Compression with Neural Networks by Fabrice Bellard

Downscaling Numerical Weather Models With GANs (My CI 2019 Paper)