The Traps Behind the Data: A Guide to 55 Statistical Fallacies

The Blind Leading The Blind – Pieter Bruegel

Compiling these 55 fallacies was a humbling experience.

Not because I discovered other people’s mistakes, but because in nearly every example, I could recall some version I had committed myself. Not the dramatic, large-scale kind of failure — more like those everyday moments at work where you think you are rigorously analyzing data, when in reality you are just convincing yourself.

Knowing the name of a bias does not make you immune. But at least, when that feeling creeps in, you have a better chance of recognizing it sooner.


The 8 Articles in This Series

I. Sampling Bias (6 types)

Your sample does not represent your target population — the analysis is skewed before it even begins.

Survivorship Bias · Sample Selection Bias · Coverage Bias · Self-Selection Bias · Convenience Sampling Bias · Time Window Bias


II. Measurement Bias (6 types)

What you measured is not what you think you measured — the ruler itself is bent.

Social Desirability Bias · Observer Bias · Recall Bias · Instrument & Measurement Error · Confirmation Bias (Collection Phase) · Temporal & Seasonal Bias


III. Numerical Intuition Traps (6 types)

Mathematical results defy intuition — the brain systematically errs when processing probabilities and aggregated numbers.

Base Rate Fallacy · Gambler’s Fallacy · Law of Small Numbers · Simpson’s Paradox · Ecological Fallacy · Atomistic Fallacy


IV. Causal Inference (5 types)

You see a correlation and assume you have found a cause, but from correlation to causation there are specific trap mechanisms.

Confounding Factor · Reverse Causality · Collider Bias · Spurious Correlation · Mediation Fallacy


V. Cognitive Bias (7 types)

The brain has specific, well-documented systematic shortcuts during analysis and decision-making that lead you astray.

Confirmation Bias · Anchoring Bias · Availability Heuristic · Representativeness Heuristic · McNamara Fallacy · Goodhart’s Law · Path Dependence


VI. Statistical Methods (11 types)

You assume the tool is correct, but the tool’s assumptions have been violated — or you misunderstood what the tool is telling you.

Regression to the Mean · Multicollinearity · Omitted Variable Bias · Overfitting · Data Leakage · Look-ahead Bias · Extrapolation Bias · P-value Misinterpretation · Effect Size Neglect · Underpowered Study · Multiple Comparisons


VII. Experiment Design (9 types)

You assume the experiment is fair, but the design and execution of the experiment itself introduced bias.

Hawthorne Effect · Placebo Effect · Experimenter Expectancy Effect · Intervention Bias · Non-Response Bias · Questionnaire Bias · Information Bias · Detection Bias · Exclusion Bias


VIII. Presentation & Reporting (5 types)

The data itself is fine, but selective presentation or visualization skews the conclusions.

Truncated Y-Axis · Dual-Axis Manipulation · Cherry-Picking / Texas Sharpshooter · File Drawer Problem · Publication Bias


For the complete checklist, see the Debugging Your Thinking Map.