---
id: "prereq-the-bitter-lesson"
type: "prereq"
source_timestamps: ["00:11:17"]
tags: ["ai-theory", "rich-sutton"]
related: ["framework-web-rebuild-layers"]
canonical_url: "http://www.incompleteideas.net/IncIdeas/BitterLesson.html"
sources: ["s20-50x-faster"]
sourceVaultSlug: "s20-50x-faster"
originDay: 20
---
# The Bitter Lesson of AI Research

## What You Need to Know

The speaker explicitly references **'the bitter lesson from AI research.'** This refers to **Rich Sutton's 2019 essay** of the same name, which argues:

> General methods that leverage massive computation ultimately dominate over human-engineered, domain-specific heuristics.

In AI history, every time researchers tried to encode human knowledge into a system, scaling up raw compute and search eventually beat them.

## Why It Matters Here

The Bitter Lesson is the **theoretical foundation** for why human scaffolding will inevitably be removed from the software stack — i.e., why **Layer 3** of [[framework-web-rebuild-layers]] is not optional but inevitable.

It directly underwrites:

- [[quote-tools-become-drag]] — human inspection interfaces become overhead
- [[quote-computing-efficiency]] — efficiency is a strong attractor
- The contrarian tilt of [[contrarian-model-speed-is-irrelevant]] — gains come from systemic compute scale, not local model tricks

## Canonical Reference

- http://www.incompleteideas.net/IncIdeas/BitterLesson.html

## Related

- [[framework-web-rebuild-layers]]
- [[quote-tools-become-drag]]
- [[concept-agentic-primitives]]
