---
id: "framework-web-rebuild-layers"
type: "framework"
source_timestamps: ["00:06:56", "00:09:18", "00:11:10"]
tags: ["architecture", "evolution", "infrastructure"]
related: ["concept-agentic-primitives", "concept-tool-agent-coevolution"]
validation_status: "aligned-with-validation-stacks"
sources: ["s20-50x-faster"]
sourceVaultSlug: "s20-50x-faster"
originDay: 20
---
# The 3 Layers of the Web Rebuild

## Overview

The transition to an agentic web is happening in three distinct, sequential phases. Each phase strips away more human-centric scaffolding than the last.

## The Three Layers

### Layer 1 — Optimize Existing Tools

Rewrite existing ecosystems in faster languages so agents wait less. Concrete examples:

- JavaScript build tools rewritten in [[entity-rust]] or Go
- Faster compilers, bundlers, package managers
- This layer keeps the *abstractions* humans recognize but accelerates them

This layer is enabled by [[concept-tool-agent-coevolution]].

### Layer 2 — Replace Tool Abstractions with Agent-Native Primitives

Abandon human-recognizable tools in favor of [[concept-agentic-primitives]]:

- Persistent shells / always-on containers (no startup cost)
- Shared KV caches replacing text-based message passing
- Sub-millisecond branching file systems like [[entity-branchfs]]
- Wire formats that assume the consumer can ingest millions of rows at once

This is the architectural break with [[concept-human-affordance-bottleneck]].

### Layer 3 — Remove Human Scaffolding Entirely

As models become more capable, the interfaces and frameworks built to *inspect and manage* them become pure overhead. The tools built for today's models become a drag on tomorrow's models — captured in [[quote-tools-become-drag]].

This layer is the operational expression of Rich Sutton's Bitter Lesson — see [[prereq-the-bitter-lesson]]. Human-engineered heuristics get out-competed by general methods leveraging massive computation.

## Sequencing Logic

The layers cannot be skipped in practice:
- Layer 1 buys time and proves the migration path
- Layer 2 requires Layer 1 toolchains to be viable
- Layer 3 requires the model capability that Layer 2 enables

## External Validation

Aligns with modern validation stacks: Layer 1 is validated via benchmarks; Layer 2 via continuous monitoring of tool-invocation success and latency; Layer 3 via drift detection that drops the need for human inspection.

## Related

- [[concept-agentic-primitives]]
- [[concept-tool-agent-coevolution]]
- [[prereq-the-bitter-lesson]]
- [[quote-tools-become-drag]]
- [[claim-speed-bottleneck-limit]]
