---
id: "concept-ai-energy-function"
type: "concept"
source_timestamps: ["00:12:27"]
tags: ["economics", "energy", "artificial-intelligence"]
related: ["concept-lng-helium-production-link", "quote-ai-energy", "action-model-energy-costs", "claim-tsmc-energy-vulnerability"]
definition: "The economic principle that the cost and feasibility of artificial intelligence compute are directly downstream of regional energy prices."
sources: ["s50-helium-48-days"]
sourceVaultSlug: "s50-helium-48-days"
originDay: 50
---
# AI as a Function of Energy Costs

The speaker posits a fundamental economic equation for the modern tech era: **AI is a function of energy costs.** See [[quote-ai-energy]].

This applies not just to the electricity required to run data centers, but crucially to the energy required to manufacture the chips themselves. Fabs in East Asia are heavily dependent on imported LNG to power their operations — see [[claim-tsmc-energy-vulnerability]] for the canonical example.

When LNG prices spike — due to disruptions at [[concept-qatar-ras-laffan-chokepoint]] or rerouting of shipping lanes — the overhead costs for [[entity-tsmc]], [[entity-samsung-electronics]], and [[entity-sk-hynix]] increase dramatically. These higher input costs must eventually be passed through the supply chain, resulting in more expensive chips, more expensive servers, and ultimately more expensive AI inference.

The corollary: a strategic advantage in AI requires securing cheap, reliable, long-term energy. This is why [[concept-power-of-siberia-2]] is a central piece of the [[concept-chinese-native-chip-stack]] thesis — and why planners should heed [[action-model-energy-costs]].


## Related across days
- [[concept-cloud-ai-economics]]
- [[concept-data-center-nimbyism]]
- [[concept-inference-wall]]
