29 Comments
User's avatar
Peter Wildeford's avatar

I got a comment privately that I want to post:

>> This contrasts the inflation-adjusted $37B spent on the Manhattan project around 1941 to the $400B+ AI bet in 2025, implying it's 10x bigger than the Manhattan project.

> However, if measured as a % of GDP, it's similar (roughly 1-2%). Which is still notable, but 1x vs 10x is an important difference. I think this is important to point out given significant population and TFP growth since 1941.

Expand full comment
Peter Wildeford's avatar

I got a comment privately that I want to post:

> I would point out that Nvidia is distinct from the old telcos in an important ways, which is that it has 75%+ margins, so if it keeps the fraction of it's sales in this kind of deal low it's not really an existential risk for them.

> Also "Altman proposes soon somehow speeding up this process 100x" regarding data centers taking 2 years seems a little misleading - he's not talking about building one from scratch in a week, presumably, but rather starting 50 projects a year such that one is finishing every week I guess

Expand full comment
The Sober Analyst's avatar

Very interesting. The bottleneck is the power generation itself, as you spoke about. I’m sure your prediction is correct that gigawatts-per-facility will increase, but the amount of available energy doesn’t add up and neither does the projected energy. The new builds (nuclear and otherwise) are on a much longer timeline.

Expand full comment
Peter Wildeford's avatar

Yeah. There’s a lot of plans to get the requisite energy, none with a guaranteed chance of success. It’s hard to project.

Expand full comment
The Sober Analyst's avatar

Not enough to fulfill all of the promises, in all likelihood. Timelines will have to be pushed back.

Expand full comment
SSBN734's avatar

The power transformers have a 24-month+ lead time for GW scale generation plants, let alone the labor and other plant equipment.

Expand full comment
Ebenezer's avatar

>NVIDIA competitors catch up

The irony is that if AGI *does* arrive on schedule, NVIDIA competitors are quite likely to catch up.

NVIDIA's moat is in software, and automating software engineers is a major focus of the industry. Not only that, but the software NVIDIA writes is the sort which is easier to automate: it has to pass a big test suite with high performance. There's little UI or human factors component. It's a fairly objective evaluation function, akin to the sort of work the AI industry is currently doing around RL with verifiable rewards.

I'm just a spectator here. I don't have deep knowledge of this industry. But from the outside, NVIDIA looks like a cartoon character with a chainsaw, furiously destroying the branch which supports their weight.

Short NVIDIA could be the trade of the century, you just need to time it right. I'm envisioning a scenario like the following: Google makes use of its existing work on verifiable rewards and autoformalization to design a provably correct, superfast TPU using AI. They don't need to replicate the breadth of customers NVIDIA serves. They initially focus on serving their internal AI needs, and then start reaching out to a few other major AI players, massively undercutting NVIDIA prices to steal market share. They take the 80/20 approach of targeting the 20% of configurations which power 80% of AI servers, instead of serving a big variety of configurations like NVIDIA. As soon as Google produces credible evidence that it can succeed with this strategy, you'd be a fool to hold on to your NVIDIA shares. They've got a long way to fall.

From an AI safety perspective, the above scenario seems mildly positive. Race dynamics decrease if Google achieves a commanding lead / leverage over other AI players. And decreasing profits in the industry means that AI leaders are more likely to be in it for humanity rather than getting rich quick.

Expand full comment
Matt Reardon's avatar

Are you using a different definition of market cap here (to imply something about liquid assets)? Nvidia’s market cap is 4.3 trillion, meaning $100 billion is only ~3%, not 35%

Expand full comment
Peter Wildeford's avatar

No, I'm just bad at math. Fixed!

Expand full comment
Larry L Terry's avatar

which calls into question other numbers in the piece. Just say'n.

Expand full comment
Peter Wildeford's avatar

Totally fair for that to be called into question given this error, which I regret.

Expand full comment
Eva B's avatar

I

Expand full comment
Larry L Terry's avatar

I get the same results as you.

Expand full comment
Zenon Kuzmyn's avatar

Nobody is talking about *water* here, either. Or, rather, the need to constantly keep these *cool* behemoths from constantly running *hot*…

Expand full comment
Mylo's avatar

If you're going to compare a contract with market cap you need to multiply the expected annual earnings from that contract by a reasonable pe multiple of ~15-20

Expand full comment
Robots and Chips's avatar

Outstanding analysis of the circular financing dynamics. Your telecom crash parallel is spot-on - what makes this particularly concerning is that NVIDIA's 75%+ margins actually create more systematic risk, not less. When margins are that high, the entire value chain becomes dependent on maintaining those margins, which requires the monopoly position to persist indefinitely. The math you laid out is sobering: by 2028, we're talking about $1T annual spend on AI infrastructure, with 25-30% of S&P 500 value tied to these bets working out. What I find most compelling is your treatment of the physical constraints - you can't just 10x energy generation or data center construction speed through financial engineering. The bottleneck shifts from capital to actual engineering and manufcturing capacity. One thing I'd add: the Chinese angle matters more than people realize. If US export controls successfully kneecap Chinese AI development while our companies are this leveraged, we've created a single point of failure for a quarter of the entire stock market. Great piece.

Expand full comment
Robots and Chips's avatar

This is one of the most rigorous analyses of the AI financing structure I've seen. The circular vendor financing dynamic is what makes this genuinely novel - when NVIDIA can essentially print its own demand through strategic investments, traditional supply-demand analysis breaks down. What particulary strikes me is your 2028-2030 inflection point observation: we're not just talking about whether AI continues scaling, but whether the entire financial architecture remains viable. If a 1e28 FLOP model doesn't meaningfully advance capabilities beyond what we see at 1e27, the whole structure unwinds simultaneously across NVIDIA, Oracle, and OpenAI. The correlation risk you identified is terrifying - this isn't diversified exposure to AI, it's a single levered bet with multiple entry points.

Expand full comment
Robots and Chips's avatar

This is an outstanding piece of analysis. The 'Infinite Money Glitch' framing captures something genuinely concerning about this cirular vendor financing at unprecedented scale. Your point about NVIDIA's position being fundamentally different from normal vendor financing - with 80-90% market share making them more like a 'sovereign lending to colonies' - is spot-on. The parallel to the 2001 telecom crash is chilling. The projections for 1e28-1e29 FLOP models by 2029-2030 are breathtaking, assuming all the bottlenecks get resolved. The observation that 25-30% of the S&P 500 is now a leveraged bet on AGI arriving on schedule should be front page news. Thanks for pulling together all these threads with such rigor!

Expand full comment
JS Denain's avatar

> This likely will be a part of 45-60GW of total compute across Meta, Microsoft, Amazon/AWS/Anthropic, OpenAI/Oracle, Google/DeepMind, and xAI

It's worth clarifying whether this includes non-AI compute or not. Given your forecast for 2027 (90GW in AI compute), my understanding is that this is 45-60GW in AI compute for mid-2026. This seems too aggressive to me.

Josh's report with EPRI (https://www.epri.com/research/products/000000003002033669) which I agree with suggests closer to 10GW in worldwide AI power currently (cf page 19), 20-30 GW in 2026, and around 40GW in 2027 (cf Figure 9, doubling to go from US to World).

Expand full comment
JS Denain's avatar

Aside from the 4 methods in the EPRI report, I know of the following predictions of AI power demand:

- 5 sources cited in https://www.offgridai.us/ (including Situational Awareness)

- RAND https://www.rand.org/pubs/research_reports/RRA3572-1.html which is also particularly aggressive, 327 GW by 2030 (and starts at 11GW in 2024)

- Anthropic's summary page 7 here https://www-cdn.anthropic.com/0dc382a2086f6a054eeb17e8a531bd9625b8e6e5.pdf

- SemiAnalysis here https://files.nitrd.gov/90-fr-9088/SemiAnalysis-AI-RFI-2025.pdf

Expand full comment
SSBN734's avatar

The discussion about where tens to hundreds of GW of new generation on a short time scale feels a lot like an "Underpants Gnome" problem.

There is no possible way to build this kind of generation in this country within the next few years. Physically impossible. The lead time for GW-sized transformers alone is tens of months to say noting of all the other equipment and labor to put it all together.

Expand full comment
Nostradamus 2's avatar

AGI will make these concerns trivial

Expand full comment
SSBN734's avatar

Can AGI conjure 24-month lead time transformers overnight?

Expand full comment
Nostradamus 2's avatar

If they wish

Expand full comment
Alex Cosgrove's avatar

One of Oracles 5 centers is in my town...

Expand full comment
James Nichols's avatar

Some of the Newer AI Data Centers are already using Nuclear Power to make Them Viable.

Expand full comment