2 Comments
User's avatar
Sim Blaustein's avatar

I love reading these as always (and the walk down memory lane!). While I broadly agree with your thesis, adding a couple of "whatabouts" here given the trickiness of market timing:

1) What about a Moore-law-style deflation in AI capex costs? I agree that AI demand is likely to only grow, but does some combination of improved model efficiency and GPU capability result in GPU capacity growing faster than demand/expected?

2) Similarly, if the pace of LLM improvements slows, does the main workload shift away from the compute-intensive training to more efficient/lighter-weight AI applications?

3) Are today's scaled tech companies (Mag7) "smarter" than the tech incumbents of 1995 and are more likely to dominate the AI wave? (ie they all read Clay Christensen etc?). I could see a world in which Google, Meta, Microsoft (heck even Apple and Tesla) reap a much larger share of the incremental AI value generated than their 1995 equivalents did from the dot com boom (ironically this list would have included Microsoft)

Expand full comment
David Levy's avatar

All valid points.

Re (1) agree on a unit basis (ie $/token) but believe capex will increase in aggregate for at least the next several years. And demand + larger and larger models will beg for more power.

Re (2) as % yes, but in tokens / dollars / etc no.

Re (3) possibly tho AMZN and (to a lesser extent) MSFT are already way behind so much that AMZN has bet the farm on Anthropic (which it doesn’t own… yet) and MSFT on OpenAI (a profoundly expensive acquisition that a normal FTC wouldn’t allow but lord knows now….)

Expand full comment