a bubble inside the ai bubble?
there's an interesting definition of a bubble by ruchir sharma: "all a bubble is, is a good idea gone too far."
while i do think we're experiencing an ai bubble, i don't mean it in a negative way, because i truly believe ai remains a profoundly transformative idea, much like the internet was in its time. in retrospect, the internet bubble was a bubble generated around the internet, not by the technology itself. similarly, we may be seeing an ai generated (pun intended!) bubble, not because the technology is not promising but because of the excitement and capital building around it.
perhaps this investment is creating new layers, or "bubbles within bubbles" reflecting both the promise and the hype around the field.
→the outer bubble
infrastructure build out: record US capex spending on chips, data centers, and gpu farms framed as the last big bet to outpace china.
cheaper money, bigger bets: today's fed rate cut pours more money into the system, lowering borrowing costs and fueling even riskier bets. the bubble still has air to grow especially becuase the US cannot afford to lose the race to china.
where the profits flow: profits would flow towards chip makers like nvidia, hyperscalers like microsoft, big labs like openai, anthropic, and google etc., and certainly consumers with added ai productivity.
maybe too many wrappers: yet far too many startups are building thin layers on top without guaranteed share of profits, echoing charlie munger's reminder that not all new tech profits flow to the builders. venture capital is amplifying this by funding every thin ai wrapper, from yc batches to mega rounds, but not all of those bets will see profits flow downstream.
→is there an inner bubble?
a bubble within: but that is not the only bubble, there might be another bubble within.
llm ≠ agi: llms have been conflated with agi, even though they represent only one facet of intelligence: language. not perception or physical action.
caution from researchers: leading voices like fei-fei li and yann lecun caution that language only models are limited, they can mimic reasoning but lack grounding in perception, action, and embodied understanding.
as fei-fei li explains:watch episode
"there is so much more than language. language is a powerful encoding of thoughts and information, but it's a lossy encoding of what the 3d physical world that all animals and living things experience, and at human intelligence, so much is beyond the realm of language. language is purely generative. language doesn't exist in nature as a physical word. whereas the entire physical, perceptual, visual world is there, and eventually embody intelligence."
where attention skews: the bubble here is less about capex and more about research focus. the same chips and data centers can run llms, vision models, or robotics simulators, but funding and talent are more concentrated and funneled into language systems.
waiting for other breakthroughs: while i argue the focus is skewed, it's also true that large language models have had a huge impact and other domains haven't yet had their "gpt moment" and that breakthrough may well demand edge computing.
bets beyond language: researchers like fei-fei li continue to push vision-centric intelligence, and companies such as physical intelligence, figure, agility robotics and even xAI are betting that embodied and spatial ai will unlock the next wave of impact.
but regardless, bubbles accelerate timelines, they create excess, but also force breakthroughs.
and while the current focus and hype is skewed towards language models with heavy bets being made on agentic and generative ai, there are many other ai domains around spatial and embodied ai that are also being worked on which will and should likely have their own breakthroughs soon.