The Price-to-Dream Ratio

The Price-to-Dream Ratio

Autonomous shopping agents, co-working and research agents, and coding agents that write, test, and deploy software. The demos are impressive and the announcements relentless, but how much economic value is any of this generating? Almost all AI-related spending is capital expenditure. Companies are buying chips, building data centers, and scaling up cloud capacity. This is spending on AI infrastructure, not productivity from AI deployment. AI is in the infrastructure buildout phase, not the value capture phase. Mass spending is generating minimal returns, but the market has decided to price the dream rather than the earnings. Can any of this translate into economic reality before the capital runs out and political patience expires? Infrastructure, capability, and revenue growth are happening. Productivity is developing. But a significant gap still exists between capital investment and return on that investment. AI is risky, but these investments are not irrational. They are pricing the dream, and the long-term winners remain unclear.

The New Software Stack

The New Software Stack

For the better part of three decades, enterprise software followed a remarkably stable economic logic. You built a product. You sold access to that product. You charged per seat. You expanded revenue by increasing the number of people required to operate the system. It was elegant, scalable, and wildly profitable. Now, it is breaking. It is the decoupling of software revenue from human labor. The industry continues to frame this moment as a competition between AI and software. That framing is wrong. AI is not competing with software. It is becoming the operating system for work.

Capturing AI

Capturing AI

AI models produce raw intelligence. They generate tokens. But tokens are an intermediate good, not a finished product. What customers actually pay for is legal work completed, code shipped, claims processed, research synthesized, and decisions supported.

They pay for refined output.

Attention has focused on the infrastructure layer — the frontier labs, the compute stack, and the data centers. That attention is not misplaced, but it overlooks a structural shift already underway. Once you understand the model as an intermediate good rather than the end product, the center of gravity moves. The decisive question is no longer who can produce intelligence, but who can turn it into something usable, trusted, repeatable, and economically defensible.

In other words, who can refine it into a usable product?

At the base of the chain sit the token producers — OpenAI, Anthropic, Google DeepMind, Meta, DeepSeek, and Qwen. They produce raw capability. This layer is expensive to build, technically formidable, and still moving fast. But crude oil is not gasoline.

Enterprises and consumers pay for gasoline.

Reimagining Software

Reimagining Software

Software Is the central nervous system of the global economy and its demise is greatly exaggerated. There’s a growing narrative thatsoftware is becoming commoditized. Large language models write code. Autonomous agents assemble applications. The barriers to building digital products appear to be collapsing. If software can be generated instantly, then software itself must be losing value.
This conclusion fundamentally misunderstands how technological disruptions develop and expand. Software is becoming the infrastructure layer of modern civilization. The economic, industrial, and geopolitical systems being constructed over the next three decades will not run on software. They will run as software.