Site icon The Blind Machine

How AI Tokens Challenge Broadband Economics

Broadband Traffic vs. AI Token Generation, 2015–2025. Sources: Cisco VNI, IBISWorld, ITU (internet traffic); LifeArchitect.ai, Fireworks AI, Stanford AI Index 2025, Epoch AI (token data).

Key stats at a glance:

The Commodity Logic of Bandwidth

Over the past decade, global internet traffic grew from roughly 72.5 exabytes per month in 2015 to 522 exabytes in 2025—a sevenfold increase driven by video streaming, cloud migration, and the pandemic-era digitization of daily life.

Broadband evolved from a scarce luxury into a utility-grade commodity: priced by the megabit, traded in peering agreements, and regulated as essential infrastructure across the EU and most OECD nations. The commodity trajectory of bandwidth follows a classic pattern: initial scarcity, massive capital investment in fibre and spectrum, price compression, and eventual ubiquity.

By 2025, according to the ITU, nearly 74% of the global population is online, with fixed broadband penetration reaching 20 subscriptions per 100 inhabitants globally.

The Emergent Commodity of Tokens

AI token generation presents a starkly different curve. Where broadband followed a decade-long commoditization arc, tokens have moved from nonexistence to mass-scale commodity in under five years. In 2021, all frontier AI models combined produced roughly 6 billion tokens per day.

By the end of 2025, the figure exceeded 50 trillion—a roughly 8,000-fold increase. The cost of generating tokens at GPT-3.5 quality has collapsed 280-fold in just two years, from $20 per million tokens to $0.07.

This price collapse mirrors, and in some respects outpaces, the bandwidth commoditisation of the late 1990s and 2000s. As Introl’s analysis notes, “LLM inference costs declined 10× annually—faster than PC compute during the microprocessor revolution or bandwidth during the dotcom boom.”

The Register describes the new economics bluntly: “Power goes in and tokens come out… For inference providers serving open-weights models, tokens are a commodity. For those serving them, it’s a race to the bottom.”

“If cloud infrastructure was built on the primitive of compute, AI infrastructure is being built on a different primitive: tokens.”

— Tommy Leep, Jetstream Ventures, via X

Structural Parallels, Structural Divergences

Both bandwidth and tokens exhibit the hallmarks of economic commoditisation: fungibility (a megabit is a megabit; a token at GPT-3.5 quality is interchangeable across providers), price transparency (published per-unit rates), competitive pressure driving margins toward zero, and the emergence of spot and forward markets for capacity.

Nvidia CEO Jensen Huang’s framing of “tokens per watt” as the defining metric of AI infrastructure profitability directly parallels how telecoms measured their competitive position in “cost per megabit delivered.”

Yet the divergences are equally instructive. Broadband commoditisation required decades of physical infrastructure deployment—fibre optic cable laid across ocean floors, spectrum auctions governed by sovereign states, last-mile copper-to-fibre upgrades.

Token commoditization runs on a software-defined supply chain layered atop GPU clusters, where algorithmic efficiencies (quantization, speculative decoding, mixture-of-experts) can compress costs by orders of magnitude without touching the physical plant.

As Deloitte projects, AI compute demand is growing 4–5× per year even as chips become more efficient—a dynamic that echoes Moore’s Law but operates at a pace more reminiscent of the early internet’s explosive traffic growth.

Tokens as a New Economic Layer

The most consequential implication may be structural. Broadband created the transport layer upon which the entire digital economy was built—e-commerce, social platforms, cloud computing, the creator economy. In a comparable manner, tokens are forming the substrate of a new economic layer: one where intelligence, rather than connectivity, is the scarce resource being commoditized and distributed.

This is not merely metaphorical.

Enterprise spending on generative AI reached $37 billion in 2025, a 3.2× increase over 2024. The inference market alone is projected to grow from $106 billion to $255 billion by 2030.

Industries most exposed to AI already talk about 3× higher revenue-per-employee growth compared to those least exposed. The academic literature frames this as a structural transformation: Ferrari (2023) describes “neural production networks” where compute concentration creates new geographies of value, while Verdegem (2022) argues that the “commodification of compute” under Big Tech follows the same winner-takes-all logic that characterized telecom consolidation.

DimensionBroadband (Bandwidth)AI Inference (Tokens)
Commodity unitMegabit per secondToken (input/output)
Commoditisation timeline~20 years (1995–2015)~3 years (2022–2025)
Cost decline rate~25% per year~10× per year (280× in 2 years)
InfrastructurePhysical (fibre, spectrum, towers)Software-defined (GPUs, algorithms)
Market structureRegulated oligopolyCompetitive, rapidly consolidating
Economic layer enabledDigital commerce, cloud, socialAI-native applications, agentic work
Geopolitical dimensionSpectrum sovereignty, subsea cablesChip export controls, GPU sovereignty

Implications: From Pipes to Intelligence

The transition from bandwidth-as-commodity to tokens-as-commodity signals more than a technological shift—it marks the emergence of a fundamentally new form of economic infrastructure. Just as broadband transformed the cost structure of information distribution, token commoditization is collapsing the cost of applying intelligence. A dollar now buys approximately 14 million tokens at GPT-3.5 quality—the equivalent of 10.77 million words, more than most people read in a decade.

By mid-2026, according to some projections, total daily AI output will surpass total daily human output (spoken and written) for the first time. Inference is projected to become the dominant AI workload by 2030, consuming over 40% of global data centre capacity.

“AI compute is racing toward commodity pricing. Foundation models are converging. The differentiation window on raw capability is closing.”

@DrevZiga, via X, March 2026

CRITICAL REVIEW

The Bubble Thesis: Commoditisation as Trap, Not Prize

The commodity narrative above is, in one reading, a story of progress: costs fall, access widens, a new economic layer forms. But a growing body of analysis—most forcefully articulated by the venture capitalist and MIT researcher Paul Kedrosky—argues that the very speed of token commoditization is not a sign of health but a symptom of one of the largest capital-expenditure bubbles in modern history.

Kedrosky’s Core Argument: Commoditisation Strands Capital

Kedrosky’s thesis begins with the observation that model capabilities have plateaued even as infrastructure spending has accelerated. In “Commoditization, Orchestration, and the New AI Stack” (March 2026), he argues that large language models are the new silicon: a foundational layer being rapidly commoditised while value migrates upward to orchestration tools like Claude Code and Codex. The implication is devastating for the current capex cycle: hundreds of billions of dollars are being spent on training infrastructure whose output—the models themselves—is becoming interchangeable. As Kedrosky puts it, “slowing model progress is masked and made less relevant by improved model coordination, which has immense capex implications.”

This parallels our broadband comparison in an unflattering way. The 1990s fibre-optic boom laid more cable than demand could absorb, leading to the collapse of companies like WorldCom, Global Crossing, and 360networks. The cable endured; the capital did not. Kedrosky sees the same pattern: “AI is a bubble because it’s one of the probably five largest CapEx bubbles in history… if it didn’t happen this time, it would truly be the first time in modern economic history”.

“AI is a bubble. There is no question. It’s one of the probably five largest CapEx bubbles in history—like canals, like railroads, like rural electrification, like fiber optics.”

— Paul Kedrosky, Plain English with Derek Thompson, March 2026

The Ghost Economy and the Displacement Spiral

Kedrosky deepens the critique in “AI and the Rise of the Ghost Economy” (March 2026), where he identifies a feedback loop that undermines the commodity-as-progress narrative. AI raises production while simultaneously weakening labour income. GDP continues to rise—driven by the “I” in the GDP equation (investment)—even as the consumption economy hollows out. He calls this “Ghost GDP.”

The mechanism is a displacement spiral: firms automate tasks, labour income declines, consumption falls, demand weakens, and firms respond to margin pressure by automating even more. Historically, “task reinstatement”—the creation of new work categories—has absorbed displaced workers. But Kedrosky argues AI is attacking the exact sector (cognitive work) where new tasks have historically formed after automation episodes. His back-of-the-envelope math suggests the AI buildout was responsible for 64–80% of US GDP growth in 2025.

This directly challenges the chart above: the parallel rise of broadband traffic and token generation may not be two commodity curves building complementary economic layers, but rather a mature infrastructure (broadband) propping up a speculative one (tokens) whose ultimate demand is uncertain.

Circular Demand and Off-Balance-Sheet Risk

Man Group’s institutional analysis—which cites Kedrosky extensively—identifies a recursive financing loop at the heart of the AI buildout. Hyperscalers simultaneously act as suppliers, customers, investors, and validators of each other’s demand signals. “Revenue growth can appear spectacular because each node in the loop pays another. Capex looks justified because demand from inside the loop appears endless. But the demand signal becomes circular and divorced from the market”. The capex–revenue gap is stark: $200+ billion in infrastructure investment against roughly $12 billion in direct AI revenue at end-2025.

Perhaps more troubling is the financing structure. Kedrosky has documented how data centre developers build facilities, lease access to hyperscalers, and package the rental income into asset-backed securities—recycling proceeds into the next construction round. Man Group notes that GPUs depreciate in roughly one year, yet are collateralised in SPVs with 7–15 year assumptions—creating what amounts to “illusory collateral.”

The risk has metastasised to utilities, pension funds, insurers, and retail investors via interval funds and REITs. As NPR reported, only 3% of AI users are paying for the service, while OpenAI does not expect profitability for five years.

The Counter-Thesis: Revenue Is Real This Time

The bubble argument is not without serious challenges. In late 2025 and early 2026, AI revenue surged at a pace that has no precedent in enterprise software history. Anthropic’s annualised revenue went from $9 billion at year-end 2025 to $19 billion by early March 2026—doubling in roughly two months.

OpenAI reportedly added $1 billion in annualised revenue per week in late 2025. As Derek Thompson noted in his interview with Kedrosky, this represents a “rare exception” to the Carlota Perez bubble model: “a historic rate of spending coinciding with a similarly historic surge of revenue.”

Menlo Ventures’ enterprise data reinforces the demand signal: $37 billion in generative AI spending in 2025, with at least 10 products exceeding $1 billion in ARR and 50 exceeding $100 million. AI deal conversion rates run at 47%—nearly double the 25% rate for traditional SaaS. Citigroup raised its 2026–2030 global AI capex estimate to $8.9 trillion, citing rapid enterprise adoption.

Citadel Securities offered a more structural rebuttal to the displacement thesis, arguing that technological diffusion follows an S-curve, not exponential compounding; that physical constraints (energy, compute cost) create natural braking mechanisms; and that “if the marginal cost of compute rises above the marginal cost of human labour for certain tasks, substitution will not occur.” Indeed job posting data shows software engineering demand up 11% year-over-year in early 2026.

“The demand side tells a different story: broad adoption, real revenue, and productivity gains at scale, signalling a boom versus a bubble.”

Menlo Ventures, State of Generative AI 2025

The Academic Middle Ground

The scholarly literature suggests the truth may be more uncomfortable than either camp admits. Occhipinti et al. (2024) used a system dynamics model to show that even a moderate AI-capital-to-labour shift could double labour underutilisation, decrease per capita disposable income by 26%, and reduce the consumption index by 21%. Korinek (2024) identifies eight structural policy challenges for the “Age of AI,” including the risk that “unprecedented productivity gains” coexist with “concerns about job disruption, income distribution, and the value of human capital.” Liang & Liu (2025) find a temporal dichotomy: AI exacerbates wealth inequality in the short term while potentially reducing it in the long term, provided policy interventions are in place.

Forbes’ framing may be the most useful synthesis: the market is not headed for a singular burst but a bifurcation. Organisations with robust data foundations and workflow-level AI integration are seeing transformative returns, while the majority remain in unstructured experimentation—a “cost centre with good intentions”.

Synthesis: The Commodity Is Real, the Financing May Not Be

The broadband-to-tokens parallel holds under scrutiny—but with a critical caveat that Kedrosky’s work forces into view. The commodity is real. Tokens are becoming as fungible, as competitively priced, and as foundational as bandwidth. But the financial architecture built to fund the transition—the recursive demand loops, the off-balance-sheet leverage, the GPU-collateralized SPVs—may be as fragile as the fibre-optic financing of 1999.

The technology survived the dotcom crash; the investors largely did not. The bandwidth commodity endured; the companies that overbuilt it went bankrupt. The relevant question for the token economy is not whether tokens will remain a foundational commodity—they almost certainly will—but who absorbs the losses when the financing structure resets.

Kedrosky’s answer, informed by every prior infrastructure bubble, is that the reset is not a matter of if but when.

The productive residue—the data centres, the models, the orchestration layers—will survive, as fibre optic cable survived the telecom crash. The question is whether the economic layer built atop tokens matures fast enough to justify the capital deployed, or whether it remains, for now, a ghost economy: GDP rising on investment while the consumption base that must ultimately sustain it continues to erode.

“Large language models are the new silicon. Orchestration layers are the new operating systems. As raw model improvement slows, value migrates up the stack. This will have huge implications, from model commoditisation, to inference traffic, to stranded training spending.”

— Paul Kedrosky, “Commoditization, Orchestration, and the New AI Stack,” March 2026
Exit mobile version