back

Speed Eats Scale: How AI Just Made Capitalism Faster

OpenAI is paying Microsoft now. A regional bank just hired OpenAI engineers to take commercial loans from JPMorgan. And bots are quietly running airline-style yield management on every transaction. Three deals from one cycle confirm capitalism's oldest rule — efficiency always wins — is now running at machine speed. Big, slow incumbents have a problem.

COAI story

THE NUMBER: 27% — Microsoft’s equity stake in OpenAI Group PBC, the for-profit entity that emerged from OpenAI’s recapitalization. The stake is currently valued at roughly $135 billion, which prices the company at $500 billion. Microsoft kept that stake after giving up its exclusive license to OpenAI’s intellectual property and erasing the AGI clause that was supposed to define the partnership through artificial general intelligence. Read that sentence with the directionality flipped. A year ago, Microsoft was paying OpenAI a revenue share for the privilege of exclusively reselling its models on Azure. Today Microsoft has stopped paying that revenue share, while OpenAI continues to pay Microsoft royalties on Azure-deployed products through 2030, and Microsoft retained its 27% stake in a company whose secondary market is bidding shares at well over $800 billion. The headlines say Microsoft “lost exclusivity.” The math says Microsoft converted exclusivity into a permanent royalty annuity plus the most leveraged passive position in AI. “Now the guy’s got Paulie as a partner,” Ray Liotta told us in 1990, narrating Sonny’s predicament at the Bamboo Lounge. “Now OpenAI’s got Paulie as a partner,” the cycle replied this morning. The labs we used to call frontier are writing rent checks to the entities that own the distribution. The labs are losing the argument that capability was ever the moat. The owners of distribution — and especially the ones running their own labs as a hedge — are winning the argument that efficiency at scale is the only game.


Episode 5 – How Distribution Is Becoming the Ultimate Moat

Most companies are paralyzed by the “Fog of War” in AI — a relentless storm of product hype, unpredictable breakthroughs, and fleeting models. But the real game-changer isn’t just the technology; it’s how you navigate the chaos and turn distribution into your ultimate moat

This morning we wrote that distribution was the only moat left in AI. Eight hours later, four structural rearrangements confirmed it harder than the morning’s draft did. Microsoft and OpenAI restructured their partnership, removing the AGI clause and the exclusivity that was the spine of the original 2019 agreement. Google committed up to $40 billion to Anthropic at a $350 billion valuation — $10 billion firm, $30 billion performance-tied — even as the secondary market bids the same shares above $800 billion. AWS deepened its Anthropic relationship into silicon-level co-engineering on Trainium and Graviton, with a “Claude Platform on AWS” coming that bypasses Bedrock entirely. And xAI/SpaceX continued running the most vertically integrated play in the industry — Musk owns the lab (Grok), the compute (Colossus, on track for one million GPUs by year-end), the audience surface (X), and the $10 billion break fee against a $60 billion call option on Cursor that locks every other lab out of the leading coding agent until April 2027. Four labs. Four structural moves. One thesis: the owners of distribution are the king-makers, and the only safe place to be is owning distribution yourself or paying the people who do.

Three other stories landed inside the same news cycle and they all turn on the same axis. A small Pennsylvania regional bank signed a multi-year deal with OpenAI to embed engineers inside the bank, automate commercial lending end-to-end, and co-build products OpenAI will eventually sell to other banks. The CEO had his AI clone deliver his earnings call to make the point. Jason Lemkin at SaaStr went public with the math on his AI-native stack: Salesforce bill up 83% with 80% fewer human seats. Notion bill: still on autopay, but he hadn’t opened the product in months. And Anthropic published the Project Deal experiment — bots representing both sides of every transaction in an internal marketplace, 186 deals closed in a week, with Opus-powered agents systematically extracting better prices than Haiku-powered agents while the participants on the losing side rated their satisfaction equally.

Three different industries. Three different scales. Three different angles of attack. One thesis. Capitalism’s defining feature has always been efficiency. AI is just efficiency running at machine tempo. Speed eats scale. Always has. The labs just made the cycle visible enough to write about on a Monday.

🦞 Now Paulie’s Your Partner

Goodfellas was always a movie about the protection economy. The most-quoted line in the film isn’t actually a line of dialogue — it’s a voiceover monologue Ray Liotta delivers as Henry Hill, narrating what just happened to Sonny, the owner of the Bamboo Lounge. Sonny had a problem (Tommy ran up a $7,000 tab and wouldn’t pay), so Sonny went to Paulie for protection. Paulie was reluctant at first. Eventually Paulie agreed to come in on the restaurant business as a partner. “Now the guy’s got Paulie as a partner,” Henry tells us. “Any problem, he goes to Paulie. But now he’s gotta come up with Paulie’s money every week, no matter what. Business bad? Fuck you, pay me. You had a fire? Fuck you, pay me. Place got hit by lightning? Fuck you, pay me.” That’s the deal. You don’t get to decide what you owe Paulie. You only get to decide whether you remain in business. The labs are Sonny. The owners of distribution are Paulie. The protection is access to billions of users, planet-scale compute, regulated cloud trust, and pre-installed surfaces. The tribute is the rev share, the equity, the silicon co-engineering, the call-option exclusivity — whichever flavor of payment makes the platform comfortable.

The Microsoft-OpenAI restructuring announced Monday morning is the cleanest case. The original 2019 deal — Microsoft’s first $1 billion check, expanded to $13+ billion across the next four years — set Microsoft up as patron and OpenAI as protégé. Microsoft paid the rev share. Microsoft held the exclusive license. Microsoft would lose its IP rights only when OpenAI achieved AGI. By every reading of the original contract, OpenAI was the prized asset and Microsoft was the senior partner doing the favor. The April 27 amendment scrambles every one of those vectors. Microsoft no longer pays the rev share. OpenAI keeps paying royalties to Microsoft through 2030. The AGI clause is gone, replaced by a fixed termination date in 2032. Microsoft kept its 27% stake — currently valued at roughly $135 billion on the $500 billion recapitalization, with the secondary market already bidding the company well above $800 billion. And OpenAI got the right to sell on any cloud — which it exercised within the same news cycle by signing the Customers Bank deal, as we’ll get to. The relationship that started as exclusive partnership ended as a passive equity position with a royalty annuity attached. That’s not a downgrade for Microsoft. That’s an upgrade with better unit economics and zero competitive constraint. Microsoft can now sell every other model — Claude, Gemini, DeepSeek, the next thing — through Azure while still collecting from OpenAI through 2030. The patron became the landlord, and the landlord prefers it that way.

Google’s up-to-$40-billion check to Anthropic is the same play with a complication worth naming. Google is both a lab and a platform. They ship Gemini. They run Google Cloud. They own Search, Android, Chrome, Maps, and Gmail — the largest distribution surface in the AI economy. Investing in Anthropic isn’t a pure platform-paying-lab transaction. It’s a hedge that wins three ways at once. If Gemini wins, Google still wins. If Anthropic wins, Google still wins because they own a piece of the equity AND they’re the cloud running the inference. And in either scenario, Google’s distribution surfaces — which billions of people touch every day — keep generating the data that makes Google the substrate everyone else has to plug into. Anthropic’s annualized revenue went from $1 billion at the end of 2024 to $9 billion at the end of 2025 to $30 billion as of April 2026 — a 30x in sixteen months. Google bought in at $350 billion against a secondary market that’s already bidding $800 billion or more. By any sober reading, Google just got the most leveraged AI position any public company has ever held — at the bottom of a curve that’s still going up, and without giving up a single yard of their own competing lab or their own distribution rails. The frontier capability doesn’t have to be yours. The pipe that delivers it does. And if you can be the pipe AND have a lab AND own a strategic stake in the competing lab, you’ve stacked the deck three ways. That’s not Paulie. That’s Paulie running his own restaurant on the side just in case Sonny goes under.

AWS is running the same hedged play with a different surface. Amazon ships its own Nova family of models. AWS owns the most-used cloud infrastructure on earth. And as of this week, Anthropic is training its most advanced foundation models on AWS Trainium and Graviton silicon, in active co-engineering with Annapurna Labs. Claude Cowork now sits inside Amazon Bedrock. A “Claude Platform on AWS” is coming that bypasses Bedrock and lives directly inside the AWS developer surface — the kind of integration you only build with a partner you intend to be welded to. AWS isn’t just selling Anthropic compute the way it sold compute to Netflix. AWS is welding its silicon to Anthropic’s training stack so that every Claude model in production has a chunk of AWS designed into the base layer. AWS owns the chips. AWS owns the deployment. AWS owns the developer relationship. AWS also has its own competing models. Anthropic does the foundation-model work. The substrate keeps the substrate position regardless of which model wins. Same Paulie play. Different costume.

xAI is the most extreme version of this structure, and the one most worth understanding because it telegraphs where the others want to end up. Elon owns the lab — Grok runs on xAI infrastructure. He owns the compute — Colossus is on track for one million GPUs by year-end, the largest training cluster on earth. He owns the audience — X is the largest text-and-real-time-conversation distribution surface in the West, and one of the few places where you can deploy an AI agent and reach hundreds of millions of users on the same calendar day. He owns the optionality — SpaceX wrote a $10 billion break fee against a $60 billion call option on Cursor, locking every other lab out of the leading coding agent until April 2027. He owns the rocket platform that’s literally putting the next generation of low-orbit communications infrastructure into the sky. He’s the only player who has stacked all four layers — model, compute, audience, vehicle — under the same control structure. Every other lab is paying somebody else for at least one of those layers. Musk built or bought all four. The reason this matters isn’t fan-service. It’s that vertical integration at this scale is the structural endpoint everyone else is racing toward. Sundar’s $185 billion 2026 capex isn’t to ship a better Gemini. It’s to make Google look more like xAI’s stack — owning the silicon, owning the cloud, owning Search, owning Android, owning Chrome, and owning a stake in the competing lab too. The hedged Paulie play is converging on Musk’s vertically-integrated Paulie play. And if it gets there, the public-market valuation gap between the platforms and the labs is going to look much wider in 2028 than it does today.

This is the part of the AI story the consensus narrative is still resisting. The labs are not the king-makers. The owners of distribution are. Microsoft, Google, Amazon, Musk’s vertically integrated apparatus, and whatever Larry Ellison eventually cobbles together with Oracle plus the Paramount-Warner content library plus the TikTok US stake — these are the entities that win regardless of which lab ships the best model in any given quarter. They don’t need to be best at anything. They don’t need to win the leaderboard. They need to own the rails. They do. And the labs are now writing checks for the privilege of riding on them.

The implication for an enterprise buyer in 2026 is not subtle. Pick your AI vendor by which Paulie they pay. OpenAI runs on Azure first and now AWS too. Anthropic runs on Google and AWS. xAI runs on its own everything. DeepSeek runs on Huawei. The model is the feature. The substrate is the contract. If your three-year procurement plan doesn’t have a clear answer to who collects the tribute when this lab succeeds, you don’t have a procurement plan. You have a vendor preference.

The historical analog isn’t even subtle. Cisco didn’t make the websites of the dot-com era. Cisco shipped the routers that made the websites possible. ARM didn’t make the iPhones. ARM licensed the chip architecture that every iPhone runs on. The picks-and-shovels companies in every technology cycle have outperformed the rush. AI’s picks-and-shovels are the platforms with distribution, compute, and the operating apparatus to deliver both — many of which now also run their own labs as a hedge. The pure-play labs are the miners. They make the headlines. The platforms make the money.

What this means for you: If you’re an enterprise CIO making a vendor map this quarter, stop optimizing for the lab. Optimize for which distribution owner you want to be paying tribute to over the next ten years. They’re the only economic positions in this cycle that compound. If you’re an allocator, the public-market vehicles that hold all the layers — Microsoft (own lab + cloud + OpenAI stake), Google (own lab + cloud + Anthropic stake), Amazon (own lab + cloud + Anthropic stake) — are the cleanest ways to own the AI economy without having to pick which lab wins. Pick the substrate. Let the labs fight for the spotlight. The substrate keeps the receipts.

⚾ Moneyball, But For Loans

Customers Bank is a $25.9 billion-asset Pennsylvania regional bank that, in size terms, is functionally invisible against JPMorgan’s $4.9 trillion balance sheet. JPMorgan is roughly 190 times larger. Both banks underwrite commercial loans. Both banks operate in the same regulatory regime. Both banks face the same enterprise customers. Both banks have access to the same frontier AI models. As of Monday morning, only one of them is running the play that will let it take meaningful share in the next three years.

CEO Sam Sidhu announced the OpenAI partnership the way you announce a thesis you’re committed to. He had his AI clone deliver the first 30 minutes of his Q1 earnings call — a stunt that he claimed was the first AI-clone earnings call by a public company — and only revealed it to analysts mid-call. Then the substantive news. OpenAI engineers will be embedded inside Customers Bank to rebuild commercial lending end-to-end. Loan close time, currently 30 to 45 days, target seven days. Account opening for complex commercial clients, currently more than a day, target under twenty minutes. Efficiency ratio, currently 49%, target low 40s — which is the kind of operating leverage that, applied to a $25.9 billion bank, generates returns the public market hasn’t been pricing into the regional banking sector for a decade. The bank has already saved 28,000 hours of work to date — roughly 15 FTEs not hired. Sidhu’s framing of the regulatory dynamic is the part to read twice: “Smaller banks are not going to be expected to have the same level of frameworks as many of the larger banks.”

That’s Moneyball. Not the romantic version where the underdog wins on heart. The actuarial version where the underdog wins because the actuary at the smaller franchise can move faster than the actuary at the bigger one. The Oakland A’s didn’t beat the Yankees because they had better baseball. They beat the Yankees because they had a structural advantage in how quickly they could re-price players in a market the Yankees were too slow to re-price. That’s exactly the bet Customers Bank is making. JPMorgan has 30,000 engineers and Jamie Dimon’s full attention. JPMorgan also has approximately 4,800 internal AI use-case reviews waiting in regulatory queue, a CISO who has to sign off personally on any model that touches customer data, and a risk committee structure that was designed around a 1990s vendor management framework. JPMorgan can deploy the same OpenAI stack Customers Bank is deploying. JPMorgan just can’t deploy it inside the same eighteen-month window, because the procedural overhead at a too-big-to-fail bank generates a structural delay that an under-the-radar regional bank doesn’t have to absorb.

If you’re a small business looking for a $5 million commercial loan today, your odds of getting a real underwriting conversation at JPMorgan or Bank of America are functionally zero. They will route you to a credit-card-grade automated decline because the regulatory and process overhead on a real underwriting conversation is too expensive to deploy on a borrower that small. The regional banks lost share in commercial lending for two decades because they couldn’t match the scale efficiencies of the megabanks. Customers Bank is betting that AI flips that equation in eighteen months. Frontier-model agentic underwriting brings the cost-per-decision down by an order of magnitude. The regulatory tolerance for AI-led underwriting at a regional bank is, today, structurally higher than at a megabank. Combine those two and the result is an arbitrage window where small + AI + speed beats big + scale + slow on every commercial loan under $25 million. There are roughly five trillion dollars of those loans outstanding in the U.S. economy. The regional bank that runs Sidhu’s play first gets a real shot at taking share that nobody in regional banking has had since the 1980s.

The same dynamic shows up at a much smaller scale inside Jason Lemkin’s SaaStr stack, and the math is the most useful piece of B2B writing this week. SaaStr runs on three humans and twenty-plus AI agents. The Salesforce bill went from $12,000 a year (with 10+ human seats) to $22,000 a year (with 2 humans and 1 API seat). That’s 80% fewer humans and 83% more spend, and Lemkin’s framing is exactly right: the agents query Salesforce roughly a hundred times more than the humans ever did. They use Salesforce as the central nervous system of the company’s GTM. They never sleep. They never stop writing. The pricing model — partly seat, partly consumption, partly Agentforce, partly Data Cloud — flexed perfectly to the new usage pattern.

Notion went the other way. SaaStr loved Notion. They were heavy users for years. Notion AI is, by Lemkin’s account, genuinely good. None of that mattered. “It’s just our AI agents don’t care.” The agents have no use for a beautifully designed wiki built for humans. They build their own real-time dashboards on top of Salesforce, Slack, and the underlying data. They route around Notion the way commercial lenders route around regional banks that can’t underwrite fast enough. The renewal will get cancelled, Lemkin predicts, the next time SaaStr does its annual stack review. Same vintage of B2B software. Same AI investment. Diametrically opposite outcomes. The rule is brutal in its clarity: “Is this software critical to AI agents being successful at their jobs? If yes, usage and spend go up. If no, usage and spend go to zero.”

That’s the same rule Customers Bank is running on JPMorgan. Useful gets allocated more resources. Unuseful gets cancelled. No feelings. No loyalty. No “we’ve always done it this way.” Pure efficiency. The agent allocates its compute to the substrates that make it more successful, and starves the rest. JPMorgan is, from the perspective of a small business borrower in 2026, the equivalent of Notion in Lemkin’s stack. Beautifully designed for a workflow that no longer exists. The borrower routes around it.

The category Lemkin calls stealth churn is the operational version of this dynamic, and it should be the topic of every public-software CEO’s next board meeting. The renewals haven’t hit yet. The seats are still on the invoice. The usage already left. Most B2B software companies haven’t seen the revenue impact of agent disintermediation because the lagging-indicator nature of annual contracts hasn’t surfaced it. The financial impact is locked in but lagging. Twelve to twenty-four months from now, every B2B software CEO whose product wasn’t on the agent’s daily path is going to discover that the renewals don’t come back. By then, the product roadmap window has closed.

What this means for you: If you run a business of any size, the action item is the same one Customers Bank is running and the same one Lemkin is running. Audit every workflow in your company by Lemkin’s rule. Is this critical to an AI agent’s success? The workflows where the answer is yes are the ones to invest in — that’s where the substrate compounds. The workflows where the answer is no are either dead weight you can cut now, or candidates for replacement by an agent-first vendor in the next eighteen months. If you’re a public-software CEO, this is the only question worth asking right now, and the renewal data won’t tell you the truth in time. Run the audit yourself. If you’re an investor, look at your portfolio companies through this lens. The B2B SaaS companies that will continue to compound are the ones whose agents need them. The ones that don’t will look fine on the next earnings print and slowly stealth-churn into irrelevance over the next two fiscal years. There won’t be a moment. There will just be a renewal that doesn’t come back.

The honest disclaimer on this whole framing is the one the consensus narrative resists: small wins only if small is fast. JPMorgan with thirty thousand engineers could run this play. So could Bank of America. So could Wells Fargo. The reason they probably won’t isn’t capability. It’s the procedural overhead the megabanks have layered on themselves over forty years of being too big to fail. That overhead was actually load-bearing in the regulatory regime that produced it. It will not be load-bearing in the AI regime, and the regional banks have at least eighteen months and possibly thirty-six months of structural advantage to take share before the megabanks adapt. The window will close. Sergey Brin un-retiring at Google to fix the coding product (which we covered in Brutalist) is what big and fast looks like. That move at JPMorgan would be Jamie Dimon publicly stating that the bank is restructuring its AI deployment process to allow regional-bank-grade speed on a defined product line, and personally owning the timeline. He hasn’t. He might. If he does, the window starts closing within ninety days. Mind the window. Use it now while it’s open.

✈️ Welcome to Premium Economy

The cleanest forward-looking implication of the agent economy isn’t in any earnings report. It’s in a research paper Anthropic published this morning about an internal experiment they ran called Project Deal. The setup was simple. Anthropic built a private marketplace inside the company. Employees described what they wanted to buy and sell. AI agents, configured from those descriptions, were deployed on both sides of every transaction. The agents posted listings, made counteroffers, negotiated, and closed deals — all in natural language, on Slack, with no human in the middle. Over one week, the agents completed 186 transactions across more than 500 listed items. Total value transacted: just over $4,000.

The summary findings are the kind of paragraph that should make you put your coffee down. Agents running on Opus 4.5 closed roughly two more deals per participant than agents running on Haiku 4.5. When the same item was sold by an Opus agent in one run and a Haiku agent in another, Opus extracted, on average, $3.64 more per deal. A lab-grown ruby sold for $65 with Opus and $35 with Haiku — an 87% delta on the same item, same week, same marketplace. A folding bicycle: $65 with Opus, $38 with Haiku. Aggressive negotiation instructions had no statistically significant effect. Model quality mattered. Prompting strategy didn’t. And the part Anthropic flagged in the conclusion is the part to underline twice: post-experiment satisfaction surveys for Opus users and Haiku users were statistically indistinguishable. Participants on the Haiku side rated their deal fairness identically to participants on the Opus side. The losing side did not know they were losing.

If you’ve ever stood at a Delta counter at JFK and watched the family in front of you upgrade to first for $200 while you got told that premium economy on the same flight was $700 with no bag included, you have already lived this exact dynamic. Airlines have been running yield management on consumers since the 1980s. The price you pay for the same seat depends on your status, your booking class, your historical revenue to the carrier, your route history, your IP address at the time of search, and an opaque set of optimization parameters that adjust in real time. The traveler who pays $400 for premium economy and the traveler who pays $700 for the same seat are sitting next to each other on the same flight, and the system has decided — based on a calculation neither passenger can audit — that they should pay different prices for an identical product. That’s not new. That’s the airline industry since deregulation. What’s new is that the same model is about to apply to everything.

In an agent-mediated economy, every transaction is now negotiated by a software agent on each side. The agent’s job is yield management on behalf of its principal. The agent that polls fifteen IPs every fifteen seconds, runs hot deals against your historical purchasing pattern, has access to a richer model with better strategic reasoning, and can be configured for aggressive negotiation will systematically extract more value than the agent that does the consumer-grade version of the same job. Anthropic’s experiment is the proof of concept. The 87% delta on a lab-grown ruby is the consumer airline industry in microcosm. The premium-agent traveler gets first class for $400. The discount-agent traveler gets premium economy for $700. They sit next to each other in the marketplace. Neither knows the other paid a different price.

The economic implication is uncomfortable. The agent economy is going to bake in a class system organized by which model you can afford to run. If your shopping agent is Opus-tier, you systematically get better deals. If your shopping agent is Haiku-tier, or DeepSeek-tier, or some white-labeled discount LLM that came included with your bank’s checking account, you systematically get worse deals — and, per Anthropic’s own experimental data, you do not know it. There is no equivalent of the Department of Transportation’s airline disclosure regime for agent-mediated commerce. There is no “you paid $300 more for this seat than the median traveler” alert. The discrimination is invisible by design, because the value extraction happens at the agent layer, not the merchant layer.

If you are an operator of a consumer business in 2026, the implication runs in two directions and both of them deserve your attention. The first direction: your customer is no longer a human. Your customer is the agent acting on behalf of a human, and the agent’s purchasing logic is structurally different from human purchasing logic. The agent does not respond to brand. The agent does not respond to “limited time only.” The agent responds to data — pricing, availability, return policy, comparable alternatives, and the agent’s own optimization model. Marketing budgets that were built for the human consumer in 2025 are going to start getting absorbed by agents in 2026, and the marketers who haven’t designed their checkout experience for agent ingestion are going to discover that conversion is coming from machines that ignore the entire visual hierarchy they spent ten years building.

The second direction is about your own purchasing. If you run a business and you’re not deploying premium-tier agents on your purchasing side, you are now in the airline-pricing schmo position. Your competitors with Opus-tier or equivalent agents are systematically extracting better terms from your shared vendors. Your software contracts are getting marked higher because the AI agent at the other end of every renewal negotiation is running a richer optimization than yours is. Your insurance pricing, your payment processing rates, your cloud bill, your vendor financing — all of these are about to go through agent-mediated negotiation, and the Haiku-tier player gets the consumer rate. “Pay me,” Paulie said. The agent economy version is “pay your agent.” And the question Anthropic just answered for the rest of the economy is whether the choice of agent matters. It does. The 87% delta on a ruby is the down payment on the next decade.

What this means for you: If you’re running any business that buys anything, your purchasing-agent budget is now strategic. Move it out of the cost column and into the revenue protection column on your operating budget. The companies that will outperform on operating margins between now and 2030 are the ones that figured out that the agent layer is where the arbitrage now lives. If you’re a consumer, the implication is harsher: the agent economy is going to systematically transfer value from people who can’t afford premium-model agents to people who can. The airline-pricing model is the floor of what’s about to happen across consumer finance, real estate, healthcare administration, and probably half a dozen other categories that nobody has thought to deploy yield management against yet. Whatever consumer-protection apparatus you’ve built your trust around — the price-comparison site, the rewards card, the rebate app — was designed for a market where humans negotiate with humans. None of those tools work when the negotiation moves to the agent layer. Pick your agent.

What This Means For You

The three stories above are surfaces of the same phenomenon. Capitalism’s defining feature has always been efficiency. Markets reward the actor who can extract more value from a given input than competitors. Slow, bureaucratic, sentimental, locally-loyal incumbents lose, and they lose without warning. The Yankees get out-priced by the A’s. JPMorgan gets out-cycled by Customers Bank. Notion gets routed around by an agent that doesn’t care how beautiful it is. Microsoft converts an exclusive right into a passive royalty stream because passive royalty streams compound while exclusive rights expire. AI didn’t change capitalism. It just made the cycle visible, the speed of adaptation the only thing that matters, and the cost of running a slow operation a problem you cannot put on a 24-month roadmap anymore.

Audit your stack against Lemkin’s rule. Walk through every line item in your software budget and your vendor budget and ask the question. Is this critical to an AI agent’s success? The yes pile is your investment list. The no pile is either dead weight you cut now, or candidates for replacement by an agent-first competitor in the next eighteen months. The audit takes a day. It will save you a year of stealth churn.

Stop optimizing for which lab. Optimize for which Paulie. OpenAI runs on Microsoft and now AWS. Anthropic runs on Google and AWS. xAI runs on SpaceX. Every lab is now wired to a substrate. The procurement question is not which lab has the best benchmark this quarter. The procurement question is which substrate is going to be the durable winner over the next decade. Pick the platform. Let the labs fight for the spotlight. The platform collects the rent.

If you’re small, move now. If you’re big, fix the speed problem. The structural arbitrage between regional banks and megabanks, between agent-native SaaS startups and incumbent CRMs, between Opus-tier purchasing agents and consumer-tier ones, is the cleanest opportunity in this cycle. It is also the most time-bounded. The big incumbents will adapt — Sergey Brin came back to Google to do exactly that. The window for small + fast to take share before big + fast shows up is somewhere between eighteen and thirty-six months. Use it now or watch it close.

Treat the agent layer as a budget line. You spend money on talent. You spend money on tools. You’re going to start spending money on agents, and the difference between cheap agents and good agents is the same difference as cheap labor and skilled labor. The companies that win the next five years will treat their agent stack the way the best venture firms treated their portfolio support apparatus a decade ago. As an operating advantage that compounds. The cheapest mistake is to assume agents are commodity. The 87% delta on a lab-grown ruby is your warning shot.

The Hill family didn’t know they were paying tribute to Paulie until Henry’s testimony made it explicit. Most enterprise AI buyers in 2026 are in the same position. The substrate has already been chosen for them. The only decision left is whether to acknowledge it and pick a Paulie they want to do business with — or to keep pretending the lab is the asset and discover, two budget cycles from now, that the rent went somewhere else.

Three Questions We Think You Should Be Asking Yourself

Which Paulie are you paying tribute to, and is that the one you want to be paying for the next ten years? Every enterprise AI buyer is, whether they’ve named it or not, paying rent on a substrate. Microsoft, Google, AWS, the emerging SpaceX/xAI position, eventually whatever Larry Ellison cobbles together with Oracle plus Paramount plus the TikTok stake. Your AI strategy is implicitly a vote for which platform you want to own you. Run the exercise honestly. Pick the one whose long-term incentives are aligned with your business, not the one whose product manager happened to sell you on an integration last quarter.

Where does your business sit on the Lemkin diagnostic? Is your product critical to AI agents being successful at their jobs? If you’re a public-software CEO, this is the only question your board should be asking on the next earnings call. If you’re a portfolio investor, this is the question you should be asking every B2B SaaS company in your book. If you’re a founder, this is the audit you should be running on yourself before the renewals start telling you the truth. The answer is binary. There is no middle. The companies that say yes will compound. The companies that say no will look fine for two more fiscal quarters and then quietly stop renewing.

What’s the price your purchasing agent is leaving on the table? Anthropic just demonstrated that the model running your agent is worth, on average, almost ten percent of every transaction’s value, and on individual high-leverage purchases as much as 87% of the price. Whatever you’re spending on enterprise software, vendor contracts, real estate, financing, insurance, and freight — there is now a measurable improvement in your operating margins available to whoever upgrades their purchasing agent first. That number is not in your budget yet. Put it there. The premium-agent traveler gets first class for $400. The discount-agent traveler pays $700 and doesn’t know it.

“Business bad? Fuck you, pay me. You had a fire? Fuck you, pay me. Place got hit by lightning, huh? Fuck you, pay me.”

— Ray Liotta as Henry Hill, Goodfellas, 1990 (voiceover)

Henry’s monologue at the Bamboo Lounge wasn’t a quote. It was a structural observation about what happens when a productive operator takes on a partner who owns the underlying protection. Sonny ran the restaurant. Paulie ran the substrate. The relationship was permanent until the productive layer stopped delivering, at which point Paulie burned the joint down for the insurance and moved on. Capitalism is structurally Paulie. AI is just the productivity layer learning, in real time, that the substrate was the asset all along. The operators running the play this week — Customers Bank, SaaStr, the Anthropic agents in Project Deal — are, in their own way, all making the same move. They’re taking the substrate seriously. They’re routing around the parts of the legacy stack that aren’t pulling their weight. They’re letting efficiency do the brutal allocation work that humans are too sentimental to do themselves. The result, twelve months from now, is going to look like an entire economic regime has been quietly re-priced while the slow operators were still arguing about benchmark rankings.

Pick your Paulie. Audit your stack. Pay your agent. Don’t be the schmo at the Delta counter. The window is open. It will close.

— Harry and Anthony

Sources

Past Briefings

Apr 26, 2026

OH SNAP! Spiegel Said the Quiet Part Out Loud: Distribution Is The Only Moat Left

THE NUMBER: 2 — the number of consumer apps Evan Spiegel says broke through in the last fifteen years. Two. In a decade and a half of unprecedented venture funding, frontier-model handouts, three trillion dollars of M&A, free distribution surfaces, and the largest concentration of engineering talent in human history. Snap was one of them. Spiegel just told Lenny Rachitsky on Sunday morning that every meaningful feature his company invented — Stories, swipe-based navigation, camera-first UX, AR lenses, Specs — was cloned within twelve months by a competitor with bigger distribution. Software, he said, isn't a moat anymore. Hardware is....

Apr 23, 2026

GPT-5.5 Released. Brooklyn Tiki Bar Reports Normal Operations.

THE NUMBER: $48 billion — the entire annual budget of the National Institutes of Health. Every cancer trial, every Alzheimer's study, every diabetes research project, every infectious-disease lab in America runs on that line. Microsoft alone will spend more than twice that on AI data centers this year. Add Meta and Amazon and the big three hyperscalers will outspend NIH by roughly nine to one. Demis Hassabis won the 2024 Nobel in Chemistry for using AI to fold two hundred million proteins, and the original DeepMind mission was "solve intelligence and use it to solve everything else." Then OpenAI raised...

Apr 22, 2026

Brutalist

This week Google wrote the biggest cybersecurity check in history, closed a billion-dollar deal with Merck, and pulled Sergey Brin out of retirement to fix the one place the brutalist strategy keeps tripping. THE NUMBER: ~~6~~ 7 Six Google products have more than a billion monthly users: Search, Maps, Gmail, YouTube, Android, Chrome. The seventh is the iPhone, which Google pays Apple roughly twenty billion dollars a year — and by some 2024 estimates closer to twenty-six — to stay on as the default search engine. Which means the most beautifully designed consumer device in human history is, functionally, a...