OpenAI just raised $110 billion. Its biggest backer is Microsoft. And this week, it started building a GitHub replacement.

That’s not a side project. That’s a jailbreak.

The Information reported Monday that OpenAI is developing an internal code-hosting platform to rival GitHub — the Microsoft-owned service that nearly every software team on earth depends on. The stated motivation is mundane: GitHub outages kept disrupting OpenAI’s engineering workflows. (GitHub incidents rose 58% year-over-year in the first half of 2025, per GitProtect.) But the real story is structural. When your largest investor owns the infrastructure your engineers can’t work without, you don’t have a vendor relationship. You have a leash.

And OpenAI is not the only company trying to slip it.

The Cursor Playbook

Cursor — the AI-native code editor that’s become the default for a growing slice of professional developers — pulled the same move at a smaller scale. For most of its life, Cursor routed coding tasks through frontier API models: Claude, GPT, Gemini. Every keystroke that triggered an AI suggestion burned tokens at someone else’s price. The economics were brutal. Teams defaulting to frontier models for routine tasks — auto-completions, simple refactors, boilerplate — were draining their API budgets on work that didn’t need frontier intelligence.

So Cursor built Composer 1.5: their own agentic coding model, trained with 20x scaled reinforcement learning over their previous version. Then they split their pricing into two pools. Composer 1.5 and Auto mode draw from a generous included allocation — three to six times the previous limits. Frontier API models (Claude, GPT) draw from a separate, metered pool.

The move is elegant. Routine work flows through Cursor’s own model at no marginal cost. Frontier models get reserved for the hard problems that justify the price. But the strategic logic matters more than the pricing: Cursor stopped being downstream of Anthropic and OpenAI’s token economics. They built their own floor.

The Pattern

This is the same pattern, at two different scales. OpenAI builds a GitHub alternative because Microsoft controls a critical dependency. Cursor builds Composer 1.5 because Anthropic and OpenAI control the per-token meter. In both cases, the trigger isn’t ambition — it’s vulnerability.

And the pattern keeps repeating. Apple ships neural accelerators in every GPU core of the new M5 chips, making on-device inference fast enough that professional workflows don’t need to round-trip to a data center. The motivation isn’t generosity. It’s that Apple watched the entire smartphone industry become dependent on cloud AI providers, and decided that dependency wasn’t going to be theirs.

The conventional wisdom says this is wasteful. Build or buy analysis, comparative advantage, focus on your core competency — the MBA playbook says you should negotiate better terms with your supplier, not reinvent their product. And in a stable industry, that advice is sound.

AI is not a stable industry. The capability frontier moves in months. Today’s essential API partner ships a competing product tomorrow. GitHub Copilot went from “developer tool” to “the reason Microsoft bought GitHub was worth it” in under two years. Anthropic launched Claude Code. Google launched Gemini Code Assist. Every model provider is one product launch away from competing directly with the customers who depend on their API.

In that environment, dependency isn’t a cost line. It’s a countdown.

What This Actually Means

The companies that survive the next phase of AI won’t be the ones with the best models, the most funding, or the flashiest demos. They’ll be the ones who recognized, early enough, that every supplier relationship in AI has an expiration date — and started building before the clock ran out.

OpenAI seems to understand this, which is ironic given that it’s simultaneously everyone else’s supplier. Cursor understood it and acted. Apple understood it at the silicon level. The companies that don’t understand it are the ones still celebrating their “strategic partnership” with a provider who’s already sketching the product that makes them redundant.

The AI supply chain doesn’t have permanent allies. It has temporary arrangements between future competitors. Build your own floor — or discover that the one you’re standing on belongs to someone else.