OpenAI’s flood of bulletins are getting exhausting to maintain up with. A variety — not exhaustive! — from simply the final month:
The final two bulletins simply dropped yesterday, and really carry readability and coherence to your entire checklist. In brief, OpenAI is making a play to be the Home windows of AI.

For practically twenty years smartphones, and specifically iOS, have been the touchstones when it comes to discussing platforms. It’s essential to notice, nonetheless, that whereas Apple’s technique of integrating {hardware} and software program was immensely worthwhile, it entailed leaving the door open for a competing platform to emerge. The problem of being a {hardware} firm is that by advantage of needing to truly create units you possibly can’t serve everybody; Apple specifically didn’t have the capability or need to go downmarket, which created the chance for Android to not solely set up a competing platform however to truly considerably exceed iOS in market share.
That implies that if we wish a historic analogy for whole platform dominance — which more and more seems to be OpenAI’s objective — we’ve to return additional to the PC period and Home windows.
Platform Institution
Earlier than there was Home windows there was DOS; earlier than DOS, nonetheless, there was a fast-talking deal-making entrepreneur named Invoice Gates. From The Fact About Home windows Versus the Mac:
Within the late Seventies and really early Nineteen Eighties, a brand new breed of private computer systems had been showing on the scene, together with the Commodore, MITS Altair, Apple II, and extra. Some staff had been bringing them into the office, which main firms discovered unacceptable, so IT departments requested IBM for one thing comparable. In spite of everything, “Nobody ever obtained fired for getting IBM.”
IBM spun up a separate staff in Florida to place collectively one thing they may promote IT departments. Pressed for time, the Florida staff put collectively a minicomputer utilizing largely off-the shelf elements; IBM’s RISC processors and the OS that they had beneath growth had been technically superior, however Intel had a CISC processor on the market instantly, and a brand new firm known as Microsoft stated their OS — DOS, which they acquired from one other firm — could possibly be prepared in six months. For the sake of expediency, IBM determined to go together with Intel and Microsoft.
The remainder, as they are saying, is historical past. The demand from firms for IBM PCs was overwhelming, and DOS — and functions written for it — turned entrenched. By the point the Mac appeared in 1984, the die had lengthy since been forged. Finally, it will take Microsoft a decade to method the Mac’s ease-of-use, however Home windows’ DOS underpinnings and related utility library meant the Microsoft place was safe regardless.
There may be nothing like IBM and its dominant place in enterprise right this moment; relatively, the path to turning into a platform is to first be a massively standard product. Buying builders and customers just isn’t a chicken-and-egg downside: it’s clear that you will need to get customers first, which attracts builders, enhancing your platform in a virtuous cycle; to place it one other approach, first a product should Combination customers after which it will get builders without cost.
ChatGPT is precisely that kind of product, and at yesterday’s DevDay 2025 keynote CEO Sam Altman and staff demonstrated precisely that kind of pull; from The Verge:
OpenAI is introducing a option to work with apps proper inside ChatGPT. The concept is that, from inside a dialog with the chatbot, you possibly can basically tag in apps that can assist you full a activity whereas ChatGPT provides context and recommendation. The corporate confirmed off a couple of alternative ways this may work. In a reside demo, an OpenAI worker launched ChatGPT after which requested Canva to create a poster of a reputation for a dog-walking enterprise; after a little bit of ready, Canva got here again with a couple of completely different examples, and the presenter adopted up by asking for a generated pitch deck based mostly on the poster. The worker additionally requested Zillow by way of ChatGPT to indicate houses on the market in Pittsburgh, and it created an interactive Zillow map — which the worker then requested follow-up questions on.
Apps obtainable inside ChatGPT beginning right this moment will embrace Reserving.com, Canva, Coursera, Expedia, Figma, Spotify, and Zillow. Within the “weeks forward,” OpenAI will add extra apps, similar to DoorDash, OpenTable, Goal, and Uber. OpenAI not too long ago began permitting ChatGPT customers to make purchases on Etsy by way of the chatbot, a part of its general push to combine it with the remainder of the net.
It’s truthful to surprise if these app experiences will measure as much as these firm’s self-built apps or web sites, simply as there are questions on simply how effectively the corporate’s Immediate Checkout will convert; what’s notable, nonetheless, is that I disagree that this represents a “push to combine…with the remainder of the net”.
That is the alternative: it is a push to make ChatGPT the working system of the long run. Apps gained’t be in your telephone or in a browser; they’ll be in ChatGPT, and in the event that they aren’t, they merely won’t exist for ChatGPT customers. That, by extension, means the burden of creating these integrations work — and people conversions performant — can be on third social gathering builders, not OpenAI. That is the facility that comes from proudly owning customers, and OpenAI is flexing that energy in a serious approach.
Second Sourcing
There’s a second side to the IBM PC technique, and that’s the position of AMD. From a 2024 Replace:
Whereas IBM selected Intel to offer the PC’s processor, they had been cautious of being reliant on a single provider (it’s notable that IBM didn’t demand the identical of the working system, which was most likely a mixture of not totally appreciating working methods as a degree of integration and lock-in for Third-party software program, which barely existed at that time, and a recognition that software program is simply bits and never a bodily good that must be manufactured). To that finish IBM demanded that Intel license its processor to a different chip agency, and AMD was the plain alternative: the agency was based by Jerry Sanders, a Fairchild Semiconductor alum who had labored with Intel’s founders, and specialised in manufacturing licensed chips.
The connection between Intel and AMD ended up being extremely fraught and largely documented by limitless lawsuits (you possibly can learn a short historical past in that Replace); the important thing level to grasp, nonetheless, is that (1) IBM wished to have twin suppliers to keep away from being captive to an integral part supplier and (2) IBM had the facility to make that occur as a result of that they had the shoppers who had been going to offer Intel a lot quantity.
The true beneficiary of IBM’s foresight, after all, was Microsoft, which managed the working system; IBM’s mandate is why it’s applicable that “Home windows” comes first within the “Wintel” characterization of the PC period. Intel reaped great earnings from its place within the PC worth chain, however extra worth accrued to Microsoft than anybody else.
This query of who will seize probably the most revenue from the AI worth chain stays an open one. There’s no query that the early winner is Nvidia: the corporate has develop into probably the most worthwhile on the earth by advantage of its mixture of best-in-class GPUs, superior networking, and CUDA software program layer that locks folks into Nvidia’s personal platform. And, so long as energy is the limiting issue, Nvidia is well-placed to keep up its place.
What Nvidia just isn’t shy about is capturing its share of worth, and that could be a highly effective incentive for different corporations within the worth chain to search for alternate options. Google is the furthest alongside on this regard because of its decade-old funding in TPUs, whereas Amazon is searching for to imitate their technique with Trainium; Microsoft and Meta are each working to design and construct their very own chips, and Apple is upscaling Apple Silicon to be used within the knowledge heart.
As soon as once more, nonetheless, the obvious and most instantly obtainable various to Nvidia is AMD, and I feel the parallels between yesterday’s announcement of an OpenAI-AMD deal and IBM’s strong-arming of Intel are very clear; from the Wall Avenue Journal:
OpenAI and chip-designer Superior Micro Units introduced a multibillion-dollar partnership to collaborate on AI knowledge facilities that can run on AMD processors, probably the most direct challenges but to trade chief Nvidia. Beneath the phrases of the deal, OpenAI dedicated to buying 6 gigawatts value of AMD’s chips, beginning with the MI450 chip subsequent yr. The ChatGPT maker will purchase the chips both immediately or by way of its cloud computing companions.
AMD chief Lisa Su stated in an interview Sunday that the deal would lead to tens of billions of {dollars} in new income for the chip firm over the subsequent half-decade. The 2 corporations didn’t disclose the plan’s anticipated general value, however AMD stated it prices tens of billions of {dollars} per gigawatt of computing capability. OpenAI will obtain warrants for as much as 160 million AMD shares, roughly 10% of the chip firm, at 1 cent per share, awarded in phases, if OpenAI hits sure milestones for deployment. AMD’s inventory worth additionally has to extend for the warrants to be exercised.
If OpenAI is the software program layer that issues to the ecosystem, then Nvidia’s long-term pricing energy can be diminished; the corporate, like Intel, should still take the lion’s share of chip earnings by way of sheer efficiency and low-level lock-in, however I consider an important purpose OpenAI is making this deal is to lock in its personal dominant place within the stack. It’s fairly notable that this announcement comes solely weeks after Nvidia’s funding in OpenAI; that, although, is one other affirmation that the corporate who has the customers has the final word energy.
There may be one different a part of the stack to control: TSMC. Each Nvidia and AMD make their chips with the Taiwanese big, and whereas TSMC is famously reticent to take worth, they’re positioned to take action in the long term. Altman absolutely is aware of this as effectively, which implies that I wouldn’t be stunned if there’s an Intel announcement sooner relatively than later; perhaps there’s hearth to that latest smoke about AMD speaking with Intel?
The AI Linchpin
Once I began writing Stratechery, Home windows was a platform in decline, superceded by cell and, surprisingly sufficient, more and more challenged by its all-but-vanquished historical foe, the Mac. To that finish, certainly one of my first items about Microsoft was about then-CEO Steve Ballmer’s misguided try to concentrate on units as a substitute of companies. I wrote a couple of years later in Microsoft’s Monopoly Hangover:
The reality is that each [IBM and Microsoft] had been victims of their very own monopolistic success: Home windows, just like the System/360 earlier than it, was a platform that enabled Microsoft to make cash in all instructions. Each corporations made cash on the machine itself and by promoting lots of an important apps (and within the case of Microsoft, back-room companies) that ran on it. There was no want to tell apart between a vertical technique, during which apps and companies served to distinguish the machine, or a horizontal one, during which the machine served to offer entry to apps and companies. If you find yourself a monopoly, the reply to strategic decisions can at all times be “Sure.”
Microsoft at that time limit not had that luxurious: the corporate wanted to select — the times of doing every little thing had been over — and that alternative needs to be companies (which is precisely what Satya Nadella did).
Ever because the emergence of ChatGPT made OpenAI The Unintentional Shopper Tech Firm I’ve been making comparable arguments about OpenAI: they should concentrate on the buyer alternative and go away the enterprise API market to Microsoft. Not solely would focus assist the corporate seize the buyer alternative, there was the chance value of GPUs used for the API that couldn’t be used to ship customers a greater expertise throughout each tier.
I now have rather more appreciation for OpenAI’s insistence on doing all of it, for 2 causes. First, it is a firm in pure progress mode, not in decline. Tradeoffs are in the long term inevitable, however why make them earlier than you should? It might have been a mistake for Microsoft to limit Home windows to solely the enterprise within the Nineteen Eighties, even when the corporate needed to low-key retreat from the buyer market during the last fifteen years; there was some huge cash to make earlier than that retreat wanted to occur! OpenAI, in the meantime, is the most popular model in AI, so why not make a play to personal all of it, from client touchpoint to API to every little thing in-between?
Second, we’ve clearly crossed the road into bubble territory, which at all times was inevitable. The query now could be whether or not or not it is a productive bubble: what sturdy infrastructure can be constructed by finally bankrupt corporations that we profit from for years to come back?
GPUs should not that sturdy infrastructure; knowledge facilities are extra long-lasting, however not definitely worth the monetary ache of a bubble burst. The true payoff can be an enormous build-out in energy technology, which might be a profit for the subsequent half a century. One other potential payoff can be the renewed viability of Intel, and as I famous above, OpenAI could also be uniquely positioned and motivated to make that occur.
Extra broadly, this play to be the Home windows of AI successfully positions OpenAI because the linchpin of your entire AI buildout. Simply have a look at what the mere announcement of partnerships with OpenAI has finished for the shares of Oracle and AMD. OpenAI is creating the situations such that it’s the main manifestation of the AI bubble, which ensures the corporate is the first beneficiary of the entire speculative capital flooding into the area. Have been the corporate extra centered, as I’ve beforehand suggested, they could not have the leverage to get sufficient funding to fulfill these extra modest (however nonetheless unbelievable) objectives; now it’s exhausting to see them not getting no matter cash they need, at the least till the bubble bursts.
What’s superb about this overview is that I solely scratched the floor of what OpenAI introduced each yesterday and during the last month — and I haven’t even talked about Sora (though I coated that subject yesterday). What the corporate is searching for to realize is extremely audacious, but additionally logical, and one thing we’ve seen earlier than:

And, curiously sufficient, there’s an Apple to OpenAI’s Microsoft: it’s Google, with their totally built-in stack, from chips to knowledge facilities to fashions to finish person distribution channels. As an alternative of taking over a menagerie of rivals, nonetheless, Google is dealing with an more and more unified ecosystem, organized, whether or not they want to be or not, round OpenAI. Such is the facility of aggregating demand and the phenomenon that’s ChatGPT.












