With a $1.4 trillion, eight-year spending plan looming, the biggest question facing OpenAI is simple: where will the money come from? CEO Sam Altman has a multi-part answer, betting on a massive expansion of current revenue streams and the creation of entirely new ones to pay for the “compute” needed to power its AI.
The company’s current financial engine runs on ChatGPT. Subscriptions from consumers account for a full 75% of its income. This is supplemented by offering corporate versions of its chatbot and licensing its AI models to other companies. This model is already scaling fast, with Altman projecting annualised revenue to top $20 billion by the end of this year, a significant jump from earlier figures.
However, $20 billion is a rounding error next to $1.4 trillion. To close that gap, Altman is projecting exponential growth, aiming for “hundreds of billions” in revenue by 2030. This growth is expected to come from its 800 million weekly users and 1 million business customers, who Altman believes will pay more as the AI models become more powerful and indispensable.
The plan doesn’t stop there. Altman has hinted at several other future revenue sources. One includes other companies paying to use OpenAI’s vast datacenters, turning its cost center into a profit center. Another, more futuristic pillar involves the “huge value” he expects AI to create through scientific breakthroughs. Perhaps most concretely, the company is reportedly building new hardware devices, potentially creating a new ecosystem of products.
A Silicon Valley investor notes the entire strategy hinges on several key factors: the AI models must keep improving, the cost of operating them must get cheaper, and the chips used to power them must become less costly. It’s a high-stakes bet that OpenAI can leverage its popular brand into a diversified, high-margin business before its massive bills come due.