• News
  • Technology News
  • Tech News
  • OpenAI CFO Sarah Friar declares: Priority in 2026 is closing the gap between what AI now makes possible and how ...

OpenAI CFO Sarah Friar declares: Priority in 2026 is closing the gap between what AI now makes possible and how ...

OpenAI CFO Sarah Friar declares: Priority in 2026 is closing the gap between what AI now makes possible and how ...
OpenAI CFO Sarah Friar has stated that now OpenAI is shifting its focus towards practical adoption of artificial intelligence in 2023. In a recent blog post Friar outlined that priorities of the company. The priority is closing the gap between what AI now makes possible and how people, companies, and countries are using it day to day,” Friar wrote. “The opportunity is large and immediate, especially in health, science, and enterprise, where better intelligence translates directly into better outcomes.”Signs of momentum are already visible. Data from Ramp showed that business spending on OpenAI models hit a record in December, surpassing rivals Anthropic and Google. This surge underscores growing enterprise reliance on OpenAI’s technology.Despite the increasing adoption, analysts advise caution. ChatGPT-maker OpenAI has announced $1.4 trillion in infrastructure deals over the past year, including massive data center commitments. Some investors question whether revenue growth can keep pace with such spending.Friar countered these concerns by highlighting that revenue has scaled alongside compute availability.
OpenAI’s compute capacity expanded from 0.2 GW in 2023 to 1.9 GW in 2025, while annualized revenue jumped from $2 billion to more than $20 billion. She described this as “never-before-seen growth at such scale.”One potential path forward is advertising, which OpenAI confirmed it has begun testing. CEO Sam Altman previously called ads a “last resort,” but the move has been anticipated for months as the company seeks to diversify revenue.

Read OpenAI CFO Sarah Friar’s blog post here

We launched ChatGPT as a research preview to understand what would happen if we put frontier intelligence directly in people’s hands.What followed was broad adoption and deep usage on a scale that no one predicted.More than experimenting with AI, people folded ChatGPT into their lives. Students started using it to untangle homework they were stuck on late at night. Parents started using it to plan trips and manage budgets. Writers used it to break through blank pages. More and more, people used it to understand their lives. People used ChatGPT to help make sense of health symptoms, prepare for doctor visits, and navigate complex decisions. People used it to think more clearly when they were tired, stressed, or unsure.Then they brought that leverage to work.At first, it showed up in small ways. A draft refined before a meeting. A spreadsheet checked one more time. A customer email rewritten to land the right tone. Very quickly, it became part of daily workflows. Engineers reasoned through code faster. Marketers shaped campaigns with sharper insight. Finance teams modeled scenarios with greater clarity. Managers prepared for hard conversations with better context.What began as a tool for curiosity became infrastructure that helps people create more, decide faster, and operate at a higher level.That transition sits at the heart of how we build OpenAI. We are a research and deployment company. Our job is to close the distance between where intelligence is advancing and how individuals, companies, and countries actually adopt and use it.As ChatGPT became a tool people rely on every day to get real work done, we followed a simple and enduring principle: our business model should scale with the value intelligence delivers.We have applied that principle deliberately. As people demanded more capability and reliability, we introduced consumer subscriptions. As AI moved into teams and workflows, we created workplace subscriptions and added usage-based pricing so costs scale with real work getting done. We also built a platform business, enabling developers and enterprises to embed intelligence through our APIs, where spend grows in direct proportion to outcomes delivered.More recently, we have applied the same principle to commerce. People come to ChatGPT not just to ask questions, but to decide what to do next. What to buy. Where to go. Which option to choose. Helping people move from exploration to action creates value for users and for the partners who serve them. Advertising follows the same arc. When people are close to a decision, relevant options have real value, as long as they are clearly labeled and genuinely useful.Across every path, we apply the same standard. Monetization should feel native to the experience. If it does not add value, it does not belong.Both our Weekly Active User (WAU) and Daily Active User (DAU) figures continue to produce all-time highs. This growth is driven by a flywheel across compute, frontier research, products, and monetization. Investment in compute powers leading-edge research and step-change gains in model capability. Stronger models unlock better products and broader adoption of the OpenAI platform. Adoption drives revenue, and revenue funds the next wave of compute and innovation. The cycle compounds.Looking back on the past three years, our ability to serve customers—as measured by revenue—directly tracks available compute: Compute grew 3X year over year or 9.5X from 2023 to 2025: 0.2 GW in 2023, 0.6 GW in 2024, and ~1.9 GW in 2025. While revenue followed the same curve growing 3X year over year, or 10X from 2023 to 2025: $2B ARR in 2023, $6B in 2024, and $20B+ in 2025. This is never-before-seen growth at such scale. And we firmly believe that more compute in these periods would have led to faster customer adoption and monetization.Compute is the scarcest resource in AI. Three years ago, we relied on a single compute provider. Today, we are working with providers across a diversified ecosystem. That shift gives us resilience and, critically, compute certainty. We can plan, finance, and deploy capacity with confidence in a market where access to compute defines who can scale.This turns compute from a fixed constraint into an actively managed portfolio. We train frontier models on premium hardware when capability matters most. We serve high-volume workloads on lower-cost infrastructure when efficiency matters more than raw scale. Latency drops. Throughput improves. And we can deliver useful intelligence at costs measured in cents per million tokens. That is what makes AI viable for everyday workflows, not just elite use cases.On top of this compute layer sits a product platform that spans text, images, voice, code, and APIs. Individuals and organizations use it to think, create, and operate more effectively. The next phase is agents and workflow automation that run continuously, carry context over time, and take action across tools. For individuals, that means AI that manages projects, coordinates plans, and executes tasks. For organizations, it becomes an operating layer for knowledge work.As these systems move from novelty to habit, usage becomes deeper and more persistent. That predictability strengthens the economics of the platform and supports long-term investment.The business model closes the loop. We began with subscriptions. Today we operate a multi-tier system that includes consumer and team subscriptions, a free ad- and commerce-supported tier that drives broad adoption, and usage-based APIs tied to production workloads. Where this goes next will extend beyond what we already sell. As intelligence moves into scientific research, drug discovery, energy systems, and financial modeling, new economic models will emerge. Licensing, IP-based agreements, and outcome-based pricing will share in the value created. That is how the internet evolved. Intelligence will follow the same path.This system requires discipline. Securing world-class compute requires commitments made years in advance, and growth does not move in a perfectly smooth line. At times, capacity leads usage. At other times, usage leads capacity. We manage that by keeping the balance sheet light, partnering rather than owning, and structuring contracts with flexibility across providers and hardware types. Capital is committed in tranches against real demand signals. That lets us lean forward when growth is there without locking in more of the future than the market has earned.That discipline sets up our focus for 2026: practical adoption. The priority is closing the gap between what AI now makes possible and how people, companies, and countries are using it day to day. The opportunity is large and immediate, especially in health, science, and enterprise, where better intelligence translates directly into better outcomes.Infrastructure expands what we can deliver. Innovation expands what intelligence can do. Adoption expands who can use it. Revenue funds the next leap. This is how intelligence scales and becomes a foundation for the global economy.

author
About the AuthorTOI Tech Desk

The TOI Tech Desk is a dedicated team of journalists committed to delivering the latest and most relevant news from the world of technology to readers of The Times of India. TOI Tech Desk’s news coverage spans a wide spectrum across gadget launches, gadget reviews, trends, in-depth analysis, exclusive reports and breaking stories that impact technology and the digital universe. Be it how-tos or the latest happenings in AI, cybersecurity, personal gadgets, platforms like WhatsApp, Instagram, Facebook and more; TOI Tech Desk brings the news with accuracy and authenticity.

End of Article
Follow Us On Social Media