How OpenAI is the reason why Microsoft CEO Satya Nadella is not ‘worried’ about Google’s AI capability

Microsoft CEO Satya Nadella confirmed the company has access to all of OpenAI's intellectual property except consumer hardware, bolstering its AI capabilities against Google. While Microsoft's own Maia AI chip production faces delays until 2026, OpenAI is pursuing a multi-pronged chip strategy, including partnerships with Nvidia, AMD, and Google, alongside developing its own chips with Broadcom.
How OpenAI is the reason why Microsoft CEO Satya Nadella is not ‘worried’ about Google’s AI capability
Microsoft CEO Satya Nadella said the company’s partnership with OpenAI gives it access to all of OpenAI's intellectual property except consumer hardware, providing Microsoft with AI capabilities that reduce concerns about Google’s competition in the field. OpenAI, on its part, has also struck deals with Nvidia, AMD and Google for AI chips, data centers and AI model training.During a podcast, Nadella explained the scope of Microsoft's access to OpenAI's technology. When asked if Microsoft has access to OpenAI's IP, Nadella responded: “In our case, the good news here is OpenAI has a program in which we have access to.” When pressed about whether Microsoft has access to all of it, Nadella confirmed: “All of it.”“So the only IP you don't have is consumer hardware?” the interviewer asked, to which Nadella responded, “That's it”. “By the way, we gave them a bunch of IPs as well to bootstrap them, right? So, this is one of the reasons why they had a mass because we built all these supercomputers together, and they benefited from it. Rightfully so,” Nadella added.
OpenAI's Master Plan for India

Microsoft’s Maia chip delay

Microsoft has delayed mass production of its next-generation Maia AI chips until 2026, according to a previous report from The Information.
The tech giant pushed mass production of the new AI chips, codenamed Braga, back by at least six months. Braga is expected to be renamed Maia 200 upon release.Microsoft had planned to deploy the Braga chip into data centers this year, but delays attributed to unanticipated design changes, staffing constraints and high turnover made this impossible.While Microsoft's Maia chips face delays, Google's Tensor Processing Units have become a preferred option for training AI models. The TPUs have established themselves as reliable hardware for AI development, giving Google an advantage in the AI infrastructure market.Google designed its TPUs specifically for machine learning workloads and has refined them through multiple generations. The chips are widely available through Google Cloud and have been used to train major AI models, making them a proven option for AI developers.

OpenAI's chip strategy

In addition to influencing Microsoft's chip development efforts, OpenAI is building its own AI chips in partnership with Broadcom. The AI company has also struck deals with Nvidia, AMD and Google for AI chips, data centers and AI model training.

author
About the AuthorTOI Tech Desk

The TOI Tech Desk is a dedicated team of journalists committed to delivering the latest and most relevant news from the world of technology to readers of The Times of India. TOI Tech Desk’s news coverage spans a wide spectrum across gadget launches, gadget reviews, trends, in-depth analysis, exclusive reports and breaking stories that impact technology and the digital universe. Be it how-tos or the latest happenings in AI, cybersecurity, personal gadgets, platforms like WhatsApp, Instagram, Facebook and more; TOI Tech Desk brings the news with accuracy and authenticity.

End of Article
Follow Us On Social Media