Apple has opted for chips designed by Google rather than those from industry leader Nvidia to develop crucial components of its artificial intelligence software infrastructure, according to an Apple research paper published on Monday.
This decision is noteworthy as Nvidia is renowned for producing the most sought-after AI processors. Despite this, Apple chose to utilize Google’s cloud infrastructure.
Nvidia dominates approximately 80% of the AI processor market, including those produced by Google, Amazon.com, and other cloud computing firms. The research paper did not explicitly state the absence of Nvidia chips, but the description of Apple’s AI tools and features infrastructure omitted any mention of Nvidia hardware.
Apple, which did not comment on Monday, indicated that it used two types of Google’s tensor processing units (TPUs) in large clusters to train its AI models. Specifically, Apple employed 2,048 TPUv5p chips for the AI model intended for iPhones and other devices, and 8,192 TPUv4 processors for its server AI model.
Unlike Nvidia, which focuses on designing graphics processing units (GPUs) for AI applications, Google provides access to TPUs through its Google Cloud Platform. Users must develop software on Google’s platform to leverage these chips.
Apple is set to release portions of its Apple Intelligence AI tools to beta users this week.
The use of TPU chips by Apple was first reported by Reuters in June, but the full extent of this reliance on Google’s hardware was only detailed in the latest research paper. While Google did not respond to requests for comment, Nvidia declined to comment on the matter.
Apple’s engineers noted in the paper that larger and more sophisticated models could be developed using Google’s chips beyond the two models discussed.
At its June developer conference, Apple introduced a range of new AI features, including the integration of OpenAI’s ChatGPT technology into its software.
On Monday, Apple’s stock dipped slightly by 0.1% to $218.24 in regular trading.