Top Stories

How Apple used Google’s help to train its AI models

Published by Uma Rajagopal

Posted on June 12, 2024

2 min read

· Last updated: January 30, 2026

Add as preferred source on Google
Apple CEO Tim Cook announcing AI collaboration with Google - Global Banking & Finance Review
Image of Apple CEO Tim Cook at a stage event discussing the collaboration with Google for training AI models. This partnership highlights the integration of Google's TPU technology in Apple's AI strategy.
Global Banking & Finance Awards 2026 — Call for Entries

How Apple used Google’s help to train its AI models By Max A. Cherney SAN FRANCISCO (Reuters) – On stage on Monday CEO Tim Cook’s Apple announced a splashy deal with OpenAI to include its powerful artificial intelligence model as a part of its voice assistant, Siri. But in the fine print of a technical […]

How Apple used Google’s help to train its AI models

By Max A. Cherney

SAN FRANCISCO (Reuters) – On stage on Monday CEO Tim Cook’s Apple announced a splashy deal with OpenAI to include its powerful artificial intelligence model as a part of its voice assistant, Siri.

But in the fine print of a technical document Apple published after the event, the company makes clear that Alphabet’s Google has emerged as another winner in the Cupertino, California, company’s quest to catch up in AI.

To build Apple’s foundation AI models, the company’s engineers used its own framework software with a range of hardware, specifically its own on-premise graphics processing units (GPUs) and chips available only on Google’s cloud called tensor processing units (TPUs).

Google has been building TPUs for roughly 10 years, and has publicly discussed two flavors of its fifth-generation chips that can be used for AI training; the performance version of the fifth generation offers performance competitive with Nvidia H100 AI chips, Google said.

Google announced at its annual developer conference that a sixth generation will launch this year.

The processors are designed specifically to run AI applications and train models, and Google has built cloud computing hardware and software platform around them.

Apple and Google did not immediately return requests for comment.

Apple did not discuss the extent to which it relied on Google’s chips and software compared with hardware from Nvidia or other AI vendors.

But using Google’s chips typically requires a client to purchase access to them through its cloud division, much in the same way customers buy computing time from Amazon.com’s AWS or Microsoft’s Azure.

(Reporting by Max A. Cherney; Editing by Sandra Maler)

Frequently Asked Questions

What is Artificial Intelligence?
Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think and learn. It encompasses various technologies, including machine learning and natural language processing.
What are Tensor Processing Units?
Tensor Processing Units (TPUs) are specialized hardware accelerators designed by Google to efficiently run machine learning workloads, particularly for training and inference of deep learning models.
What is cloud computing?
Cloud computing is the delivery of computing services over the internet, allowing users to access and store data and applications on remote servers instead of local computers.
What is a voice assistant?
A voice assistant is an AI-powered software application that can understand and respond to voice commands, helping users perform tasks using natural language.
What is machine learning?
Machine learning is a subset of AI that enables systems to learn from data and improve their performance over time without being explicitly programmed.

Tags

Related Articles

More from Top Stories

Explore more articles in the Top Stories category