

Google LLC is reportedly preparing to partner with Taiwanese fabless semiconductor company MediaTek Inc. on the next version of its tensor processing units, the company’s in-house artificial intelligence chips.
According to The Information, referencing an internal source, the next-generation TPUs could be produced by MediaTek starting sometime next year. The report also claims that processing could be done differently, with MediaTek handling input/out modules that manage the communication between the main processors and peripheral components.
Google currently uses Broadcom Inc. to produce its TPU chips and while MediaTek may start producing the next generation of chips, it’s noted that Google hasn’t cut ties with Broadcom, at least for the time being.
Google’s TPUs are custom-designed application-specific integrated circuits developed to accelerate machine learning tasks, particularly those involving neural networks. The chips are optimized for Google’s TensorFlow framework and enhance both training and inference processes by efficiently handling the computational demands of deep learning models.
TPUs differ from traditional processors by being tailored for high-volume, low-precision arithmetic operations. The specialization enables TPUs to deliver significant performance improvements and energy efficiency compared with general-purpose central processing units and graphics processing units.
In cloud environments, TPUs are integrated into Google’s data centers to provide scalable and efficient resources for large-scale machine-learning tasks. For edge computing, Google offers the Edge TPU, a compact and power-efficient version designed to bring AI capabilities to devices such as smartphones and IoT applications.
While TPUs have various purposes, AI is among their best-known applications. The specialized processors are designed to accelerate machine learning tasks, particularly those involving neural networks, by efficiently handling the massive computations required for training and inference.
As noted by Reuters, being able to produce in-house AI chips also gives Google a competitive edge in the AI race by reducing its reliance on Nvidia,” with Nvidia Corp. currently dominating the market for AI chips.
The most recent release of Google TPUs was the sixth generation, the Trillium TPU, which was announced in October. Offered an alternative to Nvidia’s popular GPUs, Trillium provides a four-times performance boost in terms of AI training and a three-times boost in inference throughput compared with the previous generation of chips.
Trillium TPUs also include increased memory and bandwidth allowing the chips to run much larger large language models with more weights and larger key-value caches. The chip supports a broader range of model architectures in both training and inference.
THANK YOU