US Top News and Analysis | Google unveils chips for AI training and inference in latest shot at Nvidia
AI generated summary, Read the full article for complete information.
Google announced that its eighth‑generation Tensor Processing Units (TPUs) will be split into two dedicated chips—one for training AI models and another for inference—marking a new effort to compete with Nvidia’s GPU‑based AI hardware. The training chip offers roughly 2.8 × the performance of the previous generation at the same price, while the inference chip delivers about 80 % better performance and includes 384 MB of SRAM, triple the amount in its predecessor. This move follows similar strategies by other tech giants such as Amazon, Microsoft, Meta, and Apple, all developing custom AI silicon to improve efficiency and support specialized use cases. Early adopters include Citadel Securities, U.S. Energy Department laboratories, and Anthropic, which plans to use multiple gigawatts of Google TPUs. Despite these advances, Google has not directly compared its new chips to Nvidia’s, and analysts estimate the combined TPU and DeepMind AI business could be worth around $900 billion.
Read more: https://www.cnbc.com/2026/04/22/google-launches-training-and-inference-tpus-in-latest-shot-at-nvidia.html
#Google #Nvidia #AminVahdat #DOE #Anthropic
AI generated summary, Read the full article for complete information.