Google built its own chips to expedite its machine learning algorithms

As Google announced at its I/O developer conference today, the company recently started building its own specialized chips to expedite the machine learning algorithms. These so-called Tensor Processing Units (TPU) are custom-built chips that Google has now been using in its own data centers for almost a year, as Google’s senior VP for its technical infrastructure Urs Holzle noted in a press conference at I/O. Google says it’s getting “an order of magnitude better-optimized performance per watt for machine learning” and argues that this is “roughly equivalent to fast-forwarding technology about seven years into the future.” Google also manages to speed up the machine learning algorithms with the TPUs because it doesn’t need the high-precision of standard CPUs and GPUs. Instead of 32-bit precision, the algorithms happily run with a reduced…


Link to Full Article: Google built its own chips to expedite its machine learning algorithms

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!