Google has announced a beta program to make its in-house Tensor Processing Units (TPUs) available to cloud customers, offering a TPU board constructed from four custom application-specific integrated circuits providing up to 180 machine-learning teraflops and 64 GB of high bandwidth memory.
The board is designed to operate computationally intensive machine-learning algorithms that support Web applications including language translation, text searching, and ad serving. Google says the TPU can expedite these workloads and consume far less power than that required by conventional graphics-processing units (GPUs) and central-processing units.
Google has open-sourced a number of machine-learning models for the TPU, such as ResNet-50 for image classification, Transformer for language processing, and RetinaNet for object detection. For applications external to those domains, TPUs can be coded at a lower level using the TensorFlow application programming interfaces.
From Top 500
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA
No entries found