Skip to main content

Google says its custom machine learning chips are often 15-30x faster than GPUs and CPUs


It’s no secret that Google has developed its own custom chips to accelerate its machine learning algorithms. The company first revealed those chips, called Tensor Processing Units (TPUs), at its I/O developer conference back in May 2016, but it never went into all that many details about them, except for saying that they were optimized around the company’s own TensorFlow machine-learning framework. Today, for the first time, it’s sharing more details and benchmarks about the project.
If you’re a chip designer, you can find all the gory glorious details of how the TPU works in Google’s paper. The numbers that matter most here, though, are that based on Google’s own benchmarks (and it’s worth keeping in mind that this is Google evaluating its own chip), the TPUs are on average 15x to 30x faster in executing Google’s regular machine learning workloads than a standard GPU/CPU combination (in this case, Intel Haswell processors and Nvidia K80 GPUs). And because power consumption counts in a data center, the TPUs also offer 30x to 80x higher TeraOps/Watt (and with using faster memory in the future, those numbers will probably increase).
It’s worth noting that these numbers are about using machine learning models in production, by the way — not about creating the model in the first place.
Google also notes that while most architects optimize their chips for convolutional neural networks (a specific type of neural network that works well for image recognition, for example). Google, however, says, those networks only account for about 5 percent of its own data center workload while the majority of its applications use multi-layer perceptrons.
Google says it started looking into how it could use GPUs, FPGAs and custom ASICS (which is essentially what the TPUs are) in its data centers back in 2006. At the time, though, there weren’t all that many applications that could really benefit from this special hardware because most of the heavy workloads they required could just make use of the excess hardware that was already available in the data center anyway. “The conversation changed in 2013 when we projected that DNNs could become so popular that they might double computation demands on our data centers, which would be very expensive to satisfy with conventional CPUs,” the authors of Google’s paper write. “Thus, we started a high-priority project to quickly produce a custom ASIC for inference (and bought off-the-shelf GPUs for training).” The goal here, Google’s researchers say, “was to improve cost-performance by 10x over GPUs.”
Google isn’t likely to make the TPUs available outside of its own cloud, but the company notes that it expects that others will take what it has learned and “build successors that will raise the bar even higher.”

Comments

Popular posts from this blog

Problem: Date Formatting cannot be Changed in Microsoft Excel

In this article, we will learn how to change the date formatting. We will use “Text to Column” wizard to resolve the problem of change the date formatting in Microsoft Excel. Let’s understand the functions: - Text to Column:  “Text to Column” is used for separating the cell content which is depending on the way your data is arranged. You can divide the data on the basis of content in the cell such as space, comma, period, semicolon, etc. Let’s take an example and understand how we can convert the date into Text. We have dates, foramatted as text in column A. Now, we want to convert it into date format.     If we want to convert the formatting into numbers, then we need to follow below given steps:- Select the range A2:A11. Go to Data tab, and click on Text to Columns from the Data tools group.     Covert Text to Columns Wizard – Step1 of 3 dialog box will appear. Select fixed width, and click on Next button.     Skip step-2, and...

The world of work is changing. Artificial intelligence, machine learning, and robots

The world of work is changing. Artificial intelligence, machine learning, and robots well disrupt the job market. How's it going to affect your business and your industry? How can you leverage the digital transformation to improve your success? The five most common skills among the occupations with the greatest prospect of growing in the future are: 📍Judgment and decision making: Considering the relative costs and benefits of potential actions to choose the most appropriate one. 📍Fluency of ideas: The ability to come up with a number of ideas about a topic (the number of ideas is important, not their quality, correctness, or creativity). 📍Active learning: Learning strategies—selecting and using training/instructional methods and procedures appropriate for the situation when learning or teaching new things. 📍Learning strategies: Understanding the implications of new information for both current and future problem-solving and decision-making. 📍Originality: The ability...