Google’s TPU Hit a Tight Sked
Summary
In this article google mentions that have been using custom designed processor for their machine learning purposes from some years back. They have specially designed this processor to accelerate special type of algorithms called TensorFlow algorithms that they used. Since these processors are specifically designed for TensorFlow algorithms they are named as Tensor Processing Units (TPUs). From conducting experiments on GPUs, CPUs and FPGA research engineers have observed that there are benefits in customization in silicon fabrication level for performance. It was observed that performance levels has improved in several orders. The philosophy behind this is lower precision that only use as many bits as it is necessary. Google has many similar but diverse chips that include accelerators. This is conducted under engineer Norman Jouppi who is a member of MIPS architecture developers at Stanford University.
In university we are required to make processors to perform tasks. Through article it they have shown that improvements can be made by allocating more resources into developing the areas that is used more (Amdahl’s Law)
This article passes the message that by designing a particular chip to undertake a task we can eliminate unwanted processing and achieve higher performance. Due to this fact companies such as Google are moving towards custom designed processers.