Deep Learning with FPGAs

Here is why FPGAs are making inroads despite the fact that GPUs are a de facto standard for implementing deep learning algorithms. In deep learning, graphics processing units, or GPUs, have become the computing architecture of choice for its immaculate speed. So why would engineers switch to FPGAs for implementing deep learning algorithms when GPUs are doing such a fabulous job and they keep getting better at it? The brief answer lies in lower cost and power consumption. According to industry estimates, an FPGA is 10 times more power-efficient than a high-end GPU, which makes FPGAs a viable alternative when it comes to performance per watt in large data centers performing deep learning operations. FPGA clusters can make up for the GPU-like speed and performance.   The companies like Microsoft and…


Link to Full Article: Deep Learning with FPGAs