With Intentions To Boost AI, Microsoft Restructured Catapult V2

With Intentions To Boost AI, Microsoft Restructured Catapult V2The Project Catapult, responsible for delivering accurate and speedy Bing results with experimental servers has just received an architectural upgrade. The Field Programmable Gate Arrays (FPGA) present in the Catapult Server help deliver better Bing results. FPGAs can score, rank, filter and measure the relevancy of text and image quarries on Bing quickly. The original Catapult server has been redesigned which was used to investigate the role of FPGAs in speeding up servers. With more flexibility in circumventing traditional data-center structures for machine learning and expands the role of FPGAs as accelerators are the proposed Catapult V2 design.

 

At the Scaled Machine Learning Conference in California Microsoft proposed the new Catapult V2 design. Allowing the FPGAs to be hooked up to a large number of computing resources. The CPU, DRAM and Network Switches is where the FPGAs are connected. The FPGA can be a processing resource in large scale, deep learning models or can accelerate local applications. Scoring training and results of Deep-Learning models can be done by FPGAs which resembles much like Bing. Unlike the original model where FPGAs are limited to a smaller network of servers, the new model is a big improvement.

 

The Catapult v2 design can be used for natural language processing, cloud-based image recognition and other tasks typically linked with machine learning.