Bing takes on Google and displays search results 10x faster

by Radu Tyrsina
Radu Tyrsina
Radu Tyrsina
CEO & Founder
Radu Tyrsina has been a Windows fan ever since he got his first PC, a Pentium III (a monster at that time). For most of the kids of... read more
Affiliate Disclosure

Microsoft’s specialized hardware for AI computation is called Brainwave, and it’s created in such a way to run a neural network as fast as possible with minimum latency.

The company announced that since using Brainwave, it managed to get ten times faster performance from Bing’s AI. The machine learning model basically powers the search engine’s functionality.

Microsoft’s goal that should be achieved via Brainwave is to provide real-time AI predictions for apps such as latest Bing features.

Bing receives new features

Microsoft also gives Bing a few new features such and the most significant ones worth mentioning are offering more answers to how-to questions and support for defining words that are not used so frequently when users hove the mouse pointer over them. These features are fueled by Brainwave.

Microsoft is using Intel’s FPGA’s to power AI computation

bing fast search

FPGAs are a sort of blank canvases for developers to use when they want to deploy various circuits by sending new software. This involves a potent mix of performance and programmability.

Now, Microsoft can do more than creating faster models with the hardware; the company can also build more sophisticated AI systems. For instance, Bing’s Turing Prototype 1 is now ten times more complex and faster thanks to the computation capacity added via Brainwave.

The FGAs deployed by Microsoft come with dedicated digital signal processors on board enhanced for complex AI math.

The main benefit of FPGAs

The most significant advantage of FPGAs over the GPUs (which became the preferred choice for AI computation) is that they don’t require extensive use of batch calculations.

A key innovation allows Microsoft to get such successful results via FPGAs is the usage of 8 and 9-bit floating point data types that radically increase performance.

Intel’s FPGA chips allows Bing to quickly read and analyze billions of documents across the entire web and provide the best answer to your question in less than a fraction of a second. […]
In fact, Intel’s FPGAs have enabled us to decrease the latency of our models by more than 10x while also increasing our model size by 10x.

The use of FPGAs to accelerate AI computation dates back in 2012

Microsoft started using FPGAs to enhance the speed of AI computation more than six years ago when the Bing team started using the chips. All this is great news for Intel as well. The company purchased Altera, a FPGA maker in 2015 and the $16.7 billion deal provided Intel with the necessary power to fuel Microsoft’s needs until today.

We’re looking forward to seeing Brainwave available via Microsoft Azure to offer customers the chance to deploy their models.


Still having issues? Fix them with this tool:


If the advices above haven't solved your issue, your PC may experience deeper Windows problems. We recommend downloading this PC Repair tool (rated Great on to easily address them. After installation, simply click the Start Scan button and then press on Repair All.

This article covers:Topics: