Bing takes on Google and displays search results 10x faster

2 minute read

Home » News » Bing takes on Google and displays search results 10x faster

Microsoft’s specialized hardware for AI computation is called Brainwave, and it’s created in such a way to run a neural network as fast as possible with minimum latency.

The company announced that since using Brainwave, it managed to get ten times faster performance from Bing’s AI. The machine learning model basically powers the search engine’s functionality.

Microsoft’s goal that should be achieved via Brainwave is to provide real-time AI predictions for apps such as latest Bing features.

Bing receives new features

Microsoft also gives Bing a few new features such and the most significant ones worth mentioning are offering more answers to how-to questions and support for defining words that are not used so frequently when users hove the mouse pointer over them. These features are fueled by Brainwave.

Microsoft is using Intel’s FPGA’s to power AI computation

bing fast search

FPGAs are a sort of blank canvases for developers to use when they want to deploy various circuits by sending new software. This involves a potent mix of performance and programmability.

Now, Microsoft can do more than creating faster models with the hardware; the company can also build more sophisticated AI systems. For instance, Bing’s Turing Prototype 1 is now ten times more complex and faster thanks to the computation capacity added via Brainwave.

The FGAs deployed by Microsoft come with dedicated digital signal processors on board enhanced for complex AI math.

The main benefit of FPGAs

The most significant advantage of FPGAs over the GPUs (which became the preferred choice for AI computation) is that they don’t require extensive use of batch calculations.

A key innovation allows Microsoft to get such successful results via FPGAs is the usage of 8 and 9-bit floating point data types that radically increase performance.

Intel’s FPGA chips allows Bing to quickly read and analyze billions of documents across the entire web and provide the best answer to your question in less than a fraction of a second. […]
In fact, Intel’s FPGAs have enabled us to decrease the latency of our models by more than 10x while also increasing our model size by 10x.

The use of FPGAs to accelerate AI computation dates back in 2012

Microsoft started using FPGAs to enhance the speed of AI computation more than six years ago when the Bing team started using the chips. All this is great news for Intel as well. The company purchased Altera, a FPGA maker in 2015 and the $16.7 billion deal provided Intel with the necessary power to fuel Microsoft’s needs until today.

We’re looking forward to seeing Brainwave available via Microsoft Azure to offer customers the chance to deploy their models.

RELATED STORIES TO CHECK OUT:

Discussions

Next up

How to fix Steam browser error 137 [QUICK FIX]

Emmanuel Johnson avatar. By: Emmanuel Johnson
2 minute read

Steam is the most popular gaming platform on PC, but many users reported Steam browser error 137 while using Steam. This error can prevent you […]

Continue Reading

Edge’s SmartScreen is sending your personal data to Microsoft

Vlad Turiceanu By: Vlad Turiceanu
2 minute read

Security issues and shared data were always problems that affected Windows 10 and Microsoft Edge users. Many of them expressed their concerns over the years […]

Continue Reading

Your computer may be sending automated queries [FIXED]

Emmanuel Johnson avatar. By: Emmanuel Johnson
2 minute read

Many users reported Your computer may be sending automated queries while using Google. This issue can be annoying since it will force you to fill […]

Continue Reading