Bing takes on Google and displays search results 10x faster

By: Costea Lestoc
2 minute read

Microsoft’s specialized hardware for AI computation is called Brainwave, and it’s created in such a way to run a neural network as fast as possible with minimum latency.

The company announced that since using Brainwave, it managed to get ten times faster performance from Bing’s AI. The machine learning model basically powers the search engine’s functionality.

Microsoft’s goal that should be achieved via Brainwave is to provide real-time AI predictions for apps such as latest Bing features.

Bing receives new features

Microsoft also gives Bing a few new features such and the most significant ones worth mentioning are offering more answers to how-to questions and support for defining words that are not used so frequently when users hove the mouse pointer over them. These features are fueled by Brainwave.

Microsoft is using Intel’s FPGA’s to power AI computation

bing fast search

FPGAs are a sort of blank canvases for developers to use when they want to deploy various circuits by sending new software. This involves a potent mix of performance and programmability.

Now, Microsoft can do more than creating faster models with the hardware; the company can also build more sophisticated AI systems. For instance, Bing’s Turing Prototype 1 is now ten times more complex and faster thanks to the computation capacity added via Brainwave.

The FGAs deployed by Microsoft come with dedicated digital signal processors on board enhanced for complex AI math.

The main benefit of FPGAs

The most significant advantage of FPGAs over the GPUs (which became the preferred choice for AI computation) is that they don’t require extensive use of batch calculations.

A key innovation allows Microsoft to get such successful results via FPGAs is the usage of 8 and 9-bit floating point data types that radically increase performance.

Intel’s FPGA chips allows Bing to quickly read and analyze billions of documents across the entire web and provide the best answer to your question in less than a fraction of a second. […]
In fact, Intel’s FPGAs have enabled us to decrease the latency of our models by more than 10x while also increasing our model size by 10x.

The use of FPGAs to accelerate AI computation dates back in 2012

Microsoft started using FPGAs to enhance the speed of AI computation more than six years ago when the Bing team started using the chips. All this is great news for Intel as well. The company purchased Altera, a FPGA maker in 2015 and the $16.7 billion deal provided Intel with the necessary power to fuel Microsoft’s needs until today.

We’re looking forward to seeing Brainwave available via Microsoft Azure to offer customers the chance to deploy their models.

RELATED STORIES TO CHECK OUT:

Next up

These features are out for good with Windows 10 version 1809

iamsovy@gmail.com' By: Sovan Mandal
2 minute read

Microsoft is all set to launch its next big update, Windows 10 version 1809 in October. While that should be a nice piece of news […]

Continue Reading

Windows 10 18H2 builds no longer receive new features

By: Matthew Adams
3 minute read

The Windows 10 October 2018 Update (otherwise 18H2) rollout might now be two to three weeks away. For the last few months, new build previews […]

Continue Reading

Windows 7 KB4457139 makes it easier to upgrade to Windows 10

By: Madeleine Dean
2 minute read

Microsoft released a new Windows 7 update to the general public. Update KB4457139 is actually a preview of the upcoming monthly rollup update and allows users […]

Continue Reading

Discussions