acm-header
Sign In

Communications of the ACM

ACM News

Google Just Got Better at Understanding Your Trickiest Searches


View as: Print Mobile App Share:
tangled knot

Google is rolling out an update to its English-language search engine designed to give it a deeper understanding of subtle queries, which will let it deliver more relevant results, company executives say.

In a briefing at Google headquarters earlier this week, some of company's search executives showed examples of the new algorithm's improved results and explained the new technology that went into them. They set the bar high for expectations. Vice president of search Pandu Nayak called them "the single biggest change we've had in the last five years and perhaps one of the biggest since the beginning of the company."

The improvements leverage a technology developed at Google called BERT, which stands for Bidirectional Encoder Representations from Transformers. The gist is that BERT trains machine language algorithms by feeding them chunks of text that have some of the words removed. Using supercomputers it designed itself to train machine learning models, Google is applying BERT to give its search algorithm a deeper understanding of search queries and web pages that contain relevant information.

From Fast Company
View Full Article


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account