How Does BERT Modify Google Search

Figuratively speaking, Google’s search engine is an endless ocean that consists of pretty much anything that you want to know. Be it a complex mathematical equation or any random sports website; Google will have an answer for almost all your questions! 

Despite billions of queries being asked on Google Search every day, almost 15% of them are ones that users never asked before. To tackle that issue, Google had rolled in a few updates and improved the results returned after each query. 

How Does Google Search Work?

To explain it in the easiest possible way, Google Search tries to understand and figure out the nuances of different languages when they are entered on the search engine. It is difficult for AI to understand basic, conversational queries, which is why people frequently search for specific keywords to aide the search! 

Google analyzes the contents of all the web pages available through a process called indexing. After analyzing, Google stores the page information in its database, called Google Index. From the database, Google ranks the web pages based on the queries put forward by the end-user on Google Search. When Google ranks the pages, it factors in all conditions, including the user’s device and location. 

Implementation of BERT in Google Search

What is BERT? 
Going by Google Search’s explanation, it is an open-sourced technique based on neural networks. It was introduced to incorporate natural language processing (NLP). BERT (Bidirectional Encoder Representations from Transformers) enable users to train their question answering method. The specialty lies in the fact that BERT models can completely understand the meaning and intent of most search queries, unlike their predecessors. Besides, more than just software development, BERT’s introduction also pushed the limits of traditional hardware usage. 

So, is BERT a language model? Very much so. BERT has been designed to improve Google Search’s understanding of language, especially conversational queries and general parts of speech that matter have a certain amount of weight in the sentence. 

To put in laymen's terms, these BERT language models have helped in customizing results in a way that the user comes across only the most important and matching ones. For now, Google Search using BERT seems like a brand new experience, because 1 in 10 English searches in the U.S. are understood better by BERT technology. And given Google’s technical expertise, it might not be a long wait before we get results affected by BERT in other languages and regions, as well. 

Coming back to BERT’s modifications on Google Search - even though it is called a bidirectional transformer, it is non-directional. The usual directional models can only read and interpret searches in a single direction (either left-right or right-left). But, through Bert NLP system, the encoder can read and understand the entire query at once (regardless of the written sequence). This lets the original errors of misinterpreting conversational queries (thus forcing users to enter keywords instead of questions) diminish by a margin. 

Another instrumental help from BERT is the ability to interpret queries in one language (preferably English) and then changing it to another language. The model has already seen considerable improvements in Hindi, Korean, and Portuguese. 

What lies ahead for the development of BERT? While there are both good and bad things present in BERT’s case, the path they have to choose is not precisely sun-kissed. There are still innumerable cases where Google Search has failed to customize and suit your results, which is when the importance of a technology all-encompassing the language barriers are rolled out! 

Leave a Reply