BERT is here: Google’s latest search algorithm can better understand natural language

A new article on Marketing Land reports on the arrival of BERT, which is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers. BERT is kind of a big deal. This is the biggest change to Google’s search engine system since RankBrain, which happened nearly five years ago. The company says BERT will impact about one in every ten queries and influence how results rank. BERT is already here, working quietly in the background. It started rolling out this week and will be going fully live soon. So far, the rollout ...