TL:DR Google’s MUM, which stands for Multitask Unified Model, is kind of like BERT 2.0 if he went and learnt 75 new languages. MUM is all about finding the best answer to a query irrespective of language and format barriers like images or video.
In Spring 2021, Google introduced us to the development of new technology designed to better understand language in complex search queries. By this Summer’s BrightonSEO, MUM was the word.
Google’s MUM, which stands for Multitask Unified Model, is designed to better serve complex search queries, knowing that users are turning to the search engine to make comparisons, scenario-based searches or multiple research queries – like a journey through information – about one topic.
‘Many of us tackle all sorts of tasks that require multiple steps with Google every day. In fact, we find that people issue eight queries on average for complex tasks.’ Google
How it works
Evolving technologies, algorithms and trust in Google as a know-it-all, have seen search queries shifting: search engines are no longer designed for short, simple queries alone. ‘What’s the time in Australia?’ ‘Directions to Pret A Manger’ and ‘How to…’ still dominate our collective usage of Google, but the evolution of the engine and our societal unwavering trust in Google as a first port-of-call means that queries are becoming much more specific, long and complex.
Today, we might need to search multiple pieces of information about one holiday one after the other. We’d look at the best hotels, then what’s nearby, then perhaps what the travel options are, then it’s flight times, then we’re looking at an airport transfer. But like the typical matriarch, Google’s MUM knows it all.
MUM in the real world
In June 2021, MUM broke into real-world search results for the first time, to deliver more extensive information into the Coronavirus vaccine programme. Its multilingual, multimodal capability allowed Google to identify over 800 keywords that people were using to search for vaccines across 50 languages.
MUM’s understanding of language
Using a comprehensive understanding of 75 different languages, the technology can bring together larger quantities of information around a subject matter taking into account language variations. Today, your numerous searches around a trip to Spain aren’t likely to reveal Spanish results – which, in this circumstance, could actually give you some of the best information from people who know the area: where the locals go, viewing spots, local delicacies, how it’s much cheaper to hop on the Metro than get yourself a taxi.
MUM brings together all relevant results regardless of language, giving users access to information they couldn’t get their hands on today.
Multimodal information gathering
MUM is designed to provide genuinely useful responses to searches that don’t have a simple or straight answer, so it’s multimodal capability is particularly interesting.
The ability to read and understand multiple formats means that MUM can present information gathered from video and image resources as well as text, utilising an in-depth understanding of the world to provide nuanced answers and advise that yes, those hiking boots are appropriate for this trip.
MUM vs BERT
When Google introduced BERT in 2018, the update was dubbed to ‘understand searches better than ever before’. BERT – or Bidirectional Encoder Representations from Transformers – enabled Google to further consider context around language and user intent to provide more useful and accurate search results.
MUM is essentially a hefty extension of BERT and works in a similar way, using natural language processing, with greater ability to multitask across media types and the ability to understand more languages. According to Google, MUM is ‘1,000 times more powerful than BERT’. It’s quite a statement – but mums always know what to do, don’t they?
If you’re interested in finding out more about SEO, take a read of our blog here diving into the link between Social Media and SEO.