Posts

Google says it is going to be radically updating its search engine by integrating its new “MUM” algorithm into its systems. 

This will allow Google’s search engines to better understand topics, find better answers and sources, and provide more intuitive ways to explore ideas.

Accompanying these new search systems, Google is going to be redesigning its search pages with new features that provide new ways to discover information and conduct searches that are more visual.

What is the MUM Algorithm?

Introduced earlier this year, the Multitask Unified Model algorithm, or MUM, allowed Google to better find information using images and across multiple languages. 

The main purpose of the algorithm is to improve Google’s ability to search with images and other types of visual content, rather than just text.

Three Ways MUM Is Changing Search

While it is hard to know exactly how transformative the introduction of the MUM algorithm will be before it arrives, Google did highlight three key features which will be coming with the change.

  1. “Things to know”
  2. Topic Zoom
  3. Visual Topic Exploration

Google’s “Things to Know”

Using predictive models, Google’s search engine will soon intuit the most likely steps you will take after an initial search and deliver websites that will facilitate those actions.

To help illustrate this process, the announcement uses the example of a user searching for “acrylic painting”.

According to the search engine’s data, there are more than 350 topics associated with that specific keyword phrase.

Using this knowledge, the “things to know” feature will then identify the most relevant or popular “paths” users are likely to take to further explore that topic and find content relating to that.

Topic Exploration

The next feature piggybacks on the last by making it easy to dive into related topics or find more in-depth information.

Using the feature, users can quickly broaden the topic they are looking at to find more general information, or zoom in to more detailed resources.

Visual Exploration

The last update enabled by MUM is actually already live on the search engine, providing a new way to visually explore topics.

Specifically, the visual search results page will appear for searches where a user is “looking for inspiration.”

As Google explains it:

“This new visual results page is designed for searches that are looking for inspiration, like ‘Halloween decorating ideas’ or ‘indoor vertical garden ideas,’ and you can try it today.”


It is likely that these new features are just the start of Google’s introduction of the MUM algorithm to revamp how it does search. Since its start, the search engine has struggled to understand visual content, but MUM finally provides a path to not only understand but deliver visual content across the entire Google platform.

In just 2020, Google has changed its search engine more than 4,500 times, according to the newly updated “How Search Works” site. 

Or, as Google puts it, “There have been 4,500 such improvements in 2020 alone.”

Whether you agree with Google’s description of their changes as “improvements”, the disclosure is interesting because it shows that the search engine continued to ramp up how frequently it updates parts of its system – even during the initial outbreak of the COVID pandemic. 

In comparison, Google made 3,200 changes to its search engine in 2019, the year before. At the same time, the company said this was nearly a 10x increase from a decade before. In 2009, the search engine reported just 350-400 changes.

What Do These Changes Include?

Google’s 2020 ‘improvements’ can include anything from updates to its user interface, changes to search results, and adjustments to how specific carousels or sub-sections like “news” function. 

As such, it isn’t all that surprising that Google is making significantly more updates to its systems than it was a decade ago. The search engine is considerably more complex and multifaceted these days compared to its 2009 counterpart. 

Still, I think many expected to see a relative slowdown to these updates as many workers began working remotely and the country braced for the spread of the novel coronavirus. 

“How Search Works” Site Gets a Redesign

This info was revealed as part of a much larger redesign of the search engine’s ‘How Search Works’ website, which “explains the ins and outs of search.”

Since 2013, Google has used the portal to help educate users about the broad principles Google uses to rank sites and filter out spam or inappropriate content. 

With the latest update, the company has “updated the site with fresh information, made it easier to navigate and bookmark sections and added links to additional resources that share how Search works and answer common questions.” 

“The website gives you a window into what happens from the moment you start typing in the search bar to the moment you get your search results. It gives an overview of the technology and work that goes into organizing the world’s information, understanding what you’re looking for and then connecting you with the most relevant, helpful information,” Google added.

Google is rolling out a new addition to its “About this result” feature in search results which will explain why the search engine chose a specific result to rank.

The new section, called “Your search & this result” explains the specific factors which made Google believe a specific page may have what you’re looking for.

This can include a number of SEO factors, ranging from the keywords which matched with the page (including related but not directly matching terms), backlink details, related images, location-based information, and more. 

How Businesses Can Use This Information

For users, this feature can help understand why they are seeing specific search results and even provide tips for refining their search for better results. 

The unspoken utility of this tool for businesses is glaringly obvious, however. 

This feature essentially provides an SEO report card, showing exactly where you are doing well on ranking for important keywords. By noting what is not included, you can also get an idea of what areas could be improved to help you rank better in the future.

Taking this even further, you could explore the details for other pages ranking for your primary keywords, helping you better strategize to overtake your competition.

What It Looks Like

Below, you can see a screenshot of what the feature looks like in action:

The information box provides a quick bullet point list of several factors which caused the search engine to return the specific result.
While Google only detailed a few of the possible details the box may include, users around the web have reported seeing information about all of these factors included:

  • Included search terms: Google can show which exact search terms were matched with the content or HTML on the related page. This includes content that is not typically visible to users, such as the title tag or meta data.
  • Related search terms: Along with the keywords which were directly matched with the related page, Google can also show “related” terms. For example, Google knew to include results related to the Covid vaccine based on the keyword “shot”.
  • Other websites link to this page: The search engine may choose to highlight a page which might otherwise appear unrelated because several pages using the specific keyword linked to this specific page.
  • Related images: If the images are properly optimized, Google may be able to identify when images on a page are related to your search.
  • This result is [Language]: Obviously, users who don’t speak or read your language are unlikely to have much use for your website or content. This essentially notes that the page is in the same language you use across the rest of Google.
  • This result is relevant for searches ih [Region]: Lastly, the search engine may note if locality helped influence its search result based on other contextual details. For example, it understood that the user in Vermont, was likely looking for nearby results when searching “get the shot”.

The expanded “About this result” section is rolling out to English-language U.S. users already and is expected to be widely available across the country within a week. From there, Google says it will work to bring the feature to more countries and languages soon.

Google’s upcoming Page Experience ranking update – initially believed to be exclusive to mobile search – will also be coming to desktop search results in the future. 

The reveal came during part of Google’s annual big I/O event this week, by Google Search product manager Jeffrey Jose. 

Since the announcement of the Page Experience update, which will implement new ranking signals based on “Core Web Vitals” which assess the user friendliness of a site, was going to be rolled out to only Google’s mobile search results. 

As Jose explained, however, the update will also be coming to desktop search – at a later date.

“Today I am happy to announce that we are bringing Page Experience ranking to desktop. While we’re launching Page Experience on mobile soon, we believe page experience is critical no matter the surface the user is browsing the web. This is why we’re working hard on bringing page experience ranking to desktop. As always we’ll be providing updated guidance, documentation, and tools along the way to help your pages perform at its best. Stay tuned for more details on this.”

The specific wording of the announcement suggests the desktop update may use its own set of unique or modified ranking signals or criteria. This is reasonable considering users are likely to have different usability expectations depending on which platform they are using. 

While the launch of the desktop Page Experience update is unknown, the mobile version is still scheduled to begin rolling out in June and be completely implemented by August.

To learn more about the Page Experience update and to see the announcement for yourself, check out the video below:

It can be easy to take for granted how little spam shows up in the dozens of Google searches we make every day.

While we are almost always able to find what we need through the search engine without an abundance of malicious, copied, or just plain spammy websites, the search engine says it has been ramping up spam detection behind the scenes to fight the seemingly endless hordes of illicit or otherwise problematic sites from filling up its search results.

In fact, Google’s webspam report for 2020 says the search engine detected more than 40 billion pages of spam every day last year. This reflects a 60% increase from the year before.

How Google Search is Fighting Spam

It is possible there was a distinct increase in spammy sites last year, potentially due to disruptions and other changes brought about by the Covid pandemic. According to the search engine though, the bulk of this increase is the result of increased spam prevention efforts with the help of AI.

Artificial intelligence and machine learning have helped the company keep with new spam methods and are credited with allowing the search engine to reduce auto-generated or scraped content “by more than 80% compared to a couple of years ago.”

This AI-based approach also frees up Google’s manual action spam team to focus on more advanced forms of spam, such as hacked sites which were “still rampant in 2020.”

To show you how this approach works and helps filter out the bulk of webspam before it even gets added to Google’s indexes, the company shared a simple graphic:

COVID Spam and Misinformation

As with everyone, Google faced unprecedented situations in the past year as it responded to the COVID-19 pandemic. This included devoting “significant effort in extending protection to the billions of searches” related to the virus.

One part of this effort was instituting a “more about this result” feature which added additional context about sites before clicking through to one of their pages. This intends to help users avoid bad actors that popped up, especially during the early stages of the pandemic.

Additionally, the search engine says it worked to remove misinformation that could be dangerous during the course of the pandemic.

What This Means For You

Assuming you are a reputable professional in your industry, Google’s increased efforts to fight spam should only be a source of comfort. There have been fewer reports of sites being incorrectly targeted by these spam prevention methods in recent years, while the overall level of deceptive, spammy, or harmful sites in the search results has plummeted. 

All in all, this means a better experience for both users trying to find information and products, as well as brands fighting to reach new customers online.

In a Google Search Central SEO session recently, Google’s John Mueller shed light on a way the search engine’s systems can go astray – keeping pages on your site from being indexed and appearing in search. 

Essentially the issue comes from Google’s predictive approach to identifying duplicate content based on URL patterns, which has the potential to incorrectly identify duplicate content based on the URL alone. 

Google uses the predictive system to increase the efficiency of its crawling and indexing of sites by skipping over content which is just a copy of another page. By leaving these pages out of the index, Google’s engine has less chances of showing repetitious content in its search results and allows its indexing systems to reach other, more unique content more quickly. 

Obviously the problem is that content creators could unintentionally trigger these predictive systems when publishing unique content on similar topics, leaving quality content out of the search engine. 

John Mueller Explains How Google Could Misidentify Duplicate Content

In a response to a question from a user whose pages were not being indexed correctly, Mueller explained that Google uses multiple layers of filters to weed out duplicate content:

“What tends to happen on our side is we have multiple levels of trying to understand when there is duplicate content on a site. And one is when we look at the page’s content directly and we kind of see, well, this page has this content, this page has different content, we should treat them as separate pages.

The other thing is kind of a broader predictive approach that we have where we look at the URL structure of a website where we see, well, in the past, when we’ve looked at URLs that look like this, we’ve seen they have the same content as URLs like this. And then we’ll essentially learn that pattern and say, URLs that look like this are the same as URLs that look like this.”

He also explained how these systems can sometimes go too far and Google could incorrectly filter out unique content based on URL patterns on a site:

“Even without looking at the individual URLs we can sometimes say, well, we’ll save ourselves some crawling and indexing and just focus on these assumed or very likely duplication cases. And I have seen that happen with things like cities.

I have seen that happen with things like, I don’t know, automobiles is another one where we saw that happen, where essentially our systems recognize that what you specify as a city name is something that is not so relevant for the actual URLs. And usually we learn that kind of pattern when a site provides a lot of the same content with alternate names.”

How Can You Protect Your Site From This?

While Google’s John Mueller wasn’t able to provide a full-proof solution or prevention for this issue, he did offer some advice for sites that have been affected:

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

And that could be by using something like a rel canonical on the page and saying, well, this small city that is right outside the big city, I’ll set the canonical to the big city because it shows exactly the same content.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.

Or we see clear information that this URL you know is supposed to be the same as this other one, you have maybe set up a redirect or you have a rel canonical set up there, and we can just focus on those main URLs and still understand that the city aspect there is critical for your individual pages.”

It should be clarified that duplicate content or pages impacted by this problem will not hurt the overall SEO of your site. So, for example, having several pages tagged as being duplicate content won’t prevent your home page from appearing for relevant searches. 

Still, the issue has the potential to gradually decrease the efficiency of your SEO efforts, not to mention making it harder for people to find the valuable information you are providing. 

To see Mueller’s full explanation, watch the video below:

Google appears to be testing the idea of “upgrading” Google My Business profiles with a special “Google Guaranteed” badge for a $50 monthly fee.

Twitter user Tom Waddington shared a screenshot for a promotional page within the GMB dashboard offering the profile upgrade.

What Is Google Guaranteed?

The search engine has been playing with the “Google Guaranteed” badge since last year, though it has typically been used in Local Service Ads for home services businesses. 

To be eligible for the badge, businesses must meet a number of criteria including certification, licensing, and background checks. 

The idea appears to be to inspire more confidence in brands listed in Google’s local results by highlighting those who have been vetted. 

Why Would Anyone Pay For This?

On its face, the idea of paying $50 a month for what amounts to a stamp of approval sounds a little silly. However, the badge comes with some backing which may help customers feel more at ease.

Along with the Google Guarantee badge, businesses which pass the screening process are also backed with a customer satisfaction guarantee. If a customer finds your business through the search engine and is not satisfied with the results, Google will refund the amount paid up to $2,000.

Along with this aspect, there is always the issue of getting ahead of your competition. Any little advantage can be the key to standing apart from your competitors.

Just an “Experiment” … For Now

When asked about the program via email, a Google spokesperson told Search Engine Journal:

“We’re always testing new ways to improve our experience for our advertisers, merchants, and users. This experiment will show the Google Guaranteed badge on the business profile. We don’t have anything additional to announce right now.”

A lot has changed at Google over the past few years, but one thing remains the same – the majority of people will click the top link on any search result page. 

A new study of over 80 million keywords and billions of search results found that an average of 28.5% of users will click the top organic result for a given search. 

From there, the average CTR for results sharply declines. Listings in the second place receive an average of 15% of clicks, while third place falls to 11%. 

By the time you get to the last listing of a results page, links receive only a 2.5% click-through rate. 

You can imagine what the CTRs for anything after the first page would be like. 

Other Factors Influencing Search CTRs

Unsurprisingly, there is quite a bit of variance in the actual click-through rates for some results pages. In the study, Sistrix found click-through rates for listings in the first position swung from 13.7% to almost 50%. 

While the relevance of the top listing has some effect on its CTR, the study suggests another major factor is the SERP layout. 

For example, search results including sitelinks extensions significantly outperformed those without. 

On the other hand, the study found that search results including featured snippets had a significant negative impact, dropping click-through rates by at least 5% on average. 

Similarly knowledge panels reduced the average CTR from 28% to 16%.

In these situations, the researchers believe users don’t feel the need to investigate further when provided with quick answers directly within the search results pages:

“The CTR in the first two organic positions drops significantly compared to the average. Many users appear to find the information they are looking for in the Knowledge Panel – especially on their smartphones, where each time a page is loaded it takes a lot of time.“

For more information, you can explore the full study report here.

Google appears to be testing the idea of integrating its normal web search into search results on YouTube. 

Reddit users have been reporting seeing results and links to traditional web pages when doing searches on the video platform, as you can see in the screenshot below:

YouTube Showing Google Search Result

As you can see, Google places a single web page result among the video results, with an option to click the link or jump to a search directly on Google. 

The test appears to be limited, with many (including myself) being unable to replicate it. However, there are enough reports to conclude this is a legitimate test and not a glitch or hoax. 

So far, reaction to the move has been mixed. Many have decried the potential new feature as “annoying” and said they would “ryin the YouTube experience.”

However, there have also been those who see potential in the concept, saying it could make it easier to leap to Google when YouTube doesn’t provide the results someone is hoping for.

As one user described:

“Sometimes I’m looking for a tutorial but I want a video explaining it, and if it doesn’t exist now I have the option to do a quick Google search in the app.”

Personally I see some utility in integrating a single, non-obtrusive link within video search results. Obviously, those searching on YouTube are primarily looking for exclusively video content, but there are certainly scenarios where users are moving back and forth between YouTube and Google. This would be a convenient option for those situations.

Google will soon be updating their search ranking algorithm with a new ranking signal. This new signal will combine a number of existing signals with a recently introduced metric known as Core Web Vitals. 

The search engine says the goal of the new update is to better rank pages based on the quality of users’ experiences with the site. 

In addition to the new ranking signal, the company announced a few other changes it will be making to its systems in the coming future:

  • Incorporating page experience metrics into rankings for Top Stories in Search on mobile
  • Removing the AMP requirement for content to be shown in Top Stories

The “New” Ranking Signal

While the new signal is being called the Page Experience Signal, it actually combines a few existing search ranking signals with the recently introduced Core Web Vitals details. The metrics being brought under the umbrella of Core Web Vitals include:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security certification
  • Following intrusive interstitial guidelines

As the company said in its announcement

“The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile.”

How To Monitor Your Core Web Vitals

To help prepare webmasters for the coming update, Google has also created a new report section within Search Console. The goal is for the new report to replace the need for a suite of tools aimed at specific issues such as page speed and mobile-friendliness.

The tool can also filter data based on those which are “Poor,” “Needs Improvement,” or “Good.”

When Will The Update Happen

While the update doesn’t really change all that much regarding how webmasters and SEO specialists should approach managing sites, the company sees it as important enough to give a significant notice ahead of the release. 

In fact, Google says these changes to the algorithm will not be happening before 2021. Additionally, the search engine will provide another notice 6 months before it is rolled out.