Most people these days understand the general idea of how search engines work. Search engines like Google send out automated bots to scan or “crawl” all the pages on a website, before using their algorithms to sort through which sites are best for specific search queries. 

What few outside Google knew until recently, was that the search engine has begun using two different methods to crawl websites – one which specifically searches out new content and another to review content already within its search index.

Google Search Advocate John Mueller revealed this recently during one of his regular Search Central SEO office-hours chats on January 7th.

During this session, an SEO professional asked Mueller about the behavior he has observed from Googlebot crawling his website. 

Specifically, the user says Googlebot previously crawled his site daily when it was frequently sharing content. Since content publishing has slowed on this site, he has seen that Googlebot has been crawling his website less often.

As it turns out, Mueller says this is quite normal and is the result of how Google approaches crawling web pages.

How Google Crawls New vs. Old Content

While Mueller acknowledges there are several factors that can contribute to how often it crawls different pages on a website – including what type of pages they are, how new they are, and how Google understands your site.

“It’s not so much that we crawl a website, but we crawl individual pages of a website. And when it comes to crawling, we have two types of crawling roughly.

One is a discovery crawl where we try to discover new pages on your website. And the other is a refresh crawl where we update existing pages that we know about.”

These different types of crawling target different types of pages, so it is reasonable that they also occur more or less frequently depending on the type of content.

“So for the most part, for example, we would refresh crawl the homepage, I don’t know, once a day, or every couple of hours, or something like that.

And if we find new links on their home page then we’ll go off and crawl those with the discovery crawl as well. And because of that you will always see a mix of discover and refresh happening with regard to crawling. And you’ll see some baseline of crawling happening every day.

But if we recognize that individual pages change very rarely, then we realize we don’t have to crawl them all the time.”

The takeaway here is that Google adapts to your site according to your own publishing habits. Which type of crawling it is using or how frequently it is happening are not inherently good or bad indicators of your website’s health, and your focus should be (as always) on providing the smoothest online sales experience for your customers. 

Nonetheless, it is interesting to know that Google has made this adjustment to how it crawls content across the web and to speculate about how this might affect its ranking process.

To hear Mueller’s full response (including more details about why Google crawls some sites more often than others), check out the video below:

For years upon years, those working in search engine optimization could consistently agree on just one thing – links were the most important ranking signal around. They were the lynchpin that could decide whether you were on the top of page 1 of the search results or deep into page 5. 

Over the past few years, though, that has changed significantly. Google’s search engine algorithms have gotten increasingly complex, relying on hundreds of different search signals contextually based on a user’s intent with their search. With this, the perceived importance of links has steadily decreased.

These days, it is easy to find experts who will earnestly tell you that links are dead or don’t matter anymore. Typically they will point to the recent prevalence of social media and the importance of quality content as proof that you don’t need to invest money or energy into establishing an authoritative link profile for your website.

Well, Patrick Stox from Ahrefs recently decided to settle this debate once and for all. He simply chose three pages on the Ahrefs website – which receives thousands of visitors a day – and convinced the team to remove and disavow all links to those pages for a month.

After seeing the results from a month without links, the Ahrefs team then restored every link pointing to these pages and shared the results.

Ahrefs Links Chart

If you’re interested in the details from this experiment, you’ll definitely want to check out Stox’s recent article detailing what happened when he disavowed links to just three pages. It’s a revealing look at how a seemingly small SEO tweak can have a significant impact on the traffic your business receives online. Spoilers: links still matter quite a bit for SEO.

If your site is offline for more than a couple of days you could be at risk of having your pages deindexed, according to Google Search Advocate John Mueller.

It should go without saying that the less downtime your website experiences, the better. Still, some downtime is unavoidable thanks to maintenance, updates, redesigns, and other issues which can be entirely out of your hands.

This inevitably raises the question of exactly how long is too long for your site to be offline. At what point does this begin to hurt your rankings?

After years of debate, we finally have an official answer from Google courtesy of John Mueller during the most recent Google Search Central SEO office hours session.

How Long is Too Long to Be Offline?

The topic arose when an SEO specialist named Aakash Singh asked Mueller what can be done to minimize the loss of rankings or search performance while his client’s website undergoes an expected week of downtime.

The bad news is that a week is simply too long for a site to be offline without experiencing any negative side effects. In fact, Mueller says that sites can start having pages be de-indexed after being down for just a few days.

John Mueller On How Site Downtime Impacts Rankings

Beginning his response, Mueller explains how Google “sees” sites that are experiencing downtime.

“For an outage of maybe a day or so, using a 503 result code is a great way to tell us that we should check back. But after a couple of days we think this is a permanent result code, and we think your pages are just gone, and we will drop them from the index.”

“And when the pages come back we will crawl them again and we will try to index them again. But it’s essentially during that time we will probably drop a lot of the pages from the website from our index, and there’s a pretty good chance that it’ll come back in a similar way but it’s not always guaranteed.”

The general message is that sites should minimize downtime, even when using the proper redirects or site codes.

Mueller does leave us with a suggestion for avoiding the worst fallout from downtime, but he still emphasizes the importance of getting a site back up as quickly as possible:

“… that could be something like setting up a static version of the website somewhere and just showing that to users for the time being. But especially if you’re doing this in a planned way I would try to find ways to reduce the outage to less than a day if at all possible.”

To hear Mueller’s full explanation, check out the recording from the December 10th SEO office hours session below:

After testing the feature for much of the year, Microsoft Bing Page insights have officially launched for all Bing search results according to an announcement earlier this week.

With this, users will now be able to see a small lightbulb icon on the right side of search results, which can provide additional information about the search result.

As the announcement says, Page insights “provides summarized insights from a page on your search results so you can find what you’re looking for faster.”

How It Works

The idea behind the new feature is fairly simple. Users simply click or mouse over the new lightbulb icon next to search results for more information.

According to Microsoft, this “helps you verify that the source is relevant to your needs, helps you get caught up to speed at a glance on top factoids you didn’t know about, and lets you jump straight to the relevant section of the page when you click ‘Read more’ for a specific question.”

You can see what this looks like in practice below:

Last Notes

It is important for webmasters to know this feature is currently only available on desktop search results. Microsoft is tightlipped about any possibility of bringing this to mobile search because of the “screen size required to properly display the results.”

Given that most modern search features are designed to responsively resize content based on the device being used and the importance of mobile search, this decision and statement raise more questions than they answer.

If you’d like to implement Bing Page insights for your website, simply ensure your site’s Bing snippet has the page insight feature added and that your content is accurate and relevant, or find out more from the full announcement here.

A few weeks ago, Google teased that it planned to refine its PageSpeed Insights tools to make data “more intuitive” and easy to understand. Now, that update has arrived.

What Is The PageSpeed Insights Tool?

If you’re unfamiliar, the PageSpeed Insights tool from Google evaluates your web pages to provide suggestions to improve how quickly content loads. 

The tool has been around in various forms since 2013 when it was a simple API webmaster could use to test their page speeds. Version 5, the most recent major update, arrived in 2018. However, smaller updates like this week’s happen somewhat regularly. 

Along with this new update, Google has moved the PageSpeed Insights tool to a new home at https://pagespeed.web.dev/.

What Is New In The PageSpeed Insights Tool?

The biggest focus of the new update is a change to the user interface to be more intuitive by “clearly differentiating between data derived from a synthetic environment and data collected from users in the field.”

To do this, Google has added dedicated sections for each type of data.

Where the tool used to include a label specifying which type of data you were viewing, Google has instead added information about what the data means for you and how it may be used to improve your performance.

Additionally, Google has shifted its emphasis to data collected from real users by moving field data to the top.

The Core Web Vitals assessment has also been expanded, with a label showing if your site has passed a Core Web Vitals assessment in the field and in-depth metrics from simulated environments.

Importantly, the PageSpeed Insights tool also includes details at the bottom of the page specifying how the data was collected in the field. This information includes:

  • Data collection period
  • Visit durations
  • Devices
  • Network connections
  • Sample size
  • Chrome versions

Lastly, Google has removed the previously included screenshot of the page as it indexed your content, replacing it with a series of images displaying the full loading sequence. 

For more, read the announcement for the update from Google’s Web.Dev blog.

Google’s Page Experience Algorithm update is officially coming to some desktop search results, beginning in February of next year.’

Google Search product manager Jeffrey Jose teased this news earlier this year at the annual I/O event. At the time, however, details about when it would be rolled out and how it would be implemented were scarce. Now, we have the full rundown.

What Is Google’s Page Experience Algorithm?

The Page Experience Algorithm was originally rolled out exclusively for searches coming from mobile devices earlier this year, but the search engine confirmed it will be bringing much of the algorithm to desktop searches. This includes the much-talked-about “Core Web Vitals” metrics which are intended to ensure a good user experience on sites.

As the announcement says:

“This means the same three Core Web Vitals metrics: LCP, FID, and CLS, and their associated thresholds will apply for desktop ranking. Other aspects of page experience signals, such as HTTPS security and absence of intrusive interstitials, will remain the same as well.”

However, one notable signal from the mobile Page Experience Algorithm will not be coming to desktop search results for obvious reasons: mobile-friendliness.

To accompany the new search signal, Google says it is working on a new Search Console report dedicated to showing how your desktop pages stack up when this algorithm is applied to them.; For now, the release date of that is unknown, but most believe the report will arrive before or at the same time as the algorithm update.

For more information, read the full announcement here.

When it comes to ranking a website in Google, most people agree that high-quality content is essential. But, what exactly is quality content? 

For a number of reasons, most online marketers agreed that Google defined high-quality content as something very specific: text-based content which clearly and engagingly communicated valuable information to readers.

Recently, though, Google’s John Mueller shot down that assumption during a video chat. 

While he still emphasizes that great content should inform or entertain viewers, Mueller explained that the search engine actually has a much broader view of “content quality” than most thought.

What Google Means When They Say “Quality Content”

In response to a question about whether SEO content creators should prioritize technical improvements to content or expand the scope of content, Mueller took a moment to talk about what content quality means to Google.

“When it comes to the quality of the content, we don’t mean like just the text of your articles. It’s really the quality of your overall website, and that includes everything from the layout to the design.

This is especially notable, as Mueller specifically highlights two factors that many continue to ignore – images and page speed. 

“How you have things presented on your pages? How you integrate images? How you work with speed? All of those factors, they kind of come into play there.”

Ultimately, Mueller’s response emphasizes taking a much more holistic view of your content and focusing on providing an all-around great experience for users on your website. 

There is an unspoken aspect to what Mueller says which should be mentioned. Mueller subtly shows that Google still prefers text-based content rather than videos or audio-only formats. While the company wants to integrate even more types of content, the simple fact is that the search engine still struggles to parse these formats without additional information.

Still, Mueller’s statement broadens the concept of “quality content” from what is often understood. 

“So it’s not the case that we would look at just purely the text of the article and ignore everything else around it and say, oh this is high-quality text. We really want to look at the website overall.”

It is no secret that Google knows the price you, your competitors, and even the shady third-party companies charge for your products or services. In some cases, you might even directly tell the company how much you charge through Google’s Merchant Center. So, it is reasonable to think that the search engine might also use that information when it is ranking brands or product pages in search results.

In a recent livestream, however, Google Webmaster Trends Analyst, John Mueller, denied the idea.

What John Mueller Has To Say About Price as a Google Ranking Signal

The question arose during an SEO Office-Hours hangout on October 8, which led to Mueller explaining that while Google can access this information, it does not use it when ranking traditional search results.

As he says in the recording of the discussion:

“Purely from a web search point of view, no, it’s not the case that we would try to recognize the price on a page and use that as a ranking factor.

“So it’s not the case that we would say we’ll take the cheaper one and rank that higher. I don’t think that would really make sense.”

At the same time, Mueller says he can’t speak on how products in shopping results (which may be shown in regular search results) are ranked. 

Within shopping search results, users can manually select to sort their results by price. Whether it is used as a factor the rest of the time isn’t something Mueller can answer:

“A lot of these products also end up in the product search results, which could be because you submit a feed, or maybe because we recognize the product information on these pages, and the product search results I don’t know how they’re ordered.

“It might be that they take the price into account, or things like availability, all of the other factors that kind of come in as attributes in product search.”

Price Is And Isn’t A Ranking Factor

At the end of the day, Mueller doesn’t work in the areas related to product search so he really can’t say whether price is a ranking factor within those areas of Google. This potentially includes when they are shown within normal search results pages.

What he can say for sure, is that within traditional web search results, Google does not use price to rank results:

“So, from a web search point of view, we don’t take price into account. From a product search point of view it’s possible.

“The tricky part, I think, as an SEO, is these different aspects of search are often combined in one search results page. Where you’ll see normal web results, and maybe you’ll see some product review results on the side, or maybe you’ll see some mix of that.”

You can hear Mueller’s full response in the recording from the October 8, 2021, Google SEO Office Hours hangout below:

Google says it is going to be radically updating its search engine by integrating its new “MUM” algorithm into its systems. 

This will allow Google’s search engines to better understand topics, find better answers and sources, and provide more intuitive ways to explore ideas.

Accompanying these new search systems, Google is going to be redesigning its search pages with new features that provide new ways to discover information and conduct searches that are more visual.

What is the MUM Algorithm?

Introduced earlier this year, the Multitask Unified Model algorithm, or MUM, allowed Google to better find information using images and across multiple languages. 

The main purpose of the algorithm is to improve Google’s ability to search with images and other types of visual content, rather than just text.

Three Ways MUM Is Changing Search

While it is hard to know exactly how transformative the introduction of the MUM algorithm will be before it arrives, Google did highlight three key features which will be coming with the change.

  1. “Things to know”
  2. Topic Zoom
  3. Visual Topic Exploration

Google’s “Things to Know”

Using predictive models, Google’s search engine will soon intuit the most likely steps you will take after an initial search and deliver websites that will facilitate those actions.

To help illustrate this process, the announcement uses the example of a user searching for “acrylic painting”.

According to the search engine’s data, there are more than 350 topics associated with that specific keyword phrase.

Using this knowledge, the “things to know” feature will then identify the most relevant or popular “paths” users are likely to take to further explore that topic and find content relating to that.

Topic Exploration

The next feature piggybacks on the last by making it easy to dive into related topics or find more in-depth information.

Using the feature, users can quickly broaden the topic they are looking at to find more general information, or zoom in to more detailed resources.

Visual Exploration

The last update enabled by MUM is actually already live on the search engine, providing a new way to visually explore topics.

Specifically, the visual search results page will appear for searches where a user is “looking for inspiration.”

As Google explains it:

“This new visual results page is designed for searches that are looking for inspiration, like ‘Halloween decorating ideas’ or ‘indoor vertical garden ideas,’ and you can try it today.”


It is likely that these new features are just the start of Google’s introduction of the MUM algorithm to revamp how it does search. Since its start, the search engine has struggled to understand visual content, but MUM finally provides a path to not only understand but deliver visual content across the entire Google platform.

Head of Instagram, Adam Mosseri, had been opening up recently in a series of blog posts about how the app surfaces content. 

First, he went in-depth on how the social app’s recommendation features find and highlight content in users’ primary feeds, as well as in stories, the explore section, and more.

Now, he is focusing on the app’s search engine, explaining how Instagram ranks search results and how to optimize content for the platform.

How Instagram Search Works

As with any modern search engine, the first and foremost goal of Instagram’s search feature is to find and return the most relevant results for an individual user’s query.

“Your search tells us what you’re looking for, and it’s noticeable when the results aren’t useful. It’s important for us to get this right, so we try to organize search results by what’s most relevant to you — whether it be a close friend, a creator you love, or ideas for vegan desserts.

“Let’s say you’re interested in finding pictures of space after seeing the blue moon. When you tap the search bar on the Explore page, the first thing you see is your recent searches. As you begin typing “space,” we show you accounts, audio, hashtags, and places that match the text of your search. In this case, results like @space and #space show up because “space” appears in their name.”

Instagram’s Top Three Ranking Signals

To deliver these results, Instagram looks at a number of factors including account data, hashtags, user engagement, and more. Specifically, Mosseri highlights three major ranking signals to pay attention to:

  • Your text in Search. The text you enter in the search bar is by far the most important signal for Search. We try to match what you type with relevant usernames, bios, captions, hashtags and places.
  • Your activity. This includes accounts you follow, posts you’ve viewed, and how you’ve interacted with accounts in the past. We usually show accounts and hashtags you follow or visit higher than those you don’t.
  • Information about the search results. When there are a lot of potential results, we also look at popularity signals. These include the number of clicks, likes, shares and follows for a particular account, hashtag or place.

Tips for Getting Your Content in Instagram Search Results

Mosseri goes on to offer three suggestions for optimizing your profile and posts for the app’s search engine:

  • Use a fitting handle and profile name. Search results are matched by text. Using an Instagram handle or profile name that’s related to the content of your posts is your best bet for showing up in relevant searches. If your friends or fans know you by a certain name, include that name in your username or profile so that you can show up when they search for you.
  • Include relevant keywords and locations in your bio. Same principle here. Make sure your bio includes keywords about who you are and what your profile is about. If your account is location-specific, like for a small business, sharing your location in your bio can make it easier for people in your area to find you.
  • Use relevant keywords and hashtags in captions. For a post to be found in Search, put keywords and hashtags in the caption, not the comments.

How Instagram Filters Unsafe Content

Of course, Instagram has to filter out its fair share of spam, inappropriate content, and problematic pages.

This is done by penalizing specific posts, accounts, and, on some rare occasions, entire hashtags.

As Mosseri explains:

“Accounts that post spam or violate our guidelines may appear lower in search results, and you may have to search their full username to find them. We also balance searches for sensitive topics with additional safety measures to make sure we don’t show you related content that could be harmful. Accounts, hashtags and posts that violate our Community Guidelines are removed from Instagram entirely, which prevents them from showing up in Search.”

Plans for the Future

Mosseri concludes his blog post by sharing a bit about the upcoming improvements Instagram plans to make to its search results. Notably, he says the company wants to make Instagram Search “more than just a way to find accounts and hashtags” by moving towards a “full search results page experience.”

“For example, your search for “space” will show you space-related photos and videos, too. This is especially helpful when you don’t have an exact username or hashtag in mind when searching for a certain topic.”

If you want to read Adam Mosseri’s full blog post about how Instagram ranks search results, click here.