Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

Search engine algorithms are tightly protected, with most of what we know pieced together through data. Google, YouTube, Bing, and Facebook prefer to keep as little publicly known as possible, to prevent people from “gaming” the algorithm to leapfrog to the top of the results. 

This week, however, YouTube revealed quite a bit about its video recommendation algorithm in a Q&A, including how a few signals directly impact rankings. 

Below, we’ve collected a few of the best questions asked in the Q&A, as well as the responses from YouTube’s team responsible for maintaining the YouTube recommendation algorithm.

Underperforming Videos

Many believe that having even one or two underperforming videos can hurt your channel overall. Is it true that a few poor videos can affect your future videos’ performance?

YouTube recognizes that not every video is going to be a smash hit. In fact, they regularly see that some channels have videos that perform very well, while others fail to hit the mark. 

This is why YouTube focuses more on how people are responding to a given video, rather than past video performance. 

As the team says, the recommendation algorithm will always “follow the audience”.

Too Many Uploads

Is there a point where a creator can be uploading too many videos? Can having a large number of uploads in a day hurt your chances of being recommended?

The simple answer here is no. YouTube’s recommendation algorithm does not directly punish channels for uploading too many videos in a day. 

In fact, there are channels which benefit from uploading numerous videos in a series at once. 

What it comes down to is how many videos your viewers are willing to watch at once. 

The recommendation algorithm will continue to recommend your videos to viewers so long as they continue to watch. 

However, if you begin to lose viewers with each successive upload, it may be a sign that your audience is at their limit. 

While there is no limit to how many videos YouTube will recommend from your channel, there is a limit to how many notifications viewers will receive in a given day. Viewers can receive up to three notifications for new videos from a single channel in a 24 hour period. 

Inactive Subscribers

After a few years, channels can develop a significant number of inactive subscribers. Can these hurt your channel, and would it be beneficial to start a new channel to reduce these numbers?

YouTube knows that there are many reasons subscribers can become inactive. Because of this, they do not factor in inactive subscribers when recommending videos. 

With this in mind, there is no real value to starting a new channel to reduce inactive subscribers or reconnect with lost viewers. 

The only reason you should consider starting a new channel is if you decide to go in a different direction with your content.

External Traffic

Does external traffic help your channel?

External traffic is absolutely a factor that YouTube’s recommendation algorithm considers and can help your videos get recommended. 

However, there are limits.

While external traffic will help your video get recommended to viewers, it has to continue to perform well to continue being shown. 

To continue being recommended, viewers have to not only click on your video but respond well to the content. 

Does this mean it will hurt my video if I’m getting lots of traffic from external websites and it is dragging down my click-through-rates and average view durations?

This is actually a common phenomenon so YouTube will not punish your video if the average view duration drops when receiving large amounts of external traffic. 

What really matters is how people respond after clicking on your video in their recommendations.

To hear the YouTube recommendation team answer these questions in more detail, watch the full video below:

This week, Microsoft announced a free new analytics service called Clarity which aims to walk the line between in-depth insights and user privacy. 

The set of tools is said to be completely GDPR compliant while still digging deep on a wide range of metrics that matter to businesses. 

Doesn’t Slow Your Site Down

One of the biggest aspects of Microsoft Clarity is that it tracks an astonishing amount of non-identifiable data without measurably slowing your web site. Even more surprising is the lack of traffic caps – making Clarity an option even if you’re getting millions of visitors a day. 

As the company says:

“Clarity is designed to have a very low impact on page load times, so you can make sure users navigating to your site won’t have to wait for pages to load.

Additionally, we don’t place any caps on your traffic so whether you get 10 visitors per day or 1,000,000, Clarity will be able to handle your traffic with no additional cost for you.”

Privacy Focused

Despite the amount of information gathered by Clarity, the analytics service still prioritizes privacy. 

According to the announcement:

“We are GDPR compliant as a data controller for visitors to our site and processor for the data gathered by the Clarity script on your site.”

Session Playback

One of the three biggest features highlighted in Clarity is the ability to replay site visitors’ time on your site including where they moved their mouse, where they clicked, and what made them pause.  

This is shown in a simple video recreating the visitor’s browser window with highlighted cursor movement.

Heat Map

While session playbacks allow you to see what single visitors are doing on your site, Clarity’s heat maps will show you what everyone is doing on your site. 

This feature shows what page elements are getting clicked the most and where users are spending the most time on a page. 

This can provide great insights into what is catching people’s attention, and where you are starting to lose them. 

Though not available at launch, the company says they will also feature an option for scrolling metrics in the future, helping understand how users are traveling through your content. 

Insight Dashboard

To help break down and visualize all this information, Clarity includes a dashboard with a wide array of important metrics. 

“We provide a dashboard of aggregate metrics to help you get an overall understanding of the traffic on your site. At a glance you will be able to see how many users were clicking on non-existent links or how many people scrolled up and down a page in search of something they couldn’t readily find.

You can also see things like how many concurrent JavaScript errors are occurring across your clients or how much time the average user spends navigating your site.”

To start using Clarity for yourself, sign up here.

Google My Business is an essential tool for any local business trying to spread their name online. It is also deceptively complicated. 

At first glance, GMB seems very simple and easy to set up. You just fill out a few forms, answer a few questions, upload a couple of pictures,, and presto! You’ve got a GMB listing. 

Actually optimizing that listing to ensure it appears in nearby customers’ searches, however, is where things get complicated. 

As usual, Google is remarkably non-transparent about how it ranks local searches.There are a few things that have become very apparent over the years. It is pretty much undeniable that having a lot of 5 star reviews will help you rank better. On the other hand, there is reason to believe some sections have absolutely no impact on your local rankings. To get to the truth of how the algorithm works, we have to look at data from tests.

Recently, MozCon speaker Joy Hawkins shared some findings her and her team have made from their own tests and data about what GMB sections help you rank better.

Which Google My Business Sections Affect Rankings

1) Business Name

Sometimes the simplest things can become unbelievably complicated. You almost certainly chose your business name well before making a listing, and you can’t exactly change it now. 

Unfortunately, this puts some businesses at a disadvantage while others get a natural step up. 

According to Hawkins, businesses with a keyword in their name get a boost in local rankings. There is one things you can do though.

As she explains:

“The real action item would be to kind of look to see if your competitors are taking advantage of this by adding descriptive words into their business name and then submitting corrections to Google for it, because it is against the guidelines.”

2) Categories

This is another section that seems like it should be very simple. You can check up to 10 boxes that match your business, including everything from Aboriginal Art Gallery to Zoo. Where this becomes tricky is ensuring the categories you choose remains the most accurate for your business. 

Hawkins’ team found that Google is updating it’s list of categories between 2 to 10 times each month on average. In some cases, they are adding new categories that may be a more specific match for your business. Other times, they may entirely remove categories they feel are irrelevant or unnecessary. 

Either way, it is up to you to keep your business categorized properly to protect your ranking.

3) Website

The vast majority of listings use the homepage of their website as their primary website listing on everything, including Google My Business. It makes sense, and it works perfectly fine. 

What Hawkins’ found, though, is that some businesses actually benefit from choosing a more specific page of their site. For example, businesses with multiple locations can link to a specific location page to specify exactly which store you are directing them to. 

In this section, there is no agreed upon best practice. Instead, Hawkins says to test several pages over time to ensure you are maximizing your exposure. 

4) Reviews

I mentioned it up above but it bears repeating. The number of positive reviews absolutely affects your ranking in local search results. 

There is a small catch, however. According to the what Hawkins’ team has seen, increasing the number of reviews on your listing may have diminishing returns.

“So for example, if you’re a business and you go from having no reviews to, let’s say, 20 or 30 reviews, you might start to see your business rank further away from your office, which is great. But if you go from, let’s say, 30 to 70, you may not see the same lift. So that’s something to kind of keep in mind.”

Still, reviews have consistently been shown to be a major ranking factor AND they improve the click-through-rate of listings. This is obviously an area you will want to invest some energy in. 

If you want to learn a little more about how these sections impact your rankings or you want to see which fields have absolutely no effect, you can read Joy Hawkins’ original post here.

The past six months have seen upheavals in just about every area of life, from schooling, to work, to our daily shopping habits. Now, a report from BrightEdge suggests these shifts are going to continue at unprecedented levels through the holiday shopping season. 

The report, based on an evaluation of eCommerce clients across a wide range of industries predicts a historic online holiday shopping season for a massive range of interests – emphasizing the need to start preparing now. 

What Changed

It’s no secret that the COVID-19 pandemic has forced many to do more online shopping this year. In-store shortages, quarantines, and general concern with public gatherings has made online shopping a go-to choice for both essentials and luxuries.

Interestingly, the analysis suggests that though shoppers are doing significantly more online shopping compared to 2019, the revenue per order remains relatively stable. 

The report offers two theories for why this shift is occurring:

“Here are our hypotheses:

1. Shoppers shifted purchasing behaviors online during the first few weeks of the COVID-19 pandemic and are more keenly aware of their budgets – refraining from placing big-ticket purchases online, while stocking up on more essential goods or affordable luxuries.

2. Shoppers started buying more frequently online after COVID-19, supplanting offline purchases. This would include the buy-online, pickup-curbside behavior that may include cheaper items that were previously always purchased in-store.”

Black Friday Goes Virtual

Another major factor contributing to the predicted surge in online shopping during the 2020 holiday shopping season is the cancellation of many Black Friday events. 

Not only does this mean consumers will be forced to look elsewhere for big deals, it is expected that many retailers will be pushing huge online sales for Black Friday to make up for lost revenue. It is also expected for Cyber Monday to gain an even higher profile this year. 

What You Can Do

With all these factors in mind, BrightEdge has one recommendation for brands trying to regain their footing this holiday season – invest in search engine optimization. 

As the report says:

“An impressive 60% of consumers have been shopping online more often since COVID-19, and of that group, 73% plan to continue after the pandemic. What digital marketers and SEOs have long known is finally coming to fruition: online shopping is convenient and easy. Now the trick is to make SEO important within your organization.”

Other Takeaways

In the conclusion of the report, BrightEdge offered a few key insights into the current behavior of online shopping and what they expect to see in the future:

  • “The research suggests that shoppers browse more frequently, leading to more purchases and overall revenue, though these purchases are smaller in value.
    This could be because shoppers are becoming more aware of their budgets – refraining from placing big-ticket purchases online while stocking up on more essential goods or affordable luxuries.
  • As we enter Q4 and the holiday shopping season, search is helping reveal radical changes in real-time.
  • The traditional customer journey is being radically altered in many industry sectors of the economy.  As a result, it has never been more important to truly understand how consumer behavior and use this understanding to drive engaging experiences.”

To view the full report, check out BrightEdge’s complete holiday shopping guide here.

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

Bing has officially completed the launch of its new Bing Webmaster Tools, which streamlines the old tool suite while offering a number of new features. 

According to the announcement, the process managed to condense the old version’s 47 unique links to just 17 different links without losing any of the functionality previously available. This was done by bundling redundant or related functions into more powerful tools.

At the same time, Bing announced it had introduced a new URL inspection tool, a Robots.txt testing feature, a site scan tool, and revamped webmaster guidelines. 

Choose Your Bing Webmaster Tools

For now, webmasters can choose to use the new or old version of Bing Webmaster Tools. The old suite is available at https://www.bing.com/webmaster/. The new version can be found at https://www.bing.com/webmasters/

However, the old version won’t be sticking around for too long. The announcement says it will be disabled sometime next month. 

Enhanced Tools

While streamlining the platform, Bing expanded the functionality of several tools. These updates include:

  • Backlinks lists backlinks for any site, including similar websites.
  • Keyword Research lets you filter data by countries, languages, and devices. 
  • URL Submission is better streamlined for easier navigation. This includes simplifying URL submission via the Bing WordPress plugin for faster indexing. 
  • SEO Reports provides improved classification of errors or issues. 

New Tools

Along with the consolidated and enhanced tools from the old version of Bing Webmaster Tools, the company revealed several new tools. These include:

  • URL Inspection: A beta feature that allows Bing to inspect crawled versions of your site for potential indexing issues.
  • Site Scan: A site audit tool that crawls and checks your site for common SEO issues which may affect your search ranking. 
  • Robots.txt Tester: Check your robots.txt file using the same inspection tools Bing uses to verify your URLs.

Google appears to be testing the idea of “upgrading” Google My Business profiles with a special “Google Guaranteed” badge for a $50 monthly fee.

Twitter user Tom Waddington shared a screenshot for a promotional page within the GMB dashboard offering the profile upgrade.

What Is Google Guaranteed?

The search engine has been playing with the “Google Guaranteed” badge since last year, though it has typically been used in Local Service Ads for home services businesses. 

To be eligible for the badge, businesses must meet a number of criteria including certification, licensing, and background checks. 

The idea appears to be to inspire more confidence in brands listed in Google’s local results by highlighting those who have been vetted. 

Why Would Anyone Pay For This?

On its face, the idea of paying $50 a month for what amounts to a stamp of approval sounds a little silly. However, the badge comes with some backing which may help customers feel more at ease.

Along with the Google Guarantee badge, businesses which pass the screening process are also backed with a customer satisfaction guarantee. If a customer finds your business through the search engine and is not satisfied with the results, Google will refund the amount paid up to $2,000.

Along with this aspect, there is always the issue of getting ahead of your competition. Any little advantage can be the key to standing apart from your competitors.

Just an “Experiment” … For Now

When asked about the program via email, a Google spokesperson told Search Engine Journal:

“We’re always testing new ways to improve our experience for our advertisers, merchants, and users. This experiment will show the Google Guaranteed badge on the business profile. We don’t have anything additional to announce right now.”

A lot has changed at Google over the past few years, but one thing remains the same – the majority of people will click the top link on any search result page. 

A new study of over 80 million keywords and billions of search results found that an average of 28.5% of users will click the top organic result for a given search. 

From there, the average CTR for results sharply declines. Listings in the second place receive an average of 15% of clicks, while third place falls to 11%. 

By the time you get to the last listing of a results page, links receive only a 2.5% click-through rate. 

You can imagine what the CTRs for anything after the first page would be like. 

Other Factors Influencing Search CTRs

Unsurprisingly, there is quite a bit of variance in the actual click-through rates for some results pages. In the study, Sistrix found click-through rates for listings in the first position swung from 13.7% to almost 50%. 

While the relevance of the top listing has some effect on its CTR, the study suggests another major factor is the SERP layout. 

For example, search results including sitelinks extensions significantly outperformed those without. 

On the other hand, the study found that search results including featured snippets had a significant negative impact, dropping click-through rates by at least 5% on average. 

Similarly knowledge panels reduced the average CTR from 28% to 16%.

In these situations, the researchers believe users don’t feel the need to investigate further when provided with quick answers directly within the search results pages:

“The CTR in the first two organic positions drops significantly compared to the average. Many users appear to find the information they are looking for in the Knowledge Panel – especially on their smartphones, where each time a page is loaded it takes a lot of time.“

For more information, you can explore the full study report here.

Google is making a change to how featured some featured snippets function by taking users directly to the associated text when clicked. 

Featured snippets are the highlighted search results that appear at the top of some results pages, showing a specifically relevant bit of text.

The company announced the update through its Google SearchLiason Twitter account, which posted:

“As we have done with AMP pages since December 2018, clicking on a featured snippet now takes users to the exact text highlighted for HTML pages, when we can confidently determine where the text is.”

While it is a relatively small change, it makes featured snippets even more useful to searchers (and thus, more essential for businesses to put in place). 

Surprisingly, the company says there is no additional code or special markup needed to prepare your featured snippets for this change. 

Instead, the search engine is essentially using a trick that highlights specific text by tweaking the URL for each snippet.

As Roger Montti explained over at Search Engine Journal, this is a feature previously used for Accelerated Mobile Pages (AMP), which has the dual benefits of being easy to implement and trackable.