Posts

Blog comments are a tricky issue for many business websites. 

On one hand, everyone dreams of building a community of loyal customers that follow every post and regularly have a healthy discussion in the comments. Not only can it be helpful for other potential customers, but comments tend to help Google rankings and help inspire future content for your site. 

On the other hand, most business-based websites receive significantly more spam than genuine comments. Even the best anti-spam measures can’t prevent every sketchy link or comment on every post. For the most part, these are more annoying than being an actual problem. However, if left completely unmonitored, spam could build up and potentially hurt your rankings.

This can make it tempting to just remove comments from your blog entirely. If you do, you don’t have to worry about monitoring comments, responding to trolls, or weeding out spam. After all, your most loyal fans can still talk about your posts on your Facebook page, right?

Unfortunately, as Google’s John Mueller recently explained, removing comments from your blog is likely to hurt more than it helps. 

John Mueller Addresses Removing Blog Comments

In a Google Search Central SEO hangout on February 5, Google’s John Mueller explored a question from a site owner about how Google factors blog comments into search rankings. Specifically, they wanted to remove comments from their site but worried about potentially dropping in the search results if they did. 

While the answer was significantly more complicated, the short version is this:

Google does factor blog comments into where they decide to rank web pages. Because of this, it is unlikely that you could remove comments entirely without affecting your rankings. 

How Blog Comments Impact Search Rankings

Google sees comments as a separate but significant part of your content. So, while they recognize that comments may not be directly reflective of your content, it does reflect things like engagement and occasionally provide helpful extra information. 

This also means that removing blog comments is essentially removing a chunk of information, keywords, and context from every blog post on your site in the search engine’s eyes. 

However, John Mueller didn’t go as far as recommending to keep blog comments over removing them. This depends on several issues including how many comments you’ve received, what type of comments you’ve gotten, and how much they have added to your SEO.

As Mueller answered:

“I think it’s ultimately up to you. From our point of view we do see comments as a part of the content. We do also, in many cases, recognize that this is actually the comment section so we need to treat it slightly differently. But ultimately if people are finding your pages based on the comments there then, if you delete those comments, then obviously we wouldn’t be able to find your pages based on that.

So, that’s something where, depending on the type of comments that you have there, the amount of comments that you have, it can be the case that they provide significant value to your pages, and they can be a source of additional information about your pages, but it’s not always the case.

So, that’s something where I think you need to look at the contents of your pages overall, the queries that are leading to your pages, and think about which of these queries might go away if comments were not on those pages anymore. And based on that you can try to figure out what to do there.

It’s certainly not the case that we completely ignore all of the comments on a site. So just blindly going off and deleting all of your comments in the hope that nothing will change – I don’t think that will happen.”

It is clear that removing blog comments entirely from your site is all but certain to affect your search rankings on some level. Whether this means a huge drop in rankings or potentially a small gain, though, depends entirely on what type of comments your site is actually losing. 

To watch Mueller’s full answer, check out the video below:

Google has always had a love-hate relationship with pop-ups or ‘interstitials’. 

Since 2016, the search engine has reportedly used a ranking penalty to punish sites using aggressive or intrusive pop-ups on their pages. Of course, if you’ve been to many sites recently, you know these disruptive pop-ups are still common across the web.

In a recent stream, Google’s John Mueller clarified exactly how the interstitial “penalty” works, and why so many sites get away with using disruptive pop-ups.

John Mueller on Website Pop-Ups

During a recent Google Search Central office hours stream, Mueller was asked about the possibility of using mobile pop-ups on their site for a short period of time.

Specifically, the individual wanted to know if they would be devalued for using interstitials to ask visitors to take a survey when visiting the site.

Perhaps surprisingly, Mueller didn’t see much issue with temporarily running pop ups on their mobile site. 

Going even further, he explained that even if the site was hit with a penalty for the pop-ups, it could potentially continue to rank well in search results. 

This is because the so-called “interstitials penalty” is quite a minor ranking factor in the grand scheme. While it can affect your rankings, it is unlikely to have a significant impact unless other issues are present.

Still, Mueller says if you are going to use pop-ups on your mobile sites, the best course is to only use them temporarily and not to show them to every visitor coming to your site.

Here’s his full response:

“I don’t think we would penalize a website for anything like this. The web spam team has other things to do than to penalize a website for having a pop-up.

There are two aspects that could come into play. On one hand we have, on mobile, the policy of the intrusive interstitials, so that might be something to watch out for that you don’t keep it too long or show it to everyone all the time.

With that policy it’s more of a subtle ranking factor that we use to adjust the ranking slightly if we see that there’s no useful content on the page when we load it. That’s something that could come into play, but it’s more something that would be a temporary thing.

If you have this survey on your site for a week or so, then during that time we might pick up on that signal, we might respond to that signal, and then if you have removed it we can essentially move on as well. So it’s not that there’s going to be a lasting effect there.

Another aspect that you want to watch out for is if you’re showing the pop-up instead of your normal content then we will index the content of the pop-up. If you’re showing the pop-up in addition to the existing content, which sounds like the case, then we would still have the existing content to index and that would kind of be okay.”

Ultimately, the take-away is to not overly fixate on being penalized specifically for using an interstitial pop-up on your site. Rather, put your attention on doing what is right for your website and what provides the best experience for visitors.

If you want to hear the question and full answer for yourself, check out the video below:

Google confirmed this week that its most recent broad core update, which began rolling out on December 3, 2020, is now completely rolled out to all search users.

Google’s SearchLiason account announced “the December 2020 Core Update rollout is complete,” yesterday following almost two weeks of anxious waiting from webmasters and SEOs.

What We Know

Google is notoriously tight-lipped about its “secret recipe” used to rank websites around the world. Still, this update was big enough that the search engine felt it necessary to alert the public when the December core update started rolling out. 

This may simply be because the update rollout is global, affecting all users in all countries, across all languages, and across all website categories. 

However, early signs suggest the algorithm update was uncommonly big, with many reporting huge gains or losses in organic traffic from search engines. 

What Is a Broad Core Update?

Google’s “broad core updates” are essentially a tuneup of the search engine’s systems. Rather than adding a specific feature, targeting a singular widespread issue like linkspam, or prioritizing a ranking signal, a core update more subtly tweaks Google’s existing systems. This can be rebalancing the impact of some search signals, refining Google’s indexing tools, or any other combination of changes. 

What To Do If You Are Affected

The first thing any webmaster should do is thoroughly check their analytics to ensure they haven’t experienced a significant change in search traffic. 

If you have, you will be disappointed to hear that Google has not provided any specific guidance for how to recover from this update. In fact, the company suggests a negative impact from a core update may not even reflect any actual problems with your website.

What the search engine does offer is a series of questions to consider if you have been affected by a recent core update. Though not as useful as actual suggestions for fixing lost rankings, these questions can help you assess your site and identify areas for improvement before the next broad core update.

With the announcement that Google will begin including the “Core Web Vitals”  (CWV) metrics in its search engine algorithm starting next year, many are scrambling to make sense of what exactly these metrics measure and how they work.

Unlike metrics such as “loading speed” or “dwell time” which are direct and simple to understand, Core Web Vitals combine a number of factors which can get very technical.

To help you prepare for the introduction of Core Web Vitals as a ranking signal next year, Google is sharing a comprehensive guide to what CWV measures, and how they can affect your website. 

What Are Core Web Vitals

The first thing to understand is what exactly Core Web Vitals are. Simply put, CWV are a combination of three specific metrics assessing your page’s loading speed, usability, and stability. These three metrics appear very technical at first, but the gist is that your site needs to load quickly and provide a secure and easy to use experience. As for the specifics, Core Web Vitals include:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

Importantly, in the new guide, Google reaffirmed its intention to start using Core Web Vitals as a ranking signal in 2021. 

“Starting May 2021, Core Web vitals will be included in page experience signals together with existing search signals including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines.”

Does Every Page Need To Meet CWV Standards?

In the help document, Google explains that the Core Web Vitals standards it set out should be seen as a mark to aim for, but not necessarily a requirement for good ranking. 

Q: Is Google recommending that all my pages hit these thresholds? What’s the benefit?

A: We recommend that websites use these three thresholds as a guidepost for optimal user experience across all pages. Core Web Vitals thresholds are assessed at the per-page level, and you might find that some pages are above and others below these thresholds. The immediate benefit will be a better experience for users that visit your site, but in the long-term we believe that working towards a shared set of user experience metrics and thresholds across all websites, will be critical in order to sustain a healthy web ecosystem.

Will Core Web Vitals Make or Break Your Site?

It is unclear exactly how strongly Core Web Vitals metrics will be able to affect your site when they are implemented, but Google’s current stance suggests they will be a significant part of your ranking.

Q: How does Google determine which pages are affected by the assessment of Page Experience and usage as a ranking signal?

A: Page experience is just one of many signals that are used to rank pages. Keep in mind that intent of the search query is still a very strong signal, so a page with a subpar page experience may still rank highly if it has great, relevant content.

Other Details

Among the Q&A, Google also gives a few important details on the scope and impact of Core Web Vitals.

Q: Is there a difference between desktop and mobile ranking? 

A: At this time, using page experience as a signal for ranking will apply only to mobile Search.

Q: What can site owners expect to happen to their traffic if they don’t hit Core Web Vitals performance metrics?

A: It’s difficult to make any kind of general prediction. We may have more to share in the future when we formally announce the changes are coming into effect. Keep in mind that the content itself and its match to the kind of information a user is seeking remains a very strong signal as well.

The full document covers a wide range of technical issues which will be relevant for any web designer or site manager, but the big picture remains the same. Google has been prioritizing sites with the best user experience for years, and the introduction of Core Web Vitals only advances that effort. 

Find out more about Core Web Vitals here.

Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

Google will soon be updating their search ranking algorithm with a new ranking signal. This new signal will combine a number of existing signals with a recently introduced metric known as Core Web Vitals. 

The search engine says the goal of the new update is to better rank pages based on the quality of users’ experiences with the site. 

In addition to the new ranking signal, the company announced a few other changes it will be making to its systems in the coming future:

  • Incorporating page experience metrics into rankings for Top Stories in Search on mobile
  • Removing the AMP requirement for content to be shown in Top Stories

The “New” Ranking Signal

While the new signal is being called the Page Experience Signal, it actually combines a few existing search ranking signals with the recently introduced Core Web Vitals details. The metrics being brought under the umbrella of Core Web Vitals include:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security certification
  • Following intrusive interstitial guidelines

As the company said in its announcement

“The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile.”

How To Monitor Your Core Web Vitals

To help prepare webmasters for the coming update, Google has also created a new report section within Search Console. The goal is for the new report to replace the need for a suite of tools aimed at specific issues such as page speed and mobile-friendliness.

The tool can also filter data based on those which are “Poor,” “Needs Improvement,” or “Good.”

When Will The Update Happen

While the update doesn’t really change all that much regarding how webmasters and SEO specialists should approach managing sites, the company sees it as important enough to give a significant notice ahead of the release. 

In fact, Google says these changes to the algorithm will not be happening before 2021. Additionally, the search engine will provide another notice 6 months before it is rolled out.

Google has announced it is rolling out a widespread update to its search engine algorithm which it is simply titled the ‘January 2020 Core Update’.

The update began rolling out late yesterday and will affect how the search engine ranks all web pages around the world. However, as it is a “broad core” update, there is no specific issue or ranking signal being prioritized like in past mobile or speed-related updates.

Rather, Google’s recommendations for optimizing for this update remain the same as past core updates, which can be found here.

In the past, Google has described its broad core updates using a metaphor:

“One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2015. A few years later in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before.”

While the update is unlikely to radically shift search engine rankings, Google’s announcement of the update is relatively uncommon. Typically, Google prefers to quietly roll out broad updates and only confirm core updates when they relate to specific issues or are widely recognized.

This may signal that Google expects relatively large impacts on some search results, though it will take some time for the full impact of the update to become apparent.

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”

Are you afraid typos or grammatical errors in your blogs might be hurting your Google ranking? According to Google Webmaster Trends Analyst John Mueller, worry no more.

The good news is typos won’t hurt your search rankings. The bad news is they may still hurt you in other ways.

Responding to a Twitter user who believed that errors in content can hurt your Google presence by getting content marked as low quality, Mueller explained that Google doesn’t actually care that much.

“It’s always good to fix known issues with a site, but Google’s not going to count your typsos (sic),” Mueller wrote.

While that might be a relief for many, there is still the obvious issue of how actual people perceive content with typos. People are prone to forgive a mistake here and there, but error-filled or poorly written content is going to be dismissed by most.

Poorly written content comes off as unprofessional and won’t help build your authority like well-edited, well-composed content. So, while you might be able to get away with some typos on Google, it always pays to take the time to edit and revise anything you are going to publish under your company name before the public ever gets to see it.