Tag Archive for: Google SEO

One of the most frustrating aspects of search engine optimization is the time it takes to see results. In some cases, you can see changes start to hit Google’s search engines in just a few hours. In others, you can spend weeks waiting for new content to be indexed with no indication when Google will get around to your pages.

In a recent AskGooglebot session, Google’s John Mueller said this huge variation in the time it takes for pages to be indexed is to be expected for a number of reasons. However, he also provides some tips for speeding up the process so you can start seeing the fruits of your labor as soon as possible.

Why Indexing Can Take So Long

In most cases, Mueller says sites that produce consistently high quality content should expect to see their new pages get indexed within a few hours to a week. In some situations, though, even high quality pages can take longer to be indexed due to a variety of factors.

Technical issues can pop up which can delay Google’s ability to spot your new pages or prevent indexing entirely. Additionally, there is always the chance that Google’s systems are just tied up elsewhere and need time to get to your new content.

Why Google May Not Index Your Page

It is important to note that Google does not index everything. In fact, there are plenty of reasons the search engine might not index your new content.

For starters, you can just tell Google not to index a page or your entire site. It might be that you want to prioritize another version of your site or that your site isn’t ready yet. 

The search engine also excludes content that doesn’t bring sufficient value. This includes duplicate content, malicious or spammy pages, and websites which mirror other existing sites.

How To Speed Up Indexing

Thankfully, Mueller says there are ways to help speed up indexing your content.

  • Prevent server overloading by ensuring your server can handle the traffic coming to it. This ensures Google can get to your site in a timely manner. 
  • Use prominent internal links to help Google’s systems navigate your site and understand what pages are most important.
  • Avoid unnecessary URLs to keep your site well organized and easy for Google to spot new content.
  • Google prioritizes sites which put out consistently quality content and provide high value for users. The more important Google thinks your site is for people online, the more high priority your new pages will be for indexing and ranking.

For more about how Google indexes web pages and how to speed up the process, check out the full AskGooglebot video below:

In a Google Search Central SEO session recently, Google’s John Mueller shed light on a way the search engine’s systems can go astray – keeping pages on your site from being indexed and appearing in search. 

Essentially the issue comes from Google’s predictive approach to identifying duplicate content based on URL patterns, which has the potential to incorrectly identify duplicate content based on the URL alone. 

Google uses the predictive system to increase the efficiency of its crawling and indexing of sites by skipping over content which is just a copy of another page. By leaving these pages out of the index, Google’s engine has less chances of showing repetitious content in its search results and allows its indexing systems to reach other, more unique content more quickly. 

Obviously the problem is that content creators could unintentionally trigger these predictive systems when publishing unique content on similar topics, leaving quality content out of the search engine. 

John Mueller Explains How Google Could Misidentify Duplicate Content

In a response to a question from a user whose pages were not being indexed correctly, Mueller explained that Google uses multiple layers of filters to weed out duplicate content:

“What tends to happen on our side is we have multiple levels of trying to understand when there is duplicate content on a site. And one is when we look at the page’s content directly and we kind of see, well, this page has this content, this page has different content, we should treat them as separate pages.

The other thing is kind of a broader predictive approach that we have where we look at the URL structure of a website where we see, well, in the past, when we’ve looked at URLs that look like this, we’ve seen they have the same content as URLs like this. And then we’ll essentially learn that pattern and say, URLs that look like this are the same as URLs that look like this.”

He also explained how these systems can sometimes go too far and Google could incorrectly filter out unique content based on URL patterns on a site:

“Even without looking at the individual URLs we can sometimes say, well, we’ll save ourselves some crawling and indexing and just focus on these assumed or very likely duplication cases. And I have seen that happen with things like cities.

I have seen that happen with things like, I don’t know, automobiles is another one where we saw that happen, where essentially our systems recognize that what you specify as a city name is something that is not so relevant for the actual URLs. And usually we learn that kind of pattern when a site provides a lot of the same content with alternate names.”

How Can You Protect Your Site From This?

While Google’s John Mueller wasn’t able to provide a full-proof solution or prevention for this issue, he did offer some advice for sites that have been affected:

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

And that could be by using something like a rel canonical on the page and saying, well, this small city that is right outside the big city, I’ll set the canonical to the big city because it shows exactly the same content.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.

Or we see clear information that this URL you know is supposed to be the same as this other one, you have maybe set up a redirect or you have a rel canonical set up there, and we can just focus on those main URLs and still understand that the city aspect there is critical for your individual pages.”

It should be clarified that duplicate content or pages impacted by this problem will not hurt the overall SEO of your site. So, for example, having several pages tagged as being duplicate content won’t prevent your home page from appearing for relevant searches. 

Still, the issue has the potential to gradually decrease the efficiency of your SEO efforts, not to mention making it harder for people to find the valuable information you are providing. 

To see Mueller’s full explanation, watch the video below:

With the announcement that Google will begin including the “Core Web Vitals”  (CWV) metrics in its search engine algorithm starting next year, many are scrambling to make sense of what exactly these metrics measure and how they work.

Unlike metrics such as “loading speed” or “dwell time” which are direct and simple to understand, Core Web Vitals combine a number of factors which can get very technical.

To help you prepare for the introduction of Core Web Vitals as a ranking signal next year, Google is sharing a comprehensive guide to what CWV measures, and how they can affect your website. 

What Are Core Web Vitals

The first thing to understand is what exactly Core Web Vitals are. Simply put, CWV are a combination of three specific metrics assessing your page’s loading speed, usability, and stability. These three metrics appear very technical at first, but the gist is that your site needs to load quickly and provide a secure and easy to use experience. As for the specifics, Core Web Vitals include:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

Importantly, in the new guide, Google reaffirmed its intention to start using Core Web Vitals as a ranking signal in 2021. 

“Starting May 2021, Core Web vitals will be included in page experience signals together with existing search signals including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines.”

Does Every Page Need To Meet CWV Standards?

In the help document, Google explains that the Core Web Vitals standards it set out should be seen as a mark to aim for, but not necessarily a requirement for good ranking. 

Q: Is Google recommending that all my pages hit these thresholds? What’s the benefit?

A: We recommend that websites use these three thresholds as a guidepost for optimal user experience across all pages. Core Web Vitals thresholds are assessed at the per-page level, and you might find that some pages are above and others below these thresholds. The immediate benefit will be a better experience for users that visit your site, but in the long-term we believe that working towards a shared set of user experience metrics and thresholds across all websites, will be critical in order to sustain a healthy web ecosystem.

Will Core Web Vitals Make or Break Your Site?

It is unclear exactly how strongly Core Web Vitals metrics will be able to affect your site when they are implemented, but Google’s current stance suggests they will be a significant part of your ranking.

Q: How does Google determine which pages are affected by the assessment of Page Experience and usage as a ranking signal?

A: Page experience is just one of many signals that are used to rank pages. Keep in mind that intent of the search query is still a very strong signal, so a page with a subpar page experience may still rank highly if it has great, relevant content.

Other Details

Among the Q&A, Google also gives a few important details on the scope and impact of Core Web Vitals.

Q: Is there a difference between desktop and mobile ranking? 

A: At this time, using page experience as a signal for ranking will apply only to mobile Search.

Q: What can site owners expect to happen to their traffic if they don’t hit Core Web Vitals performance metrics?

A: It’s difficult to make any kind of general prediction. We may have more to share in the future when we formally announce the changes are coming into effect. Keep in mind that the content itself and its match to the kind of information a user is seeking remains a very strong signal as well.

The full document covers a wide range of technical issues which will be relevant for any web designer or site manager, but the big picture remains the same. Google has been prioritizing sites with the best user experience for years, and the introduction of Core Web Vitals only advances that effort. 

Find out more about Core Web Vitals here.

Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

It’s a question we all have dealt with at least once or twice, and one that rarely has a satisfying answer: “Why did my Google rankings suddenly drop?”

Sometimes, a simple audit will reveal a technical hiccup or issue that is downgrading your rankings. Just as often, though, it appears everything is working as it should but you are suddenly further down the page or not even on the first page anymore. 

In this situation, Google’s John Mueller says there are four major reasons for sites to lose rankings. 

John Mueller Explains Why Sites Lose Rankings

In a recent Google Webmaster Central chat, Mueller was asked why a publisher who had ranked well for “seven or eight years” had suddenly lost rankings for three different sites. Notably, the person asking the question couldn’t find any signs of problems in their inbound or outbound links, and all the sites used the same keywords (they sell similar products by different brands). 

Of course, Mueller couldn’t get too specific with his answer because he didn’t have actual data or analytics on the sites. Still, he did his best to address four general reasons sites may suddenly rank worse.

1) Rankings Are Temporary

Once a site is ranking at the top for its ideal keywords, many site owners feel like they have accomplished their mission and will continue to rank there. Unfortunately, John Mueller says that rankings are malleable and change constantly.

Mueller explained:

“In general, just because the site was appearing well in search results for a number of years does not mean that it will continue to appear well in search results in the future.

These kinds of changes are essentially to be expected on the web, it’s a very common dynamic environment”

2) The Internet Is Always Changing

The reason why rankings are so prone to fluctuations is that the internet itself is always changing. New sites are being created every day, links might die, competitors might improve their own SEO, and people’s interests change.

Each and every one of these can have a big impact on the search results people see at any given time. 

As Mueller put it:

“On the one hand, things on the web change with your competitors, with other sites…”

3) Google Changes Its Algorithms

To keep up with the constantly changing internet, Google itself has to regularly overhaul how its search engine interprets and ranks websites. 

To give you one idea how this plays out, a few years ago search results were absolutely dominated by “listicles” (short top 5 or top 10 lists). Over time, people got tired of the shallow information these types of lists provided and how easily they could be abused as clickbait. Google recognized this and tweaked its algorithm to better prioritize in-depth information hyper-focusing on a single topic or issue. Now, though a listicle can still rank on Google, it is considerably harder than it used to be.

As Mueller simply explained:

“On the other hand, things on our side change with our algorithms in search.”

4) People Change

This is one that has been touched upon throughout the list Mueller gave, but it really gets to the heart of what Google does. What people expect out of the internet is constantly changing, and it is Google’s job to keep up with these shifts. 

In some cases, this can mean that people outright change how they search. For example, simple keywords like “restaurants near me” or “fix Samsung TV” were the main tool people used to find information for years and years. As voice search has become widespread and people have gotten more accustomed to using search engines all the time, queries have expanded to frequently include full sentences or phrases like “What is the best Chinese restaurant in midtown?”

At the same time, what people expect out of the same queries is also shifting with technological innovation and content trends. 

Mueller describes the situation by saying:

“And finally on the user side as well, the expectations change over time. So, just because something performed well in the past doesn’t mean it will continue to perform well in search in the future.”

Always Be Monitoring and Improving

The big theme behind all of these reasons sites lose rankings is that they are standing still while the world moves past them. To maintain your high rankings, your site has to be constantly in motion – moving with the trends and providing the content users want and expect from sites at any given time. 

This is why successful sites are also constantly monitoring their analytics to identify upcoming shifts and respond to any drops in rankings as soon as they happen.

If you want to see the full response, watch the video below (it starts with Mueller’s response but you can choose to watch the entire Webmaster Central office-hours discussion if you wish).

When creating content to help your SEO, many people believe they should aim for an “ideal” word count. The perfect number has ranged from 300 to 1,500 words per post depending on when and who you ask. There’s just one problem – Google’s leading experts say there is no perfect word count.

Why Do Word Counts Seem Important?

Since Google is relatively tight-lipped about the exact recipe they use to rank sites on its search engine, SEO experts have traditionally had to rely on their own data to understand the inner-workings of the search engine.

Sometimes, this information is later confirmed. Marketing experts had long believed that site speed was an important ranking signal for Google before the company confirmed its impact.

The problem is this approach relies strongly on correlation – which can be unreliable or lead to incorrect conclusions.

This is also why the “ideal” word counts recommended by “experts” tends to vary so wildly. When we have to rely on relatively limited data (at least, compared to Google’s data), it can skew the conclusions taken from the data.

This is where Google’s John Mueller comes in.

What Google Has To Say

The company’s leading experts have repeatedly denied that they consider word counts to be an important ranking signal. Some have suggested it is lightly considered, but the impact is negligible compared to other factors like keyword relevance or backlinks to the page.

The latest Googler to speak out about the issue is John Mueller, Webmaster Trends Analyst at Google.

In a recent tweet, Mueller used a simple analogy to explain why focusing on word counts is the wrong approach.

Simply put, focusing on how long each piece of content is puts the attention on the wrong area. If you write long posts, simply for the point of hitting a total number of words, there is a high risk of drifting off-topic or including irrelevant details.

The better approach is to create content with the goal of answering a specific question or responding to a specific need. Then, write until you’ve provided all the relevant information – whether it takes 300 or 1,500 words to do so.

In the latest episode of Google’s “Search for Beginners” series, the company focused on 5 things everyone should consider for their website.

While it is relatively straight and to the point, the video shares insight into the process of ranking your site on Google and ensuring smooth performance for users across a wide range of devices and platforms.

Specifically, Google’s video recommends:

  1. Check if your site is indexed: Perform a search on Google for “site:[yourwebsite.com]” to ensure your site is being properly indexed and included in search results. If your site isn’t showing up, it means there is an error keeping your site from being crawled or indexed.
  2. Provide high quality content: Content is essential for informing users AND search engines about your site. Following the official webmaster guidelines and best practice documents will help your site rank better and improve overall traffic.
  3. Maximize performance across all devices: Most searches are now occurring on mobile devices, so it is important that your site loads quickly on all devices. You can check to ensure your site is mobile friendly using Google’s online tool here.
  4. Secure your website: Upgrading from HTTP to HTTPS helps protect your users information and limit the chance of bad actors manipulating your site.
  5. Hire an SEO professional: With the increasingly competitive search results and fast-changing results pages, Google recommends hiring an outside professional to assist you.

The video actually implies that hiring an SEO professional is so important they will be devoting significantly more time to it in the future. Here’s what the presenter had to say:

“Are you looking for someone to work on [your website] on your behalf? Hiring a search engine optimizer, or “SEO,” might be an option. SEOs are professionals who can help improve the visibility and ranking of your website. We’ll talk more about hiring an SEO in future episodes.”

This month, Google announced that more than half of all web pages in its search results around the globe are being pulled from its mobile-first index.

That means that the majority of pages being shown in Google’s search results were crawled, indexed, and ranked based on the mobile version of that page. As such, it marks a huge turning point for the increasing mobile-emphasis in web design and optimization.

What exactly is mobile-first indexing?

Over the past two years, Google has established a second, distinct index which prioritizes mobile pages and search results. This came as more than half of all search results were originating from mobile devices, rather than desktop computers.

Gradually, Google has expanded this index with the intent of eventually making it the primary search index.

With the launch of this index, Google also changed how it approached website indexing. Rather than defaulting to the desktop version of a page to assess its optimization and search value, the search engine began indexing mobile pages over their desktop counterpart. Thus, Google began its process of “mobile-first indexing.”

Is your site in Google’s mobile index?

If your site has been added to Google’s mobile-first index, you will likely have been notified within Google Search Console. Simply check your messages to see if your site has been migrated over.

If your site has not been migrated over, there is a chance that Google is having issues viewing the mobile version of your site, has found significant discrepancies between the mobile and desktop versions of your site, or has decided your mobile version is not up to snuff.

You should probably take the time to review the mobile version of your site to ensure it is properly optimized and laid out for Google’s search crawlers. You should also ensure that both versions of your site are largely similar, as Google prefers websites with parity across devices.

While Google is never going to reveal their “secret recipe” that is used to rank the billions and billions of web pages around, the company still wants to help you ensure your site is performing as well as possible.

To help with this, Google has launched a new tool designed to evaluate your website and rate how it follows the company’s SEO best practices and guidelines.

The tool is currently in open beta, but is available to all webmasters at web.dev.

According to the search engine, the tool is the end result of more than 10 years of learning and iteration.

“As the bar for high-quality experience continues to rise, users are quickly disappointed in a web experience that doesn’t deliver. And then they’re gone.

“We believe, however, the web now has the capabilities to overcome that challenge—to give all users the best possible experience wherever they are.”

The most useful part of the tool for most webmasters will be its SEO assessment, but it also includes audits for performance, accessibility, and more.

Specifically, web.dev can evaluate a website’s:

  • Performance: Audits for metrics like first paint and time to interactive to determine lag.
  • PWA: Assesses your page against the baseline Progressive Web App Checklist.
  • Best Practices: Looks for everything from HTTPS usage to correct image aspect ratios.
  • SEO: Checks for best practices to ensure your site is discoverable.
  • Accessibility: Checks for common issues that may prevent users from accessing your content.

All you have to do to evaluate your own site is enter the URL.

Along with some simple images rating your site’s performance, you will also be given a list of recommended improvements you can make, listed in order of how important they are. The recommendations at the top of the list will have the biggest impact on your site, while those at the bottom as more minute changes that will have little effect on your ranking – though they may improve your site’s overall performance.

Web.dev also provides detailed downloadable reports which can be printed or digitally shared with site owners, providing an easy-to-understand breakdown of your site’s performance on Google.

The tool generates an up-to-date report on a daily basis, so you can also quickly see how any changes you make affect your site’s performance.

Not long ago, it seemed like every business website had a “Testimonials” page filled with reviews and references from either past-customers or fellow members of their industry. If you have a keen eye, though, you might have noticed these pages are slowly falling out of use in favor of posting your Google, Yelp, and other online reviews on your site.

The practice has led to some confusion, as many experts claimed putting your own online reviews from across the web on your site could be potentially dangerous for search engine optimization. There have even been suggestions it could lead to Google penalties.

Now, you can breathe easy and share your online reviews with pride, as Google webmaster trends analyst John Mueller has confirmed that it is totally fine to highlight your reviews on your company website – with one exception.

While posting your reviews on your website is acceptable, Mueller warns that you can not use review structured data on these reviews.

As Mueller explained on Twitter:

“From a Google SEO point of view, I don’t see a problem with that. I imagine the original is more likely to rank for that text, but if you use that to provide context, that’s fine (it shouldn’t be marked up with structured data though).”

Mueller then went on to explain that review structured data is intended for reviews “directly produced by your site” and using them on third-party reviews on your own site would go against Google’s guidelines.