A representative from Google announced the search engine began rolling out a broad core update (appropriately titled the June 2021 Core Update) this week. Surprisingly, the announcement also revealed a second update is expected to roll out next month. 

Note that this is not the Page Experience Update which Google is planning to launch in mid-June.

Typically, Google rolls out a broad core update every few months. For example, the last update before this came nearly six months ago, in December 2020. The gap between updates before that was even longer, with the previous update arriving in May 2020. 

Obviously, this raises some questions about why the company felt the need to start releasing a two-part algorithm now, rather than waiting to roll it all out at once next month. 

Google being Google, details about what the broad core updates will change are relatively scant. Still, here’s what we do know:

Why Two Core Updates?

Based on statements from Google liaison Danny Sullivan and others, it seems the search engine simply didn’t want to sit on some of the completed updates while it waited for the rest to be finalized. 

Sullivan did note that some effects from the first part of the update may be temporary, however, as the second part rolls out. 

“Of course, any core update can produce drops or gains for some content. Because of the two-part nature of this release, it’s possible a very small slice of content might see changes in June that reverse in July.”

What You Should Expect

As with most broad core updates, Google is giving somewhat mixed signals about how big the impact will be. 

On one hand, the company says most sites won’t notice any changes to their presence in search results. At the same time, Google says the update will produce “some widely noticeable effects.”

From past experience, we can predict that sites producing quality content and keeping up with overall Google guidelines will be largely unaffected. Those within more controversial or less reputable industries (online gambling, some medical niches, law, etc.), may be more likely to see some fallout even if they have been doing everything “right”. 

Those using tactics which can be seen as more “spammy” such as republishing content, using user-generated content in overbearing or spammy ways, or using questionable guest-blogging practices may also be likely to see some negative results as the update rolls out.

Ultimately, we will all have to wait and see as the update finishes, which Google says should take about two weeks. 

What To Do If You Are Affected

Perhaps one of the most frustrating things about broad core updates is that you can be impacted even if you aren’t doing anything ostensibly “wrong”. Some pages may see negative ranking shifts despite following all of Google’s guidance. 

This makes recovering a tricky proposition, but Google has provided some advice for brands negatively impacted. 

Specifically, the company suggests asking yourself the following questions about your brand:

Content and Quality Questions

  • Does the content provide original information, reporting, research or analysis?
  • Does the content provide a substantial, complete or comprehensive description of the topic?
  • Does the content provide insightful analysis or interesting information that is beyond obvious?
  • If the content draws on other sources, does it avoid simply copying or rewriting those sources and instead provide substantial additional value and originality?
  • Does the headline and/or page title provide a descriptive, helpful summary of the content?
  • Does the headline and/or page title avoid being exaggerating or shocking in nature?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Would you expect to see this content in or referenced by a printed magazine, encyclopedia or book?

Expertise Questions

  • Does the content present information in a way that makes you want to trust it, such as clear sourcing, evidence of the expertise involved, background about the author or the site that publishes it, such as through links to an author page or a site’s About page?
  • If you researched the site producing the content, would you come away with an impression that it is well-trusted or widely-recognized as an authority on its topic?
  • Is this content written by an expert or enthusiast who demonstrably knows the topic well?
  • Is the content free from easily-verified factual errors?
  • Would you feel comfortable trusting this content for issues relating to your money or your life?

Presentation and Production Questions

  • Is the content free from spelling or stylistic issues?
  • Was the content produced well, or does it appear sloppy or hastily produced?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Does the content have an excessive amount of ads that distract from or interfere with the main content?
  • Does content display well for mobile devices when viewed on them?

Comparative Questions

  • Does the content provide substantial value when compared to other pages in search results?
  • Does the content seem to be serving the genuine interests of visitors to the site or does it seem to exist solely by someone attempting to guess what might rank well in search engines?

While not hard and fast guidance, these questions can help you evaluate your site and find areas to improve upon before the next broad core update. 

Thankfully, in this case we know the next update is coming quite soon – July 2021 – so there is a chance any negative effects from the ongoing update will be short-lived. 

Have you gotten your brand’s website ready for the upcoming Google Page Experience ranking signal update? 

If not, Google Developer Martin Splitt says there’s no need to panic. 

In an interview on the Search Engine Journal Show on YouTube, host Loren Baker asks Splitt what advice he would give to anyone worried their site isn’t prepared for the update set to launch in mid-June. 

Giving a rare peek at the expected impact of the impending update, Splitt reveals the Page Experience signal update isn’t going to be a massive gamechanger. Instead, it is more of a “tiebreaker.”

As a “lightweight ranking signal”, just optimizing your site’s Page Experience metrics isn’t going to launch you from the back of the pack to the front. If you are competing with a site with exactly the same performance in every other area, however, this will give you the leg up to receive the better position in the search results. 

Don’t Ignore The Update

While the Page Experience update isn’t set to radically change up the search results, Splitt says brands and site owners should still work to optimize their site with the new signals in mind. 

Ultimately, making your page faster, more accessible on a variety of devices, and easier to use is always a worthwhile effort – even if it’s not a major ranking signal. 

As Splitt says:

“First things first, don’t panic. Don’t completely freak out, because as I said it’s a tiebreaker. For some it will be quite substantial, for some it will not be very substantial, so you don’t know which bucket you’ll be in because that depends a lot on context and industry and niche. So I wouldn’t worry too much about it.

I think generally making your website faster for users should be an important goal, and it should not just be like completely ignored. Which is the situation in many companies today that they’re just like ‘yeah, whatever.’”

As for how he thinks brands should approach the update, Splitt recommended focusing on new projects and content rather than prioritizing revamping your entire site upfront. 

… For new projects, definitely advise them to look into Core Web Vitals from the get-go. For projects that are already in maintenance mode, or are already actively being deployed, I would look into making some sort of plan for the mid-term future — like the next six months, eight months, twelve months — to actually work on the Core Web Vitals and to improve performance. Not just from an SEO perspective, but also literally for your users.”

Much of the discussion focuses on the perspective of SEO professionals, but it includes several bits of relevant information for anyone who owns or manages a website for their business. 

To hear the full conversation, check out the video below from Search Engine Journal:

Google’s upcoming Page Experience ranking update – initially believed to be exclusive to mobile search – will also be coming to desktop search results in the future. 

The reveal came during part of Google’s annual big I/O event this week, by Google Search product manager Jeffrey Jose. 

Since the announcement of the Page Experience update, which will implement new ranking signals based on “Core Web Vitals” which assess the user friendliness of a site, was going to be rolled out to only Google’s mobile search results. 

As Jose explained, however, the update will also be coming to desktop search – at a later date.

“Today I am happy to announce that we are bringing Page Experience ranking to desktop. While we’re launching Page Experience on mobile soon, we believe page experience is critical no matter the surface the user is browsing the web. This is why we’re working hard on bringing page experience ranking to desktop. As always we’ll be providing updated guidance, documentation, and tools along the way to help your pages perform at its best. Stay tuned for more details on this.”

The specific wording of the announcement suggests the desktop update may use its own set of unique or modified ranking signals or criteria. This is reasonable considering users are likely to have different usability expectations depending on which platform they are using. 

While the launch of the desktop Page Experience update is unknown, the mobile version is still scheduled to begin rolling out in June and be completely implemented by August.

To learn more about the Page Experience update and to see the announcement for yourself, check out the video below:

It can be easy to take for granted how little spam shows up in the dozens of Google searches we make every day.

While we are almost always able to find what we need through the search engine without an abundance of malicious, copied, or just plain spammy websites, the search engine says it has been ramping up spam detection behind the scenes to fight the seemingly endless hordes of illicit or otherwise problematic sites from filling up its search results.

In fact, Google’s webspam report for 2020 says the search engine detected more than 40 billion pages of spam every day last year. This reflects a 60% increase from the year before.

How Google Search is Fighting Spam

It is possible there was a distinct increase in spammy sites last year, potentially due to disruptions and other changes brought about by the Covid pandemic. According to the search engine though, the bulk of this increase is the result of increased spam prevention efforts with the help of AI.

Artificial intelligence and machine learning have helped the company keep with new spam methods and are credited with allowing the search engine to reduce auto-generated or scraped content “by more than 80% compared to a couple of years ago.”

This AI-based approach also frees up Google’s manual action spam team to focus on more advanced forms of spam, such as hacked sites which were “still rampant in 2020.”

To show you how this approach works and helps filter out the bulk of webspam before it even gets added to Google’s indexes, the company shared a simple graphic:

COVID Spam and Misinformation

As with everyone, Google faced unprecedented situations in the past year as it responded to the COVID-19 pandemic. This included devoting “significant effort in extending protection to the billions of searches” related to the virus.

One part of this effort was instituting a “more about this result” feature which added additional context about sites before clicking through to one of their pages. This intends to help users avoid bad actors that popped up, especially during the early stages of the pandemic.

Additionally, the search engine says it worked to remove misinformation that could be dangerous during the course of the pandemic.

What This Means For You

Assuming you are a reputable professional in your industry, Google’s increased efforts to fight spam should only be a source of comfort. There have been fewer reports of sites being incorrectly targeted by these spam prevention methods in recent years, while the overall level of deceptive, spammy, or harmful sites in the search results has plummeted. 

All in all, this means a better experience for both users trying to find information and products, as well as brands fighting to reach new customers online.

For many small-to-medium businesses, appearing in search results around their local area is significantly more important than popping up in the results for someone halfway across the country. 

This raises the question, though. How many of the countless searches made every day are actually locally based?

We now have the answer to that question thanks to a new tool released by LocalSEOGuide.com and Traject Data.

What Percent Of Searches Are Local?

Working together, the companies analyzed over 60 million U.S. search queries and found that over a third (approx. 36%) of all queries returned Google’s local pack – indicating the search was location-based. 

Perhaps the biggest surprise from the data is that locally-based searches have remained largely consistent throughout the year. Following an uptick in early 2020 (likely driven by the coronavirus pandemic), the rate stayed around 36% over the course of the year. The only significant exception came in September, where the data shows a significant decrease in locally-driven searches. 

This data shows just how important it is for even brands that are strictly local to establish their brands online and optimize for search engines. Otherwise, you might be missing out on a big source of potential business.

Other Features In The Local Pack-O-Meter

Along with data on the appearance of local packs in Google search results, the Local Pack-O-Meter includes information on several other search features. These include:

  • Knowledge Graphs
  • “People Also Ask” Panels
  • Image Boxes
  • Shopping Boxes
  • Ads
  • Related Searches
  • And more

Though the current form of the tool doesn’t include ways to more selectively filter the information, there is plenty to take from the information for planning what search features you need to prioritize and which can be put on the back burner. 

To explore the Local Pack-O-Meter for yourself, click here.

If you are an online retailer, you are no doubt familiar with Google’s wide array of special features built for online shopping. You are also probably aware of how confusing it can be to get included in these unique search results.

To help clarify this process and make it easier to get your products highlighted in Google’s search results, the search engine recently revealed some technical tips and tricks for e-commerce sites. 

Why It Takes Extra Work To Get In Google Shopping Results

The first question most business owners or site managers might have when they start trying to get their products included in Google Shopping results is “why do I have to do all this extra work?”

Google’s whole thing is analyzing sites and automatically delivering that information in its search results, right? Why can’t they just pull your product info when your pages get indexed?

The simple answer is that Google knows online retail changes very quickly and shoppers get very frustrated with out of date or inaccurate information. If this became a frequent problem, users would likely stop paying attention to Google’s product-related search results. 

While the search engine regularly re-indexes updated webpages, it can’t guarantee pages will be indexed fast enough to ensure information is up-to-date for searchers. 

Additionally, there are some features which online retailers tend to provide to help shoppers which can make things a little confusing for search engines to understand. 

For example, Google says it still struggles with accurately telling the difference between these types of information:

  • Original Price vs. Discounted Price
  • Related Products vs. The Main Product Being Sold
  • Taxes or Shipping Costs vs. The Actual Product Price

This is why the search engine asks online retailers to help provide this information for Google Shopping results.

Now, let’s get into the advice from Google Developer Advocate Alan Kent and how you can get your products into Google product showcases.

Two Ways To Give Google Your Product Data

In the latest Lightning Talks video, Kent discusses two different ways site managers can get their product information to Google. 

The first method is by using structured data. This is essentially using special coding embedded into pages to provide Google with additional information typically not provided through regular site code or markup. 

This is generally seen as the advanced approach because it requires significant knowledge of coding and the latest structured data techniques. 

The other method covered by Kent is by directly providing product data through Google Merchant Center, which can be done with:

  • A feed of all product data manually submitted to the search engine.
  • An API developed to update products individually as changes are made on your site. 

For more information, check out the guide provided by Google.

Conclusion

While providing product data to search engines is essential for appearing in these specific product-centric search results, the company emphasizes that these practices don’t replace traditional SEO.

“Remember that SEO still matters for organic search. Make your product details, such as images and descriptions, appealing to your customers.”

If you want to watch the full explanation from Kent, it is available below:

Throughout 2020, approximately 65% of searches made on Google were “zero-click searches”, meaning that the search never resulted in an actual website visit.

Zero-click searches have been steadily on the rise, reaching 50% in June 2019 according to a study published by online marketing expert Rand Fishkin and SimilarWeb.

The steep rise in these types of searches between January and December 2020 is particularly surprising because it was widely believed zero-click searches were largely driven by mobile users looking for quick-answers. Throughout 2020, however, most of us were less mobile than ever due to Covid restrictions, social distancing, and quarantines.

The findings of this latest report don’t entirely disprove this theory, though. Mobile devices still saw the majority of zero-click Google searches. On desktop, less than half (46.5%) were zero-click searches, while more than three-fourths (77.2%) of searches from mobile devices did not result in a website visit.

Study Limitations

Fishkin acknowledges that his reports do come with a small caveat. Each analysis used different data sources and included different searching methods, which may explain some of the variance. Additionally, the newer study – which included data from over 5.1 trillion Google searches – had access to a significantly larger data pool compared to the approximately one billion searches used in the 2019 study.

“Nonetheless, it seems probable that if the previous panel were still available, it would show a similar trend of increasing click cannibalization by Google,” Fishkin said in his analysis.

What This Means For Businesses

The most obvious takeaway from these findings is that people are increasingly finding the information they are looking for directly on the search results pages, rather than needing to visit a web-page for more in-depth information.

It also means that attempts to regulate Google are largely failing.

Many have criticized and even pursued legal action (with varying levels of success) against the search engine for abusing their access to information on websites by showing that information in “knowledge panels” on search results.

The argument is that Google is stealing copyrighted information and republishing it on their own site. Additionally, this practice could potentially create less reason for searchers to click on ads, meaning Google is contributing to falling click-through rates and making more money off of it.

Ultimately, Google is showing no signs of slowing down on its use of knowledge panels and direct answers within search results. To adjust to the rise of zero-click searches, brands should put more energy into optimizing their content to appear in knowledge panels (increasing your brand awareness) and diversify their web presence with social media activity to directly reach customers.

In a Google Search Central SEO session recently, Google’s John Mueller shed light on a way the search engine’s systems can go astray – keeping pages on your site from being indexed and appearing in search. 

Essentially the issue comes from Google’s predictive approach to identifying duplicate content based on URL patterns, which has the potential to incorrectly identify duplicate content based on the URL alone. 

Google uses the predictive system to increase the efficiency of its crawling and indexing of sites by skipping over content which is just a copy of another page. By leaving these pages out of the index, Google’s engine has less chances of showing repetitious content in its search results and allows its indexing systems to reach other, more unique content more quickly. 

Obviously the problem is that content creators could unintentionally trigger these predictive systems when publishing unique content on similar topics, leaving quality content out of the search engine. 

John Mueller Explains How Google Could Misidentify Duplicate Content

In a response to a question from a user whose pages were not being indexed correctly, Mueller explained that Google uses multiple layers of filters to weed out duplicate content:

“What tends to happen on our side is we have multiple levels of trying to understand when there is duplicate content on a site. And one is when we look at the page’s content directly and we kind of see, well, this page has this content, this page has different content, we should treat them as separate pages.

The other thing is kind of a broader predictive approach that we have where we look at the URL structure of a website where we see, well, in the past, when we’ve looked at URLs that look like this, we’ve seen they have the same content as URLs like this. And then we’ll essentially learn that pattern and say, URLs that look like this are the same as URLs that look like this.”

He also explained how these systems can sometimes go too far and Google could incorrectly filter out unique content based on URL patterns on a site:

“Even without looking at the individual URLs we can sometimes say, well, we’ll save ourselves some crawling and indexing and just focus on these assumed or very likely duplication cases. And I have seen that happen with things like cities.

I have seen that happen with things like, I don’t know, automobiles is another one where we saw that happen, where essentially our systems recognize that what you specify as a city name is something that is not so relevant for the actual URLs. And usually we learn that kind of pattern when a site provides a lot of the same content with alternate names.”

How Can You Protect Your Site From This?

While Google’s John Mueller wasn’t able to provide a full-proof solution or prevention for this issue, he did offer some advice for sites that have been affected:

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

And that could be by using something like a rel canonical on the page and saying, well, this small city that is right outside the big city, I’ll set the canonical to the big city because it shows exactly the same content.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.

Or we see clear information that this URL you know is supposed to be the same as this other one, you have maybe set up a redirect or you have a rel canonical set up there, and we can just focus on those main URLs and still understand that the city aspect there is critical for your individual pages.”

It should be clarified that duplicate content or pages impacted by this problem will not hurt the overall SEO of your site. So, for example, having several pages tagged as being duplicate content won’t prevent your home page from appearing for relevant searches. 

Still, the issue has the potential to gradually decrease the efficiency of your SEO efforts, not to mention making it harder for people to find the valuable information you are providing. 

To see Mueller’s full explanation, watch the video below:

Google My Business is expanding its performance report for business listings with a new breakdown of how people are finding your listing.

The new analytics section details whether people are coming to your listing using either a mobile or desktop device, as well as if they found you through Google Search or Maps.

How To Find The New Report

To access the report for your listing, first sign in and select which location or business you are wanting to assess. Then, select the Insights tab on the left. On this page, you’ll find the new performance reports available directly at the top.

Below, you can see an example of the report shared by Barry Schwartz from Search Engine Roundtable.

Within the performance report, you’ll find a section explaining “How people discovered you.”

On one side of the report, you’ll see the “People who viewed your business profile” section, while the right column shows the specific searches being used to find your page.

Learning More About Device and Source Reports

To coincide with the launch of these reports, Google has updated its help documents to add a section explaining the “users who viewed your profile” data.

As the document explains:

“A user can be counted a limited number of times if they visit your Business Profile on multiple devices and platforms such as desktop or mobile and Google Maps or Google Search. Per breakdown device and platform, a user can only be counted once a day. Multiple daily visits aren’t counted. “

There are also a few important details to keep in mind when viewing the report:

  • Since this metric represents the number of unique users, it may be lower than the number of views you find on Google My Business and in email notifications. 
  • Since the metric focuses on views of the Business Profile, as opposed to overall views of the Business on Google, it may also be lower than the number of views you find on Google My Business and in email notifications.

Insights like these help with not only improving your listings and optimization to perform more effectively in search results. They can also help understand your customers and their specific needs or behaviors which may, in turn, allow you to provide better service for them.

Google My Business has officially launched a new label that highlights the number of years you’ve been in business within local search results.

The “years in business” label has been in testing over the past few years, and was quietly launched officially on February 9th, 2021.

While it is just a small label added to your listing, this could prove to be a significant way to differentiate yourself in the crowded “local pack” search results.

As Google put it in the announcement, you can now “add an opening date to your Business Profile to tell customers when your business first opened, or will open, and its address.”

To get an idea of what the label looks like, Barry Schwartz from RustyBrick (and who first noticed the launch of the label) took a screenshot of his own business listing with the new tag.

Source: Barry Scwhartz/RustyBrick, Inc.

How To Get The ‘Years in Business’ Tag

Adding this label to your own Google My Business listing is relatively simple. All you have to do is add the open date of your business within your GMB profile. 

To do this, just sign into your GMB account, click the location you want to update, then select the “info” option in the menu. From there, click “add opening date”, update with your own date you opened up shop, and voila. The label should be added to your local listing within the next few days.

“I’ve Been Seeing This Label For Months”

Many might have noticed that Google has been slowly adding this label to many of the listings which are eligible over the past year. Users first spotted the tag way back in September of 2020, with a larger roll out done in November.

Still, this week marks the official launch of the feature for all Google My Business listings.

How This Helps You

Thanks to bad actors listing non-existent or questionable businesses within Google My Business, it has become more important than ever to visibly show that you are a real, active, and trustworthy business within your listing.

This feature allows you to quickly do this by showing you have been a part of your community for years – if not decades – and won’t be going anywhere anytime soon.