Google is ramping up to release its next core algorithm update “in the coming weeks”, likely signaling a major shakeup coming to search results in the near future. 

The reveal that a core algorithm update is coming came from Google Search Liaison and well-known SEO journalist Danny Sullivan who posted a lengthy message about the coming update on his website, Search Engine Roundtable.

When Is The Algorithm Update Coming?

In his message, Sullivan says that the teams at Google haven’t figured out exactly what day the core update is coming because there is still testing being done. Despite this, Sullivan felt confident enough to say that he expects the update to roll out in the coming weeks even if it takes tweaking after testing.

Notably, Sullivan says he had considered posting similar updates before the release of past core algorithm updates but did not because of the potential for them to be pushed back. In this instance, he is apparently more confident the update will pass through testing relatively quickly.

Sullivan’s full post reads:

“We’d tell you when the next core update will be if we knew. But we don’t know exactly yet, that’s all. These aren’t scheduled to a particular day. The ranking team makes changes, tests those, evaluates those and eventually we get a launch date. There have been many times I could have said “Core update next week!” because everything was on track for that to happen, but then there’s a need to do a bit more work or other things that might cause a pushback. I would expect we’ll see one in the coming weeks, because that fits in with our general cycle. But precisely when, that’s just not known yet.”

Past Major Algorithm Updates

This upcoming core algorithm update will be the first since one which began rolling out in March 2024 and completed in April. The reason for the longer-than-normal rollout is that this update was the largest core algorithm update to date. 

Before that, Google released a slew of smaller updates in August, October, and November of 2023. 

Sullivan did not give any insight into how big the upcoming update might be or what might be targeted by the update. For now, we can only assume that this update is aimed at reducing spam and improving the relevance of search results. 

We will update you as more information about the upcoming core algorithm update is revealed or when it begins rolling out to the public. 

Listing menu items in your Google Business Profile and having a busy shop seem to be powerful ways to help your business’s local Google rankings according to a newly published set of tests by SEO expert Claudia Tomina.

Google Business Profiles are the central way local shoppers find new and nearby businesses, so keeping your listing up to date and as full of information as possible is crucial. 

Though officially unconfirmed, Tomina’s test gives strong evidence to support the idea that menu items and how busy a business is are ranking signals for local Google searches. 

How Menu Items May Impact Rankings

According to the report, adding specific menu items on Google can help your restaurant rank for searches for those foods. 

In one example, Tomina added “caesar salad” to the menu items for a restaurant’s account and clearly saw an uptick in search position for the query “best caesar salad near me.” The addition didn’t just give her a small bump in the rankings. The restaurant went from search position 71 to the very top position.

How Busier Locations May Impact Rankings

Tomina’s tests also found that busier stores or restaurants during the Google popular times window tend to rank busier than less busy establishments.

As she wrote in her report, “My research shows that if a business is busier at a specific time of day then they outrank their competitors.”

In the charts below, you can see how rankings for the keyword “caesar salad near me” tended to rank better during popular times during the day.

The Big Picture

Local search results can be a highly competitive area for many businesses. Any edge that you can get on your competition can be the difference between getting a lead or missing out on it to another local business. 

If you haven’t updated your menu on Google, now is the time to do so.

A lot has been made of the importance of new content when it comes to ranking on Google. But, what’s so bad about older content? Are all old posts bad for your site? Should you be regularly removing old posts?

Thankfully, Google’s John Mueller and Lizzi Sassman addressed this recently on an episode of the Search Off The Record podcast.

In the episode, Mueller and Sassman talked at length about content decay, a term referring to content that becomes outdated or irrelevant over time, how it affects your site, and what you should do about it.

What Is Content Decay According to Google

While the term content decay isn’t necessarily a commonly used term within the SEO community, it is an apt term for some types of content. Specifically, this is how Mueller defines content decay:

“[Content decay is] something where, when you look at reference material, it’s kind of by definition old. People wrote about it because they’ve studied it for a really long time, so it’s an old thing. But, that doesn’t mean it’s no longer true or no longer useful.”

Is Content Decay Inherently Bad?

As Google’s workers explained, content decay isn’t inherently bad. Even some posts that may seem outdated, such as old event announcements or product updates, shouldn’t be treated as a problem.Sassman recommends keeping this content around for historical accuracy.

As an example, Sassman pointed toward Google’s help pages which still use the outdated term “Webmaster Tools.”

“If we went back and we replaced everything, like where we said Google Webmasters or Webmaster Tools, if we replaced Search Console everywhere we said Webmaster Tools, it would be factually incorrect.”

What Should You Do About Content Decay?

It might be tempting to simply delete outdated content but Mueller recommends going back and adding context to outdated content instead. This way, you still retain the value from past content while making it clear what aspects are now irrelevant and prevent confusion among readers. 

As he stated:

“People come to our site for whatever reason, then we should make sure that they find information that’s helpful for them and that they understand the context. If something is old and they search for it, they should be able to recognize, ‘Oh, maybe I have to rethink what I wanted to do because what I was searching for doesn’t exist anymore or is completely different now.”

For more, listen to the full episode of Google’s Search Off The Record podcast below:

Every brand wants to put their best foot forward. If you want to do that online, you need to understand what canonical URLs are. So, today we are going to talk a bit about what canonical URLs are, why your pages may have a canonical version, and how Google chooses which page is the canonical page. 

What Are Canonical URLs?

A canonical URL or web page is the version of a page selected to be indexed by Google when there are multiple versions of the page. 

This version of the page is used by Google to rank the web page and be displayed in search results in order to prevent duplicate search listings. 

As the owner of the website, you have some control over which pages are chosen to be canonical URLs. As we will get into further down, though, Google doesn’t always select the page you believe should be the canonical version.

Before we get to that, let’s take a moment to talk about the legitimate reasons why you may have duplicate versions of a page.

5 Reasons For Having Duplicate Web Pages

According to Google’s official documentation and guidelines about canonical webpages, the search engine believes there are five legitimate reasons a webpage may have multiple versions. 

  1. Region variants: for example, a piece of content for the USA and the UK, accessible from different URLs, but essentially the same content in the same language
  2. Device variants: for example, a page with both a mobile and a desktop version
  3. Protocol variants: for example, the HTTP and HTTPS versions of a site
  4. Site functions: for example, the results of sorting and filtering functions of a category page
  5. Accidental variants: for example, the demo version of the site is accidentally left accessible to crawlers

How Google Chooses A Canonical Webpage 

Until very recently, it was unclear exactly how Google selected canonical pages. Website owners and managers could signal which version they wanted to appear in search results using the rel=”canonical” tag in the code of the page. 

However, this version wasn’t always the one that Google went with.

Gary Ilyes from Google cleared the mystery up (mostly) in a recent Google Search Central video. 

The process starts with finding the content and identifying the main content or “centerpiece of a page”. Then, it groups the pages with similar content in duplicate clusters. 

Then, Google uses a handful of pages to essentially rank each version of the page like it would a listing in search results. The page with the best ranking is selected as the canonical version and included in most search results. 

While he doesn’t list exactly what signals are used, Ilyes did say this:

“Some signals are very straightforward, such as site owner annotations in HTML like rel=”canonical”, while others, like the importance of an individual page on the internet, are less straightforward.”

Notably, this doesn’t mean that Google only indexes one version of the page to be used in all contexts. There are situations where Google may decide to show users a version of the page other than the canonical version.

“The other versions in the cluster become alternate versions that may be served in different contexts, like if the user is searching for a very specific page from the cluster.

To hear Gary Ilyes himself talk about the process, check out the full Google Search Central Video below:

Thanks to a new review algorithm, Google says it has become better and faster at identifying fake reviews. In a new blog post, the company declared, “In 2023, this new algorithm helped us take down 45% more fake reviews than the year before.”

According to Google, it receives more than 20 million contributions every day to Maps and Search. This can make it very difficult to filter out the inauthentic or malicious contributions, so Google uses complex algorithms, along with employees, to help spot these fake contributions. 

The New Algorithm

The latest major algorithm designed to detect and remove fake reviews was added last year. In the blog post, the company describes the algorithm as “a machine learning algorithm that detects questionable review patterns even quicker” by evaluating “longer-term signals on a daily basis” to spot “one-off cases and broader attack patterns.”

For example, the algorithm may act if it sees that “a reviewer leaves the same review on multiple businesses or if a business receives a sudden spike in 1 or 5-star reviews.”

In one case, the algorithm noticed when a group of scammers began falsely claiming they could get people paid for doing high-paying online tasks like writing fake reviews or clicking ads. 

As the company described, the algorithm “quickly identified this surge in suspicious reviews thanks to its ability to continuously analyze patterns, like whether an account had previously posted reviews.” 

With this data, human review analysts were able to cross-reference the data with reports on merchants who had seen a spike in suspicious 5-star reviews to remove even more of the fake reviews. 

In just this one scheme, Google says it was able to remove more than 5 million fake reviews within just a few weeks. 

More From The Data

Along with highlighting how the new algorithm allows Google to identify fake reviews, the blog post highlights several other statistics about fake reviews and spam it had removed throughout 2023:

  • Google blocked or removed over 170 million policy-violating reviews in the past year (a 45% increase from 2022). 
  • Over 12 million fake business profiles were blocked or removed in 2023.
  • 14 million policy-violating videos were identified and removed (an increase of 7 million more than the year before).
  • Google prevented more than 2 million attempts from bad actors to claim Business Profiles that were not theirs (double the amount from 2022).

For more, read the complete blog post detailing how Google identified and removed spammy or malicious contributions to Business Profiles and online reviews last year. 

Google is making a big change to its Core Web Vitals ranking signals soon, as the company announced that the new Interaction to Next Paint (INP) signal will replace the First Input Delay (FID) on March 12.

The new INP metric measures the amount of time between when a user interacts with a web page (for example, by clicking a button) to when a browser begins rendering pixels on the screen.

Though FID measured a similar time between user input and browser rendering, Google says INP captures interactivity in ways that were not possible previously.

The History Behind FID and INP Metrics

FID has been a metric used by Google to rank sites since the debut of Google’s Core Web Vitals in 2018. However, Google quickly began to see that this metric didn’t fully capture user interactions as they had hoped. 

This led to Google introducing INP as an experimental or “pending” metric in 2022. Now, almost 2 years later, Google has decided to fully replace FID with the INP metric in March.

What You Should Do

Before March, it is recommended that website managers ensure their site is meeting the threshold for a “good” INP performance. 

If you do not meet this mark, Google suggests optimizing your site with these strategies:

  • Evaluate your site’s performance using tools such as PageSpeed Insights or the Google Chrome User Experience Report.
  • Identify issues that may be slowing down INP, like extended JavaScript tasks, excessive main thread activity, or a large DOM. 
  • Optimize issues based on Google’s optimization guides for the specific issue.

As Google’s ranking algorithms evolve, this and other ranking signals will likely be updated or replaced. This emphasizes how important it is to use the latest optimization standards and to ensure a smooth user experience if you want your business to be easily found online.

The Google SEO Starter Guide is designed to help individuals and organizations quickly learn the most important steps necessary for getting their websites ranking within Google Search. 

While the guide reportedly maintains a 91% approval rating, it has largely gone without updates for several years but that will be changing soon.

In a recent episode of Google’s “Search Off The Record” podcast, the company’s Search Relations team discussed plans to update the SEO Starter Guide, including talking about what would and would not be included in the revised document. 

Discussions like this are great for seeing how SEO is talked about within the search engine and learning what the company prioritizes when ranking sites along with identifying SEO myths that might lead you astray when optimizing your own site. 

So, what’s changing in the revised SEO Starter Guide?

HTML Structure

One topic the group discussed was the importance (or lack thereof) of HTML structure when it comes to online rankings.

While the team agreed that using proper HTML structure can help with online rankings, they indicated the guide will clarify that these are not all that important in the grand scheme.

As Google’s Gary Ilyes said:

“Using headings and a good title element and having paragraphs, yeah, sure. It’s all great, but other than that it’s pretty futile to think about how the page… or the HTML is structured.”

Branded Domain Names vs Keyword Rich Domain Names

SEO experts have been increasingly debating whether it is better to focus on your existing branding when establishing a domain name, or if domains perform better when including specific keywords.

According to the Google team, the new guide will clarify this by indicating that brands should focus on including branding in their domains over using keywords. The thought process shared by those in the discussion was that establishing a memorable brand will have a more long-term impact than trying to optimize your domain specifically for search engines. 

Debunking SEO Myths

Lastly, the group said one thing they want to improve in the document was how it addressed widespread SEO myths and misconceptions. 

For example, everyone agreed that the SEO Starter Guide should specifically debunk the idea that using Google products while creating or optimizing your site will improve search rankings. 

They indicated they would address this myth and several others to prevent people from optimizing their site based on misinformation found elsewhere online. 

For more from the discussion, listen to the entire episode of “Search Off The Record” here.

A lead Google spokesperson gave a surprising response to claims that the search engine stole content from a publisher without providing any benefit to the publisher’s website. 

Google’s rich search results have been controversial since their launch, as some feel that these results simply copy information from other websites instead of sending users to that content where it was originally posted. 

The search engine has largely ignored these criticisms by saying that rich results improve the search experience and include links to the original content. 

That’s what makes it so surprising that Google Search Liaison Danny Sullivan recently publicly responded to one publisher’s complaints directly.

The Original Complaint

In several recent tweets, a representative for travel brand Travel Lemming posted:

“Google is now stealing Travel Lemming’s own brand searches (even via site search).

They take our list — INCLUDING MY ORIGINAL PHOTOS 📸 — and present it in a rich result so people don’t click through.

I am literally IN that Red Rocks photo!…”

They are doing this across all travel searches – unbranded and branded alike.

Example: “Mexico Travel Tips” – they have an AI answer & also a rich result that basically just re-creates an entire blog post, including our stolen photos.

Again, I am IN that Mexico packing photo!

Like how is it legal for Google to just essentially create entire blog posts from creators’ content and images?

I literally have a law degree from the top law school in the world, and even I can’t figure it out!

Fair use does NOT apply if you’re using the content to compete directly against the creator, which they clearly are.

I can’t sit outside a movie theatre, project the movie on a wall, earn money from it, and claim fair use.

I spent SO much time taking those photos in Denver.

It was 10+ full days worth of work for me and partner Clara, going around the city to photograph everything. $100s of money spent in attraction admission fees, gas, parking.

Now Google just gets to extract all that value?

How much does Google get to take before creators say “enough is enough”?

How hard does the water have to boil before the frog jumps?

The comments show it is a prisoner’s dilemma as long as Google has a monopoly on search …”

Google’s Response

Danny Sullivan, Google’s Search Liaison, provided a lengthy response that delves specifically into what is happening, why, and ways they are hoping to improve the situation. 

Not only does Sullivan give insight into the company’s perspective, but also their own opinions about the function. Importantly, Sullivan doesn’t disregard Travel Lemming’s complaints and is sympathetic to how rich search results impact publishers:

“Hey Nate, this got flagged to my attention. I’ll pass along the feedback to the team. Pretty sure this isn’t a new feature. Elsewhere in the thread, you talk about it being an AI answer, and I’m pretty sure that’s not the case, either. It’s a way to refine an initial query and browse into more results.

With the example you point out, when you expand the listing, your image is there with a credit. If you click, a preview with a larger view comes up, and that lets people visit the site. Personally, I’m not a fan of the preview-to-click.

I think it should click directly to the site (feedback I’ve shared internally before, and I’ll do this again). But it’s making use of how Google Images operates, where there’s a larger preview that helps people decide if an image is relevant to their search query. Your site is also listed there, too. Click on that, people get to your site.”

If you don’t want your images to appear in Google Search, this explains how to block them:

https://developers.google.com/search/docs/crawling-indexing/prevent-images-on-your-page

I suspect you’d prefer an option to not have them appear as thumbnails in particular features. We don’t have that type of granular control, but I’ll also pass the feedback on. 

I appreciate your thoughts and concerns. I do. The intention overall is to make search better, which includes ensuring people do indeed continue to the open web — because we know for us to thrive, the open web needs to thrive.

But I can also appreciate that this might not seem obvious from how some of the features display.

I’m going to be sharing these concerns with the search team, because they’re important.

You and other creators that are producing good content (and when you’re ranking in the top results, that’s us saying it’s good content) should feel we are supporting you.

We need to look at how what we say and how our features operate ensure you feel that way.

I’ll be including your response as part of this.”

I doubt Sullivan is going to change many minds about Google’s rich search results, but this rare interaction is revealing to how Google sees the situation and is trying to walk a tightrope between providing a seamless search experience while sustaining the sites it relies on.

Google has released its annual Year in Search list of trends and popular searches, but this year’s list includes a special twist.

To celebrate 25 years as a search engine, this year’s Year In Search includes a ton of interesting fun facts, milestones, and a time capsule that helps users see how far we’ve come.

25 Years of Digital Search

To open the latest Year in Search report, Google included a film that highlights how our interests and technology have changed in the last two and a half decades.

The video includes all sorts of gigantic accomplishments from popular figures such as Taylor Swift and iconic brands like Pokemon, along with massive social and scientific developments like nuclear fusion, and the increasing acceptance of marriage equality.

The Time Capsule

The Google Trends Time Capsule shows the most popular searches for certain categories from each year.

For example, the graph below lets you see Pokemon’s gradual fall out of the top five card games and eventual return to popularity 17 years later.

You can explore the featured categories, gain a new perspective of the most powerful trends, and even take an interactive quiz to see which year’s search trends were most interesting to you.

Most Searched Playground

Another new part of the Year in Search report is the inclusion of an interactive Google Doodle/game that highlights many of the most popular places, people, and events from the past 25 years.

Zoom in and explore to find over 1,700 figures, easter eggs, and other surprising finds while touring Google’s history as a search engine.

The Year In Search Report

Of course, we have the annual search trends report itself. The Year in Search 2023 report highlights the most popular searches of the past year for several categories. You can explore the global trends that have shaped the world or filter the trends by country.

Local Year In Search

If you are looking for more localized trends, you can also explore the top trends from specific cities in Google’s Local Year in Search.

The section includes trends for each city including entertainment, “near me” searches, and relevant area-specific searches.

Even More

This year’s report includes even more cool trivia and interactive information, including Google Search Milestones, notable Facts About Google, and a statement from CEO Sundar Pichai’s view for Google’s future.

Google is giving advertisers the ability to opt out of showing ads across its Search Partner Network (SPN) following a concerning report that suggests ads may be being shown on controversial websites – even if you’ve placed those websites on a blocklist. 

The Claims

A report from Adalytics was recently published asserting that an unnamed Fortune 500 company had been upset and “surprised” after it learned that its ads were being served across several non-Google websites. 

Specifically, the report says that the company’s ads were being shown on Breitbart.com – a controversial right-wing news site that has been accused of racial bias, misleading articles, and incendiary perspectives. This is particularly problematic, as the company had added the domain to its exclusion list years before. 

As Adalytics stated:

“This raises the possibility that ads were served on websites and publishers despite the brand’s deliberate efforts to achieve brand safety and exercise control over their own media investments.”

In response, Google widely denied the claims in Adalytics’ report and suggested they ads shown were intentionally triggered. Still, the company announced it will allow brands to entirely opt-out of showing ads across the SPN if they desire. 

The company said:

“Though we take enormous issue with Adalytics’ methodology and conclusions, we always look to improve our products to meet our partners’ needs.”

Why It Matters

Having your ads shown alongside inappropriate or hateful content can be damaging to a brand’s reputation and develop negative associations for those who see it. Additionally, those who frequent problematic websites are unlikely to be your target market, so your ad budget is likely being wasted when this happens. 

Similar issues at X (formerly Twitter) have recently led to several large advertisers to publicly announce they were pulling ads from the social network following a report indicating X was displaying these ads alongside hate speech, racist, and white nationalist content. 

By giving advertisers the opportunity to opt-out, Google is ensuring that brands still can feel comfortable their ads aren’t being shown alongside objectionable content while the information from Adalytics’ report is further investigated. 

For more, read the full report here.