Tag Archive for: SEO

Google is ramping up to release its next core algorithm update “in the coming weeks”, likely signaling a major shakeup coming to search results in the near future. 

The reveal that a core algorithm update is coming came from Google Search Liaison and well-known SEO journalist Danny Sullivan who posted a lengthy message about the coming update on his website, Search Engine Roundtable.

When Is The Algorithm Update Coming?

In his message, Sullivan says that the teams at Google haven’t figured out exactly what day the core update is coming because there is still testing being done. Despite this, Sullivan felt confident enough to say that he expects the update to roll out in the coming weeks even if it takes tweaking after testing.

Notably, Sullivan says he had considered posting similar updates before the release of past core algorithm updates but did not because of the potential for them to be pushed back. In this instance, he is apparently more confident the update will pass through testing relatively quickly.

Sullivan’s full post reads:

“We’d tell you when the next core update will be if we knew. But we don’t know exactly yet, that’s all. These aren’t scheduled to a particular day. The ranking team makes changes, tests those, evaluates those and eventually we get a launch date. There have been many times I could have said “Core update next week!” because everything was on track for that to happen, but then there’s a need to do a bit more work or other things that might cause a pushback. I would expect we’ll see one in the coming weeks, because that fits in with our general cycle. But precisely when, that’s just not known yet.”

Past Major Algorithm Updates

This upcoming core algorithm update will be the first since one which began rolling out in March 2024 and completed in April. The reason for the longer-than-normal rollout is that this update was the largest core algorithm update to date. 

Before that, Google released a slew of smaller updates in August, October, and November of 2023. 

Sullivan did not give any insight into how big the upcoming update might be or what might be targeted by the update. For now, we can only assume that this update is aimed at reducing spam and improving the relevance of search results. 

We will update you as more information about the upcoming core algorithm update is revealed or when it begins rolling out to the public. 

A lot has been made of the importance of new content when it comes to ranking on Google. But, what’s so bad about older content? Are all old posts bad for your site? Should you be regularly removing old posts?

Thankfully, Google’s John Mueller and Lizzi Sassman addressed this recently on an episode of the Search Off The Record podcast.

In the episode, Mueller and Sassman talked at length about content decay, a term referring to content that becomes outdated or irrelevant over time, how it affects your site, and what you should do about it.

What Is Content Decay According to Google

While the term content decay isn’t necessarily a commonly used term within the SEO community, it is an apt term for some types of content. Specifically, this is how Mueller defines content decay:

“[Content decay is] something where, when you look at reference material, it’s kind of by definition old. People wrote about it because they’ve studied it for a really long time, so it’s an old thing. But, that doesn’t mean it’s no longer true or no longer useful.”

Is Content Decay Inherently Bad?

As Google’s workers explained, content decay isn’t inherently bad. Even some posts that may seem outdated, such as old event announcements or product updates, shouldn’t be treated as a problem.Sassman recommends keeping this content around for historical accuracy.

As an example, Sassman pointed toward Google’s help pages which still use the outdated term “Webmaster Tools.”

“If we went back and we replaced everything, like where we said Google Webmasters or Webmaster Tools, if we replaced Search Console everywhere we said Webmaster Tools, it would be factually incorrect.”

What Should You Do About Content Decay?

It might be tempting to simply delete outdated content but Mueller recommends going back and adding context to outdated content instead. This way, you still retain the value from past content while making it clear what aspects are now irrelevant and prevent confusion among readers. 

As he stated:

“People come to our site for whatever reason, then we should make sure that they find information that’s helpful for them and that they understand the context. If something is old and they search for it, they should be able to recognize, ‘Oh, maybe I have to rethink what I wanted to do because what I was searching for doesn’t exist anymore or is completely different now.”

For more, listen to the full episode of Google’s Search Off The Record podcast below:

Google is making some big changes to how it ranks results that aim to deliver more personalized search results and increase the prevalence of “first-hand knowledge”.

The search engine announced the changes earlier this month while spotlighting two specific updates that have recently come to users. 

Cathy Edwards, Vice President of Search at Google, says these updates will better connect humans with the topics and content that are most relevant to their interests and needs:

“Search has always been about connecting human curiosity with the incredible expanse of human wisdom on the net. These advancements will help users find the most helpful information just for them, no matter how specific their questions may be. 

Bringing First-Hand Knowledge To The Surface

Google has made adjustments to its ranking algorithm to show more first-person perspectives higher in search results. While the company didn’t tell us exactly how it tweaked the algorithm, Edwards emphasizes that it will help people find new individual experiences, advice, and opinions when searching. 

With this change, the company says it will hopefully show fewer repetitive pieces of content that don’t bring new perspectives or opinions in the first pages of results. 

The announcement says:

“As part of this work, we’ve also rolled out a series of ranking improvements to show more first-person perspectives in results, so it’s easier to find this content across Search.”

Follow Topics For More Curated Results

Google is giving you the ability to curate your own search results by following topics that are important to you. 

By following topics in search results, such as a favorite football team, style of restaurant, or genre of music, you can stay in touch with these topics naturally while you are searching. 

Follows not only impact what you see in typical search results but help highlight important topics in Discover and other areas of Google.

You can see an example of how this can shape your search results below. The first image shows what search results looked like before this update rolled out, and after.

Like most changes to the search results, however, it is unclear exactly how this affects optimization strategies going forward. We will know more as we get more data in the coming weeks.

Personalization Is The Future

Google has been increasingly customizing search results for users based on numerous factors including location, age, gender, demographics, and more. These latest updates continue this effort to ensure that the search results you see aren’t just the most relevant sites for anyone. They are the most relevant search results for you.

For years, backlinks have been considered one of the most important ranking factors for ranking on Google’s search engine. In 2016, the company even confirmed as much when a search quality senior strategist said that the top ranking factors were links, content, and RankBrain.

According to new comments from Google’s Gary Illyes, an analysis for Google Search, things have changed since then. 

What Was Said

During a panel at Pubcon Pro, Illyes was asked directly whether links are still one of the top three ranking factors. In response, here is what he said:

“I think they are important, but I think people overestimate the importance of links. I don’t agree it’s in the top three. It hasn’t been for some time.”

Illyes even went as far as to say there are cases where sites have absolutely 0 links (internal or external), but consistently ranked in the top spot because they provided excellent content. 

The Lead Up

Gary Illyes isn’t the first person from Google to suggest that links have lost the SEO weight they used to carry. Last year, Dan Nguyen from the search quality team stated that links had lost their impact during a Google SEO Office Hours session:

“First, backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago. We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries.’

Other major figures at Google, including Matt Cutts and John Mueller, have predicted this would happen for years. As far back as 2014, Cutts (a leading figure at Google at the time) said:

“I think backlinks still have many, many years left in them. But inevitably, what we’re trying to do is figure out how an expert user would say, this particular page matched their information needs. And sometimes backlinks matter for that. It’s helpful to find out what the reputation of the site or a page is. But, for the most part, people care about the quality of the content on that particular page. So I think over time, backlinks will become a little less important.”

Ultimately, this shift was bound to happen because search has become so much more complex. With each search, Google considers the intent behind the search, the actual query, and personal information to help tailor the search results for each user. With so much in flux, we have reached a point where the most important ranking signals may even differ based on the specific site that is trying to rank.

A recent article from Gizmodo has lit up the world of SEO, drawing a rebuff from Google and extensive conversation about when it’s right to delete old content on your website. 

The situation kicked off when Gizmodo published a recent article detailing how CNET had supposedly deleted thousands of pages of old content to “game Google Search.” 

What makes this so interesting, is that deleting older content that is not performing well is a long-recognized part of search engine optimization called “content pruning”. By framing their article as “exposing” CNET for dirty tricks, Gizmodo sparked a discussion about when content pruning is effective for sites and if SEO is inherently negative for a site’s health.

What Happened

The trigger for all of this occurred when CNET appeared to redirect, repurpose, or fully remove old pages based on analytics data including pageviews, backlink profiles, and how long a page has gone without an update. 

An internal memo obtained by Gizmodo shows that CNET did this believing that deprecating and removing old content “sends a signal to Google that says CNET is fresh, relevant, and worthy of being placed higher than our competitors in search results.”

What’s The Problem?

First, simply deleting old content does not send a signal that your site is fresh or relevant. The only way to do this is by ensuring your content itself is fresh and relevant to your audience. 

That said, there can be benefits to removing old content if it is not actually relevant or high-quality. 

The biggest issue here seems to be that CNET believes old content is inherently bad, but there is no such “penalty” or harm of leaving older content on your site if it may still be relevant to users.

As Google Search Liaison Danny Sullivan posted on X (formerly Twitter):

“Are you deleting old content from your site because you somehow believe Google doesn’t like ‘old’ content? That’s not a thing! Our guidance doesn’t encourage this. Old content can still be helpful, too.”

Which Is It?

The real takeaway from this is a reminder that Google isn’t as concerned with “freshness” as many may think. 

Yes, the search engine prefers sites that appear to be active and up-to-date, which includes posting new relevant content regularly. That said, leaving old content on your site won’t hurt you – unless it’s low-quality. Removing low-quality or irrelevant content can always help improve your overall standing with search engines by showing that you recognize when content isn’t up to snuff. Just don’t go deleting content solely because it is ‘old’.

The Washington Post may not be the first organization you imagine when you think about SEO experts, but as a popular news organization read by millions around the world, The Post has dealt with its fair share of issues in developing its long-term strategies for web performance and SEO. 

Now, the news site is sharing the fruit of that hard work by releasing its own Web Performance and SEO Best Practices and Guidelines.

These guidelines help ensure that The Washington Post remains competitive and visible in highly competitive search spaces, drives more organic traffic, and maintains a positive user experience on its website. 

In the announcement, engineering lead Arturo Silva said:

“We identified a need for a Web Performance and SEO engineering team to build technical solutions that support the discovery of our journalism, as the majority of news consumers today read the news digitally. Without proper SEO and web performance, our stories aren’t as accessible to our readers. As leaders in engineering and media publishing, we’re creating guidelines that serve our audiences and by sharing those technical solutions in our open-source design system, we are providing tools for others to certify that their own site practices are optimal.”

What’s In The Washington Post’s SEO and Web Performance Guidelines?

If you’re hoping to see a surprise trick or secret tool being used by The Washington Post, you are likely to be disappointed. 

The guidelines are largely in line with practices used by most SEO experts, albeit with a specific focus on their specific search and web performance issues.

For example, the Web Performance section covers three specific areas: loading performance, rendering performance, and responsiveness. Similarly, the SEO guidelines are split into on-page SEO, content optimization, technical SEO, and off-page SEO. 

More than anything, the guidelines highlight the need for brands to focus their SEO efforts on their unique needs and goals and develop strategies that are likely to remain useful for the foreseeable future (instead of chasing every new SEO trend). 

To read the guidelines for yourself, visit the Washington Post’s site here. 

Just last week, Google Search Liaison, Danny Sullivan, once again took to Twitter to dispel a longstanding myth about word counts and search engine optimization (SEO). 

The message reads:

“Reminder. The best word count needed to succeed in Google Search is … not a thing! It doesn’t exist. Write as long or short as needed for people who read your content.”

Sullivan also linked to long-existing help pages and included a screencap of a statement from these pages which says:

“Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t.)”

Of course, this is not a new message from Google. Still, many of the most popular SEO tools and experts still claim that anywhere between 300 to 1,500 words is ideal for ranking in Google search results. 

Incidentally, a day later Google’s John Mueller also responded to an SEO professional who argued there was “correlation between word count and outranking competition?” In a short but simple reply, Mueller said “Are you saying the top ranking pages should have the most words? That’s definitely not the case.”

Most likely, this myth of an ideal SEO word count will continue to persist so long as search engine optimization exists in its current form. Still, it is always good to get a clear reminder from major figures at Google that content should be as long as necessary to share valuable information to your audience – whether you can do that in a couple sentences or exhaustive multi-thousand-word content. 

One of Google’s most visible spokespeople, John Mueller, made a rare appearance on Reddit to answer a series of “dumb” SEO questions covering everything from geotagging images to how often you should blog.

In a thread on the r/BigSEO subreddit called “incoming dumb question barrage”, a user asked a series of five questions:

  1. Should we be geotagging images. Does Google even care?
  2. Blogging. If we do it, is it everyday or once a week with some seriously solid stuff?
  3. Google Business Profile posting: Everyday, once a week, or why bother?
  4. Since stuff like Senuke died 10 years ago, is it all about networking with webmasters of similar and same niche sites for links?
  5. Piggybacking off #4, what about PBNs? Are they back? If so, does it have to be a group of completely legit looking websites vs some cobbled together WP blogs?

Mueller provided a series of candid answers which we will get into below:

Geotagging Images

Here Mueller kept it short and sweet: “No need to geotag images for SEO.”

How Often Should You Blog?

As always, Google won’t provide a specific post frequency that is “best” for SEO blog content. Rather, Mueller says to post “as often as you have something unique & compelling to say.”

However, the Google Search Advocate admits that more frequent posting can more traffic if you are able to maintain the quality of your content. 

“The problem with trying to keep a frequency up is that it’s easy to end up with mediocre, fluffy content, which search engine quality algorithms might pick up on.”

Additionally, he indicates that those who are using AI to create a lot of content quickly are unlikely to be rewarded.

Google Business Profile Posting Frequency

Unfortunately, this is not Mueller’s area of knowledge. His answer was a simple “no idea.”

Outdated Linkbuilding Strategies

The last two questions are devoted to asking if older methods for link building were still relevant at all. Clearly, this tickled Mueller as he largely dismissed either approach. 

“SENuke, hah, that’s a name I haven’t heard in ages, lol. Sorry. Giggle. I have thoughts on links, but people love to take things out of context to promote their link efforts / tools, so perhaps someone else will say something reasonable, or not.

“OMG, PBNs too. What is this thread even. Now I won’t say anything without a lawyer present.”

No Shortcuts To Online Riches

Of course, there is an underlying current connecting all of these questions. Mueller takes note of this as well, saying:

“Reading between the lines, it seems you want to find a short-cut to making money online.”

The truth is, there are no real shortcuts to online success these days. However, there are a lot of questionable people willing to take your money to provide tools and courses that often get you nowhere. 

“Unfortunately, there’s a long line of people trying to do the same, and some have a lot of practice. Some will even sell you tools and courses on how to make money online (and *they* will be the ones making the money, fwiw, since people pay them for the tools and courses). The good tools cost good money, and they’re not marketed towards people who just want to make money online — they’re targeted at companies who need to manage their online presence and report on progress to their leadership chain.”

At the same time, Mueller encourages individuals such as the person who started to thread to keep learning and practicing SEO:

“… learn HTML, learn a bit of programming, and go for it. 90% of the random tricks you run across won’t work, 9% of the remaining ones will burn your sites to the ground, but if you’re lucky & persistent (is that the same?), you’ll run across some things that work for you.

“If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.”If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.

“And … after some time, you might notice that actually building something of lasting value can also be intriguiing [sic], and you’ll start working on a side-project that does things in the right way, where you can put your experience to good use and avoid doing all of the slash & burn site/spam-building.”

Having a robust backlink profile remains one of the most crucial factors for ranking a webpage highly in search, so it is always big news when Google actually tells us what it looks for in quality links. 

Yesterday, the search engine published a new set of guidelines and best practices for building backlinks, detailing how to make your links crawlable, how to craft well-ranking anchor text, and how to best establish internal links on your site. 

Below, we will cover all the new guidelines and best SEO practices for links on your website according to Google:

Crawlable Links

As the page Google updated was originally dedicated to specifically making links crawlable, this section remains largely unchanged. It reads, “Generally, Google can only crawl your link if it’s an <a> HTML element (also known as anchor element) with an href attribute. Most links in other formats won’t be parsed and extracted by Google’s crawlers. Google can’t reliably extract URLs from <a> elements that don’t have an href attribute or other tags that perform as links because of script events.”

Anchor Text Placement 

The best practice for placing anchor text for links reads: “Anchor text (also known as link text) is the visible text of a link. This text tells people and Google something about the page you’re linking to. Place anchor text between <a> elements that Google can crawl.”

Writing Anchor Text

As for the anchor text itself, Google encourages you to balance descriptiveness with brevity: “Good anchor text is descriptive, reasonably concise, and relevant to the page that it’s on and to the page it links to. It provides context for the link, and sets the expectation for your readers. The better your anchor text, the easier it is for people to navigate your site and for Google to understand what the page you’re linking to is about.”

Internal Links 

While Google emphasizes the importance of internal links on your website, it also states that the search engine doesn’t look for a target number of links.

“You may usually think about linking in terms of pointing to external websites, but paying more attention to the anchor text used for internal links can help both people and Google make sense of your site more easily and find other pages on your site. Every page you care about should have a link from at least one other page on your site. Think about what other resources on your site could help your readers understand a given page on your site, and link to those pages in context.”

External Links

When it comes to external links, Google has advice for creating powerful links that don’t come off as spam: “Linking to other sites isn’t something to be scared of; in fact, using external links can help establish trustworthiness (for example, citing your sources). Link out to external sites when it makes sense, and provide context to your readers about what they can expect.”

If your site gets hit with an algorithmic penalty from Google, you’ll likely be eager to fix the issue and improve your rankings again. However, Google’s top experts say it can take quite some time to recover if they believe your site is spammy.

In a recent Google SEO Office Hours session, representatives were asked how long it can take to recover from an algorithm penalty related to content quality problems. 

While many details about the question remain unclear – such as how significant the penalty is – the search engine’s spokespeople encouraged site owners to be proactive. Otherwise, it may be months before they regain ground in the search results.

Specifically, the question posed in the video is:

“If a website gets algorithmically penalized for thin content, how much of the website’s content do you have to update before the penalty is lifted?”

There are a few ways the question could be read, so in this case, the experts kept it simple and straight to the point:

“Well, it’s generally a good idea to clean up low-quality content or spammy content that you may have created in the past.

For algorithmic actions, it can take us months to reevaluate your site again to determine that it’s no longer spammy.”

In other words, it is always better to share high-quality original content than to risk being labeled as spam. Once that happens, you’ll likely be in the doghouse for at least a few months.

To hear the answer, check out the video below beginning at 24:24.