Tag Archive for: SEO

Google is making some big changes to how it ranks results that aim to deliver more personalized search results and increase the prevalence of “first-hand knowledge”.

The search engine announced the changes earlier this month while spotlighting two specific updates that have recently come to users. 

Cathy Edwards, Vice President of Search at Google, says these updates will better connect humans with the topics and content that are most relevant to their interests and needs:

“Search has always been about connecting human curiosity with the incredible expanse of human wisdom on the net. These advancements will help users find the most helpful information just for them, no matter how specific their questions may be. 

Bringing First-Hand Knowledge To The Surface

Google has made adjustments to its ranking algorithm to show more first-person perspectives higher in search results. While the company didn’t tell us exactly how it tweaked the algorithm, Edwards emphasizes that it will help people find new individual experiences, advice, and opinions when searching. 

With this change, the company says it will hopefully show fewer repetitive pieces of content that don’t bring new perspectives or opinions in the first pages of results. 

The announcement says:

“As part of this work, we’ve also rolled out a series of ranking improvements to show more first-person perspectives in results, so it’s easier to find this content across Search.”

Follow Topics For More Curated Results

Google is giving you the ability to curate your own search results by following topics that are important to you. 

By following topics in search results, such as a favorite football team, style of restaurant, or genre of music, you can stay in touch with these topics naturally while you are searching. 

Follows not only impact what you see in typical search results but help highlight important topics in Discover and other areas of Google.

You can see an example of how this can shape your search results below. The first image shows what search results looked like before this update rolled out, and after.

Like most changes to the search results, however, it is unclear exactly how this affects optimization strategies going forward. We will know more as we get more data in the coming weeks.

Personalization Is The Future

Google has been increasingly customizing search results for users based on numerous factors including location, age, gender, demographics, and more. These latest updates continue this effort to ensure that the search results you see aren’t just the most relevant sites for anyone. They are the most relevant search results for you.

For years, backlinks have been considered one of the most important ranking factors for ranking on Google’s search engine. In 2016, the company even confirmed as much when a search quality senior strategist said that the top ranking factors were links, content, and RankBrain.

According to new comments from Google’s Gary Illyes, an analysis for Google Search, things have changed since then. 

What Was Said

During a panel at Pubcon Pro, Illyes was asked directly whether links are still one of the top three ranking factors. In response, here is what he said:

“I think they are important, but I think people overestimate the importance of links. I don’t agree it’s in the top three. It hasn’t been for some time.”

Illyes even went as far as to say there are cases where sites have absolutely 0 links (internal or external), but consistently ranked in the top spot because they provided excellent content. 

The Lead Up

Gary Illyes isn’t the first person from Google to suggest that links have lost the SEO weight they used to carry. Last year, Dan Nguyen from the search quality team stated that links had lost their impact during a Google SEO Office Hours session:

“First, backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago. We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries.’

Other major figures at Google, including Matt Cutts and John Mueller, have predicted this would happen for years. As far back as 2014, Cutts (a leading figure at Google at the time) said:

“I think backlinks still have many, many years left in them. But inevitably, what we’re trying to do is figure out how an expert user would say, this particular page matched their information needs. And sometimes backlinks matter for that. It’s helpful to find out what the reputation of the site or a page is. But, for the most part, people care about the quality of the content on that particular page. So I think over time, backlinks will become a little less important.”

Ultimately, this shift was bound to happen because search has become so much more complex. With each search, Google considers the intent behind the search, the actual query, and personal information to help tailor the search results for each user. With so much in flux, we have reached a point where the most important ranking signals may even differ based on the specific site that is trying to rank.

A recent article from Gizmodo has lit up the world of SEO, drawing a rebuff from Google and extensive conversation about when it’s right to delete old content on your website. 

The situation kicked off when Gizmodo published a recent article detailing how CNET had supposedly deleted thousands of pages of old content to “game Google Search.” 

What makes this so interesting, is that deleting older content that is not performing well is a long-recognized part of search engine optimization called “content pruning”. By framing their article as “exposing” CNET for dirty tricks, Gizmodo sparked a discussion about when content pruning is effective for sites and if SEO is inherently negative for a site’s health.

What Happened

The trigger for all of this occurred when CNET appeared to redirect, repurpose, or fully remove old pages based on analytics data including pageviews, backlink profiles, and how long a page has gone without an update. 

An internal memo obtained by Gizmodo shows that CNET did this believing that deprecating and removing old content “sends a signal to Google that says CNET is fresh, relevant, and worthy of being placed higher than our competitors in search results.”

What’s The Problem?

First, simply deleting old content does not send a signal that your site is fresh or relevant. The only way to do this is by ensuring your content itself is fresh and relevant to your audience. 

That said, there can be benefits to removing old content if it is not actually relevant or high-quality. 

The biggest issue here seems to be that CNET believes old content is inherently bad, but there is no such “penalty” or harm of leaving older content on your site if it may still be relevant to users.

As Google Search Liaison Danny Sullivan posted on X (formerly Twitter):

“Are you deleting old content from your site because you somehow believe Google doesn’t like ‘old’ content? That’s not a thing! Our guidance doesn’t encourage this. Old content can still be helpful, too.”

Which Is It?

The real takeaway from this is a reminder that Google isn’t as concerned with “freshness” as many may think. 

Yes, the search engine prefers sites that appear to be active and up-to-date, which includes posting new relevant content regularly. That said, leaving old content on your site won’t hurt you – unless it’s low-quality. Removing low-quality or irrelevant content can always help improve your overall standing with search engines by showing that you recognize when content isn’t up to snuff. Just don’t go deleting content solely because it is ‘old’.

The Washington Post may not be the first organization you imagine when you think about SEO experts, but as a popular news organization read by millions around the world, The Post has dealt with its fair share of issues in developing its long-term strategies for web performance and SEO. 

Now, the news site is sharing the fruit of that hard work by releasing its own Web Performance and SEO Best Practices and Guidelines.

These guidelines help ensure that The Washington Post remains competitive and visible in highly competitive search spaces, drives more organic traffic, and maintains a positive user experience on its website. 

In the announcement, engineering lead Arturo Silva said:

“We identified a need for a Web Performance and SEO engineering team to build technical solutions that support the discovery of our journalism, as the majority of news consumers today read the news digitally. Without proper SEO and web performance, our stories aren’t as accessible to our readers. As leaders in engineering and media publishing, we’re creating guidelines that serve our audiences and by sharing those technical solutions in our open-source design system, we are providing tools for others to certify that their own site practices are optimal.”

What’s In The Washington Post’s SEO and Web Performance Guidelines?

If you’re hoping to see a surprise trick or secret tool being used by The Washington Post, you are likely to be disappointed. 

The guidelines are largely in line with practices used by most SEO experts, albeit with a specific focus on their specific search and web performance issues.

For example, the Web Performance section covers three specific areas: loading performance, rendering performance, and responsiveness. Similarly, the SEO guidelines are split into on-page SEO, content optimization, technical SEO, and off-page SEO. 

More than anything, the guidelines highlight the need for brands to focus their SEO efforts on their unique needs and goals and develop strategies that are likely to remain useful for the foreseeable future (instead of chasing every new SEO trend). 

To read the guidelines for yourself, visit the Washington Post’s site here. 

Just last week, Google Search Liaison, Danny Sullivan, once again took to Twitter to dispel a longstanding myth about word counts and search engine optimization (SEO). 

The message reads:

“Reminder. The best word count needed to succeed in Google Search is … not a thing! It doesn’t exist. Write as long or short as needed for people who read your content.”

Sullivan also linked to long-existing help pages and included a screencap of a statement from these pages which says:

“Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t.)”

Of course, this is not a new message from Google. Still, many of the most popular SEO tools and experts still claim that anywhere between 300 to 1,500 words is ideal for ranking in Google search results. 

Incidentally, a day later Google’s John Mueller also responded to an SEO professional who argued there was “correlation between word count and outranking competition?” In a short but simple reply, Mueller said “Are you saying the top ranking pages should have the most words? That’s definitely not the case.”

Most likely, this myth of an ideal SEO word count will continue to persist so long as search engine optimization exists in its current form. Still, it is always good to get a clear reminder from major figures at Google that content should be as long as necessary to share valuable information to your audience – whether you can do that in a couple sentences or exhaustive multi-thousand-word content. 

One of Google’s most visible spokespeople, John Mueller, made a rare appearance on Reddit to answer a series of “dumb” SEO questions covering everything from geotagging images to how often you should blog.

In a thread on the r/BigSEO subreddit called “incoming dumb question barrage”, a user asked a series of five questions:

  1. Should we be geotagging images. Does Google even care?
  2. Blogging. If we do it, is it everyday or once a week with some seriously solid stuff?
  3. Google Business Profile posting: Everyday, once a week, or why bother?
  4. Since stuff like Senuke died 10 years ago, is it all about networking with webmasters of similar and same niche sites for links?
  5. Piggybacking off #4, what about PBNs? Are they back? If so, does it have to be a group of completely legit looking websites vs some cobbled together WP blogs?

Mueller provided a series of candid answers which we will get into below:

Geotagging Images

Here Mueller kept it short and sweet: “No need to geotag images for SEO.”

How Often Should You Blog?

As always, Google won’t provide a specific post frequency that is “best” for SEO blog content. Rather, Mueller says to post “as often as you have something unique & compelling to say.”

However, the Google Search Advocate admits that more frequent posting can more traffic if you are able to maintain the quality of your content. 

“The problem with trying to keep a frequency up is that it’s easy to end up with mediocre, fluffy content, which search engine quality algorithms might pick up on.”

Additionally, he indicates that those who are using AI to create a lot of content quickly are unlikely to be rewarded.

Google Business Profile Posting Frequency

Unfortunately, this is not Mueller’s area of knowledge. His answer was a simple “no idea.”

Outdated Linkbuilding Strategies

The last two questions are devoted to asking if older methods for link building were still relevant at all. Clearly, this tickled Mueller as he largely dismissed either approach. 

“SENuke, hah, that’s a name I haven’t heard in ages, lol. Sorry. Giggle. I have thoughts on links, but people love to take things out of context to promote their link efforts / tools, so perhaps someone else will say something reasonable, or not.

“OMG, PBNs too. What is this thread even. Now I won’t say anything without a lawyer present.”

No Shortcuts To Online Riches

Of course, there is an underlying current connecting all of these questions. Mueller takes note of this as well, saying:

“Reading between the lines, it seems you want to find a short-cut to making money online.”

The truth is, there are no real shortcuts to online success these days. However, there are a lot of questionable people willing to take your money to provide tools and courses that often get you nowhere. 

“Unfortunately, there’s a long line of people trying to do the same, and some have a lot of practice. Some will even sell you tools and courses on how to make money online (and *they* will be the ones making the money, fwiw, since people pay them for the tools and courses). The good tools cost good money, and they’re not marketed towards people who just want to make money online — they’re targeted at companies who need to manage their online presence and report on progress to their leadership chain.”

At the same time, Mueller encourages individuals such as the person who started to thread to keep learning and practicing SEO:

“… learn HTML, learn a bit of programming, and go for it. 90% of the random tricks you run across won’t work, 9% of the remaining ones will burn your sites to the ground, but if you’re lucky & persistent (is that the same?), you’ll run across some things that work for you.

“If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.”If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.

“And … after some time, you might notice that actually building something of lasting value can also be intriguiing [sic], and you’ll start working on a side-project that does things in the right way, where you can put your experience to good use and avoid doing all of the slash & burn site/spam-building.”

Having a robust backlink profile remains one of the most crucial factors for ranking a webpage highly in search, so it is always big news when Google actually tells us what it looks for in quality links. 

Yesterday, the search engine published a new set of guidelines and best practices for building backlinks, detailing how to make your links crawlable, how to craft well-ranking anchor text, and how to best establish internal links on your site. 

Below, we will cover all the new guidelines and best SEO practices for links on your website according to Google:

Crawlable Links

As the page Google updated was originally dedicated to specifically making links crawlable, this section remains largely unchanged. It reads, “Generally, Google can only crawl your link if it’s an <a> HTML element (also known as anchor element) with an href attribute. Most links in other formats won’t be parsed and extracted by Google’s crawlers. Google can’t reliably extract URLs from <a> elements that don’t have an href attribute or other tags that perform as links because of script events.”

Anchor Text Placement 

The best practice for placing anchor text for links reads: “Anchor text (also known as link text) is the visible text of a link. This text tells people and Google something about the page you’re linking to. Place anchor text between <a> elements that Google can crawl.”

Writing Anchor Text

As for the anchor text itself, Google encourages you to balance descriptiveness with brevity: “Good anchor text is descriptive, reasonably concise, and relevant to the page that it’s on and to the page it links to. It provides context for the link, and sets the expectation for your readers. The better your anchor text, the easier it is for people to navigate your site and for Google to understand what the page you’re linking to is about.”

Internal Links 

While Google emphasizes the importance of internal links on your website, it also states that the search engine doesn’t look for a target number of links.

“You may usually think about linking in terms of pointing to external websites, but paying more attention to the anchor text used for internal links can help both people and Google make sense of your site more easily and find other pages on your site. Every page you care about should have a link from at least one other page on your site. Think about what other resources on your site could help your readers understand a given page on your site, and link to those pages in context.”

External Links

When it comes to external links, Google has advice for creating powerful links that don’t come off as spam: “Linking to other sites isn’t something to be scared of; in fact, using external links can help establish trustworthiness (for example, citing your sources). Link out to external sites when it makes sense, and provide context to your readers about what they can expect.”

If your site gets hit with an algorithmic penalty from Google, you’ll likely be eager to fix the issue and improve your rankings again. However, Google’s top experts say it can take quite some time to recover if they believe your site is spammy.

In a recent Google SEO Office Hours session, representatives were asked how long it can take to recover from an algorithm penalty related to content quality problems. 

While many details about the question remain unclear – such as how significant the penalty is – the search engine’s spokespeople encouraged site owners to be proactive. Otherwise, it may be months before they regain ground in the search results.

Specifically, the question posed in the video is:

“If a website gets algorithmically penalized for thin content, how much of the website’s content do you have to update before the penalty is lifted?”

There are a few ways the question could be read, so in this case, the experts kept it simple and straight to the point:

“Well, it’s generally a good idea to clean up low-quality content or spammy content that you may have created in the past.

For algorithmic actions, it can take us months to reevaluate your site again to determine that it’s no longer spammy.”

In other words, it is always better to share high-quality original content than to risk being labeled as spam. Once that happens, you’ll likely be in the doghouse for at least a few months.

To hear the answer, check out the video below beginning at 24:24.

Keeping up with all of Google’s ranking algorithms and systems can be a lot. It seems like every time you turn around, the search engine has pushed out some new ranking system that brands need to be aware of if they want to reach users on the largest search engine around. 

Making matters even more complicated, Google also occasionally retires older systems as they become obsolete or redundant over the years.

Thankfully, Google has released a comprehensive guide to its many different ranking systems so you can be sure you are optimized for the most important ranking signals without investing resources into systems that are out of use. 

Ranking Systems Vs. Ranking Updates

Along with information about each ranking system and how it influences your standings on Google Search, the guide clarifies the language between ranking updates and ranking systems.

These terms have been used somewhat interchangeably but Google is finally drawing a clear line between the two.

According to the guide, a ranking system is something that is constantly operating behind the scenes – such as RankBrain or the helpful content system.

On the other hand, a ranking update is a one-time change to the ranking systems. For example, Google regularly rolls out updates to its spam detection systems.

Active Google Ranking Systems

Here are Google’s currently active ranking systems in alphabetical order:

  • BERT: BERT (or Bidirectional Encoder Representations from Transformers) is an AI system that allows Google to understand how combinations of words may change meanings and intent
  • Crisis Information Systems: This is a system Google has in place to handle important information during times of crisis – both personal and public. For example, the system helps intervene when users search for content related to potentially dangerous personal crises, such as suicide, sexual assault, or poison ingestion.
  • Deduplication Systems: This is used to help Google avoid delivering search results with duplicate or nearly identical content.
  • Exact Match Domain System: A system is used to balance the importance of ranking brands highly for searches containing their exact business name without giving too much credit to sites with domain names that exactly match broader queries.
  • Freshness Systems: Google’s freshness systems work to show newer content more prominently for queries where it would be expected.
  • Helpful Content System: The relatively new Helpful Content System guarantees that users see original content written with their needs in mind, rather than content crafted specifically to rank well.
  • Link Analysis Systems and PageRank: These systems determine what content is about and what pages may be most helpful for specific queries based on how pages across the web are linked together.
  • Local News Systems: Google uses this to highlight information from local news sources when they will be the best resource for a query.
  • Neural Matching: This lets Google understand representations of concepts in queries and match them with the most relevant pages.
  • Original Content Systems: Google’s Original Content Systems help identify the original source of content and highlight them above those who simply cite it.
  • Removal-Based Demotion Systems: The system responsible for demoting or removing content with a high volume of content removal requests.
  • Page Experience System: The Page Experience System is designed to assess which sites will provide the best user experience.
  • Passage Ranking System: Passage ranking is an AI system used to identify specific sections of content which may be most relevant for search.
  • Product Reviews System: As part of Google’s shopping tools in search, Google uses the Product Reviews System to reward highly reviewed products and to showcase reviews that contain the most insightful or relevant information.
  • RankBrain: RankBrain is an AI system crucial to the search engine’s ability to understand how words and concepts are related and return more relevant content – even when all the exact words in a search may not be present.
  • Reliable Information Systems: These are a number of systems that ensure Google’s search results prioritize information from reliable sources.
  • Site Diversity System: The Site Diversity System prevents Google from showing more than two specific pages from the same domain in the top results for a query.
  • Spam Detection Systems: The Spam Detection Systems identify content and behaviors which violate Google’s spam policies and deal with them appropriately by demoting or delisting them.

Retired Google Ranking Systems

  • Hummingbird: Originally rolled out in 2013, Hummingbird was a broad overhaul to Google’s ranking systems. Since then, Google’s recent systems have evolved past the need for this system.
  • Mobile-Friendly Ranking System: This system rewarded sites that were optimized to render well on mobile devices. Since then, it has been absorbed into the Page Experience System.
  • Page Speed System: Initially a standalone system that highlighted sites that loaded quickly on mobile devices, this system has since been incorporated into the Page Experience System.
  • The Panda System: Panda was released in 2011 with the purpose of surfacing high-quality, original content. Since 2015, it has been part of Google’s core ranking systems.
  • The Penguin System: The “cousin” to Panda, Penguin demoted websites that used spammy linkbuilding strategies to rank abnormally well. It has been part of the core ranking systems since 2016.
  • Secure Sites System: Originally, it gave a small boost to sites that adopted HTTPS security protocols when it was less commonly used across the web. Though HTTPS sites are much more common these days, the system is still in use as part of Google’s Page Experience System.

With the holidays approaching, SEO analysts BrightEdge are releasing their yearly list of important optimization trends ecommerce brands should know about.

Based on data collected by tracking over 6,000 ecommerce keywords over 10 categories over the past three years, the latest list makes one thing very clear – successful ecommerce brands are increasingly relying on content creation to drive their sales. 

Of the top five new trends covered, three highlight different ways content creators and other types of publishers are leading the ecommerce market by delivering consumers with the most valuable content at the right times.

Let’s explore the latest ecommerce trends below:

1) Brands and publishers are siphoning away retail traffic

Retailers these days have a lot of competition to contend with online. Not only are you fighting to stand out among the slew of other online retailers, but you have to outrank brands and publishers in search results. 

According to the report, retailers’ performance for top ecommerce keywords is down 70% from 2020. Meanwhile, brands are making headway into shopping results by adopting direct-to-consumer models while content publishers are attracting attention with product overviews and reviews.

2) Retailers Are Driving Ecommerce With Content

While the report does not include data impacted by the new “helpful content update”, the data does emphasize that retailers who publish quality content are more effectively able to differentiate their brand and their products from those who only offer product descriptions.

Specifically, BrightEdge says retailers should:

  • Focus on creating context for your products through content.
  • ‘Organize categories in ways that make it easier to learn about and shop multiple related products.

3) Organic Links Are Still Crucial

As Google’s ad platform and other features like localized business listings have taken over more and more space in search results, many have suggested that organic search results have lost their importance. 

However, BrightEdge’s data suggests that classic organic search results are still the most effective traffic source for retail brands. For the top ecommerce results, up to 70% of all clicks went to organic search results.

In fact, it appears Google may be aware that online shopping-related searches are best served through organic search results, as local packs, videos, and image carousels have all become less common for ecommerce searches. 

4) More Ecommerce Sites Are Adopting Schema Markup

Brands, publishers, and retailers involved in ecommerce are all increasingly adopting a few specific types of schema markup to make their pages easier for search engines to understand and index. 

Specifically, these three schema types have seen significantly increased usage around shopping results:

  • Product
  • ImageObject
  • ItemList

5) Article and Category Pages Dominate Ecommerce

Category pages have always been a major driver of clicks for ecommerce, and this remains true in 2022. For the top keywords, category pages have the highest click-through rate 70% of the time. However, BrightEdge noted that recently, articles about products have higher click-through rates than links directly to product pages.


Google’s search results are always shifting. It is important for brands to stay aware of the latest trends in their market and adapt the most effective SEO strategies if they want to stay ahead of the competition – especially leading up to the holiday season.