For years, backlinks have been considered one of the most important ranking factors for ranking on Google’s search engine. In 2016, the company even confirmed as much when a search quality senior strategist said that the top ranking factors were links, content, and RankBrain.

According to new comments from Google’s Gary Illyes, an analysis for Google Search, things have changed since then. 

What Was Said

During a panel at Pubcon Pro, Illyes was asked directly whether links are still one of the top three ranking factors. In response, here is what he said:

“I think they are important, but I think people overestimate the importance of links. I don’t agree it’s in the top three. It hasn’t been for some time.”

Illyes even went as far as to say there are cases where sites have absolutely 0 links (internal or external), but consistently ranked in the top spot because they provided excellent content. 

The Lead Up

Gary Illyes isn’t the first person from Google to suggest that links have lost the SEO weight they used to carry. Last year, Dan Nguyen from the search quality team stated that links had lost their impact during a Google SEO Office Hours session:

“First, backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago. We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries.’

Other major figures at Google, including Matt Cutts and John Mueller, have predicted this would happen for years. As far back as 2014, Cutts (a leading figure at Google at the time) said:

“I think backlinks still have many, many years left in them. But inevitably, what we’re trying to do is figure out how an expert user would say, this particular page matched their information needs. And sometimes backlinks matter for that. It’s helpful to find out what the reputation of the site or a page is. But, for the most part, people care about the quality of the content on that particular page. So I think over time, backlinks will become a little less important.”

Ultimately, this shift was bound to happen because search has become so much more complex. With each search, Google considers the intent behind the search, the actual query, and personal information to help tailor the search results for each user. With so much in flux, we have reached a point where the most important ranking signals may even differ based on the specific site that is trying to rank.

Bing made quite the splash six months ago, with the launch of its new AI-powered search experience using Bing Chat. New data, which Microsoft disputes, suggests the search experience may have failed to make much of a lasting impression.

According to the latest report from StatCounter, Bing saw a short boost to their share of the search market that peaked in March at 6.61% (about a month after the launch of the new search experience). However, the current rate (6.47%) is only a little above the search share (6.35%) at the launch of the new search experience in February.

Even worse, Bing actually consistently received a higher share of the search market throughout 2022, with a high of 7.82% in November.

Is The Data Accurate?

StatCounter has been considered a reliable analyst of search engine traffic and market share, often cited by major news publications. 

Microsoft, on the other hand, has often disputed their findings – just as they did with this report.

In a statement, a Microsoft representative told The Wall Street Journal that “third-party data companies aren’t measuring all the people who are going directly to Bing’s chat page.”

Microsoft Corporate VP Yusuf Mehdi also claimed that “we’ve made more progress in the last six months than we have in the previous decade or two combined.”

While Bing may argue a small percentage of searches do not get included in StatCounter’s numbers, ultimately these users would have a minimal effect on most analysis. The search engine has always struggled to gain ground behind Google, and it is looking like its implementation of AI has done little to help,

A recent article from Gizmodo has lit up the world of SEO, drawing a rebuff from Google and extensive conversation about when it’s right to delete old content on your website. 

The situation kicked off when Gizmodo published a recent article detailing how CNET had supposedly deleted thousands of pages of old content to “game Google Search.” 

What makes this so interesting, is that deleting older content that is not performing well is a long-recognized part of search engine optimization called “content pruning”. By framing their article as “exposing” CNET for dirty tricks, Gizmodo sparked a discussion about when content pruning is effective for sites and if SEO is inherently negative for a site’s health.

What Happened

The trigger for all of this occurred when CNET appeared to redirect, repurpose, or fully remove old pages based on analytics data including pageviews, backlink profiles, and how long a page has gone without an update. 

An internal memo obtained by Gizmodo shows that CNET did this believing that deprecating and removing old content “sends a signal to Google that says CNET is fresh, relevant, and worthy of being placed higher than our competitors in search results.”

What’s The Problem?

First, simply deleting old content does not send a signal that your site is fresh or relevant. The only way to do this is by ensuring your content itself is fresh and relevant to your audience. 

That said, there can be benefits to removing old content if it is not actually relevant or high-quality. 

The biggest issue here seems to be that CNET believes old content is inherently bad, but there is no such “penalty” or harm of leaving older content on your site if it may still be relevant to users.

As Google Search Liaison Danny Sullivan posted on X (formerly Twitter):

“Are you deleting old content from your site because you somehow believe Google doesn’t like ‘old’ content? That’s not a thing! Our guidance doesn’t encourage this. Old content can still be helpful, too.”

Which Is It?

The real takeaway from this is a reminder that Google isn’t as concerned with “freshness” as many may think. 

Yes, the search engine prefers sites that appear to be active and up-to-date, which includes posting new relevant content regularly. That said, leaving old content on your site won’t hurt you – unless it’s low-quality. Removing low-quality or irrelevant content can always help improve your overall standing with search engines by showing that you recognize when content isn’t up to snuff. Just don’t go deleting content solely because it is ‘old’.

The Washington Post may not be the first organization you imagine when you think about SEO experts, but as a popular news organization read by millions around the world, The Post has dealt with its fair share of issues in developing its long-term strategies for web performance and SEO. 

Now, the news site is sharing the fruit of that hard work by releasing its own Web Performance and SEO Best Practices and Guidelines.

These guidelines help ensure that The Washington Post remains competitive and visible in highly competitive search spaces, drives more organic traffic, and maintains a positive user experience on its website. 

In the announcement, engineering lead Arturo Silva said:

“We identified a need for a Web Performance and SEO engineering team to build technical solutions that support the discovery of our journalism, as the majority of news consumers today read the news digitally. Without proper SEO and web performance, our stories aren’t as accessible to our readers. As leaders in engineering and media publishing, we’re creating guidelines that serve our audiences and by sharing those technical solutions in our open-source design system, we are providing tools for others to certify that their own site practices are optimal.”

What’s In The Washington Post’s SEO and Web Performance Guidelines?

If you’re hoping to see a surprise trick or secret tool being used by The Washington Post, you are likely to be disappointed. 

The guidelines are largely in line with practices used by most SEO experts, albeit with a specific focus on their specific search and web performance issues.

For example, the Web Performance section covers three specific areas: loading performance, rendering performance, and responsiveness. Similarly, the SEO guidelines are split into on-page SEO, content optimization, technical SEO, and off-page SEO. 

More than anything, the guidelines highlight the need for brands to focus their SEO efforts on their unique needs and goals and develop strategies that are likely to remain useful for the foreseeable future (instead of chasing every new SEO trend). 

To read the guidelines for yourself, visit the Washington Post’s site here. 

Just last week, Google Search Liaison, Danny Sullivan, once again took to Twitter to dispel a longstanding myth about word counts and search engine optimization (SEO). 

The message reads:

“Reminder. The best word count needed to succeed in Google Search is … not a thing! It doesn’t exist. Write as long or short as needed for people who read your content.”

Sullivan also linked to long-existing help pages and included a screencap of a statement from these pages which says:

“Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t.)”

Of course, this is not a new message from Google. Still, many of the most popular SEO tools and experts still claim that anywhere between 300 to 1,500 words is ideal for ranking in Google search results. 

Incidentally, a day later Google’s John Mueller also responded to an SEO professional who argued there was “correlation between word count and outranking competition?” In a short but simple reply, Mueller said “Are you saying the top ranking pages should have the most words? That’s definitely not the case.”

Most likely, this myth of an ideal SEO word count will continue to persist so long as search engine optimization exists in its current form. Still, it is always good to get a clear reminder from major figures at Google that content should be as long as necessary to share valuable information to your audience – whether you can do that in a couple sentences or exhaustive multi-thousand-word content. 

Microsoft is overhauling its Bing search engine’s mobile experience with new features, better formatting, and integration with mobile apps for Skype and Edge.

The news came from Microsoft’s Global Head of Marketing, Divya Kumar, who showcased the new mobile experience and upcoming features in a blog post. 

Previewed Features Are Arriving This Week

First, Kumar announced that several features previewed in May will be launched over the next week. These features include:

  • Richer video experience on mobile and desktop
  • Knowledge Cards
  • Including graphs in search results
  • Improved Formatting
  • Better social sharing abilities

Along with these updates, Kumar says that chat history will be coming to desktop over the next week after already arriving on mobile. To access your chat history, hit the clock icon in the top right of an existing chat.

New Updates To Bing

The bulk of the announcement is dedicated to highlighting upcoming features for users on mobile devices.

For starters, Microsoft is premiering a Bing Chat widget that can be directly added to iOS or Android home screens – launching the new Bing Chat tools will always be possible with just a tap.

Additionally, Divya Kumar says that Bing is implementing the ability to continue a conversation across different platforms if you are signed in. For example, a user might start a conversation on desktop, but they will be able to pick up where they left off if they decide to move to a mobile device. 

Microsoft is also working to improve language support for non-English users with better voice input.

Third-Party App Integration

Microsoft has integrated its AI tools into its mobile keyboard app, SwiftKey to make drafting new messages efficient and intuitive.

Additionally, the company is bringing  Bing’s AI abilities to Skype by making the new Bing experience available from within any group chat. Just tag Bing in the chat to access the tools. 

Why It Matters

Bing has been pushing to change its status as a search engine through its diverse AI tools and major updates to all of its services. According to the announcement, it seems to be working.

The company says it is seeing 8x the number of daily downloads since it launched the new AI-assisted Bing and they expect to see further growth as they develop these tools and products further.

After months of rumors and speculation, Google’s AI-powered generative search experience is here – sort of. 

The new conversational search tool is available to users as a Google Labs experiment only accessible by signing up for a waitlist. That means it is not replacing the current version of Google Search (at least, not yet), but it is the first public look at what is likely to be the biggest overhaul to Google Search in decades. 

Though we at TMO have been unable to get our hands on the new search experience directly, we have gathered all the most important details from those who have to show you what to expect when the generative search experience becomes more widely available. 

What The AI-Powered Google Generative Search Experience Looks Like

The new Google search experience is present at the very top of Google search results, giving context, answering basic questions, and providing a conversational way to refine your search for better results. 

Notably, any AI-generated search information is currently tagged with a label that reads Generative AI is experimental.

Google will also subtly shade AI content based on specific searches to “reflect specific journey types and the query intent itself.” For example, the AI-created search results in the shopping-related search below are placed on a light blue background. 

Where Does The Information Come From?

Unlike most current AI-powered tools, Google’s new search experience cites its sources. 

Sources are mentioned and linked to, making it easier for users to keep digging. 

Additionally, the AI tools can pull from Google’s existing search tools and data, such as Google Shopping product listings and more. 

Conversational Search

The biggest change that comes with the new AI-powered search is the ability to follow up queries with follow-ups using context from your previous search. As the announcement explains:

“Context will be carried over from question to question, to help you more naturally continue your exploration. You’ll also find helpful jumping-off points to web content and a range of perspectives that you can dig into.”

What AI Won’t Answer

The AI-powered tool will not provide information for a range of topics that might be sensitive or where accuracy is particularly important For example, Google’s AI tools won’t give answers about giving medicine to a child because of the potential risks involved. Similarly, reports suggest the tool won’t answer questions about financial issues.

Additionally, Google’s AI-powered search will not discuss or provide information on topics that may be “potentially harmful, hateful, or explicit”.

To try out the new Google AI-powered generative search experience for yourself sign up for the waitlist here.

Google Discover will not show content or images that would normally be blocked by the search engine’s SafeSearch tools. 

Though not surprising, this is the closest we have come to seeing this confirmed by someone at Google. Google Search Liaison Danny Sullivan responded to a question on Twitter by SEO Professional Lily Ray. In a recent tweet, Ray posed the question:

“Is the below article on SafeSearch filtering the best place to look for guidance on Google Discover? Seems that sites with *some* adult content may be excluded from Discover entirely; does this guidance apply?”

In his initial response, Sullivan wasn’t completely certain but stated: “It’s pretty likely SafeSearch applies to Discover, so yes. Will update later if that’s not the case.”

While Sullivan never came back to state this was not the case, he later explained that “our systems, including on Discover, generally don’t show content that might be borderline explicit or shocking etc. in situations where people wouldn’t expect it.”

Previously, other prominent figures at Google including Gary Illyes and John Mueller had indicated this may be the case, also suggesting adult language may limit the visibility of content in Discover. 

For most brands, this won’t be an issue but more adult-oriented brands may struggle to appear in the Discovery feed, even with significant optimization.

To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below: