To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.

Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.

As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.

With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.

What Can You Do?

Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:

  • Monitor site performance regularly to identify early signs of issues with your site
  • Create content geared to your audience’s needs and interests
  • Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors

TL;DR

Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.

Even though the new AI-powered Bing search experience is rolling out to a limited number of users, Microsoft says it is seeing record-setting growth and engagement that may indicate a big shift is coming to the search landscape. 

Though Microsoft is still only receiving a single-digit percentage of overall search volume, these early numbers could be a sign that Google might finally have a real challenger as the new AI-powered Bing and Edge browsing experience become more widely available. 

Additionally, Microsoft reports it is now seeing more than 100 million daily active users – with around a third of those users being entirely new to Bing. 

Importantly, Microsoft says users are returning to Bing more often each day thanks to expanded uses of the Edge browser and improvements to Bing’s search result relevance.

Microsoft reported that around a third of the users with access to the AI-powered search experience are using the Chat feature every day for a wide variety of tasks including search, content creation, and more. 

While the new AI-powered search experience is likely driving much of this increased engagement and usage, long-term data shows that use of the Edge browser has also been steadily growing over the past two years.

Additionally, Microsoft says the implementation of AI-assisted search has significantly improved the relevance of search results, saying: “The second factor driving trial and usage is that our core web search ranking has taken several significant jumps in relevancy due to the introduction of the Prometheus model so our Bing search quality is at an all-time high.” 

As we are in the early days of Bing’s new AI-powered search and browser experience, it will be interesting to see whether this growth continues – especially once Google’s AI-powered tools begin to develop. 

For more, read the full report from Microsoft here.

After an… interesting rollout, Bing is making some changes to its much-talked-about AI chatbot. As the company announced yesterday afternoon, Bing will limit users to 50 questions per day and 5 questions per session to rein in the new system. 

Since its rollout, users have been sharing examples of the chatbot, created in a partnership with OpenAI, getting up to all sorts of bad behavior. Some of the most notable include gaslighting users about the year, committing accidental racism, and even trying to blackmail a user by threatening to release personal information.

Early AI Chatbots “Somewhat Broken”

Addressing the situation in a tweet thread, OpenAI CEO Sam Altman admitted that the current AI tools are “somewhat broken” but stressed the importance of letting the world see and influence these early stages to help “get it right” down the line. 

“We think showing these tools to the world early, while still somewhat broken, is critical if we are going to have sufficient input and repeated efforts to get it right. the level of individual empowerment coming is wonderful, but not without serious challenges.”

At the same time, Altman says it is important to regulate these tools while they are more bark than bite, saying “we are potentially not that far away from potentially scary ones.”

What Bing Is Changing

Bing is limiting chat sessions to 50 chat “turns” or questions a day, with each session being limited to 5 “turns”. Specifically, Microsoft defined a turn as a complete exchange including a question from a user and a reply. 

“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start.”

For more, read the announcement from Bing here

Having a robust backlink profile remains one of the most crucial factors for ranking a webpage highly in search, so it is always big news when Google actually tells us what it looks for in quality links. 

Yesterday, the search engine published a new set of guidelines and best practices for building backlinks, detailing how to make your links crawlable, how to craft well-ranking anchor text, and how to best establish internal links on your site. 

Below, we will cover all the new guidelines and best SEO practices for links on your website according to Google:

Crawlable Links

As the page Google updated was originally dedicated to specifically making links crawlable, this section remains largely unchanged. It reads, “Generally, Google can only crawl your link if it’s an <a> HTML element (also known as anchor element) with an href attribute. Most links in other formats won’t be parsed and extracted by Google’s crawlers. Google can’t reliably extract URLs from <a> elements that don’t have an href attribute or other tags that perform as links because of script events.”

Anchor Text Placement 

The best practice for placing anchor text for links reads: “Anchor text (also known as link text) is the visible text of a link. This text tells people and Google something about the page you’re linking to. Place anchor text between <a> elements that Google can crawl.”

Writing Anchor Text

As for the anchor text itself, Google encourages you to balance descriptiveness with brevity: “Good anchor text is descriptive, reasonably concise, and relevant to the page that it’s on and to the page it links to. It provides context for the link, and sets the expectation for your readers. The better your anchor text, the easier it is for people to navigate your site and for Google to understand what the page you’re linking to is about.”

Internal Links 

While Google emphasizes the importance of internal links on your website, it also states that the search engine doesn’t look for a target number of links.

“You may usually think about linking in terms of pointing to external websites, but paying more attention to the anchor text used for internal links can help both people and Google make sense of your site more easily and find other pages on your site. Every page you care about should have a link from at least one other page on your site. Think about what other resources on your site could help your readers understand a given page on your site, and link to those pages in context.”

External Links

When it comes to external links, Google has advice for creating powerful links that don’t come off as spam: “Linking to other sites isn’t something to be scared of; in fact, using external links can help establish trustworthiness (for example, citing your sources). Link out to external sites when it makes sense, and provide context to your readers about what they can expect.”

In a closed-door presentation at the Microsoft offices, the company revealed it would be integrating ChatGPT’s AI capabilities into Bing and Microsoft web browsers. 

Introducing the new feature, Microsoft CEO reportedly told event attendees that “this technology is going to reshape pretty much every software category.”

Billed as “your AI-powered copilot for the web”, the new feature unites information from Bing with capabilities from Edge web browser and artificial intelligence. Together, users can turn to AI features to get direct answers to questions, find information in more effective ways, and recontextualize the content they find.

According to Nadella, search engines currently fail to deliver the most efficient experience up to 40% of the time, causing users to click on search results and immediately click back to search results. 

With these new capabilities, Microsoft hopes to change that radically.

How It Works

The new AI-powered Bing uses a next-generation language model from OpenAI (the creators of ChatGPT) which is reportedly even more powerful than ChatGPT.

Additionally, Microsoft is using a new model to improve the relevance of answers and keep them up to date. 

Nadella says this AI model has already been applied to Bing’s core search algorithm, causing the biggest jump in search relevance ever. 

The centerpiece of the new experience is an expanded search box that allows users to input up to 1,000 characters and a chatbot that allows users to interact with Bing in conversational language. 

Notably, this means the search engine will be able to easily follow up on previous searches and provide greater context or translate information into more understandable formats.

For example, the company demonstrated the AI’s capabilities by quickly summarizing a 15-page PDF with a single click or translating a piece of code into another programming language.

Try It Yourself

The new search experience is seeing a limited preview release on desktop devices. Starting today, anyone can try out the new experience by visiting Bing.com and conducting a series of sample searches. 

However, the feature is expected to see a complete release and mobile version soon. 

If you’re still unclear on how Google thinks about marketing agencies that offer negative SEO linkbuilding services or link disavowal services, the latest comments from John Mueller should help clarify the company’s stance. 

In a conversation that popped up on Twitter between Mueller and several marketing experts, Mueller clearly and definitively slammed companies offering these types of services by saying that they are “just making stuff up and cashing in from those who don’t know better.”

This is particularly notable as some have accused Google of being unclear on their handling of link disavowal using their tools

The post that started it all came from Twitter user @RyanJones who said, “I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

In response, one user began talking about negative SEO which caught the attention of Mueller. The user mentioned that “agencies know what kind of links hurt the website because they have been doing this for a long time. It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well. They will provide you examples as well with proper insights.”

In response, Mueller gave what is possibly his clearest statement on this type of “service” yet:

“That’s all made up & irrelevant. These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Instead of spending time and effort on any of this, Mueller instead recommended something simple:

“Don’t waste your time on it; do things that build up your site instead.”

Google is encouraging brands to ensure content is properly dated in search engines by using multiple date indicators on each page. 

The recommendation came in the wake of an issue with Google News where the wrong dates were being shown.

In the response, Google’s Search Liaison, Danny Sullivan, emphasized that while many factors may have contributed in this specific situation, the lack of proper date signals made it difficult to show correct info in the search results. 

“That page is a particular challenge since the main story lacks a visible date (it only has a time), and the page contains multiple stories which do contain full dates. Our guidance warns about this.”

To prevent situations like this from arising, Sullivan says it is important to use several signals to clarify the date content is published:

“Understand that ideally, the meta data alone would seem to some to be enough, and we’ll keep working to improve. But there are good reasons why we like multiple date signals present.”

Why Does This Matter?

It may not seem like a big deal for the wrong date to occasionally get shown with content in the search results. However, these can undermine your authority, lead to confusion, and create a poor user experience. All of these can lead to decreased page performance and even demotions in Google’s search results.

On the other hand, situations like this also highlight the need for Google to deliver more consistent ways to signal a page’s publishing date. 

For now, the best recommendation Google has is to use a scattershot approach for the best chance of having your page correctly dated:

“Google doesn’t depend on a single date factor because all factors can be prone to issues. That’s why our systems look at several factors to determine our best estimate of when a page was published or significantly updated.”

If your site gets hit with an algorithmic penalty from Google, you’ll likely be eager to fix the issue and improve your rankings again. However, Google’s top experts say it can take quite some time to recover if they believe your site is spammy.

In a recent Google SEO Office Hours session, representatives were asked how long it can take to recover from an algorithm penalty related to content quality problems. 

While many details about the question remain unclear – such as how significant the penalty is – the search engine’s spokespeople encouraged site owners to be proactive. Otherwise, it may be months before they regain ground in the search results.

Specifically, the question posed in the video is:

“If a website gets algorithmically penalized for thin content, how much of the website’s content do you have to update before the penalty is lifted?”

There are a few ways the question could be read, so in this case, the experts kept it simple and straight to the point:

“Well, it’s generally a good idea to clean up low-quality content or spammy content that you may have created in the past.

For algorithmic actions, it can take us months to reevaluate your site again to determine that it’s no longer spammy.”

In other words, it is always better to share high-quality original content than to risk being labeled as spam. Once that happens, you’ll likely be in the doghouse for at least a few months.

To hear the answer, check out the video below beginning at 24:24.