The Washington Post may not be the first organization you imagine when you think about SEO experts, but as a popular news organization read by millions around the world, The Post has dealt with its fair share of issues in developing its long-term strategies for web performance and SEO. 

Now, the news site is sharing the fruit of that hard work by releasing its own Web Performance and SEO Best Practices and Guidelines.

These guidelines help ensure that The Washington Post remains competitive and visible in highly competitive search spaces, drives more organic traffic, and maintains a positive user experience on its website. 

In the announcement, engineering lead Arturo Silva said:

“We identified a need for a Web Performance and SEO engineering team to build technical solutions that support the discovery of our journalism, as the majority of news consumers today read the news digitally. Without proper SEO and web performance, our stories aren’t as accessible to our readers. As leaders in engineering and media publishing, we’re creating guidelines that serve our audiences and by sharing those technical solutions in our open-source design system, we are providing tools for others to certify that their own site practices are optimal.”

What’s In The Washington Post’s SEO and Web Performance Guidelines?

If you’re hoping to see a surprise trick or secret tool being used by The Washington Post, you are likely to be disappointed. 

The guidelines are largely in line with practices used by most SEO experts, albeit with a specific focus on their specific search and web performance issues.

For example, the Web Performance section covers three specific areas: loading performance, rendering performance, and responsiveness. Similarly, the SEO guidelines are split into on-page SEO, content optimization, technical SEO, and off-page SEO. 

More than anything, the guidelines highlight the need for brands to focus their SEO efforts on their unique needs and goals and develop strategies that are likely to remain useful for the foreseeable future (instead of chasing every new SEO trend). 

To read the guidelines for yourself, visit the Washington Post’s site here. 

Just last week, Google Search Liaison, Danny Sullivan, once again took to Twitter to dispel a longstanding myth about word counts and search engine optimization (SEO). 

The message reads:

“Reminder. The best word count needed to succeed in Google Search is … not a thing! It doesn’t exist. Write as long or short as needed for people who read your content.”

Sullivan also linked to long-existing help pages and included a screencap of a statement from these pages which says:

“Are you writing to a particular word count because you’ve heard or read that Google has a preferred word count? (No, we don’t.)”

Of course, this is not a new message from Google. Still, many of the most popular SEO tools and experts still claim that anywhere between 300 to 1,500 words is ideal for ranking in Google search results. 

Incidentally, a day later Google’s John Mueller also responded to an SEO professional who argued there was “correlation between word count and outranking competition?” In a short but simple reply, Mueller said “Are you saying the top ranking pages should have the most words? That’s definitely not the case.”

Most likely, this myth of an ideal SEO word count will continue to persist so long as search engine optimization exists in its current form. Still, it is always good to get a clear reminder from major figures at Google that content should be as long as necessary to share valuable information to your audience – whether you can do that in a couple sentences or exhaustive multi-thousand-word content. 

Microsoft is overhauling its Bing search engine’s mobile experience with new features, better formatting, and integration with mobile apps for Skype and Edge.

The news came from Microsoft’s Global Head of Marketing, Divya Kumar, who showcased the new mobile experience and upcoming features in a blog post. 

Previewed Features Are Arriving This Week

First, Kumar announced that several features previewed in May will be launched over the next week. These features include:

  • Richer video experience on mobile and desktop
  • Knowledge Cards
  • Including graphs in search results
  • Improved Formatting
  • Better social sharing abilities

Along with these updates, Kumar says that chat history will be coming to desktop over the next week after already arriving on mobile. To access your chat history, hit the clock icon in the top right of an existing chat.

New Updates To Bing

The bulk of the announcement is dedicated to highlighting upcoming features for users on mobile devices.

For starters, Microsoft is premiering a Bing Chat widget that can be directly added to iOS or Android home screens – launching the new Bing Chat tools will always be possible with just a tap.

Additionally, Divya Kumar says that Bing is implementing the ability to continue a conversation across different platforms if you are signed in. For example, a user might start a conversation on desktop, but they will be able to pick up where they left off if they decide to move to a mobile device. 

Microsoft is also working to improve language support for non-English users with better voice input.

Third-Party App Integration

Microsoft has integrated its AI tools into its mobile keyboard app, SwiftKey to make drafting new messages efficient and intuitive.

Additionally, the company is bringing  Bing’s AI abilities to Skype by making the new Bing experience available from within any group chat. Just tag Bing in the chat to access the tools. 

Why It Matters

Bing has been pushing to change its status as a search engine through its diverse AI tools and major updates to all of its services. According to the announcement, it seems to be working.

The company says it is seeing 8x the number of daily downloads since it launched the new AI-assisted Bing and they expect to see further growth as they develop these tools and products further.

After months of rumors and speculation, Google’s AI-powered generative search experience is here – sort of. 

The new conversational search tool is available to users as a Google Labs experiment only accessible by signing up for a waitlist. That means it is not replacing the current version of Google Search (at least, not yet), but it is the first public look at what is likely to be the biggest overhaul to Google Search in decades. 

Though we at TMO have been unable to get our hands on the new search experience directly, we have gathered all the most important details from those who have to show you what to expect when the generative search experience becomes more widely available. 

What The AI-Powered Google Generative Search Experience Looks Like

The new Google search experience is present at the very top of Google search results, giving context, answering basic questions, and providing a conversational way to refine your search for better results. 

Notably, any AI-generated search information is currently tagged with a label that reads Generative AI is experimental.

Google will also subtly shade AI content based on specific searches to “reflect specific journey types and the query intent itself.” For example, the AI-created search results in the shopping-related search below are placed on a light blue background. 

Where Does The Information Come From?

Unlike most current AI-powered tools, Google’s new search experience cites its sources. 

Sources are mentioned and linked to, making it easier for users to keep digging. 

Additionally, the AI tools can pull from Google’s existing search tools and data, such as Google Shopping product listings and more. 

Conversational Search

The biggest change that comes with the new AI-powered search is the ability to follow up queries with follow-ups using context from your previous search. As the announcement explains:

“Context will be carried over from question to question, to help you more naturally continue your exploration. You’ll also find helpful jumping-off points to web content and a range of perspectives that you can dig into.”

What AI Won’t Answer

The AI-powered tool will not provide information for a range of topics that might be sensitive or where accuracy is particularly important For example, Google’s AI tools won’t give answers about giving medicine to a child because of the potential risks involved. Similarly, reports suggest the tool won’t answer questions about financial issues.

Additionally, Google’s AI-powered search will not discuss or provide information on topics that may be “potentially harmful, hateful, or explicit”.

To try out the new Google AI-powered generative search experience for yourself sign up for the waitlist here.

Google Discover will not show content or images that would normally be blocked by the search engine’s SafeSearch tools. 

Though not surprising, this is the closest we have come to seeing this confirmed by someone at Google. Google Search Liaison Danny Sullivan responded to a question on Twitter by SEO Professional Lily Ray. In a recent tweet, Ray posed the question:

“Is the below article on SafeSearch filtering the best place to look for guidance on Google Discover? Seems that sites with *some* adult content may be excluded from Discover entirely; does this guidance apply?”

In his initial response, Sullivan wasn’t completely certain but stated: “It’s pretty likely SafeSearch applies to Discover, so yes. Will update later if that’s not the case.”

While Sullivan never came back to state this was not the case, he later explained that “our systems, including on Discover, generally don’t show content that might be borderline explicit or shocking etc. in situations where people wouldn’t expect it.”

Previously, other prominent figures at Google including Gary Illyes and John Mueller had indicated this may be the case, also suggesting adult language may limit the visibility of content in Discover. 

For most brands, this won’t be an issue but more adult-oriented brands may struggle to appear in the Discovery feed, even with significant optimization.

To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.

Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.

As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.

With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.

What Can You Do?

Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:

  • Monitor site performance regularly to identify early signs of issues with your site
  • Create content geared to your audience’s needs and interests
  • Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors

TL;DR

Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.

Even though the new AI-powered Bing search experience is rolling out to a limited number of users, Microsoft says it is seeing record-setting growth and engagement that may indicate a big shift is coming to the search landscape. 

Though Microsoft is still only receiving a single-digit percentage of overall search volume, these early numbers could be a sign that Google might finally have a real challenger as the new AI-powered Bing and Edge browsing experience become more widely available. 

Additionally, Microsoft reports it is now seeing more than 100 million daily active users – with around a third of those users being entirely new to Bing. 

Importantly, Microsoft says users are returning to Bing more often each day thanks to expanded uses of the Edge browser and improvements to Bing’s search result relevance.

Microsoft reported that around a third of the users with access to the AI-powered search experience are using the Chat feature every day for a wide variety of tasks including search, content creation, and more. 

While the new AI-powered search experience is likely driving much of this increased engagement and usage, long-term data shows that use of the Edge browser has also been steadily growing over the past two years.

Additionally, Microsoft says the implementation of AI-assisted search has significantly improved the relevance of search results, saying: “The second factor driving trial and usage is that our core web search ranking has taken several significant jumps in relevancy due to the introduction of the Prometheus model so our Bing search quality is at an all-time high.” 

As we are in the early days of Bing’s new AI-powered search and browser experience, it will be interesting to see whether this growth continues – especially once Google’s AI-powered tools begin to develop. 

For more, read the full report from Microsoft here.

After an… interesting rollout, Bing is making some changes to its much-talked-about AI chatbot. As the company announced yesterday afternoon, Bing will limit users to 50 questions per day and 5 questions per session to rein in the new system. 

Since its rollout, users have been sharing examples of the chatbot, created in a partnership with OpenAI, getting up to all sorts of bad behavior. Some of the most notable include gaslighting users about the year, committing accidental racism, and even trying to blackmail a user by threatening to release personal information.

Early AI Chatbots “Somewhat Broken”

Addressing the situation in a tweet thread, OpenAI CEO Sam Altman admitted that the current AI tools are “somewhat broken” but stressed the importance of letting the world see and influence these early stages to help “get it right” down the line. 

“We think showing these tools to the world early, while still somewhat broken, is critical if we are going to have sufficient input and repeated efforts to get it right. the level of individual empowerment coming is wonderful, but not without serious challenges.”

At the same time, Altman says it is important to regulate these tools while they are more bark than bite, saying “we are potentially not that far away from potentially scary ones.”

What Bing Is Changing

Bing is limiting chat sessions to 50 chat “turns” or questions a day, with each session being limited to 5 “turns”. Specifically, Microsoft defined a turn as a complete exchange including a question from a user and a reply. 

“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start.”

For more, read the announcement from Bing here