After months of rumors and speculation, Google’s AI-powered generative search experience is here – sort of. 

The new conversational search tool is available to users as a Google Labs experiment only accessible by signing up for a waitlist. That means it is not replacing the current version of Google Search (at least, not yet), but it is the first public look at what is likely to be the biggest overhaul to Google Search in decades. 

Though we at TMO have been unable to get our hands on the new search experience directly, we have gathered all the most important details from those who have to show you what to expect when the generative search experience becomes more widely available. 

What The AI-Powered Google Generative Search Experience Looks Like

The new Google search experience is present at the very top of Google search results, giving context, answering basic questions, and providing a conversational way to refine your search for better results. 

Notably, any AI-generated search information is currently tagged with a label that reads Generative AI is experimental.

Google will also subtly shade AI content based on specific searches to “reflect specific journey types and the query intent itself.” For example, the AI-created search results in the shopping-related search below are placed on a light blue background. 

Where Does The Information Come From?

Unlike most current AI-powered tools, Google’s new search experience cites its sources. 

Sources are mentioned and linked to, making it easier for users to keep digging. 

Additionally, the AI tools can pull from Google’s existing search tools and data, such as Google Shopping product listings and more. 

Conversational Search

The biggest change that comes with the new AI-powered search is the ability to follow up queries with follow-ups using context from your previous search. As the announcement explains:

“Context will be carried over from question to question, to help you more naturally continue your exploration. You’ll also find helpful jumping-off points to web content and a range of perspectives that you can dig into.”

What AI Won’t Answer

The AI-powered tool will not provide information for a range of topics that might be sensitive or where accuracy is particularly important For example, Google’s AI tools won’t give answers about giving medicine to a child because of the potential risks involved. Similarly, reports suggest the tool won’t answer questions about financial issues.

Additionally, Google’s AI-powered search will not discuss or provide information on topics that may be “potentially harmful, hateful, or explicit”.

To try out the new Google AI-powered generative search experience for yourself sign up for the waitlist here.

Google Discover will not show content or images that would normally be blocked by the search engine’s SafeSearch tools. 

Though not surprising, this is the closest we have come to seeing this confirmed by someone at Google. Google Search Liaison Danny Sullivan responded to a question on Twitter by SEO Professional Lily Ray. In a recent tweet, Ray posed the question:

“Is the below article on SafeSearch filtering the best place to look for guidance on Google Discover? Seems that sites with *some* adult content may be excluded from Discover entirely; does this guidance apply?”

In his initial response, Sullivan wasn’t completely certain but stated: “It’s pretty likely SafeSearch applies to Discover, so yes. Will update later if that’s not the case.”

While Sullivan never came back to state this was not the case, he later explained that “our systems, including on Discover, generally don’t show content that might be borderline explicit or shocking etc. in situations where people wouldn’t expect it.”

Previously, other prominent figures at Google including Gary Illyes and John Mueller had indicated this may be the case, also suggesting adult language may limit the visibility of content in Discover. 

For most brands, this won’t be an issue but more adult-oriented brands may struggle to appear in the Discovery feed, even with significant optimization.

One of Google’s most visible spokespeople, John Mueller, made a rare appearance on Reddit to answer a series of “dumb” SEO questions covering everything from geotagging images to how often you should blog.

In a thread on the r/BigSEO subreddit called “incoming dumb question barrage”, a user asked a series of five questions:

  1. Should we be geotagging images. Does Google even care?
  2. Blogging. If we do it, is it everyday or once a week with some seriously solid stuff?
  3. Google Business Profile posting: Everyday, once a week, or why bother?
  4. Since stuff like Senuke died 10 years ago, is it all about networking with webmasters of similar and same niche sites for links?
  5. Piggybacking off #4, what about PBNs? Are they back? If so, does it have to be a group of completely legit looking websites vs some cobbled together WP blogs?

Mueller provided a series of candid answers which we will get into below:

Geotagging Images

Here Mueller kept it short and sweet: “No need to geotag images for SEO.”

How Often Should You Blog?

As always, Google won’t provide a specific post frequency that is “best” for SEO blog content. Rather, Mueller says to post “as often as you have something unique & compelling to say.”

However, the Google Search Advocate admits that more frequent posting can more traffic if you are able to maintain the quality of your content. 

“The problem with trying to keep a frequency up is that it’s easy to end up with mediocre, fluffy content, which search engine quality algorithms might pick up on.”

Additionally, he indicates that those who are using AI to create a lot of content quickly are unlikely to be rewarded.

Google Business Profile Posting Frequency

Unfortunately, this is not Mueller’s area of knowledge. His answer was a simple “no idea.”

Outdated Linkbuilding Strategies

The last two questions are devoted to asking if older methods for link building were still relevant at all. Clearly, this tickled Mueller as he largely dismissed either approach. 

“SENuke, hah, that’s a name I haven’t heard in ages, lol. Sorry. Giggle. I have thoughts on links, but people love to take things out of context to promote their link efforts / tools, so perhaps someone else will say something reasonable, or not.

“OMG, PBNs too. What is this thread even. Now I won’t say anything without a lawyer present.”

No Shortcuts To Online Riches

Of course, there is an underlying current connecting all of these questions. Mueller takes note of this as well, saying:

“Reading between the lines, it seems you want to find a short-cut to making money online.”

The truth is, there are no real shortcuts to online success these days. However, there are a lot of questionable people willing to take your money to provide tools and courses that often get you nowhere. 

“Unfortunately, there’s a long line of people trying to do the same, and some have a lot of practice. Some will even sell you tools and courses on how to make money online (and *they* will be the ones making the money, fwiw, since people pay them for the tools and courses). The good tools cost good money, and they’re not marketed towards people who just want to make money online — they’re targeted at companies who need to manage their online presence and report on progress to their leadership chain.”

At the same time, Mueller encourages individuals such as the person who started to thread to keep learning and practicing SEO:

“… learn HTML, learn a bit of programming, and go for it. 90% of the random tricks you run across won’t work, 9% of the remaining ones will burn your sites to the ground, but if you’re lucky & persistent (is that the same?), you’ll run across some things that work for you.

“If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.”If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.

“And … after some time, you might notice that actually building something of lasting value can also be intriguiing [sic], and you’ll start working on a side-project that does things in the right way, where you can put your experience to good use and avoid doing all of the slash & burn site/spam-building.”

To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

Google released its annual Ads Safety Report this week, highlighting the company’s efforts to guarantee advertising on its platforms is safe and trusted.

Along with suspending more than 6.7 million spammy ad accounts over the last year, the report details how Google is fighting fraud, preventing potentially harmful ads from running, and protecting user privacy.

Using machine learning algorithms, Google is able to identify suspicious activity and patterns faster than ever and quickly remove fraudulent or harmful ads.

This has contributed to a huge improvement in Google’s abilities to detect spam and harmful activity at scale, leading to over 2 billion more ads being blocked in 2022 compared to the previous year. 

At the same time Google released the report, the company also announced it is launching an Ads Transparency Center to help users better understand the ads they are seeing and who is paying to display them. 

Highlights From The 2022 Google Ads Safety Report

The full Ads Safety Report includes a lot of details about how Google detects and removes malicious or spammy ads, but these were the details we think are most important for you to know:

  • Google blocked over 5.2 billion ads for policy violations
  • Ad restrictions were down by over a billion annually in 2022
  • Over 6.7 million advertiser accounts were suspended for “egregious” policy violations
  • The number of ads removed from web pages stayed largely stable compared to the previous year

What Is The Ads Transparency Center?

In response to the leap in blocked ads and suspended ad accounts, Google decided to create the Ads Transparency Center – a central knowledge hub containing information about verified advertisers and ads. 

Here you’ll be able to find detailed information about the ads a specific advertiser has run, what ads are being shown in a specific area, and more about ads appearing on the platform. 

Users can also access My Ad Center here, which gives them the ability to like, block, or report potentially problematic ads. 

For more about Google’s attempts to keep the ads on its platform safe for users, check out the full 2022 Ads Safety Report here or the Ads Transparency Center announcement here.

Just ahead of International Fact-Checking Day on April 2, Google announced a wave of new features intended to make it easier for users to learn more about where their information is coming from.

As the company revealed in a recent announcement, Google is introducing five new features to verify information online:

  • Expanding the “About this result” feature worldwide
  • Introducing an “About this author” section
  • Making it easier to learn about websites using the “About this page” feature
  • Providing context for top stories with “Perspectives”
  • Helping spot information gaps

Expanding the “About this result” feature worldwide

Launched in 2021, the “About this result” feature gives searchers access to additional information about the sources and websites that appear in search results. 

Though English-speaking countries have been able to find this information by clicking the three vertical dots next to most search results for a while, users in other countries or speaking other languages are just now getting access to the feature. 

Introducing an “About this author” section

Google is adding a new section to the “About this result” feature which gives information specifically about the author of the content you see. 

At the time, it is unclear exactly where Google will be gathering this information, but it is worth keeping an eye on as the feature rolls out – especially if your site publishes blog content.

Making it easier to learn about websites using the “About this page” feature

Google is adding a new way to access the “About this page” feature, which details information about a webpage similar to the “About this result” feature.

Now, you can learn more about a page by typing the URL of a site into Google’s search. The following search results will include information from the “About this page” feature at the top of the page. 

Here, you’ll see information about how the site describes itself and what others across the web have said about the site. 

Providing context for top stories with “Perspectives”

The Perspectives carousel aims to provide additional context around Top Stories by sharing helpful insights from journalists and other experts.

The feature has been in testing since 2022, but Google says it will be widely available in the coming days. 

Helping spot information gaps

When Google is unable to confidently provide information about a topic – either because there are few good sources available or because the information is changing quickly around that topic – the search engine will display a content warning with the search results it provides. 

To learn more about these new features, read the complete announcement from Google here.

Google has started giving users in the US and UK access to Google Bard, its answer to Bing and ChatGPT’s AI chat tools. The company is doing a gradual rollout through a waitlist at bard.google.com

What Is Bard?

Bard is a generative AI. That means it will generate content for you based on prompts that you submit through a chatbot. 

In today’s announcement (partially written with the help of Bard), the company suggested a variety of ways users might be able to take advantage of the AI tool:

“You can use Bard to boost your productivity, accelerate your ideas and fuel your curiosity. You might ask Bard to give you tips to reach your goal of reading more books this year, explain quantum physics in simple terms or spark your creativity by outlining a blog post.”

Is Bard an AI Search Tool?

Yes and no. 

Bard is something of a complementary tool to Google’s search engine. While it is not directly integrated into Google Search, it is “designed so that you can easily visit Search to check its responses or explore sources across the web.”

Along with suggesting queries, you can immediately open a new tab with search results for a given query. 

At the same time, Bard is not considered a direct part of Google search. Instead, the company suggests it will be adding other AI tools to its search engine in the future. 

Bard Is In Early Stages

Throughout the announcement, Google repeatedly described Bard as an early experiment, As with Bing’s AI tools, Bard is likely to have some early quirks and weirdness as users get their hands on it. 

Additionally, Google pointed out that the AI tool is far from perfect. It can get information wrong or phrase things in misleading ways. Some of these errors may be small. In Google’s example, Bard got the scientific name for a plant wrong – Zamioculcas zamiifolia, not Zamioculcas zamioculcas. However, the company cautions it may be inaccurate in other ways.

Still, it will be fun to see what Bard can do now that it is coming to the public.

Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.

Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.

As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.

With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.

What Can You Do?

Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:

  • Monitor site performance regularly to identify early signs of issues with your site
  • Create content geared to your audience’s needs and interests
  • Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors

TL;DR

Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.

Having a robust backlink profile remains one of the most crucial factors for ranking a webpage highly in search, so it is always big news when Google actually tells us what it looks for in quality links. 

Yesterday, the search engine published a new set of guidelines and best practices for building backlinks, detailing how to make your links crawlable, how to craft well-ranking anchor text, and how to best establish internal links on your site. 

Below, we will cover all the new guidelines and best SEO practices for links on your website according to Google:

Crawlable Links

As the page Google updated was originally dedicated to specifically making links crawlable, this section remains largely unchanged. It reads, “Generally, Google can only crawl your link if it’s an <a> HTML element (also known as anchor element) with an href attribute. Most links in other formats won’t be parsed and extracted by Google’s crawlers. Google can’t reliably extract URLs from <a> elements that don’t have an href attribute or other tags that perform as links because of script events.”

Anchor Text Placement 

The best practice for placing anchor text for links reads: “Anchor text (also known as link text) is the visible text of a link. This text tells people and Google something about the page you’re linking to. Place anchor text between <a> elements that Google can crawl.”

Writing Anchor Text

As for the anchor text itself, Google encourages you to balance descriptiveness with brevity: “Good anchor text is descriptive, reasonably concise, and relevant to the page that it’s on and to the page it links to. It provides context for the link, and sets the expectation for your readers. The better your anchor text, the easier it is for people to navigate your site and for Google to understand what the page you’re linking to is about.”

Internal Links 

While Google emphasizes the importance of internal links on your website, it also states that the search engine doesn’t look for a target number of links.

“You may usually think about linking in terms of pointing to external websites, but paying more attention to the anchor text used for internal links can help both people and Google make sense of your site more easily and find other pages on your site. Every page you care about should have a link from at least one other page on your site. Think about what other resources on your site could help your readers understand a given page on your site, and link to those pages in context.”

External Links

When it comes to external links, Google has advice for creating powerful links that don’t come off as spam: “Linking to other sites isn’t something to be scared of; in fact, using external links can help establish trustworthiness (for example, citing your sources). Link out to external sites when it makes sense, and provide context to your readers about what they can expect.”