Tag Archive for: Google

Google is making a big change to its Core Web Vitals ranking signals soon, as the company announced that the new Interaction to Next Paint (INP) signal will replace the First Input Delay (FID) on March 12.

The new INP metric measures the amount of time between when a user interacts with a web page (for example, by clicking a button) to when a browser begins rendering pixels on the screen.

Though FID measured a similar time between user input and browser rendering, Google says INP captures interactivity in ways that were not possible previously.

The History Behind FID and INP Metrics

FID has been a metric used by Google to rank sites since the debut of Google’s Core Web Vitals in 2018. However, Google quickly began to see that this metric didn’t fully capture user interactions as they had hoped. 

This led to Google introducing INP as an experimental or “pending” metric in 2022. Now, almost 2 years later, Google has decided to fully replace FID with the INP metric in March.

What You Should Do

Before March, it is recommended that website managers ensure their site is meeting the threshold for a “good” INP performance. 

If you do not meet this mark, Google suggests optimizing your site with these strategies:

  • Evaluate your site’s performance using tools such as PageSpeed Insights or the Google Chrome User Experience Report.
  • Identify issues that may be slowing down INP, like extended JavaScript tasks, excessive main thread activity, or a large DOM. 
  • Optimize issues based on Google’s optimization guides for the specific issue.

As Google’s ranking algorithms evolve, this and other ranking signals will likely be updated or replaced. This emphasizes how important it is to use the latest optimization standards and to ensure a smooth user experience if you want your business to be easily found online.

The Google SEO Starter Guide is designed to help individuals and organizations quickly learn the most important steps necessary for getting their websites ranking within Google Search. 

While the guide reportedly maintains a 91% approval rating, it has largely gone without updates for several years but that will be changing soon.

In a recent episode of Google’s “Search Off The Record” podcast, the company’s Search Relations team discussed plans to update the SEO Starter Guide, including talking about what would and would not be included in the revised document. 

Discussions like this are great for seeing how SEO is talked about within the search engine and learning what the company prioritizes when ranking sites along with identifying SEO myths that might lead you astray when optimizing your own site. 

So, what’s changing in the revised SEO Starter Guide?

HTML Structure

One topic the group discussed was the importance (or lack thereof) of HTML structure when it comes to online rankings.

While the team agreed that using proper HTML structure can help with online rankings, they indicated the guide will clarify that these are not all that important in the grand scheme.

As Google’s Gary Ilyes said:

“Using headings and a good title element and having paragraphs, yeah, sure. It’s all great, but other than that it’s pretty futile to think about how the page… or the HTML is structured.”

Branded Domain Names vs Keyword Rich Domain Names

SEO experts have been increasingly debating whether it is better to focus on your existing branding when establishing a domain name, or if domains perform better when including specific keywords.

According to the Google team, the new guide will clarify this by indicating that brands should focus on including branding in their domains over using keywords. The thought process shared by those in the discussion was that establishing a memorable brand will have a more long-term impact than trying to optimize your domain specifically for search engines. 

Debunking SEO Myths

Lastly, the group said one thing they want to improve in the document was how it addressed widespread SEO myths and misconceptions. 

For example, everyone agreed that the SEO Starter Guide should specifically debunk the idea that using Google products while creating or optimizing your site will improve search rankings. 

They indicated they would address this myth and several others to prevent people from optimizing their site based on misinformation found elsewhere online. 

For more from the discussion, listen to the entire episode of “Search Off The Record” here.

A lead Google spokesperson gave a surprising response to claims that the search engine stole content from a publisher without providing any benefit to the publisher’s website. 

Google’s rich search results have been controversial since their launch, as some feel that these results simply copy information from other websites instead of sending users to that content where it was originally posted. 

The search engine has largely ignored these criticisms by saying that rich results improve the search experience and include links to the original content. 

That’s what makes it so surprising that Google Search Liaison Danny Sullivan recently publicly responded to one publisher’s complaints directly.

The Original Complaint

In several recent tweets, a representative for travel brand Travel Lemming posted:

“Google is now stealing Travel Lemming’s own brand searches (even via site search).

They take our list — INCLUDING MY ORIGINAL PHOTOS 📸 — and present it in a rich result so people don’t click through.

I am literally IN that Red Rocks photo!…”

They are doing this across all travel searches – unbranded and branded alike.

Example: “Mexico Travel Tips” – they have an AI answer & also a rich result that basically just re-creates an entire blog post, including our stolen photos.

Again, I am IN that Mexico packing photo!

Like how is it legal for Google to just essentially create entire blog posts from creators’ content and images?

I literally have a law degree from the top law school in the world, and even I can’t figure it out!

Fair use does NOT apply if you’re using the content to compete directly against the creator, which they clearly are.

I can’t sit outside a movie theatre, project the movie on a wall, earn money from it, and claim fair use.

I spent SO much time taking those photos in Denver.

It was 10+ full days worth of work for me and partner Clara, going around the city to photograph everything. $100s of money spent in attraction admission fees, gas, parking.

Now Google just gets to extract all that value?

How much does Google get to take before creators say “enough is enough”?

How hard does the water have to boil before the frog jumps?

The comments show it is a prisoner’s dilemma as long as Google has a monopoly on search …”

Google’s Response

Danny Sullivan, Google’s Search Liaison, provided a lengthy response that delves specifically into what is happening, why, and ways they are hoping to improve the situation. 

Not only does Sullivan give insight into the company’s perspective, but also their own opinions about the function. Importantly, Sullivan doesn’t disregard Travel Lemming’s complaints and is sympathetic to how rich search results impact publishers:

“Hey Nate, this got flagged to my attention. I’ll pass along the feedback to the team. Pretty sure this isn’t a new feature. Elsewhere in the thread, you talk about it being an AI answer, and I’m pretty sure that’s not the case, either. It’s a way to refine an initial query and browse into more results.

With the example you point out, when you expand the listing, your image is there with a credit. If you click, a preview with a larger view comes up, and that lets people visit the site. Personally, I’m not a fan of the preview-to-click.

I think it should click directly to the site (feedback I’ve shared internally before, and I’ll do this again). But it’s making use of how Google Images operates, where there’s a larger preview that helps people decide if an image is relevant to their search query. Your site is also listed there, too. Click on that, people get to your site.”

If you don’t want your images to appear in Google Search, this explains how to block them:

https://developers.google.com/search/docs/crawling-indexing/prevent-images-on-your-page

I suspect you’d prefer an option to not have them appear as thumbnails in particular features. We don’t have that type of granular control, but I’ll also pass the feedback on. 

I appreciate your thoughts and concerns. I do. The intention overall is to make search better, which includes ensuring people do indeed continue to the open web — because we know for us to thrive, the open web needs to thrive.

But I can also appreciate that this might not seem obvious from how some of the features display.

I’m going to be sharing these concerns with the search team, because they’re important.

You and other creators that are producing good content (and when you’re ranking in the top results, that’s us saying it’s good content) should feel we are supporting you.

We need to look at how what we say and how our features operate ensure you feel that way.

I’ll be including your response as part of this.”

I doubt Sullivan is going to change many minds about Google’s rich search results, but this rare interaction is revealing to how Google sees the situation and is trying to walk a tightrope between providing a seamless search experience while sustaining the sites it relies on.

One of Google’s most visible spokespeople, John Mueller, made a rare appearance on Reddit to answer a series of “dumb” SEO questions covering everything from geotagging images to how often you should blog.

In a thread on the r/BigSEO subreddit called “incoming dumb question barrage”, a user asked a series of five questions:

  1. Should we be geotagging images. Does Google even care?
  2. Blogging. If we do it, is it everyday or once a week with some seriously solid stuff?
  3. Google Business Profile posting: Everyday, once a week, or why bother?
  4. Since stuff like Senuke died 10 years ago, is it all about networking with webmasters of similar and same niche sites for links?
  5. Piggybacking off #4, what about PBNs? Are they back? If so, does it have to be a group of completely legit looking websites vs some cobbled together WP blogs?

Mueller provided a series of candid answers which we will get into below:

Geotagging Images

Here Mueller kept it short and sweet: “No need to geotag images for SEO.”

How Often Should You Blog?

As always, Google won’t provide a specific post frequency that is “best” for SEO blog content. Rather, Mueller says to post “as often as you have something unique & compelling to say.”

However, the Google Search Advocate admits that more frequent posting can more traffic if you are able to maintain the quality of your content. 

“The problem with trying to keep a frequency up is that it’s easy to end up with mediocre, fluffy content, which search engine quality algorithms might pick up on.”

Additionally, he indicates that those who are using AI to create a lot of content quickly are unlikely to be rewarded.

Google Business Profile Posting Frequency

Unfortunately, this is not Mueller’s area of knowledge. His answer was a simple “no idea.”

Outdated Linkbuilding Strategies

The last two questions are devoted to asking if older methods for link building were still relevant at all. Clearly, this tickled Mueller as he largely dismissed either approach. 

“SENuke, hah, that’s a name I haven’t heard in ages, lol. Sorry. Giggle. I have thoughts on links, but people love to take things out of context to promote their link efforts / tools, so perhaps someone else will say something reasonable, or not.

“OMG, PBNs too. What is this thread even. Now I won’t say anything without a lawyer present.”

No Shortcuts To Online Riches

Of course, there is an underlying current connecting all of these questions. Mueller takes note of this as well, saying:

“Reading between the lines, it seems you want to find a short-cut to making money online.”

The truth is, there are no real shortcuts to online success these days. However, there are a lot of questionable people willing to take your money to provide tools and courses that often get you nowhere. 

“Unfortunately, there’s a long line of people trying to do the same, and some have a lot of practice. Some will even sell you tools and courses on how to make money online (and *they* will be the ones making the money, fwiw, since people pay them for the tools and courses). The good tools cost good money, and they’re not marketed towards people who just want to make money online — they’re targeted at companies who need to manage their online presence and report on progress to their leadership chain.”

At the same time, Mueller encourages individuals such as the person who started to thread to keep learning and practicing SEO:

“… learn HTML, learn a bit of programming, and go for it. 90% of the random tricks you run across won’t work, 9% of the remaining ones will burn your sites to the ground, but if you’re lucky & persistent (is that the same?), you’ll run across some things that work for you.

“If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.”If you want to go this route, accept that most – or all – of the things you build will eventually blow up, but perhaps you’ll run into some along the way that make it worthwhile.

“And … after some time, you might notice that actually building something of lasting value can also be intriguiing [sic], and you’ll start working on a side-project that does things in the right way, where you can put your experience to good use and avoid doing all of the slash & burn site/spam-building.”

To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

Google released its annual Ads Safety Report this week, highlighting the company’s efforts to guarantee advertising on its platforms is safe and trusted.

Along with suspending more than 6.7 million spammy ad accounts over the last year, the report details how Google is fighting fraud, preventing potentially harmful ads from running, and protecting user privacy.

Using machine learning algorithms, Google is able to identify suspicious activity and patterns faster than ever and quickly remove fraudulent or harmful ads.

This has contributed to a huge improvement in Google’s abilities to detect spam and harmful activity at scale, leading to over 2 billion more ads being blocked in 2022 compared to the previous year. 

At the same time Google released the report, the company also announced it is launching an Ads Transparency Center to help users better understand the ads they are seeing and who is paying to display them. 

Highlights From The 2022 Google Ads Safety Report

The full Ads Safety Report includes a lot of details about how Google detects and removes malicious or spammy ads, but these were the details we think are most important for you to know:

  • Google blocked over 5.2 billion ads for policy violations
  • Ad restrictions were down by over a billion annually in 2022
  • Over 6.7 million advertiser accounts were suspended for “egregious” policy violations
  • The number of ads removed from web pages stayed largely stable compared to the previous year

What Is The Ads Transparency Center?

In response to the leap in blocked ads and suspended ad accounts, Google decided to create the Ads Transparency Center – a central knowledge hub containing information about verified advertisers and ads. 

Here you’ll be able to find detailed information about the ads a specific advertiser has run, what ads are being shown in a specific area, and more about ads appearing on the platform. 

Users can also access My Ad Center here, which gives them the ability to like, block, or report potentially problematic ads. 

For more about Google’s attempts to keep the ads on its platform safe for users, check out the full 2022 Ads Safety Report here or the Ads Transparency Center announcement here.

Just ahead of International Fact-Checking Day on April 2, Google announced a wave of new features intended to make it easier for users to learn more about where their information is coming from.

As the company revealed in a recent announcement, Google is introducing five new features to verify information online:

  • Expanding the “About this result” feature worldwide
  • Introducing an “About this author” section
  • Making it easier to learn about websites using the “About this page” feature
  • Providing context for top stories with “Perspectives”
  • Helping spot information gaps

Expanding the “About this result” feature worldwide

Launched in 2021, the “About this result” feature gives searchers access to additional information about the sources and websites that appear in search results. 

Though English-speaking countries have been able to find this information by clicking the three vertical dots next to most search results for a while, users in other countries or speaking other languages are just now getting access to the feature. 

Introducing an “About this author” section

Google is adding a new section to the “About this result” feature which gives information specifically about the author of the content you see. 

At the time, it is unclear exactly where Google will be gathering this information, but it is worth keeping an eye on as the feature rolls out – especially if your site publishes blog content.

Making it easier to learn about websites using the “About this page” feature

Google is adding a new way to access the “About this page” feature, which details information about a webpage similar to the “About this result” feature.

Now, you can learn more about a page by typing the URL of a site into Google’s search. The following search results will include information from the “About this page” feature at the top of the page. 

Here, you’ll see information about how the site describes itself and what others across the web have said about the site. 

Providing context for top stories with “Perspectives”

The Perspectives carousel aims to provide additional context around Top Stories by sharing helpful insights from journalists and other experts.

The feature has been in testing since 2022, but Google says it will be widely available in the coming days. 

Helping spot information gaps

When Google is unable to confidently provide information about a topic – either because there are few good sources available or because the information is changing quickly around that topic – the search engine will display a content warning with the search results it provides. 

To learn more about these new features, read the complete announcement from Google here.

Google has started giving users in the US and UK access to Google Bard, its answer to Bing and ChatGPT’s AI chat tools. The company is doing a gradual rollout through a waitlist at bard.google.com

What Is Bard?

Bard is a generative AI. That means it will generate content for you based on prompts that you submit through a chatbot. 

In today’s announcement (partially written with the help of Bard), the company suggested a variety of ways users might be able to take advantage of the AI tool:

“You can use Bard to boost your productivity, accelerate your ideas and fuel your curiosity. You might ask Bard to give you tips to reach your goal of reading more books this year, explain quantum physics in simple terms or spark your creativity by outlining a blog post.”

Is Bard an AI Search Tool?

Yes and no. 

Bard is something of a complementary tool to Google’s search engine. While it is not directly integrated into Google Search, it is “designed so that you can easily visit Search to check its responses or explore sources across the web.”

Along with suggesting queries, you can immediately open a new tab with search results for a given query. 

At the same time, Bard is not considered a direct part of Google search. Instead, the company suggests it will be adding other AI tools to its search engine in the future. 

Bard Is In Early Stages

Throughout the announcement, Google repeatedly described Bard as an early experiment, As with Bing’s AI tools, Bard is likely to have some early quirks and weirdness as users get their hands on it. 

Additionally, Google pointed out that the AI tool is far from perfect. It can get information wrong or phrase things in misleading ways. Some of these errors may be small. In Google’s example, Bard got the scientific name for a plant wrong – Zamioculcas zamiifolia, not Zamioculcas zamioculcas. However, the company cautions it may be inaccurate in other ways.

Still, it will be fun to see what Bard can do now that it is coming to the public.

Google has confirmed it is rolling out its latest broad core algorithm update, signifying yet another potential shake-up for the search engine’s results.

Google’s broad core algorithm updates serve as some of the most significant updates for the search engine compared to the smaller updates that are happening multiple times a day. They can affect rankings for search engine results pages (SERPs) throughout Google’s entire platform.

As is usual with Google, the search company is being tight-lipped about specific details regarding the update, only going so far as to confirm the latest update. The update is also expected to take up to multiple weeks for the full impact to be obvious.

With this in mind, it is wise for brands to take note and monitor their own search performance in the coming weeks.

What Can You Do?

Aside from always striving to provide the best online experience possible with your website, there are a few specific steps you can take to safeguard your site from updates like these:

  • Monitor site performance regularly to identify early signs of issues with your site
  • Create content geared to your audience’s needs and interests
  • Optimize your site’s performance (including speed, mobile-friendliness, and user experience) to ensure your site isn’t off-putting to visitors

TL;DR

Google has launched its latest broad core algorithm update, which could potentially affect rankings for search engine results pages. The update may take several weeks to have full impact, so brands are advised to monitor their search performance. To safeguard your site, monitor its performance regularly, create audience-specific content, and optimize its performance for speed, mobile-friendliness, and user-experience.